• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Ant colonies and intelligence.

On paths to food, in specific places in the nest, perhaps on other ants...
 
The paths bit I could understand, but when the nest is destroyed, how do they know how to build a new one quicker. Why would accumulated chemicals, (which I am assuming don't last all that long, or even the ants they are on), be able to provide a memory for learning?
 
drkitten, I see your point. Which schools, btw.

Oxford, Cambridge, Toronto, San Diego, and Cal-Tech leap to mind immediately.

I guess my idea is that each ant is a bundle of automonous algorithms, running in parallel, whereas I don't think that this is true in the cat's brain. Yes, parallel operation, but parallel, independent algorithms. Perhaps I am hung up on a distinction that doesn't matter?

I dunno, man. It seems like each individual neuron in a cat's brain is pretty effin' independent.
 
The paths bit I could understand, but when the nest is destroyed, how do they know how to build a new one quicker. Why would accumulated chemicals, (which I am assuming don't last all that long, or even the ants they are on), be able to provide a memory for learning?

Another wild guess: Perhaps the first destruction resulted in more "builders" being born or differentiated.
 
drkitten;1497180 I dunno said:
I don't see how that is relevant. The neuron is not running a complex algorithm like an ant is.

I'm going to make up a pursuit algorithm such as might happen in a evolved animal.

gen 0: run at prey
result: some prey caught, but lots of deaths due to running into trees, ditches

gen 1: run at prey, stop if obstacle
result: some prey caught, no deaths due to injury

gen 2: run at prey turn left at obstacle

gen 3: run at prey, turn towards easiest terrain immediately adjacent to obstacle

gen 4: scan terrain, pick terrain with largest groupings of "easy"

gen 5: you get the idea...

Now, of course no brain evolved that way. But gen 2 is more realistic than you might think. For example, house flys, when attempting to fly away from predators, always launch at 45degrees. If you know this it is easy to catch a fly with your hand, otherwise they are extremely elusive. Simple patterns are quite good. But we can speculate about a fly evolving more intelligent defensive flying if a predator ended up capitalizing on the 45 takeoff.

There is nothing in any of those things that involve parallel problem solving. Yes, of course there are neurons running in parallel, and independently, but that is not what I mean. What I mean is that there are not (I speculate) neurons or bundles of neurons that are solving the pursuit problem by each bundle projecting a possible pursuit path, and calculating the difficulty, and then some super neuron bundle evaluating the solutions of each bundle and uses the bundle that produced the results with the highest fitness. Which is what the ants are doing. Thousands of trails are independently followed, chemicals record which trails are successful or not, and they quickly optimize on the local-maximal path.

Now, do we have evidence that any lower or higher order mammel solves pursuit problems like an ant does?
 
I don't see how that is relevant. The neuron is not running a complex algorithm like an ant is.

It's relevant because the neuron -- in parallel with all the other neurons -- is running a complex animal.

We've got no neurobiological evidence for anything like the "algorithms" you describe. What we have neurobiological evidence for are simple parallel operations of neurons

If you look at what's going on neurobiologically, the "behavior" appears to be (at least at some level) simply an emergent property of large scale neural events. And the neural events themselves are random -- actually, they're random in a way that ant movements usually arent, but the relationship between different neural events shifts the chemical balance enough to make certain types of cooperative parallel computing more likely.


What I mean is that there are not (I speculate) neurons or bundles of neurons that are solving the pursuit problem by each bundle projecting a possible pursuit path, and calculating the difficulty, and then some super neuron bundle evaluating the solutions of each bundle and uses the bundle that produced the results with the highest fitness. Which is what the ants are doing. Thousands of trails are independently followed, chemicals record which trails are successful or not, and they quickly optimize on the local-maximal path.

Except that that's not what ants do. Just as quick example -- you correctly point out that there's no "super neuron bundle evaluating the solutions of each bundle." But there's similarly no super ant (or ant cluster) evaluating the solutions of each individual ant.

The standard metaphor to the human brain is one ant = one neuron. Just as ants solve problems non-symbolically and without any global control, based on a shared chemical environment that individual ants manipulate without large-scale planning or coordination, so do neurons. Just like neurons, ants do not "project" hypothetical paths and evaluate their usefulness. And, in fact, ants do not wander "randomly," but instead follow previously laid down trails, but with a stochastic error component. Success feedback rewards the trails with beneficial error, so the overall "trail" becomes shorter over time. There's no "evaluation" in the sense that you propose.

And we've actually got some rather good evidence that mammals do solve pursuit problems in more or less the same way. First, you've got to be rather specific about what you mean by "pursuit problems': if you're talking about a fox chasing a rabbit, that's a rather different task from the one of the ants finding the honeypot. Honeypots don't move.

But taking a slightly different example of "pursuit" -- object tracking in the visual cortex -- we've got fairly good evidence that something ant-like does happen in the brain. In theory, one could move one's eyes in any direction at any time in response to any stimulus, but in practice, what happen is that one "learns" to move one's eyeballs left in response to objects seen on the left side of the fovea and so forth. The mechanism is approximately the same -- stochastic changes that result in lowered "error" (which can be measured in terms of image stability on the retina) are rewarded and used as the base for further modifications.

Basically, if you want to describe cognition in terms of symbolic, goal oriented processing, you're welcome to do so. (A lot of psychologists do.) But you have no support whatsoever in neurobiology for doing so. What we have neuroanatomical evidence for really does look an awful lot like an anthill....
 
MY HEAD IS FULL OF ANTS!!!!!

:D

That is interesting. Do you think it's possible for a collective, sentient intelligence to develop? Something like an ant model, but where the whole is capable of self-awareness and communication with other sentient species? I'm delving into sci-fi territory here, but just wondering if there's any theorectical reason to rule it out or to expect it.
 
Dr. Kitten, I grant all your points. Thank you for a fascinating post. I still feel that the metaphor is stretched a bit to far, not so much as by you as others, since the ants have several different behaviors (follow trails, forage, fight, etc), whereas, to my knowledge, the neurons do not. But I recognize I am merely an interested layperson in this field.
 
OK, for eye candy I knocked up a Randi.org ant pic lol.

I'd upload the program as an attachment, but the zip file's way too big at 1Meg, plus I'm guessing zips containing executables would be rejected anyway.

The complex system of highways are clearly evident (and also nice and curvy, which makes my simulation a little different, I believe)
 

Attachments

  • RandiAnts.JPG
    RandiAnts.JPG
    71.4 KB · Views: 19
Last edited:
MY HEAD IS FULL OF ANTS!!!!!

:D

That is interesting. Do you think it's possible for a collective, sentient intelligence to develop?
Hey, we already did ;)

...Something like an ant model, but where the whole is capable of self-awareness and communication with other sentient species? I'm delving into sci-fi territory here, but just wondering if there's any theorectical reason to rule it out or to expect it.
Facetiousness aside, I can't really see any real reason why it's impossible for high level hive intelligence to develop i.e. parts of the system distributed in different physical bodies.
 
I tried to get a thread started on this topic a while back, but it never took. I had seen something on TV about how a mass of army ants, who do not have permanent residences, suddenly "decide" it's time to move on. It's not known whether this is a cummulative, parallel decision, or if a single ant (say, the queen) signals when it's time to go.

I think the collective intelligence of an ant colony is an excellent model for how the human mind can be conceptualized as an emergent property of brain function, rather than some ethereal thing that transcends time and space.

Does an ant colony have a "soul"?
 
Facetiousness aside, I can't really see any real reason why it's impossible for high level hive intelligence to develop i.e. parts of the system distributed in different physical bodies.

Yeah, I don't see a reason either, I was just wondering if there was any more scientific backing for that feeling...any sort of theory or experiment or anything that might lend more weight to the idea that it's possible or impossible.

I'm already considering a sci-fi type race that communicates via either chemical or visual (bioluminescent) signals developed into a collective intelligence. Wonder what a brain the size of a planet could do? BEsides be depressed, that is ;)
 
I'm already considering a sci-fi type race that communicates via either chemical or visual (bioluminescent) signals developed into a collective intelligence. Wonder what a brain the size of a planet could do? BEsides be depressed, that is ;)
Ever play Sid Meier's Alpha Centauri?
 

In their "crops" (stomach).

Has anyone here ever played Sim Ant? It's an excellent game and the manual contains an amazing amount of info on ants and their behavior. A really fun and informative game.
_______

"The game is essentially a simulation of an ant colony. The game consists of three modes: a Quick Game, a Full Game, and an Experimental Game. It was released for the IBM PC, Commodore Amiga, Apple Macintosh, and Super Nintendo Entertainment System. The later version also added eight scenarios, where the goal in each is to eliminate the enemy red ants in various locales, each with different hazards. However, this version of the game lacks the Experimental Game.

In the game, the player plays the role of an ant in a colony of black ants in the garden of a suburban home. The player can change which ant they are (known as the "yellow ant", as it's colored at any time), and the ant controlled by them can order the other ants (by ordering a certain number to follow it, for instance). The player must battle against the red ants. The ultimate aim of the game is to spread throughout the garden, into the house, and finally to drive the human beings out of the house. In this respect, it differed from other 'Sim' games, which largely had no "win" or "lose" situation."

http://en.wikipedia.org/wiki/SimAnt
__________

http://www.amazon.com/exec/obidos/tg/detail/-/B00002EPYA/qid=1142094006/sr=8-2/ref=sr_8__i2_xgl63/10
 
Lem

I'm already considering a sci-fi type race that communicates via either chemical or visual (bioluminescent) signals developed into a collective intelligence. Wonder what a brain the size of a planet could do? BEsides be depressed, that is ;)

This idea was explored very well in The Invinvible by Stanislaw Lem.
 
After reading that link (further up) about the ant police, and seeing that these colonies are more like "police states", it kinda makes the whole "emerging intelligence" hard to imagine.

If all the individuals are essentially suppressed into slavery and murder is used by queens and fellow (selfish) workers, then it gives a picture of a barely contained system, rather than an AI "about to burst into sentience" system.

Unless mind is also a police state... :boggled:
 

Back
Top Bottom