• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Human Echolocation

I wonder what fundies would say about this. According them: Echo location, in bats, was supposed to be "intelligently designed", and impossible to arise out of evolution.

But, if humans can master some form of echo location, then why not other animals?

It seems the beginnings of echo location are not that difficult to develop, through an evolutionary process, after all.
 
A couple of nits here.

Echolocation refers to locating something via sound that has been emitted.

Using the audible cues in a video game to locate something is not echolocation, but is using the human ability to locate sounds via auditory cues.

www.aes.org/sections/pnw/ppt/other/2005aud.ppt has some information on the process at a basic level (i.e. what fits in a 1 hour talk) n.b. The deck at that URL, despite any other notices, has been released to the public domain (yes, I'm the author, and it was created while I had no employer)
 
It seems the beginnings of echo location are not that difficult to develop, through an evolutionary process, after all.

While I'm very sympathetic to dealing with ID'er, creationists, and other science-haters, this isn't going to do it.

Humans can localize sounds fairly well. Not as well as a dog or cat with erect, movable ears, but fairly well. For instance, in the right circumstances you can detect a 10 microsecond interaural delay. (i.e. delay from right to left ear) Some reports get this down to 5 microseconds, I'm not sure I quite accept those completely, but it is not excludable based on what science knows about the auditory apparatus.

In other words, to some extent, we already can, and with practice, most people can do better than folks normally realize.
 
What scientific evidence is there for this increase in sensitivity? I don't remember seeing anything to that effect when I was studying psychophysics some years ago, although things might have changed.

I'm curious about this too. I did a little research and found an article that suggests there is little increase in sensitivity.

http://www.spring.org.uk/2008/02/blind-peoples-other-senses-not-more.php

Maybe blind people are just more attuned to their sense of hearing as people have suggested. It wouldn't suprise me. In engineering you can do some amazing things with signal processing. The things that the brain can do probably make my DSP textbook look like child's play.
 
Sight is the primary sense used by people. It's a visual world. Remove vision, and you'll just have to adapt to interpreting whatever clues your remaining senses supply.

Call it anectdotal, if you will. I've never made a formal study of it, but this is based on my experience, repeated, say, eight or ten times. I'm an old-school photographer, started developing my own film and prints at the age of eight. Darkrooms are second-nature to me. I've frequently introduced newbies to darkroom operations. The single most frequent complaint is that they still try to see what they are doing. My advice to them was simple: close your eyes. This simple act seems to disconnect the mental switch that makes you try to function visually. Eyes closed (even in the dark), you quit looking around and start using your sense of touch and positional memory for where you left things. I've "found" things in the darkroom by hearing the sound from where another person has set the object on the counter (like scissors or a roll of masking tape).

Don't have that problem any more, now that I've gone 100% digital.

Beanbag
 
Maybe blind people are just more attuned to their sense of hearing as people have suggested. It wouldn't suprise me. In engineering you can do some amazing things with signal processing. The things that the brain can do probably make my DSP textbook look like child's play.

For a brief description of the signal processing that the cochlea and auditory periphery do, check out:

www.aes.org/sections/pnw/ppt/other/limitsofhearing.ppt
www.aes.org/sections/pnw/ppt/other/aes2004.ppt
www.aes.org/sections/pnw/ppt/loudness/loudtut.ppt

For that matter, scanning through any of the decks at www.aes.org/sections/pnw/ppt.htm may be interesting (or maybe not)

It is clear that blind people learn to use hearing much more effectively. But as someone has said, you can learn as much by shutting your eyes, feeling, and listening.
 
Last edited:
Everyone seems to be dancing around the point here. Senses per se vary greatly within a population; some people have good eyes, some don't. Some people can hear well into the high 10 and 20's of kilohertz, some cannot. Some people can taste "savory", some cannot. Small is particularly variable; there is a woman who works for one of the coffee makers who is phenomenally able to discern between thousands of quirks in coffees. But the key here isn't really the senses, it's the processing that happens with their inputs in the brain. People are trainable, to a large extent, to be able to overcome deficiencies and use the capabilities their senses do have. With practice, people integrate the feel of a car sensed by the hands on the wheel with the sounds and sights of driving; most don't ever really notice, until they have to change cars. Cats (just Siamese cats? Can't remember; I read the article long ago in Scientific American) have a mis-wiring in their optic nerve which causes one segment of their vision to be transposed with another, but experimentation shows that the brain has compensated for it. The brain is very good at filling in gaps for things that aren't really there; most people never notice that there is a blind spot in their vision in each eye.

Those with missing senses have to learn to make due with what they do have, and the brain is marvelous at doing that. They also have 24/7/365 practice; with no days off. They aren't distracted by the missing sense. Everything is going for making what we consider remarkable use of the remaining senses. It serves to show us what most of us could probably do as well, if we had the right incentive.
 
Everyone seems to be dancing around the point here. Senses per se vary greatly within a population; some people have good eyes, some don't. Some people can hear well into the high 10 and 20's of kilohertz, some cannot. Some people can taste "savory", some cannot. Small is particularly variable;

Some senses vary greatly, some don't. Except for refractive correction, eyesight is pretty standard (barring things like colorblindness and pathologies). Hearing is even more standard, with the only big differences being due to noise exposure (or, again, pathology) and age.

But as the person above pointed out, most people can learn to use the cues presented to us by the ears (and eyes) much better than we normally do.

Smell/Taste are enormously variable, and in fact the kinds of tastes, and especially smells, are not shared anything like evenly in the population.

I'm not as well informed about touch, in a lot of ways it's less researched and less understood.
 
I once went up a 2000 foot hill with my eyes shut. It's astonishing how fast you start paying detailed attention to sounds.
Deaf people on the other hand, are often extremely noisy, because noise doesn't distract them at all.
If this lad lost his eyes so very young, it's also probable that some of the visual processing areas of his brain switched to processing auditory data, so he is probably far better at it than anyone blinded in adulthood could ever be, because he has a great deal more online sound processing hardware in action. That won't do a thing for the sensitivity of his ears, but he will be able to extract information from the sound he can hear.
I wouldn't be surprised if this lad could beat that 5msec discrimination jj mentions.
 
I wouldn't be surprised if this lad could beat that 5msec discrimination jj mentions.

Just so it's clear, that's "MICROseconds". The interaural time delay amounts to about .9 milliseconds, so yeah, we can all hear that as long as we have two functioning ears. :)

I don't know that he could beat that, by the way, but what he almost certainly can do is extract a lot more information from what he can hear.
 
Just so it's clear, that's "MICROseconds". The interaural time delay amounts to about .9 milliseconds, so yeah, we can all hear that as long as we have two functioning ears. :)

Yup. I missed that small difference.:o



I don't know that he could beat that, by the way, but what he almost certainly can do is extract a lot more information from what he can hear.[/quote]
That's what it looks like.
 

Back
Top Bottom