The brain is a computational device. There may be some cognitive scientists out there who disagree with that statement, but I don't think there are many. There is much less agreement on what type of computational device it is.
One possibility is that the brain is a symbol-processing device very much like a computer. A computer can add essentially any two numbers using the same circuitry. It does not have one microchip for adding 1 + 1 and a different one for adding 2 + 2. It has a single algorithm that can be applied to any arbitrary number (assuming the computer can represent that number -- obviously there are numbers too large for any modern machine to handle).
One of the big mysteries of the brain is that it is unclear how to make a symbol-processing/algebraic device out of neurons. This has led many schools of thought, such as Connectionists, to deny that the brain can do symbol-processing or works anything like a digital computer (see Marcus's The Algebraic Mind for some blow-back). On the flip side, folks like Randy Gallistel have argued that if we don't know how to implement read/write memory into neurons (a related question), then there is a gaping hole in our knowledge about neurons.
This all comes to mind in relation to some work done in the last decade on barn owls. Barn owls locate their prey via both sight and sound, and neuroscientists have located the area of the brain where these two signals are combined. If you put prism goggles on a barn owl so that it's vision is offset (e.g., everything looks like it's 10 degrees left of where it actually is), the two signals get distorted at first, but eventually the neural map that represents location according to the ears shifts so that it's in sync with the the visual map.
As a computer programmer, the obvious thing to do would be to just add 10 degrees to the auditory signals across the map. However, that's not what the brain does. This can be shown by putting barn owls into goggles that shift only part of the field of vision. Only the auditory signals for that region of space shift. That strikes me as very non-algebraic in nature (not that a computer programmer couldn't achieve this effect, but why would she write that ability into the code. Keep in mind that barn owls didn't evolve to wear prism goggles).
That said, there's no reason that all the brain must compute things algebraically. Perceptual systems may be unusual in that respect. Still, as very little is known about how the brain computes anything, this example is very interesting.
For those interested in the barn owl details, check out:
Knudsen, E.I. (2002). Instructed learning in the auditory localization pathway of the barn owl. Nature, 417, 322-328.
One possibility is that the brain is a symbol-processing device very much like a computer. A computer can add essentially any two numbers using the same circuitry. It does not have one microchip for adding 1 + 1 and a different one for adding 2 + 2. It has a single algorithm that can be applied to any arbitrary number (assuming the computer can represent that number -- obviously there are numbers too large for any modern machine to handle).
One of the big mysteries of the brain is that it is unclear how to make a symbol-processing/algebraic device out of neurons. This has led many schools of thought, such as Connectionists, to deny that the brain can do symbol-processing or works anything like a digital computer (see Marcus's The Algebraic Mind for some blow-back). On the flip side, folks like Randy Gallistel have argued that if we don't know how to implement read/write memory into neurons (a related question), then there is a gaping hole in our knowledge about neurons.
This all comes to mind in relation to some work done in the last decade on barn owls. Barn owls locate their prey via both sight and sound, and neuroscientists have located the area of the brain where these two signals are combined. If you put prism goggles on a barn owl so that it's vision is offset (e.g., everything looks like it's 10 degrees left of where it actually is), the two signals get distorted at first, but eventually the neural map that represents location according to the ears shifts so that it's in sync with the the visual map.
As a computer programmer, the obvious thing to do would be to just add 10 degrees to the auditory signals across the map. However, that's not what the brain does. This can be shown by putting barn owls into goggles that shift only part of the field of vision. Only the auditory signals for that region of space shift. That strikes me as very non-algebraic in nature (not that a computer programmer couldn't achieve this effect, but why would she write that ability into the code. Keep in mind that barn owls didn't evolve to wear prism goggles).
That said, there's no reason that all the brain must compute things algebraically. Perceptual systems may be unusual in that respect. Still, as very little is known about how the brain computes anything, this example is very interesting.
For those interested in the barn owl details, check out:
Knudsen, E.I. (2002). Instructed learning in the auditory localization pathway of the barn owl. Nature, 417, 322-328.
1 comment:
> why would she write that
> ability into the code. Keep in
> mind that barn owls didn't
> evolve to wear prism goggles
Shape of the head of animals tends to change (sometimes quite spectacularly) during their life. Besides, we need to calibrate our mind maps anyway when we start to see, so the same mechanism can be reused.
Post a Comment