Introduction to Psychological Science: Integrating Behavioral, Neuroscience and Evolutionary Perspectives - William J. Ray 2021
Sensation and Perception
✵ 5.1 Describe how psychophysics relates to our understanding of how we experience the world.
✵ 5.2 Describe key processes in the visual system.
✵ 5.3 Explain what happens in the auditory system that allows us to hear.
✵ 5.4 Summarize the role that chemical processes play in smell and taste.
✵ 5.5 Describe the different senses of touch and how we experience pain.
James Wannerton lives in London and since he was a child, he has experienced sounds as tastes. Every sound he hears, he also involuntarily experiences a taste. He reports that every time he stops or passes through a London Tube (subway) station, he experiences a taste and a feeling in his mouth. Since he has ridden the London Underground often over the past 50 years, he has created a map of each station and the taste he experiences in response to the different sounds as he goes through that station. Although extra sounds at a particular station could be different on a given day, the tracks and shape of each underground station make for the same sound at each unique station (Cook, 2014).
Another person, John, had no problem talking about an object. He could recall the object from memory and describe it. He could even draw it from memory. However, when he looked at the object, he could no longer access this knowledge. He clearly could describe what he was seeing in terms of size and shape and appearance. He could say the object was long and thin with a shiny surface. The connection between what he was seeing and what he knew about the object disappeared when he looked at the object. He also found it difficult to recognize friends and family from their faces. Further, although John found reading a passage to be difficult and slow, he found it easy to write (Humphreys & Riddoch, 2014).
Another individual whom we shall call Laura reported that when she watched an object in motion, she felt sick. Laura said that she no longer saw movement in objects. Rather, the objects appeared as “restless” or “jumping around.” She could see objects at different distances and at different locations. However, she could not see them move from one location to another. According to Laura, the objects appeared jumping from one location to another but nothing in between. This caused her real problems when walking in the city. Traffic had become very frightening. Although she could identify cars, she could not tell if they were moving or not (Heywood & Zihl, 1999).
The experiences described can be understood by knowing about our brain and how it creates our experiences. The story of James Wannerton reflects a condition called synesthesia. Synesthesia is the condition in which the experience of one sense automatically produces experience in another. For Wannerton, sound produced tastes and feelings in his mouth. For others, sounds produce colors. Some people see a color when they look at numbers or words. Synesthesia is present in about 1% of the population. Many people with synesthesia would not want to change the way they experience the world.
Figure 5-1 Part of the London subway map as experienced by James Wannerton.
Source: Cook (2014); Spiers and Maguire (2006); Hobson and Pace-Schott (2002); Hobson (1989, 2009); Golombek and Rosenstein (2010).
The next two stories of John and Laura are not as pleasant. These cases resulted from damage to the brain through a stroke or an automobile accident. In reading these cases, you can see that our vision of people and things is not a single process but a number of different processes that are put together by our brains. We also understand movement and color of objects and people as part of the world outside of us, although it requires a brain to make it happen.
You could also tell your own story as you ride a roller coaster at your favorite theme park. You receive lots of information that create sensations. You hear sounds, see the world moving quickly, and have difficulty determining where you are in space. The roller coaster and other rides are designed to give you experiences that are not part of your everyday life. However, even in your everyday life, your brain is busy creating experiences that are influenced by what is going on in the external world.
This chapter is about understanding how the external world influences what is going on in your brain and giving you basic sensory experiences. The brain is actively doing tasks of its own during these external stimuli changes. Thus, your sensory experiences are a combination of what your brain is doing and what is impacting your senses from the external world. Using this information, your brain creates your reality.
The nature of our nervous system actually determines whether we sense something or nothing at all. You can stand in a big city and see and hear various sensations. However, you don’t experience the radio waves or cell phone activity that is all around you. Although similar in nature to the frequency of light that we do see, radio waves and cell phone activity are energies at a different frequency. As humans, we do not have receptors that are sensitive to these frequencies.
In terms of sensory systems across species, we can see how evolution favored one sensory system over the others. In humans, this is seen in the sensory organization in the brain in that the visual connections from the eye to the brain contain more than one million fibers, whereas the auditory system contains only about 30,000 fibers. Thus, it is not surprising that you primarily use vision rather than smell or sound to move around your world. Since seeing is such an important sense for humans, this chapter will focus on vision.
In this chapter, you will learn the basic mechanisms of sensation and perception. It is our nervous system responding to certain types of stimulation that gives us an experience that we interpret as reality. Thus, our experience of reality is based on how our nervous system is constructed. However, as you know from going to the grocery store when you are hungry, our sensory processes can be changed by what is going on in our body. In this chapter, you will see how ongoing activities in the brain are modified by external stimulation to give us experiences.
The manner in which our brain and nervous system take energy that exists around us and turns it into an experience is a critical aspect of sensation and perception. Sensation refers to the manner in which our receptor system transforms energy into activity that can be interpreted by the brain. Perception is the manner in which the brain makes sense of this activity. Said in other terms, sensation and perception is the study of the way our brain creates a world that we experience external to ourselves or, in other cases such as pain, the location may be internal.
Let’s begin with simple sensations. Since the time of Aristotle (384—322 BCE) more than 2,300 years ago, five sensory systems have been described. These are vision, hearing, touch, taste, and smell. More recently, it has been suggested that pain and temperature should also be included. Each of the sensory systems uses different biological processes referred to as transducers to initiate the sensory process. The transducer in the auditory system turns sound waves into mechanical and then electrical impulses. The transducers in our eye turn light energy into electrical impulses.
There are also connections between the brain areas involved in different sensory processes that give us an integration of sensory processes. Putting these together gives us the feeling of a single event. These specific connections are referred to as pathways and networks. These pathways and networks allow us to see, hear, and feel the world as if it is a coherent whole. However, our brain begins with information from each of the sensory systems alone. Although there is a relationship between the physical characteristics of the stimulus and the way our sensory system responds, there is not a perfect relationship. This was studied in the 1800s with the establishment of the field of psychophysics.
The research of Gustav Fechner (1801—1887) in the 1800s helped to establish the field of psychophysics. Psychophysics is the study of the relationship between the physical characteristics of a stimulus and the manner in which we experience it. Fechner noted that although there is a relationship between the increase in the physical intensity and the subjective experience, it is not a linear relationship. For example, we do not experience a 200-watt light bulb as being twice as bright as a 100-watt bulb. What this suggests is that experiences in the mental world are not the same as changes in the physical world.
Sensory Measurement Thresholds
Part of Fechner’s work was based on studies by Ernst Heinrich Weber (1795—1878). Weber had studied how much physical difference is needed for you to detect a change. He had individuals place a weight in their hand. Then they lifted a second weight and noted if it was different from the first. Weber wanted to know how much more the second weight needed to be before people noticed a difference. You can ask a similar question about any of our sensory processes. How much does light need to change in frequency before you notice a different color, or how much louder does a sound need to be before you experience a louder sound. This difference is referred to as the just-noticeable difference or JND.
Fechner also noticed that the JND changes as the original weights change. That is, if you had a 100-gram weight (about one-fifth of a pound) and added 2 grams, you could tell a difference. However, if you had a 200-gram weight and added 2 grams, you would not notice the difference, but you would notice a 4-gram change. If you did this over a number of different weights you would notice that there is a constant proportion of the original weight that is required to notice a change. Fechner referred to this relationship as Weber’s law. The constant proportion described in Weber’s law varies by modality. It is.02 in the weight example just described. However, it is larger when determining light intensity or the saltiness of one taste versus another.
Weber also studied our ability to detect differences in tactile stimulation. For example, you could ask how far apart two pin pricks would need to be on your skin before you could experience them as two pins rather one. You may notice that this distance is farther on your back than on your fingers, for example. A measure related to JND is called the difference threshold. This is the actual physical measure between the original stimulus and the one first noticed as different.
When there is just one stimulus present, the concept is the absolute threshold. The basic question is what amount of stimulation is necessary for you to detect that it is present. In this chapter you will learn that on a dark night you can detect a candle at 20 miles. In order to determine the absolute threshold, you would simply have someone move the candle away from you until it could no longer be detected. The basics for determining thresholds were described in detail in Fechner’s 1860 book, Elements of Psychophysics.
1. Define the terms below including their role in the human sensory system.
2. How does psychophysics relate to our understanding of how we experience the world?
3. What is just-noticeable difference (JND), and how are difference threshold and absolute threshold related to it?
Our ability to see and use visual information has evolved to aid us on a variety of levels. On a survival level, vision helps us survive and avoid danger. On a sexual level, we see others as attractive and desirable and potential sexual partners. On a social level, we have evolved mechanisms related to those people we want to be with and those we want to avoid. Visual processes play an important role in additional cultural processes such as reading and art. We see some images as beautiful while we see others as awful. At least half of our brain is related directly or indirectly to the processing of visual information including areas involved in the recognition of faces.
If you ask most of us what we see when we look at the world, we might respond that we see a blue sky, or green leaves on a tree, or the texture of a building. We can generally go into great detail about what is there. However, if you were to ask physicists what is there, they would tell you that what exists is energy. It is this energy that our nervous system turns into colors, shapes, movement, and a variety of other sensations.
Although we rarely think about it, our visual system is really amazing. If you were in a desert at night with no external light, you could detect candlelight from more than 20 miles away. You can also see stars millions of miles away. The visual system is also able to produce a visual image whether it is a dark night or a bright day. Let us now begin with the structure of the visual system and how this relates to its function.
The Visual System Is Sensitive to Electromagnetic Energy
Let’s begin with light. Light is electromagnetic energy. Physicists have considered light both as made up of waves and as a stream of particles or photons. Some receptors in our eye require a large number of photons to create a response while other receptors require fewer photons. Researchers interested in vision tend to see light as photons when discussing the firing of individual receptors and as waves when describing the characteristics of light. Both definitions of light are correct.
The light we see is only a small part of the electromagnetic spectrum. Different parts of the overall spectrum are associated with gamma rays, X-rays, ultraviolet rays, infrared rays, radar, microwaves, cell phone activity, and various forms of radio and television broadcasts over the air. The part of the spectrum that we can see is found between ultraviolet rays and infrared rays.
In terms of the visual spectrum, we refer to this light in terms of waves. As such, these waves can be described in terms of two factors. Amplitude refers to how large it is. Frequency refers to how often it recurs. Frequency is also described in terms of wavelength, which is how much distance there is between two consecutive waves (see Figure 5-2).
Figure 5-2 Showing wavelengths and frequencies of colors. Wavelength is the length of a single cycle. Frequency is the number of cycles in a given time—usually a second.
Since graphs of the electromagnetic spectrum may be in terms of frequency or wavelength, it can be noted that a higher frequency is associated with a shorter length between cycles and a lower frequency is associated with a longer wavelength.
When our eye transforms the visual spectrum, we see the lower frequencies (longer wavelengths) as red and orange and the higher frequencies (shorter wavelengths) as blue and violet. The greens and yellows are in between. Why might we humans be sensitive to this particular part of the electromagnetic spectrum?
Evolution offers us one answer to this question. Little more than 500 million years ago, there was an increase in diversity of life forms on Earth. Organisms existing at that time lived in water. One characteristic of water is that it passes some frequencies in the electromagnetic spectrum better than others. As the graph in Figure 5-3 demonstrates, those frequencies that are least affected when passed through water correspond to those frequencies of light to which our visual system is most sensitive. Thus, although the visual system continued to evolve as organisms lived outside of water, the physical frequencies processed by our visual system harken back to adaptations manifested more than 500 million years ago. By the way, you may have also noted in the graph that low frequencies (approximately 100 Hz) were also transmitted without reduction in water. These are frequencies used by electric fish to probe and sense their environment.
Figure 5-3 Attenuation of wavelength in water. This graph shows those frequencies that are not reduced as they pass through water are the same ones as the colors we see. The higher the line on the y-axis, the less of that frequency that could be experienced.
Source: Allman (2001, p. 67).
It also happened that during this period of increasing diversity more than 500 million years ago, new genetic variations developed that produced different types of photoreceptors. A photoreceptor is simply a cell that responds when light hits it. One type of photoreceptor was sensitive to low light. These light-sensitive cells, which later evolved as rods, allow us to see in dim light.
Another type of photoreceptor was produced that required greater illumination (more photons). Over time, the receptors of this system differentially became sensitive to different frequencies in the visual spectrum. These were the forerunners of the cones in our visual system that allow us to experience colors. The genes for cones differ in different species resulting in different sensitivity to different parts of the visual spectrum. In humans, the genes responsible for red and green sensitivity are located close to each other on a single chromosome, whereas the genes for blue sensitivity are located elsewhere. This suggests that blue sensitivity evolved at a different time than red and green sensitivity.
The visual system also evolved to solve a large variety of tasks that differ in different species (Baden, Euler, & Berens, 2020). As humans, we rarely think about how stable the world stays as we move our heads and go through the space around us. If we moved a video camera in the same way, we would see more of a blur. To maintain our visual world in a stable manner, our vestibular system senses movement and sends this information to the hindbrain, which in turn sends signals to the eye muscles. You will be introduced to the role of the vestibular system later in the chapter.
The information from the vestibular system allows our eyes to go in the opposite direction from our movement in order to compensate for movement and leave the image on our retina stable. Pretty clever engineering. Further, corrections take place through the cerebellum, which compares changes in eye velocity and head velocity. The front-facing location of our eyes allows us to determine depth, which would have been helpful in human history for both eye—hand coordination and catching animals. Like us, other animals such as owls and cats with flat faces can experience depth using both eyes. Let us now turn to how the eye functions and uses the brain to create the visual world that we experience.
A Tour of the Eye
In this section you will be taken on a tour of the structure and function of the eye. On this tour, you can note what each aspect contributes to our construction of our visual world. Let us begin with the human eye, which is shown in Figure 5-4.
Figure 5-4 Structure of the human eye: note the fovea is located at the center of the macula, which produces the greatest visual acuity.
Light reaches the cornea where it is bent and the lens where it is focused. Both are located at the front of the eye. The object we see is focused on the retina at the back of the eye. If your eye does not focus the image on the retina exactly, then you are either nearsighted or farsighted. If the image is focused in front of the retina, you are nearsighted (referred to as myopia). If it focuses behind the retina, you are farsighted (referred to as hyperopia). Your glasses or contact lenses cause the image to fall on the retina. Also, as most people age, their eyes lose the ability of the lens to change shape as easily as they did before middle age. This results in difficulty to focus on things close to the eye. This condition is referred to as presbyopia.
The physics of this is that when an object is somewhere about 30 feet from our eyes, the reflected light from an object is no longer parallel. This requires greater refraction or bending of the light waves to produce a shape image on the retina. This is referred to as accommodation. The lens of your eye is actually elastic—when we view objects at a distance, it becomes flatter, and when we view objects close up, it bulges forward. With aging, the lens becomes less elastic and less refraction is possible with close-up objects. Many people by age 50 find it difficult to see clearly close up or read small print without glasses. Quite a change from the 1-year-old infant who can see objects located just beyond his or her nose.
The pupil is surrounded by muscles contained in the iris, which let the pupil allow in more or less light according to the intensity of the light energy. If you want to see this for yourself, stand in front of a mirror with the lights off. Then, turn them on and watch your pupils change in size. By the way, as light is reflected off the iris, its pigmentation determines the color that you see in someone’s eyes. About half of humans worldwide have brown eyes. If you have blue eyes, you probably have an ancestor who lived near the Baltic Sea.
Within the eye there is a clear fluid referred to as the vitreous. Light passes through this fluid and reaches the back of the eye that contains the receptors that change the electromagnetic energy of light into electrical energy. It is the job of this electrical energy to create information in the brain. You may note that, from an optics standpoint, the actual image that falls on the retina is turned upside down. This of course is not critical since it is the neuronal information that determines what we see. Although extremely complex, the retina is only about.2 mm thick.
Figure 5-5 shows you the eye with the three main layers of the retina.
Figure 5-5 Human eye with the three main layers of the retina.
Rods and Cones
If you look at the figure of the retina, you may at first think it is backward. However, it is not. It is more like a medieval castle in which modern wiring was placed outward from the wall rather than behind it. The furthest layer of the retina is composed of some 100 million photoreceptor cells that are sensitive to light. There are some 95 million rods and some 5 million cones. Rods and cones are named for their shape (see Figure 5-6).
Figure 5-6 Outer segments of rods and cones. Both rods and cones respond to specific frequencies of light.
The easiest way to think about rods and cones is to consider rods as related to nighttime vision and cones to daytime vision. Rods allow you to see in dim light. They are much more sensitive than cones to the few photons of light energy available on dark nights. When light strikes one of the photoreceptors, the photons of light are absorbed, which in turn modifies the flow of current in the neuron.
There are three types of cones. Historically, these have been referred to as “green,” “red,” and “blue” cones. This is technically not correct since the experience of color is determined by comparing the activity of each type of cone with that of the other types. In each type of cone, the chemical sensitive to light is sensitive to a different frequency. This gives us the ability to experience color. Cones also allow you to see with high-spatial acuity, which gives you the ability to see detail.
Rods and cones also differ in their location. There are many more rods in the peripheral parts of the retina and more cones located in a more central region of the retina called the fovea, although there tend to be fewer of the blue cones at the fovea. One implication of having rods in the periphery is that your peripheral vision is very good on dark nights.
Whether we look at something directly or in the periphery can make a difference. Look at the Leonardo da Vinci portrait of the Mona Lisa. Margaret Livingstone (2000) pointed out that if you look at her forehead or other areas away from her mouth, you will use the lower frequency rods and see her having a smile. If you look at her mouth, you will use your higher frequency cones and the smile will be less apparent.
It is the purpose of the rods and cones to transform light energy into electrical activity. This is accomplished by a chemical process that works differently in rods and cones. The photochemical in rods is called rhodopsin, which is very sensitive to light. This is what allows you to see with clarity at night. However, bright light such as daytime sunshine will cause rhodopsin to not function in the same way. Thus, in the daytime you see mainly with cones that contain different photochemicals.
A new device for those who have lost functioning in rods and cones is described in the box: Applying Psychological Science—Bionic Eye Implants Gives Vision to Those with Profound Vision Loss.
Applying Psychological Science—Bionic Eye Implants Give Vision to Those with Profound Vision Loss
Philip Booth first began losing his sight at the age of 7. Booth has an inheritable disease in which the rods and cones in his retina slowly began to stop working. By age 30 he was totally blind. However, in the last few years, he has been testing a device that allows him to see letters and structures such as doors and windows (da Cruz et al., 2013).
The device is referred to as Argus II. Argus II has two major components. The first is a pair of glasses that contains a video camera with a processing unit that can wirelessly transmit the visual signal. The second component is a grid of electrodes that is placed directly on the retina. The wireless connections means that there are no wires going to the person’s retina. Additionally, only a single operation is needed to initially implant the electrode grid. When the wireless signals reach the electrodes, the cells in the retina are stimulated in relation to the visual image recorded by the camera. These cells produce action potentials that go to the visual areas of the person’s brain.
Argus II has been approved in the United States by the US Food and Drug Administration (FDA) and in some countries of Europe. Research testing the device with a number of participants has shown it to be effective over at least six years (for example, da Cruz et al., 2013). At this point, the Argus II does not produce normal vision. It limits the vision of individuals to letters, words, and some large objects. However, it shows that it is possible to create a prosthesis that improves vision. One would expect that with better visual processors and increased electrode arrays, even better vision would be possible.
Thought Question: Why is it particularly important to augment the sensory capabilities of young children? What ideas do you have for innovations to enhance any of our senses?
1. In what specific ways has our ability to see and use visual information evolved to aid us in living in the world?
2. What is the electromagnetic spectrum, and how is it related to vision?
3. Draw your own image of the structure of the human eye. Label the important parts and describe the function of each. Be sure to include at least the following: cones, cornea, iris, lens, pupil, retina, rods, and vitreous fluid.
4. Rods and cones are both photoreceptors. What specific contributions does each provide to visual processing in humans?
Vision and Color
Before leaving the eye to understand visual processing in the brain, let’s look at color. Although much of the information in our visual world comes from shapes and edges, color offers richness that lets us quickly determine evolutionarily valuable information such as “is the fruit ripe?” It was Newton who understood that color is not part of the external world itself but something that humans create from specific parameters of the energy of light. At this point let’s ask how this energy produces the sensations needed to give us the perception of color.
Wavelengths and Color
The cones in the retina are sensitive to different wavelengths of electromagnetic radiation that the brain is able to change into the experience of color. There are three different cone types in terms of frequency sensitivity.
The first type is sensitive to a wavelength of about 430 nanometers (nm), which we experience as blue. A nanometer is one-billionth of a meter. The second is most sensitive to a wavelength of 530 nm, which we experience as green, and the third is most sensitive to 560 nm, which we see as red. However, if we only had one type of cone, we would not see that color. Why is that?
Our ability to see colors requires that the signals from the different types of cones be compared in the brain. For example, the experience of orange results from the red cones being the most activated, the green cones being less activated, and the blue cones being least activated. In this way we have the experience of seeing orange.
Our experience of color is actually the result of our nervous system comparing the signals from each of the cones against one another. Because of the computation nature of experiencing color, blue, red, and green cones are more accurately referred to in terms of their wavelength sensitivity as S (short), M (middle), and L (long) cones. But we are getting ahead of the story. Let’s now look at how we historically came to understand color vision.
Theory of Color
Isaac Newton helped establish our modern understanding of color. He is well known for demonstrating that sunlight could be broken down into a spectrum of colors by using a prism. He further showed that this spectrum could be refracted to produce the original white light.
Figure 5-7 Wavelength of light. Each of the receptors in the eye is sensitive to different frequencies of light.
Later, the fact that any color can be produced by mixing red, green, and blue light was demonstrated by the British physicist Thomas Young in the early 1800s. He began by considering the impossibility of the retina to contain receptors for every possible color. Since this would be impossible, Young suggested that there must be a limited number of receptors that respond to a limited number of colors.
Young then suggested that three kinds of “fibers” existed in the eye. One is sensitive to red, another to green, and the third to blue. Young determined that in people with normal color vision, three different color lights (red, blue, green) are necessary to match any other possible color.
In the middle of the 1800s, Hermann von Helmholtz expanded Young’s idea and also had individuals match a standard color by varying red, blue, and green light sources. He also sought to elaborate on how the visual system is able to experience color. The idea that variations in three different light colors lies at the bottom of our experience of color came to be known as the Young—Helmholtz trichromatic theory of color.
An alternative theory of color vision was developed by Ewald Hering (1834—1918) who demonstrated that there are some aspects of color perception that the Young—Helmholtz trichromatic theory could not explain. This alternative theory is referred to as the opponent-process theory of color vision. As the name implies, certain colors result in opposite responses in the visual system. Let’s begin with an example, as can be seen in Figure 5-8. If you look at the center of the American flag for about 30 seconds and then look at a blank white sheet of paper and blink, you will see the real colors of the American flag.
Figure 5-8 Afterimage demonstration—look at the flag for 30 seconds and then look at a white sheet of paper.
The results are that blue produces a yellow afterimage and yellow produces a blue one. Likewise, green produces a red afterimage and red produces a green one. Hering also postulated a third opponent mechanism that involved light and dark, that is, black and white.
Initially seen as opposing theories, the trichromatic and the opponent process are now seen as complementary. In fact, each theory describes physiological mechanisms for color vision at different locations in the visual system. The trichromatic theory reflects early color processing involving the cones and the opponent process theory reflects later processing by the ganglion cells in the retina.
In performing color-matching studies in the 1800s, it was determined that some individuals used only two of the three light sources to match colors. These individuals are commonly called color blind. Some color-blind individuals matched colors to a standard color by using only green and blue light while others used only blue and red light. These individuals are called dichromats. Color blindness is found more often in males than females, since it is caused by a recessive allele on the X chromosome as described previously. Figure 5-9 is one common test of color blindness in which those with normal color vision would see a “2” in the middle of the figure.
Figure 5-9 Common test of color blindness—you are to tell the number you see in the image.
At this point, we have discussed the basic structure of the eye as well as how fundamental information such as color is processed. However, basic stimulation of light on the retina needs to be processed by our brain to give us the rich and complex information we experience when we see the world. Let us now turn to how visual information is communicated from the eye to the brain.
1. If color is not part of the external world, how do we create it?
2. What did each of the following researchers contribute to our understanding of how we process colors:
a. Isaac Newton?
b. Thomas Young?
c. Hermann von Helmholtz?
d. Ewald Hering?
3. Most humans have three different types of cones. What happens when a genetic variation causes an individual to have two types of cones? What would happen if an individual only had one type of cone?
The Eye Connects to the Brain
We now leave the eye itself and move to its connection with the brain. That is, the next point on our tour is the place where visual information leaves the eye. The axons of the ganglion cells are long and end in the midbrain. The point at which these axons leave the eye is referred to as the optic disk or blind spot. This is also the point at which blood vessels enter the eye. Although there are no receptors in this part of the eye to respond to light, the brain fills in the missing information by using information from the other eye and eye movement. Thus, you see a complete scene without a hole in the image.
The demonstration in Figure 5-10 allows you to experience this lack of receptors at the blind spot. Simply hold this book up to your face with one eye closed. Starting about four inches from your face and focusing on the cross, slowly move the book back until the cow disappears. This is the point at which the cow falls on your blind spot.
Figure 5-10 To find your blind spot, close one eye and look at the cross. Begin with the image about four inches from your face. Slowly move the image away from you. At one point the cow will disappear as it falls on your blind spot.
Your visual field is nothing more than what you see in front of you without moving your head or eyes. Drawing an imaginary vertical line down the middle of this field creates a right visual field and a left visual field.
Your left visual field is projected on to the right side of both of your eyes. In turn, the right visual field goes to the left side of each eye. Information from the left and right visual fields is kept separate in the brain and, as shown in Figure 5-11, goes to different hemispheres. The left visual field information goes to the right hemisphere, and the information from the right visual field goes to the left hemisphere. Although 90% of the retina neurons go to a part of the thalamus, the lateral geniculate (LGN), information from the eye also goes directly to the superior colliculus.
Figure 5-11 Images from the left and right visual fields go to each eye. The information from each visual field is kept separate. Information from the right visual field goes to the occipital lobe of the left hemisphere. Information from the left visual field goes to the occipital lobe of the right hemisphere.
The Primary Visual Cortex
At this point on our tour of the visual system, we have gone from light-sensitive photoreceptors in the retina to ganglion cells capable of producing action potentials. The pathways continue to the LGN and then to the primary visual cortex. At each level, the brain is able to construct greater abstraction in relation to the visual stimuli.
At the primary visual cortex level, the visual stimuli are seen mainly in terms of lines, edges, movement, and color. This information is also interpreted in terms of the context in which it is presented. To continue the tour, we need to move to areas involved in even more abstract processes including motion, depth, form, and color.
It turns out that as we process visual information, our brains want to know two things. The first is what something is. Is it a house, a tree, or a face? It is also at this point that other processes such as our memory becomes involved. That is, we recognize a face because we have seen that person before and stored that information in memory. It can be noted that the what pathway is the only visual pathway that leads directly to the hippocampus, an area critically important for memory. This type of connection in the brain allows us to give meaning to basic visual information; we see something and remember what it is.
The second question we seek to answer is where something is located in our world. This pathway has to do with spatially related factors such as motion, depth, and location. Both the what and where pathways begin as light stimulates our retina, which sends information to the primary visual cortex also called V1. In technical terms, we say there is a ventral pathway to the temporal lobe involved in processing what something is, and a dorsal pathway to the parietal lobe involved in processing where something is. This can be seen in Figure 5-12.
Figure 5-12 There are two pathways in the brain—one related to where an object is and one related to what the object is.
Other researchers have emphasized the role the ventral pathway plays in visual perception and the role the dorsal pathway plays in visual control of action (for example, Milner & Goodale, 2006). In this case the ventral pathway helps an organism know what something is and its relation to space and time. The dorsal pathway is involved in using vision for doing things in the world. When you play tennis or baseball or other sports, you are constantly using visual information to direct your actions. We constantly estimate what will come next.
1. What is the blind spot? Since we all have it, how are we able to see a whole scene without a hole in the middle of it?
2. Once visual information has left the eye, what are the primary structures it passes through on its way to and through the brain to be processed?
If you were asked whether the world is three-dimensional, you would quickly answer “yes.” However, as you think about the perception of depth, you have to consider how your visual world began with visual stimuli falling on the retina. That is to say, you begin with a two-dimensional image on the retina, and from that your visual system constructs a three-dimensional world with depth. Somewhat an amazing feat, don’t you think?
There are two types of information that humans use to perceive depth. The first is a set of cues that can be determined from using only one eye or monocular cues. How we perceive depth was a critical question for artists of the Renaissance such as Leonardo da Vinci. Many of the paintings prior to this time, especially religious paintings, were often two-dimensional and viewed as flat.
We use a number of cues to determine depth. Some of the cues we use for determining depth with just one eye are as follows:
1. Our knowledge of the object: From experience we know the size of people, trees, houses, buildings, mountains, and so forth. From this information we can determine whether something is close or far.
2. Occlusion: If a person or object hides the view of another person or object, we conclude that what is unhidden is closer.
3. Linear perspective: Parallel lines such as a railroad track appear to come together as they approach the horizon. This is called the vanishing point in art. Greater distance is experienced with a greater convergence.
4. Size perspective: If we assume two objects are the same size, we perceive the smaller one to be farther away.
5. Distribution of light and shadows: The manner in which light and dark fall give us the impression of depth. Likewise, brighter colors are seen as closer.
6. Motion: Even with one eye, we experience objects that move more slowly across our retina to be farther away than objects that move quickly. As you run through the woods, for example, the bushes next to your path move more quickly across the retina than do the trees farther away. An even more classic example is looking out the window of a fast car. You see the area close to the road moving very quickly and that in the distance moving more slowly. This is referred to as motion parallax.
In humans, our eyes are separated from each other by about 6 cm. What this means is that for distances of less than about 100 feet, each eye receives slightly different information when looking at the same scene. You can see this for yourself by quickly closing one eye after the other while looking at an object 5 or 10 feet away. As you close and open each eye, you will see the object shift slightly from side to side.
An even more graphic illustration is to hold your finger up in front of you and again alternate the closing of one eye and then the other. In comparison to the background image, your finger will move from side to side. This is particularly clear if you line up your finger with some vertical line in the background, such as a utility pole or the edge of a bookcase.
The fact that the image falls on a different place on the retina of each eye is called binocular disparity. Thus, the greatest differences or disparity between the images on the two retinas would come from objects that are close, whereas those at a great distance will show almost none at all.
Research has shown that there are cells in V1 that are sensitive to the differences in information received by the two eyes in terms of binocular disparity. Following the where or M-pathway from the visual cortex to the parietal area, one can also find cells sensitive to the disparity between the image on the two retinas. Additionally, there appear to be some cells that are sensitive to particular directions of movement and the amount of disparity. For example, some cells fire to a right-moving object that is close to the observer, whereas other cells fire to left-moving objects that are far away.
In the demonstration where you focused on your finger with alternating eyes, another process was at work. To focus on your finger or any other close object requires that each of your eyes moves and turns more inward. When looking at a distant scene, your eyes are more parallel. Whether you focus close or focus far requires different eye-movement positions. The muscle movement of the eyes gives an additional channel of feedback to the brain for determining depth.
One aspect of our visual system is that a variety of processes are occurring at the same time. This parallel processing allows us to create an image quickly. However, at times this results in what have come to be called illusions. In Chapter 1 you saw a white triangle that was not there. In Figure 5-13, we all see a white square, but actually there is no square there. It is the construction of our nervous system.
Figure 5-13 Image of white square—it is there, isn’t it?
We see what we expect to see. As you look at Figure 5-14, you read “A, B, C” in a horizontal manner and “12, 13, 14” in a vertical manner. Of course, the middle symbol is exactly the same. This is referred to as a top-down process since what is expected, or the idea of a particular word, determines what is seen. If you just focus on the “13,” you process the information from a bottom-up perspective. That is, your brain analyzes the nature of the stimuli such as a vertical line, curves, and so forth to determine that they are the numbers 1 and 3.
Figure 5-14 Image of A—B—C, 12—13—14—you see 13 or B depending on the context.
We tend to resolve the information that is presented to our sensory system in a coherent manner. This is based both on a top-down process in which our expectations help to form what we see and on a bottom-up process by which our sensory system puts together the individual features into a coherent scene (see Bar & Bubic, 2014 for an overview). One common experience of visual information is that we seek to create a consistent image. As you look at the image below, you see either a man playing a saxophone or the face of a woman (Figure 5-15).
Figure 5-15 Image of either a man playing a saxophone or the face of a woman.
Not only do we see things (the white square) that are not there, but we are also able to see a world that cannot exist. The artist M. C. Escher enjoyed playing with our sensory systems in many of his etchings. You can see these online (for example, https://www.wikiart.org/en/m-c-escher). We can even make a static figure move (see Figure 5-16).
Figure 5-16 Making a static figure “move.”
We think of the figures we just looked at as illusions. However, in some ways all of what we “see” is an illusion or construction of our nervous system (Eagleman, 2001). For example, we do not notice the edges of our visual field. That is, we don’t notice where the top and bottom or left and right of our field of vision ends. Neither do we notice that our vision in the periphery is not as sharp as that in the center of our visual field. When we watch an old 35mm film at the movies, we “see” continuous motion, whereas what is really there is a sequence of individual still pictures presented one at a time. Illusions are important since they help us understand how the visual system works. In fact, illusions have been studied and described for at least 2,000 years.
At the turn of the last century, an approach developed in Germany that emphasized the manner in which our perceptual system organizes the visual world in a predetermined manner. The name of this approach was Gestalt psychology, which you read about previously. As you remember “gestalt” means form or shape. The basic idea is that the whole is more than the sum of the parts. That is to say, the parts of a visual scene become organized in a manner such that a whole image emerges. This also happens in music in which the song you experience is more than just the sum of the individual notes. There is a form quality to the experience of music that transcends the instrument on which it is played or the musical key in which it is played.
The field of Gestalt psychology was developed by Max Wertheimer, Kurt Koffka, and Wolfgang Köhler. Gestalt psychology emphasized the observance of perceptual phenomena with a small number of individuals. For example, some of Wertheimer’s early work examined what he called the phi phenomenon, which is the experience of apparent motion when two lights are turned off and on at an interval of 50—60 milliseconds. The experience of motion from one light to the other emerges from the situation and cannot be explained by just knowing that two lights alternatively turn on and off. The individuals see the lights move from one to the other.
In 1915, Edgar Rubin in Denmark showed that our perceptual system organizes ambiguous stimuli in a definite manner. His famous Rubin’s vase clearly demonstrates that given this ambiguous set of stimuli, our perceptual system will organize it in one of two ways, either as two faces or as a vase. This came to be known as the figure—ground relationship. When we see faces, the vase becomes the ground and is no longer viewed as a vase. The opposite is true when we see the vase.
By 1925, Koffka, Wertheim, and Köhler had defined a number of principles that describe the manner in which our sensory system organizes patterns of sensory stimuli. The basic idea is that there are a set number of principles that are part of our evolutionary history. These include:
1. Similarity: We tend to see objects that are similar to one another to be grouped together. For example, in Figure 5-17 we see the dots organized in horizontal lines according to color. It is more difficult to view them as vertical lines.
Figure 5-17 Similarity—which dots are alike?
Proximity: We tend to see objects that are close together as part of a group. For example, in Figure 5-18 the four vertical lines in the figure are grouped into two separate groups.
Figure 5-18 Proximity—how do you group the lines?
Closure: We tend to fill in missing parts, even when the lines are not present. We have no problem seeing the panda in the illustration in Figure 5-19.
Figure 5-19 Closure—we fill in the missing lines.
Continuation: We tend to follow lines to their culmination. We are more likely to see the X figure as composed of two continuous lines in Figure 5-20.
Figure 5-20 Continuation—there is a predictable way that we see the beginning and end of the line.
Good figure: We tend to see structures in a manner that makes it as simple as possible. For example, in Figure 5-21, we tend to see three boxes rather than three “L” shapes on one side and three upside-down “L” shapes on the other.
Figure 5-21 Good figure—we create figures such as the boxes.
Learning to See Requires Movement
Until the middle of the last century, most people thought that you learn to see as the result of looking at the object. External stimulation was seen as the key. Held and Hein (1963) showed us that it was more complicated than just having external stimulation. In a very ingenious study, they took ten pairs of kittens. Each pair was from a different litter. One kitten rode in a gondola-like apparatus with limited ability to move. The other kitten was free to move itself in a number of directions as it pulled the gondola in a circle. This procedure is pictured in Figure 5-22.
Figure 5-22 Apparatus for equating motion and consequent visual feedback.
The kittens were exposed to the apparatus for 3 hours a day from 8 weeks to 12 weeks after birth. When not in the apparatus, the kittens were kept with their mothers and litter mates in a dark environment. What difference did riding versus walking have? Held and Hein looked at three different tasks.
The first task was simply holding the kitten in the experimenter’s hand and slowly moving it down toward the edge of a table. Normal kittens anticipate contact with the table and put out their paws. The second task was a visual cliff situation. In this situation, a red checkered tablecloth is placed either directly below a piece of glass or about a yard or so below the piece of glass. The glass is illuminated from below in such a manner that the glass is not apparent. If the animal has depth perception, it will not walk on the glass with the pattern a yard below it. The third task was simply to hold the kitten in a standing position and then for an experimenter to move his or her hand quickly toward the kitten. Normal kittens show an eye blink response.
When the kittens that walked and pulled the gondola were tested on these three tasks, they showed the normal response on each of the tasks. The kittens that rode did not show the normal responses. Overall, this research shows the importance of self-produced movement in the development of visual processes. It can also be noted that human infants do not show a fear response to the visual cliffuntil they are able to crawl (Campos, Bertenthal, & Kermoian, 1992).
What happens if someone loses his or her vision early in life? Does this result in the person developing a greater sensitivity to sound or touch? This idea has been around at least since the time of the ancient Greeks. Today, a variety of studies have shown that this indeed is the case (Bavelier & Hirshorn, 2010). Individuals who become blind early in life are shown to have increased auditory abilities. Likewise, individuals who become deaf early in life are shown to have superior visual abilities. What happens to these areas in the brain? The unused visual cortex in individuals who are blind is taken over by other sensory processes such as touch or hearing. Some research has shown that the primary visual cortex (V1) in blind individuals may be involved in reading Braille or in hearing. This is also true with deaf individuals with the auditory cortex being taken over by other senses.
The World Is Your Laboratory—Human Sight Returned
The importance of learning to see has been highlighted through the study of unfortunate individuals who lose sight as children through accidents. Some of these have had transplant operations later in life that restored the ability to see. What they saw was of great interest to scientists.
One such case was a man referred to as S.B. born in 1906 who lost sight in both eyes at about 10 months of age (Gregory & Wallace, 1963). Fifty years later, this man received corneal grafts to restore sight. The vision psychologist Richard Gregory worked with the physician who performed the operation to study the manner in which sight was restored. The patient’s initial sight was described as follows:
S.B.’s first visual experience, when the bandages were removed, was of the surgeon’s face. He described the experience as follows:—He heard a voice coming from in front of him and to one side: he turned to the source of the sound, and saw a “blur”. He realized that this must be a face. Upon careful questioning, he seemed to think that he would not have known that this was a face if he had not previously heard the voice and known that voices came from faces.
At the time we first saw him, he did not find faces “easy” objects. He did not look at a speaker’s face, and made nothing of facial expressions. On the other hand, he very rapidly (apparently within a couple of days) distinguished between passing lorries [trucks] and cars, and would get up at six each morning to look at them some way off. He “collected” different types of lorry, and took much pleasure recognizing vans, articulated lorries, and so on. His particular interest in cars and lorries may have been in part that they made familiar sounds, which helped in identification; that they could only be driven by sighted people, and so held out particular promise to him. He had spent many hours trying to visualize the shape of cars while washing them, particularly his brother-in-law’s car, which he frequently washed down.
The initial contact with S.B. resulted in Richard Gregory noting that S.B. was able to walk without the need for touching when he went through a doorway. He was also able to tell time from a large clock. S.B. reported that he learned to tell time by feeling the hands on a clock. He had also learned letters of the alphabet through touch. Thus, he was able to make a transition from touch to vision. He could also identify the colors red, white, and black. Other colors confused him.
Later, a variety of illusions such as the Necker cube, the Zöllner illusion, and the Poggendorf illusion were shown to S.B. These are shown in Figure 5-23, Figure 5-24, and Figure 5-25. The Necker cube changes perspective in terms of whether the corner is seen to be close or far away. The vertical lines in the Zöllner illusions are not seen as parallel. The diagonal line in the Poggendorf illusion is not seen as a straight line. In general, S.B. did not see the Necker change perspective, that is, the inside move toward or away from him. Also, he did not see the lines in the Zöllner to be non-parallel or the line in the Poggendorf not to be straight.
Figure 5-23 Necker cube.
Figure 5-24 Zöllner illusion.
Figure 5-25 Poggendorf illusion.
Richard Gregory also took S.B. to view particular tools, one of which was a lathe. Gregory notes that S.B. was somewhat agitated and could not identify the parts of the tools. Upon being allowed to touch the lathe, S.B. closed his eyes and correctly understood how it worked. S.B. reported, “Now that I’ve felt it I can see.”
Thought Question: S.B. reported being able to understand what he was seeing by being able to touch it. What senses could you use to help you understand your world if you could no longer see?
1. What characteristics built in to our visual system allow for us to experience illusions?
2. What is the difference between monocular cues and binocular cues for determining depth? Give an example of each.
3. Gestalt psychology is interested in the way our perceptual system organizes ambiguous stimuli in a definite manner. Identify and describe five organizational principles that are part of our evolutionary history.
4. Until the middle of the 20th century, most people thought that you learned to see by merely looking at objects. How have our views changed as a result of the kitten experiments? As a result of this research, what recommendations would you make to parents to provide an environment in which their infants develop their visual systems most effectively?
You are sitting and listening to your favorite music. You realize the complexity of the music but have no difficulty picking out the vocal part from that of the instruments. Then, from the next room you hear the sound of plates dropping and breaking as they hit the floor. This may be followed by your friend saying a few words reflecting his or her anger at the event.
Real-world sounds are extremely complex. We have the ability to make a wide variety of discriminations ranging from the sound of the human voice to a babbling brook to rain on the roof or a piece of music with one particular type of guitar or even sounds that signal danger (see Schnupp, Nelken, & King, 2011 for an overview). How do we do it?
Hearing or audition is the manner in which we detect sounds. The auditory system is not only able to detect whether a sound is present but also where the sound is coming from. Unlike vision, in which the receptors in the eye are sensitive to changes in electromagnetic energy, hearing works on a more mechanical basis. It is the changes in air pressure at particular frequencies that produce our experience of sound.
As you press a key on a piano, a string is struck. The string then begins to vibrate at a particular frequency. This in turn changes air pressure, which produces waves. Similar to throwing a rock into a still pond, the waves move outward from their source. If it is a large rock, the waves will be big. If it is a small rock, they will be small.
Large sound waves or those high in amplitude will result in our experience of a loud sound. Small amplitude waves will result in a soft sound. The unit of measure for amplitude is the decibel (db). Figure 5-26 shows the amplitude expressed in db of common experiences.
Figure 5-26 Sounds experienced in everyday life and the level associated with each.
If the waves are fast, we experience a high-pitched sound, whereas slower waves are experienced as lower in pitch. At times, we can both hear and feel sounds as you might in an action movie in a theater. The unit of measure for frequency is cycles per second or hertz, which is abbreviated Hz. The typical young adult can detect frequencies between 20 Hz and 20,000 Hz, whereas your dog can hear up to 40,000 or 60,000 Hz. In fact, most animals can hear higher frequencies than humans with the porpoise, whale, and bat being able to hear above 100,000 Hz. As humans grows older, their hearing becomes less responsive to the upper frequencies.
If you hear the tones shown in Figure 5-27, you would hear a simple tone. However, most sounds that we hear are complex. That is, they are made up of a number of sounds. If you think about a violin as it plays a note, what you hear is a complex sound. Timbre is the term used to describe complex tones such as a train whistle or a church bell ringing. Each of these is not a single frequency wave but a complex set of waves around a particular frequency.
Figure 5-27 Sound is described in terms of frequency and amplitude.
How the Ear Works
Sound waves enter the ear. It is the pressure of these waves that influences our hearing of sound. The components of a sound wave include frequency, amplitude, and complexity. Frequency determines the pitch that we hear. We hear a high pitch such as that produced by a piccolo or flute differently from the low pitch of a tuba. The intensity or amplitude of the sound wave determines whether we experience it as loud or soft. The complexity of the sound wave determines the richness or timbre of the experience of hearing.
Sound waves are directed through the ear canal to the ear drum also called the tympanic membrane. These structures are referred to as the outer ear. Your outer ear is shaped in a particular way. This shape allows for particular frequencies coming from different locations to be differentially acted upon. This allows your brain to use this information to help localize sounds in space and to know where they are coming from.
The sound waves coming from the outer ear move to the ear drum. The ear drum transfers this energy to three bones, the hammer, anvil, and stirrup. The stirrup connects to the inner ear. Since the size of the ear drum is larger than the size of the area of the stirrup connecting to the inner ear, there is an increase in pressure. This increase in pressure allows for more sensitivity especially in the middle frequencies of human hearing.
The cavity between the ear drum and the inner ear that contains the three bones is filled with air. In fact, the air comes from the throat through the Eustachian tube. The Eustachian tube is normally closed but will open when the pressure on the outside of the ear drum is different from that of the middle ear. Some people chew gum or yawn when flying in an attempt to equalize the two pressures. The experience is of your ears “popping.”
The sound waves from the middle ear are then transferred to the inner ear. The main structure that changes pressure waves into information that can be processed by the brain is the cochlea. Cochlea comes from the Greek word for snail and, as you can see in Figure 5-28, it looks like the shell of a snail. The human cochlea is wound about three times. The diameter of the coil becomes smaller as it progresses.
Figure 5-28 Auditory pathway.
When energy from the middle ear moves the stirrup, which is connected to the oval window at the end of the cochlea, this energy moves the fluid inside the tube. The oval window is a membrane that is flexible and can be moved to create wave-like action. At the other end of the cochlea is another membrane, the round window, which is also flexible. These two flexible membranes allow the fluid in the cochlea to move.
Like waves at the ocean, the pressure from the sound moves throughout the cochlea. However, the movement does not occur in a uniform manner. The movement of the fluid reflects the amplitude and frequency of the sound waves. Because the mechanical nature of the cochlea is that it is more flexible in some areas and more rigid in others, specific frequencies of sound stimulate hair cells in different parts of the tube (see Figure 5-29).
Figure 5-29 Human cochlea.
To summarize, mechanical activity in the environment such as a tree falling produces sound waves. These sound waves are picked up by the ear and channeled down the ear canal. The waves then reach the ear drum, which they cause to vibrate. These sound pressure vibrations move the bones of the middle ear. The third bone, the stirrup, moves the oval window in relation to the frequency and amplitude of the sound waves. This moves the fluid of the inner ear, which in turn stimulates hair cells in the cochlea. The stimulated hair cells release neurotransmitters that result in the firing of neurons that send information to the auditory areas of the brain. Notice that sound waves move through a number of physical structures before an action potential can be produced.
Sound waves in the air become mechanical waves in the air cavity of the middle ear, which in turn creates fluid waves in the inner ear. Why does sound go through so many transformations? One answer is that from an evolutionary perspective, the auditory system is built on a very old system. Fish, for example, are sensitive to pressure changes in water. Porpoises receive sounds through their bodies rather than through their ears. Other organisms such as moths receive sounds through touch receptors located on their bodies. Thus, in humans we might think of the ear drum as a pressure or touch receptor protected by the ear canal.
Creating Sounds in the Brain
Three types of information come from the hair cells in the cochlea. The first is the pitch or frequency of the sound. The second is the amplitude or how loud the sound is. And the third type of information is the temporal nature or duration of the sound also referred to as complexity. From these three types of information, we create a world of music and language.
This information from each ear is initially kept separate as it goes to the initial structure in the brain stem involved with hearing, the cochlear nuclei. The information created in the cochlea of the left ear travels along auditory nerve fibers to a group of neurons referred to as the left cochlear nucleus. Information from the right ear goes to a different set of neurons, the right cochlear nucleus. The cochlear nuclei are located in the brain stem at the junction of the medulla and pons (see Figure 5-30). The structure of the cochlea neurons is such that pitch information is mapped in terms of neuron location. This is similar to how the keys on a piano go from low-frequency to high-frequency notes.
Figure 5-30 The pathway in the brain that creates the experience of hearing a sound.
Creating sounds in your brain is a complicated process as are the pathways that auditory information follows as it moves through your brain (see Kandel, Schwartz, Jessel, Siegelbaum, & Hudspeth, 2013; Schnupp, Nelken, & King, 2011 for more detailed information). Only the basics will be presented here.
Above the level of the cochlear nuclei, information related to sound goes in parallel fashion to a variety of structures. These pathways generally include information from both ears. Since you have two ears, it is possible for a sound to reach one ear slightly before it reaches the other. A sound off to your left would reach the left ear about 700 millionths of a second before it reached the right ear. Of course, a sound from in front of you would arrive at each ear at the same time.
When a sound arrives at the right and left ears, it is processed in the cochlea and then sent to the cochlear nucleus associated with that ear. From the left and right cochlear nuclei, the information is sent to the superior olive (see Figure 5-30). Information arrives at the superior olive from each ear in the form of action potentials. If an action potential related to sound in one ear is delayed from that related to the other ear, neurons in the superior olive will respond. This set of neurons in humans is able to distinguish time differences as short as 10 millionths of a second (Kandel, Schwartz, Jessel, Siegelbaum, & Hudspeth, 2013).
How do you know where a sound came from in the external world? The superior olive is able to process differential loudness information between the sounds heard at the left and right ears. Using both the time at which a sound arrived at the left and right ears and the loudness of the sound, the superior olive begins to create a map of where in the external world a sound may have originated.
Pathways go from the superior olive to the inferior colliculus and then to the medial geniculate nucleus and then to the primary auditory cortex in the left and right temporal lobes. In the same way that the primary visual cortex is called V1, the primary auditory cortex is called A1. The neurons of the auditory cortex are arranged in a manner that reflects the pitch or frequency of the sound information.
From the auditory cortex, there are two pathways that go to two different prefrontal areas of the cortex. One of the pathways has been called the what pathway and is associated with determining what a sound is. The other pathway is the where pathway and is involved with determining where in space an object is located.
If a person has lost their ability to hear, one approach has been to present information directly to the auditory nerve. This is referred to as a cochlear implant. As seen in Figure 5-31, a microphone, sound processor, and transmitter are placed on the skull above the ear. These are the external devices. The internal components include a receiver and an electrode system, which send signals to different parts of the auditory nerve.
Figure 5-31 With a cochlear implant, external sounds are processed and sent directly to the auditory nerve.
It should be noted that a cochlear implant does not restore normal hearing. Rather, it is a means to help the person use the sounds found in the environment, especially in terms of speech. With the advent of small com puter chips, researchers are seeking to improve the quality of what is being heard.
Intact hearing makes learning to speak much easier. This is why children who are born deaf are fitted with cochlear implants as early as possible. Adults who have lost their hearing can also benefit from cochlear implants. These implants allow patients to understand the spoken word, but music is more of a problem. Improvements in hearing can be achieved by using cochlear implants in both ears.
According to the Food and Drug Administration (FDA), as of December 2012, approximately 324,200 people worldwide have received implants. In the United States, roughly 58,000 adults and 38,000 children have received them (http://www.nidcd.nih.gov/health/hearing/pages/coch.aspx). In 2017, the FDA approved a device that allows individuals with certain cochlear implants to stream sound directly from an iPhone or iPad. This will allow for phone calls and music to be sent directly to the cochlear implant.
The Vestibular System and Balance
Before we leave the ear, let’s look at another system that is contained in the inner ear. This is the vestibular system. The vestibular system contributes to your experience of movement, head position, and where you are in space in relation to gravity. It also functions as an internal guidance device.
As you saw in Figure 5-28 previously, there are three semicircular canals that are located near the cochlea. There are also two other structures called the saccule and the utricle. Like the cochlea, these structures and canals contain hair cell receptors. In fact, the hair cells related to hearing and those related to balance and your movement in space work in the same way. As you look at these canals you will notice that they are located in different orientations (see Figure 5-32). As such, each is sensitive to movement in different directions. Further, the saccule and utricle are slightly different in size such that one moves faster than the other. In that way you experience acceleration. Jumping up and down will influence the fluid in these structures and canals differently than moving your head from left to right. As this fluid moves, it puts pressure on the hair cells, resulting in action potentials being sent to a number of areas in the brain.
Figure 5-32 The vestibular system of the inner ear.
1. The auditory system processes sound waves to result in our experience of hearing. How do the following components of sound waves affect what we hear:
2. A book falls off the desk while you’re studying. What are the steps of the process that occur from the time the sound wave hits your outer ear until you identify the sound?
3. How do our two ears—and the separate parts of the brain to which they are connected—work separately and together to give us the information we need to identify what and where a sound is? And what is a “superior olive” anyway?
4. What is the function of the vestibular system and what are its important parts? How does it interact with the visual system?
The Chemical Senses: Smell and Taste
Smell and taste are different from vision and hearing. Rather than photons or pressure, we experience the chemicals in our world with smell and taste. We notice the smell of good food cooking, and we are attracted to where the smell is coming from. As this is happening, our digestive system prepares to process the food we are about to eat. However, if we open a container with rotten food, that is a different story. We are repulsed just by a quick experience of the odor. We have a long evolutionary history of protecting ourselves through the smells that we experience. The same is true with taste. Ask any 6-year-old. Humans generally seek sweet tastes and reject bitter ones. This makes sense in terms of evolution because, in nature, poisons are generally bitter rather than pleasant tasting.
Olfaction: The Smelling Sense
The wild pig’s sense of smell is extremely well developed (much better than both their eyesight and hearing), and they rely strongly on it to detect danger and search out food (Estabrook, 2015). They are capable of sensing some odors 5 to 7 miles away and may be able to detect odors as much as 25 feet underground! (If you want to learn more about wild pigs, see https://feralhogs.tamu.edu/frequently-asked-questions/frequently-asked-questions-wild-pigs/).
Unlike other animals such as the wild pig or your cat, you do not see humans using their sense of smell to find their way. However, we can do so (Porter et al., 2007). In one study, humans tracked a scent outdoors, which they were able to do. Practice improved this ability. Also, the information received by each nostril helps to improve the tracking. By the way, the participants were tracking the smell of chocolate.
Since humans rely more on vision than smell as a major sense, you might think that there are fewer areas of our brain devoted to these discriminations. However, it is not that simple. It turns out that it is estimated that we still may be able to distinguish more than 10,000 different chemical smells (Buck & Bargman, 2013). Perfumers, food critics, and wine tasters can make even finer discriminations. Research shows that human mothers and their infants learn to detect the smell of each other.
Olfaction refers to your sense of smell. In your nose are sensory neurons that are responsive to chemicals that we experience as odors (see Buck & Bargman, 2013 for an overview). These sensory neurons are located in the olfactory epithelium, which is part of the nasal cavity. The cells in the olfactory epithelium are relatively short-lived and replaced by new cells every 30 to 60 days.
Surprising enough, mammalian species tend to have the relative same number of neurons in the olfactory bulb in relation to their overall brain size (McGann, 2017). As you see in the Myths and Misconceptions feature in this chapter, humans are actually excellent in terms of the ability to detect odors. When we perceive an odor, receptors produce electrical signals and send a pattern of activity to the olfactory bulb located below the frontal areas of the brain. The olfactory bulb processes the signal and serves as a relay station to other areas of the brain including the limbic area (see Figure 5-33). Thus, it is not surprising that we often have emotional reactions to odors. How about your grandmother’s apple pie? Amusement parks such as Disney World use scents to enhance the park experience.
Figure 5-33 Smells stimulate the cells in your nose. Action potentials go from these to the olfactory bulb and then to other areas, especially those associated with emotionality.
Does food smell different when you are hungry? As you know from experience, the answer is yes. Not only are there pathways going from the olfactory bulb to the brain areas related to detecting odors, but there are also pathways from the brain to the olfactory bulb. This allows you to pay attention to important odors such as the smell of food when hungry. It also lets you do the opposite, that is, ignore odor sensations when not necessary. Thus, your physiological state can determine what and how you experience smells.
Across a number of animal species odors are able to bring forth specific innate behaviors. You may have heard of the chemicals called pheromones. Pheromones play an important role in mating and other behaviors. In some species, pheromones can signal when the female is receptive and able to conceive. They can also bring forth aggressive responses. Pheromones are excreted from the organism in sweat, urine, and other bodily fluids. This is how your dog, as well as other animals, marks his territory. In animals, pheromones (like odors) are detected by structures in the nasal passage but also by an additional tubular structure in the nasal area referred to as the vomeronasal organ (VNO). Humans do not have the vomeronasal structure, which is why we are less affected by pheromones.
Myths and Misconceptions—Humans and the Ability to Smell
If asked, most of us would say that we humans do not have the same ability to smell odors as do other animals. Our dogs seem to spend the majority of their life smelling things, especially trees and other dogs. Well, the idea that we humans have a terrible sense of smell turns out to be a myth. New research shows that humans have an excellent sense of smell (McGann, 2017).
Much like the statement you should drink eight glasses of water a day, the idea that humans were inferior to other mammals in terms of olfaction had never been tested. It was just an idea that dated back to the 19th century and the work of Paul Broca. In a somewhat complicated interaction between the church and the scientists of the day, Broca had pointed out that the front lobes had increased in size over our evolutionary history. This was interpreted to support the idea of humans having free will. He further pointed out that the olfactory bulb of humans had not increased in size. Further, it was reported that a number of animals have a larger olfactory bulb in relation to the size of other parts of their brain. As with many myths, the basic idea that humans had an inferior sense of smell was passed on without research from generation to generation.
In this century with better research techniques for identifying both chemical substances and neurons in different parts of the brain, a new picture of human abilities in terms of smell is appearing. For example, one study in the journal Science suggested that humans can discriminate more than one trillion different smells (Bushdid, Magnasco, Vosshall, & Keller, 2014). This is in comparison to being able to discriminate several million different colors, and half a million different tones. Even if the trillion number is high, our ability is clearly better than the 10,000 number often listed in textbooks.
Recent reviews support the idea that humans are strongly influenced by olfaction (McGann, 2017). Research has shown that environmental odors can prime specific memories and emotions, which can be related to post-traumatic stress disorder (PTSD) experiences. Odors can also influence autonomic nervous system activation, which can shape perceptions of stress and affect, which in turn results in approach and avoidance behaviors. Humans can even follow outdoor scent trails and exhibit dog-like behaviors when trails change direction. Also, humans are sensitive to the smell associated with other people. Of course, as with all of our senses, age, gender, and developmental stage can influence the ability to smell. Although humans may have a better sense of smell than many have suggested, dogs do have some advantages. These include more neurons in the brain structures associated with smell and noses that have the ability to breathe out without a decontamination of the scent trail. Dogs with their cold noses can also sense sources of heat (Bálint et al., 2020). Although dogs have important abilities, it is clear that the human sense of smell is much better and more important than previously thought.
Thought Question: How could our knowledge of smell be informed by a myth for such a long period of time? What are some other research studies that could be conducted based on this new knowledge of our sense of smell?
The Gustatory System: A Matter of Taste
Although smell is associated with a number of processes, taste is mainly related to food (see Buck & Bargman, 2013 for an overview). That is, the taste system determines the chemical makeup of foods and beverages in terms of nutrient content, agreeableness, and potential toxicity (Roper & Chaudhari, 2017). Previously we thought that humans and other mammals were sensitive to four qualities of taste. These were sweet, bitter, salty, and sour. We now know there is a fifth taste—umami. Umami is a Japanese word and can be translated as “pleasant savory taste.” This is the taste associated with amino acids, which some experience as similar to monosodium glutamate (MSG). Both sweet and umami are found to be pleasant, whereas bitter brings forth aversive reactions. Salts are important to help organisms maintain electrolyte balance, and this taste typically adds flavor to foods. To complicate the story a little more, there is now a suggestion that some of the receptors on the tongue that were thought to be sensitive to sour are really sensitive to water (Zocchi, Wennemuth, & Oka, 2017).
By definition, taste refers to the five qualities processed by our gustatory system—sweet, bitter, salty, sour, and umami. Flavor—which is more complex—refers to the combination of sensory processes that we experience in terms of taste, smell, and texture as well as our experience of chewing. Not only do you smell food through your nose but also odors are transmitted through the back of your mouth. If your mouth is dry or you have a cold, the flavor of the food you eat is greatly changed. Further, genetic differences in alleles determines whether you experience foods such as brussels sprouts and broccoli as bitter or not (Roper & Chaudhari, 2017).
As seen in Figure 5-34, taste signals are produced in the mouth and go through the brain stem and the thalamus to the area of the brain related to taste. There is also a pathway from the thalamus to the hypothalamus that controls feeding behaviors. Most of our taste buds are located on the tongue, although there are some in other areas of the mouth and throat. The bumps on your tongue are not taste buds but papillae. Your taste buds lie buried around these. In order to taste something, it is dissolved in the saliva in your mouth. The combination of the food and your saliva makes its way to the taste buds, which are garlic-shaped structures.
Figure 5-34 Taste signals begin with your taste buds and from there go to a number of pathways in the brain. These include those connected with eating. There are also facial muscles and those in the mouth that react to tastes we find repulsive.
1. What evolutionary role do the sensory processes of smell and taste provide for humans?
2. What are pheromones, and what role do they play in many animal species? Why aren’t they as relevant to humans?
3. What are the five qualities of taste that humans can detect?
4. Which three sensory processes combine to produce the human experience of flavor, and how is that different from taste?
Touch and Pain
We move through the world. We pick up a bottle of water. We write on our computer, text on a cell phone, and open a door without paying much attention to our sensation of touch. If we touch something that is hot, we feel pain. If we break a bone or have a headache, we also feel pain. Both touch and pain are critical to our ability to live our lives. This section describes the mechanisms involved and the experience of touch and pain.
The Sense of Touch
Touch plays a critical role in our lives. It helps us use tools. We use touch to tell us how much force is needed to pick up a cup of coffee or glass of water. If you are a cook, you might use touch to see if meat or bread or a cake is cooked correctly. If you work with wood or clay you use touch to measure smoothness. If you are blind you actually use touch to read using braille. Mothers of many species use touch to soothe their infants. In fact, infants of many species including humans show delayed developmental processes without touch. And, of course, individuals use touch to intensify romantic relationships.
Historically, a distinction has been made between active and passive touch. Active touch is when you move your body, generally your hands, against another object or person. Passive touch is when another person or object rubs against your skin. Both passive and active touch use the same receptors in the skin and the same pathways to the brain (Gardner & Johnson, 2013); however, the cognitive features may differ. If something rubs you as you walk through the woods, you initially react and seek to determine what it is. You may also pick up something in the woods and feel it actively to determine what it is. Although the question is the same—what is it?—in each case, the cognitive and emotional processes differ. There is also a close connection between active touch and the motor system. It is also the case that active and passive touch show different patterns of brain activation (Simões-Franklin, Whitaker, & Newell, 2011).
Think for a second as to what happens when you place an object between your fingers. Your skin conforms to the shape of the object. As your skin moves in relation to the object, information is sent to your brain. This information creates a tactile image, which includes the object’s shape and texture as well as the amount of force needed to hold the object. Holding Jell-o is not the same as holding a pen.
Your hand has four types of receptors that supply information to your brain. As can be seen in Figure 5-35, the four types—Meissner corpuscles, Merkel cells, Pacinian corpuscles, and Ruffini endings or corpuscles—are shaped differently and are located at different depths of the skin. Merkel receptors are related to sensing fine details and fire constantly as long a stimulus is present. Meissner corpuscles fire when the stimulus is first applied and then again when it is removed. These are associated with controlling handgrips. Deeper in the skin are the Ruffini endings, which respond continuously when the stimulus is present and are related to the stretching of the skin. Pacinian corpuscles are also deeper and are sensitive to rapid vibrations and fine texture.
Figure 5-35 Your skin contains four types of cells that are sensitive to different aspects of touch—shape, texture, movement, and pressure.
The four types of receptors work together to give you a sense of shape, texture, movement, and pressure. The receptors perform this task by being sensitive to skin stretch, edges, lateral motion, and vibration (Figure 5-36). Similar to receptor cells in the eye, touch cells have receptive fields. These receptive fields help you experience the tip of a sharp pencil differently from the blunt eraser end. Some of these receptors are quick-acting and respond to changes. For example, you notice when you initially put your smart watch or fitness tracker on, but after a time you no longer feel its pressure on your skin. Other longer-term receptors respond as long as the stimulus is present.
Figure 5-36 We use the receptors in our fingers to determine what an object is.
One way researchers study touch is to take two objects such as straight pins and place them at different distances on the skin. They then note when the pins are experienced as two pins versus one pin. The most sensitive areas of your body are your fingers and lower face (lips and cheek). Think of how you use your fingers and lips versus your back and legs. The least sensitive areas are your calf and thigh.
The information from the touch receptors goes through the spinal cord and then to the brain stem and the thalamus. As the information goes from the spinal cord to the brain, the fiber tracts cross to the other side so that information from your right hand goes to your left hemisphere and information from your left hand goes to your right hemisphere. From the thalamus, touch information goes to the parietal lobe.
The area of the parietal lobe involved in sensing touch is referred to as the somatosensory cortex. Touch information on the somatosensory cortex is organized in terms of locations on the body. More brain area is associated with greater sensitivity, as shown by the size of body parts in Figure 5-37. It should be noted that directly in front of the sensory cortex is the motor cortex. The motor cortex is organized similarly to the sensory cortex. This allows for an integration of the experience of touch with making actions.
Figure 5-37 The sensory cortex receives information from our body, whereas the motor cortex controls movement.
Unlike blindness or deafness which is not uncommon in the population, few individuals are unable to experience pain. Those people who cannot experience pain typically have a genetic disorder associated with the inability to experience pain although they may experience touch. One such person was a 10-year-old boy found in Pakistan. He was a street entertainer who made money by putting knives through his arms or walking on burning coals. Unfortunately, he died at age 13 jumping off a roof.
(described in Cox et al., 2006).
As noted in the story of Tom at the beginning of Chapter 1, phantom limb is the global feeling that a body part still exists even after it was amputated. This experience is shown by the majority of people who have lost an arm or leg. Some group of these individuals not only have the experience of the limb being present but also experience pain in the limb. The nature of this pain varies from a constant pain to short lived episodes. It can vary in intensity.
(described in Flor et al., 2006)
Although most of us do not seek the experience of pain, historically it is critical for our survival (see Basbaum & Jessell, 2013 for an overview). Pain helps us know when we need to withdraw from a situation such as touching something hot. Pain also helps us know if there is a problem with our internal bodily systems such as a toothache or broken bone. Since pain tends to make us withdraw from a situation, it has the ability to help us minimize additional damage to our bodies.
The description of discomfort comes in many forms. We talk about shooting pain, dull pain, throbbing pain, burning pain, stinging pain, and cool pain to name a few. We also use the term soreness to reflect discomfort in our muscles. Pain can be short term, persistent, or even long term. As with other sensory processes, pain is the result of a complex sensory experience interpreted by the brain.
The simplest way we have to determine if pain is present in people is to ask them. The verbal report is the most common technique used in pain research. However, genetic factors make some individuals more sensitive to pain than others. Environmental conditions also play a role. Thus, there is an individual factor in all pain reports. Unfortunately for scientists, there is no direct measure of pain.
What if you were working on a project nailing some wood together and missed and hit your finger? Your first pain sensation would be a sharp sensation. This would be followed by a more prolonged pain that might have a burning sensation. The sharp sensation involves A delta (Aδ) pain fibers and the following dull sensation involves C fibers.
The cells related to the experience of pain outside of the brain are referred to as nociceptors. By the way, the brain itself cannot feel pain as it contains no nociceptors. So why do I get headaches you might ask. There are nociceptors in the tissues between your brain and skull. These tissues can be influenced by chemicals released from blood vessels, and blood flow itself can trigger migraine headaches. Outside the brain, nociceptors are located in the skin and the structures below the skin such as muscles and joints. Overall, these receptors convert thermal, mechanical, or chemical stimuli into action potentials.
These receptors connect to pathways that lead to the spinal cord and then the thalamus in the brain. The experience of pain is complex in that a number of areas of the brain are involved. The areas involved depend on the context in which the pain is experienced as well as the person’s previous experience of pain. As expected, the limbic system through the cingulate is involved in the emotional aspects of pain. The insula, which is associated with the internal state of the body, is also involved in processing pain. Damage to the insula results in the person being able to distinguish between different types of pain such as dull verses sharp, but not show emotional responses to the pain. This suggests that the insula is an area in which the sensory, affective, and cognitive components are integrated.
Figure 5-38 Examples of pain assessment scales.
You may have noticed there is an exception to what has been said about pain so far. That exception is illustrated in the phenomenon of phantom limb pain. There is pain, but there are no receptors where we would expect them to be. Pain is this case originates from the central nervous system (brain and spinal cord). This is referred to as neuropathic pain. Generally, neuropathic pain is caused by damage to the central nervous system or damage to nerves in the periphery of our body.
1. Define active touch and passive touch. In what ways are they the same, and in what ways are they different?
2. What are the four types of touch receptors that are in your hand? What is the role of each, and how do they work together for you to experience an object?
3. Trace the touch sensory pathway from the finger through to the somatosensory cortex.
4. What is unique about the organization of the motor cortex and the sensory cortex? What advantage does that organization provide?
5. What evolutionary role does the sensory process of pain provide for humans?
6. How can a phantom limb cause pain since there are no longer any nociceptors present?
Learning Objective 1: Describe how psychophysics relates to our understanding of how we experience the world.
The manner in which our brain and nervous system take energy that exists around us and turns it into an experience is a critical aspect of sensation and perception. Sensation refers to the manner in which our receptor system transforms energy into activity that can be interpreted by the brain. Perception is the manner in which the brain makes sense of this activity. Simple sensations are processed differently in each of our sensory systems. Each of the sensory systems uses different biological processes referred to as transducers to initiate the sensory process. Our brain begins with information from each of the sensory systems alone. Our nervous system then processes this information, which gives us an experience that we interpret as reality. Thus, our experience of reality is based on how our nervous system is constructed.
Psychophysics (established by Gustav Fechner in the 1800s) is the study of the relationship between the physical characteristics of a stimulus and the manner in which we experience it. The relationship between the increase in the physical intensity and the subjective experience is a logarithmic rather than a linear relationship. In each sensory system, there is a constant that reflects the ability to notice differences between two stimuli. This difference is referred to as the just-noticeable difference or JND. Fechner also noticed that the JND changes as the original weights change by a constant proportion—the difference threshold. When there is just one stimulus present, the concept is the absolute threshold.
Learning Objective 2: Describe the key processes in the visual system.
Vison is our most important sense. A least half of our brain is related directly or indirectly to the processing of visual information including areas involved in the recognition of faces.
Light is electromagnetic energy. The light we see is only a small part of the electromagnetic spectrum. In the visual spectrum, we refer to this light as waves, which can be described in terms of: (1) amplitude or how large it is; (2) frequency or how often it recurs; and (3) length or how much distance there is between two consecutive waves. When our eye transforms the visual spectrum, we see the lower frequencies (longer wavelengths) as red and orange and the higher frequencies (shorter wavelengths) as blue and violet. The greens and yellows are in between.
Light reaches the cornea where it is bent and the lens where it is focused. Both are located at the front of the eye. The object we see is focused on the retina at the back of the eye. The pupil is surrounded by muscles contained in the iris, which let the pupil allow in more or less light according to the intensity of the light energy. Within the eye there is a clear fluid referred to as the vitreous. Light passes through this fluid and reaches the back of the eye, which contains the receptors that change the electromagnetic energy of light into electrical energy. It is the job of this electrical energy to create information in the brain.
Rods permit nighttime vision and to see in dim light. Cones permit daytime vision and to see with high-spatial acuity, which gives you the ability to see detail. Rods and cones also differ in their location. There are many more rods in the peripheral parts of the retina, and more cones located in a more central region of the retina called the fovea. The cones in the retina are sensitive to different wavelengths of electromagnetic radiation that the brain is able to change into the experience of color. There are three different cone types in terms of frequency sensitivity: (1) blue, (2) green, and (3) red. Our ability to see colors requires that the signals from the different types of cones be compared in the brain. The idea that variations in three different light colors lies at the bottom of our experience of color came to be known as the Young—Helmholtz trichromatic theory of color.
Some individuals use only two of the three light sources to match colors. These individuals are commonly called color blind; some match colors to a standard color by using only green and blue light, while others use only blue and red light. Color blindness is found more often in males than females since it is caused by a recessive allele on the X chromosome.
The point at which axons leave the eye is referred to as the optic disk or blind spot. Although there are no receptors in this part of the eye to respond to light, the brain fills in the missing information by using information from the other eye and eye movement. Thus, you see a complete scene without a hole in the image.
As information leaves the eye, it travels by long axons—the optic nerve—to the lateral geniculate nucleus (LGN), part of the thalamus. Different types of information such as color and contrast go to different areas in the brain. Your visual field is what you see in front of you without moving your head or eyes. The visual information from each eye remains separate in the LGN and is conveyed to a variety of areas in the primary visual cortex in the occipital lobe. Each of the areas in the primary visual cortex is involved in different aspects of visual processing such as depth perception, color, movement, and form.
If the direct pathway from the eye to the LGN is lost through brain damage, the experience of vision is lost.
The pathway continues to the temporal lobe and has been referred to as the what pathway, which is involved with knowing what a given visual object is. The other type of ganglion cell is the M or large type cells. Information moves along a separate pathway to a different part of the primary visual cortex. It then goes to the parietal lobe and is referred to as the where pathway, which is involved with knowing where an object is in space. The what pathway helps an organism know what something is and its relation to space and time. The where pathway is involved in using vision for doing things in the world.
Our visual experience of motion is a combination of muscular and position feedback of the head and eyes and the visual stimuli on the retina. The illusion of apparent motion clearly shows that the perception of motion cannot be explained by the position of the image on the retina alone and that perceiving the position of an object is performed separately from determining motion.
There are two types of information that humans use to perceive depth. The first is a set of cues that can be determined from using only one eye or monocular cues: (1) our knowledge of the object: (2) occlusion, (3) linear perspective, (4) size perspective, (5) distribution of light and shadows, and (6) motion. The second type of information for perceiving depth comes from binocular cues related to the fact that humans have two eyes that are separated from one another by about 6 cm. Thus, for distances of less than about 100 feet, each eye receives slightly different information when looking at the same scene, called binocular disparity. The greatest differences or disparity between the images on the two retinas would come from objects that are close. When looking at a distant scene, your eyes are more parallel and there is almost no disparity between the images. Whether you focus close or focus far requires different eye-movement positions. The muscle movement of the eyes gives an additional channel of feedback to the brain for determining depth.
A variety of processes in our visual system are occurring at the same time. This parallel processing allows us to create an image quickly. However, at times this results in illusions, based both on a top-down process in which our expectations help to form what we see and on a bottom-up process by which our sensory system puts together the individual features into a coherent scene. Not only do we see things that are not there, but we are also able to see a world that cannot exist, such as in the etchings by the artist M. C. Escher. We can make a static figure move. In some ways, all of what we “see” is an illusion or construction of our nervous system. For example, we do not notice the edges of our visual field. Illusions are important since they help us understand how the visual system works.
Gestalt psychology was developed in Germany at the beginning of the 20th century and emphasized the manner in which our perceptual system organizes the visual world in a predetermined manner. The parts of a visual scene become organized in a manner such that a whole image emerges and include: (1) similarity, (2) proximity, (3) closure, (4) continuation, and (5) good figure.
Individuals who become blind early in life are shown to have increased auditory abilities. Likewise, individuals who become deaf early in life are shown to have superior visual abilities. The unused visual cortex in individuals who are blind is taken over by other sensory processes such as touch or hearing. This is also true with deaf individuals with the auditory cortex being taken over by other senses.
Learning Objective 3: Explain what happens in the auditory system that allows us to hear.
Mechanical activity in the environment, such as a tree falling, produces sound waves that enter the ear, and it is the pressure of these waves that influences our hearing of sound. The components of a sound wave include (1) frequency, which determines the pitch that we hear; (2) amplitude or intensity, which determines whether we experience it as loud or soft; and (3) complexity, which determines the richness or timbre of the experience of hearing. Sound waves are picked up by the ear and channeled down the ear canal to the ear drum, which they cause to vibrate. These sound pressure vibrations move the bones of the middle ear. The third bone, the stirrup, moves the oval window in relation to the frequency and amplitude of the sound waves. This moves the fluid of the inner ear, which in turn stimulates hair cells in the cochlea. The stimulated hair cells release neurotransmitters that result in the firing of neurons that send information to the auditory areas of the brain. Sound waves move through a number of physical structures before an action potential can be produced. Why so many transformations? From an evolutionary perspective, the auditory system is built on a very old system.
Three types of information come from the hair cells in the cochlea: (1) pitch or frequency; (2) amplitude or loudness; and (3) temporal nature or duration of the sound, or complexity. This information from each ear is initially kept separate as it goes to the left and right cochlear nuclei. Using both the time at which a sound arrived at the left and right ears and the loudness of the sound, the superior olive begins to create a map of where in the external world a sound may have originated. Pathways then go from the superior olive to the inferior colliculus and then to the medial geniculate nucleus and then to the primary auditory cortex. From the auditory cortex, there are two pathways that go to two different prefrontal areas of the cortex: (1) the what pathway, which is associated with determining what a sound is; and (2) the where pathway, which is involved with determining where in space an object is located.
The vestibular system, also located in the inner ear, contributes to your experience of movement, head position, and where you are in space in relation to gravity. There are three semicircular canals located near the cochlea, as well as two other structures called the saccule and the utricle. Like the cochlea, they contain hair cell receptors related to balance that work in the same way as those related to hearing. The canals are located in different orientations. As such, each is sensitive to movement in different directions. The saccule and utricle help you experience acceleration. As you move, information from the vestibular system is sent to the brain and in turn coordinates your movement to keep you balanced. The system is also able to stabilize your vision such that moving your head does not cause the object you are looking at to become blurred.
Learning Objective 4: Summarize the role that chemical processes play in smell and taste.
Smell and taste are different from vision and hearing. Rather than photons or pressure, we experience the chemicals in our world with smell and taste. We notice the smell of good food cooking, and we are attracted to where the smell is coming from. As this is happening, our digestive system prepares to process the food we are about to eat. However, if we open a container with rotten food, that is a different story. We are repulsed just by a quick experience of the odor.
Olfaction refers to your sense of smell. In your nose are sensory neurons that are responsive to chemicals that we experience as odors. These receptors produce electrical signals and send a pattern of activity to the olfactory bulb, which processes the signal and serves as a relay station to other areas of the brain including the limbic area. Your physiological state can determine what and how you experience smells. We have a long evolutionary history of protecting ourselves through the smells and tastes that we experience. It is estimated that we may be able to distinguish more than 10,000 different chemical smells.
Taste refers to the five qualities processed by our gustatory system: (1) sweet, (2) bitter, (3) salty, (4) sour, and (5) umami. Flavor, which is more complex, refers to the combination of sensory processes that we experience in terms of taste, smell, and texture as well as our experience of chewing. Taste signals are produced in the mouth—most of our taste buds are located on the tongue—and go through the brain stem and the thalamus to the area of the brain related to taste.
Learning Objective 5: Describe the different senses of touch and what happens when we experience pain.
Touch plays a critical role in our lives. Historically, a distinction has been made between active and passive touch: (1) active touch is when you move your body, generally your hands, against another object or person; and (2) passive touch is when another person or object rubs against your skin. Both use the same receptors in the skin and the same pathways to the brain; however, there are differences: (1) the cognitive features may differ; (2) there is a close connection between active touch and the motor system; and (3) active and passive touch show different patterns of brain activation. Your hand has four types of receptors that supply information to your brain: (1) Meissner corpuscles, (2) Merkel cells, (3) Pacinian corpuscles, and (4) Ruffini endings. The four types of receptors work together to give you a sense of shape, texture, movement, and pressure; they do this by being sensitive to skin stretch, edges, lateral motion, and vibration. The information from the touch receptors goes through the spinal cord and then to the brain stem, the thalamus, and the somatosensory cortex in the parietal lobe. Touch information on the somatosensory cortex is organized in terms of locations on the body. The motor cortex is organized similarly to the sensory cortex, allowing for an integration of the experience of touch with making actions.
Pain is critical for our survival. As with other sensory processes, pain is the result of a complex sensory experience interpreted by the brain. Since there is no direct measure of pain, the verbal report is the most common technique used in pain research. Like other sensory processes, there are particular receptors in our body that are sensitive to pain. The cells related to the experience of pain outside of the brain are referred to as nociceptors. These receptors connect to pathways that lead to the spinal cord and then the thalamus in the brain. The experience of pain is complex in that a number of areas of the brain are involved depending on the context in which the pain is experienced as well as the person’s previous experience of pain. The phenomenon of phantom limb pain is an exception—there is pain, but there are no receptors where we would expect them to be. Pain is this case originates from the central nervous system (brain and spinal cord) and is referred to as neuropathic pain.
1. The author states that “[we] do create the world we experience.” How is this different from the idea that we experience the world directly as it really is? Use concepts presented in this chapter such as synesthesia, illusions, psychophysics, and signal detection theory in developing your response.
2. If you only had one type of cone, you would not be able to see the color associated with it or any color at all for that matter. Why?
3. You are sitting on a bench just looking at your visual field opening up in front of you. Suddenly, without turning your head, you see a change in the left part of your visual field. What are the multiple pathways that new information takes to your eyes and ultimately to your brain?
4. Do we live in a three-dimensional world? Of course we do, but since our visual perception starts with a two-dimensional image on the retina, what cues does our visual system use to add depth to that image and construct a three-dimensional world?
5. Before you read this chapter, it was easy moving around the world, taking in different sensory stimuli, identifying them, and responding appropriately. Now you know that each of these sensory systems is sensitive to very different types of stimuli that follow different and very complex paths through the nervous system and brain before you are able to identify what you are perceiving. How do all of these stimuli and processes work together to produce a stable and meaningful world in which we feel comfortable living? How does the story of S.B. whose sight was restored inform your understanding of how incredibly hard this all really is?
6. The author adopts an evolutionary perspective to answer the question of why sound goes through so many transformations. What does an evolutionary perspective have to offer in helping us understand our other sensory processes—vision, smell, taste, touch, and pain?
7. A chemical in the environment is detected by your nose. Describe the process that kicks into gear for you to experience that chemical as an odor and react to it.
8. A substance is placed on your tongue. Describe the process that enables you to identify the quality of the taste and then experience it as a flavor.
9. An object is placed in your hand. Describe the process that allows you to identify by touch what it is. What roles do the sensory cortex and the motor cortex play in that process?
10. 10. Describe the pain process, focusing on the two main types of fibers/axons and the types of nociceptors that are located on them. What kinds of sensations activate each type?
11. 11. You are reading this book to learn about psychology, so go back through each of the sensory processes covered in this chapter—vision, hearing, smell, taste, touch, and pain—and note instances in which these sensory processes interact with psychological processes.
For Further Reading
✵ Cytowic, R., Eagleman, D., & Nabokov, D. (2011). Wednesday Is Indigo Blue: Discovering the Brain of Synesthesia. Cambridge, MA: MIT Press.
✵ DeSalle, R. (2018). Our Senses. New Haven, CT: Yale University Press.
✵ Frith, C. (2007). Making up the Mind: How the Brain Creates Our Mental World. Malden, MA: Blackwell Publishing.
✵ Kandel, E. (2016). Reductionism in Art and Brain Science. New York: Columbia University Press.
✵ Cochlear implant—http://www.nidcd.nih.gov/health/hearing/pages/coch.aspx
Key Terms and Concepts
hearing or audition
just-noticeable difference or JND
opponent-process theory of color vision
optic disk or blind spot
Young—Helmholtz trichromatic theory of color