Sensation and perception
Brain and behaviour
Loraine Townsend, Kirston Greenop & Oliver Turnbull
After studying this chapter you should be able to:
•define the terms sensation and perception
•explain what is studied in the field of psychophysics
•understand the role of thresholds in the process of sensation and perception
•describe the structure of the eye, ear, tongue and nose, and their respective sensory pathways to the brain
•contrast theories of colour vision
•explain the forms of visual perception
•discuss the theoretical assumptions about the organisation of vision
•identify and describe the principal categories of neuropsychological disorders of vision that are observed after a brain lesion has occurred
•explain the role of sound waves in hearing
•compare the theories of hearing
•compare the chemical senses of taste and smell
•explain why people experience pain differently
•outline the kinaesthetic sense and the vestibular sense.
Melinda was sitting outside on the grass in the middle of summer. She could feel the warm sun on her skin; she could smell the flowers that were growing nearby. Melinda could also feel the soft grass tickling her. She started to feel drowsy and so decided to gently stretch her legs and lie down. As she looked up she saw her friend coming towards her through the dark green trees with the bright yellow flowers. She heard the birds twittering softly and then heard her friend calling to her. Melinda sat up and watched her friend approach. Her friend had brought some ice-cold juice, which Melinda drank fast, although the cold fresh sensation that it left lingered on her tongue.
Melinda found it fascinating to think of all the complicated brain processes that allowed her to be able to perform the simple task of seeing. She was lucky enough to have no problems with vision, and using her sight felt completely natural to her. It was hard to imagine what it must feel like not to be able to see these things, to see them in confusing ways or, perhaps worst of all, to see them but not be able to make sense of them. As she learned about how many neurological problems could affect vision, her ability to see felt even more amazing. ’Imagine being able to see faces, but not recognise them!’ she thought to herself. Melinda imagined that this must be a very disturbing experience. It was quite frightening to think about how much could go wrong with the complex workings of the brain, but learning about these helped her to understand and appreciate the remarkable things her brain was able to do.
Melinda knew that she often took her senses for granted, but when she did think about them she was amazed at how they allowed her to make meaning of the world around her.
As we discussed in Chapter 7, neurons are nerve cells, and different neurons have different roles to play in our nervous systems. For nearly all people, when a flower is in front of their eyes, their neurons responsible for sight will fire and they will see a flower. Similarly, when an oven door opens while bread is baking, most people’s neurons for smell will fire and they will smell baking bread.
However, for a very small number of people, the sensations they receive may be processed as an overlapping experience. So, for example, they may see smells or taste sounds. This is called synesthesia, where one experience overlaps with another (Simner, 2012). A person may hear a particular music note but perceive it as a sugary taste, or a person may experience someone else’s voice as the taste of chocolate and coffee. Cases of synesthesia are very rare, but they illustrate the need to study the different senses and how people make meaning of sensation. They can also help to demonstrate how sensation and perception may affect a person’s psychology: imagine having an overlapping sense while everyone else’s senses remain separate. Also imagine what happens to one’s thoughts, feelings and behaviour when drugs, brain damage or emotion affect perception. Understanding sensation and perception is fundamental to understanding how people think, behave and feel, and plays an important role in the field of psychology.
This chapter describes each sense (with a particular focus on vision) and explains what happens as the sensory receptors pick up signals from the environment and send them to the brain. This chapter also explains the usual pathways that these experiences take in the brain, as well as what happens when the pathways are disrupted or damaged in a way that affects the sensation or perception experience. However, before each sense is explained in detail, both sensation and perception must be defined, and the field of psychophysics must be introduced.
Figure 8.1 What kind of object is this?
Sensation and perception
Sensation is a passive process during which the sensory receptors and the brain receive information from the environment. Perception, on the other hand, is a process that entails actively choosing information from sensation, organising it and interpreting it to make meaning of the world.
Imagine you are looking at an object. Sensation occurs when the signals reach the eye, move to the brain and register as different lines, colours and shapes. Perception occurs when you make meaning of all those different lines, colours and shapes, and in so doing see the object as, for example, a flower. The process of seeing an object and recognising it as a flower would, therefore, involve the following steps:
Step 1:Energy signals in the environment reach the specialised receptor cells in the eyes.
Step 2:These specialised receptor cells turn the energy signal into an electrochemical impulse as part of a process called transduction.
Step 3:The impulse is then sent to the relevant brain region(s), and sensation occurs.
Step 4:The brain makes meaning of the message, and perception occurs.
The specialised receptor cells in the sensory organs are only activated when energy in the correct form hits them. For example, the receptor cells in the eye will only fire when light waves hit them, not when sound waves hit them. The energy signals to which each sense responds are set out in Table 8.1.
While sensation involves picking up the bits of signals from the environment, perception involves making meaning of this information. Making sense of many small pieces of information depends on what you already know, what you have learnt as well as the social and cultural context in which you live. For example, suppose you are at a party where you are surrounded by many people in conversation. If you listen to someone talking to another person quite a distance away from you, you may not hear the whole conversation. The pieces of the conversation you do hear are like the pieces of sensation you pick up from the environment.
Table 8.1 The energy signals to which the various senses respond
Temperature and pressure signals
Making sense of the pieces of conversation is similar to perception. However, your perception may be incorrect. You may hear ’dinner … tired … struggle … murder’. If the person whom you overheard is a small, dainty lady with lots of make-up and jewellery talking to a similar lady, you may perceive the conversation to be about her having to make dinner when she was so tired and that it was murder in a figurative sense, meaning that it was unpleasant. If the two people in conversation are very large, muscular men, you may think they are talking about an actual murder. You heard the same pieces of information (similar to the bits of sensation signals we receive) and, in making meaning of them, you took into account the sociocultural context as well as the appearances of the people (perception).
This example illustrates that perception is always subjective as it depends on the person who is interpreting the sensations in order to make meaning from them. This is a personal process based on past experience, learning and the environment in which the person lives. Therefore, while we all make meaning of our sensations, this meaning depends on who we are, where we live, and what our past experiences are. You have probably experienced this when you heard the same event described by different people: they all seem to tell a slightly different story, focusing on different aspects.
It is not only the medical world that uses psychophysics — manufacturers use it all the time to avoid creating unpleasant experiences for consumers.
For example, imagine you wanted to buy a piece of fish. You walk into a shop and have to wait a few moments for your eyes to adjust to the light so that you can find the fish counter. You look around but cannot see any fish, so you decide to ask an assistant. However, the music is playing so loudly that the assistant thinks you are looking for a dish. This frustrating experience would probably make you leave the shop without buying any fish.
People who design the interiors of shops use psychophysics to determine what level of light is needed for consumers to see the products yet not be overwhelmed by brightness. The level of background music is also studied so that the consumers can experience it as pleasant but still be able to hear someone talking to them.
Psychophysics is a special field in psychology that studies sensations, their limits and how they are perceived. Sternberg (2004) explains psychophysics as the study of the physical energy stimulation of the sensory organs which results in meaningful psychological experience. This field of study will ask questions such as: ’How loud must a sound be in order for a person to hear it?’ or ’How bright must a light be in order for a person to see it?’
Sternberg (2004) observes how psychophysics is used every day by health care workers, who are constantly evaluating how great a specific stimulus needs to be in order for us to perceive it. The implication is that if the level of stimulus that is required for a person to detect it is too high or too low, there may be something wrong with that person. For example, suppose a man had his hand crushed in an accident. In measuring whether he can feel pin pricks on his hand, the doctor or nurse can determine his levels of feeling in that hand as well as whether or not his nerves have been damaged. When we go for routine medical checks, the health care worker asks questions such as: ’Can you feel this?’, ’Can you see this?’ or ’Can you hear this?’ The answers to these help the health care worker determine the body’s level of functioning.
Besides these levels of stimuli, psychophysics also studies the thresholds needed to detect stimuli, the ability to discriminate between stimuli, the errors we make in detection and how we become used to all the stimuli around us.
A threshold is the level of energy that a stimulus must have in order for an organism to perceive it. Think of this as the threshold you cross when you walk through a door. When you are outside the door (below the threshold), you cannot detect who is in the room, but as soon as you cross the threshold, you can see that there are, for example, 10 people there. You used energy to cross the threshold and can now perceive the stimuli of 10 people. Similarly, a stimulus must have energy in order to cross the threshold and be noticed.
The absolute threshold is the minimum amount of energy required for an organism to detect a stimulus. For example, how much energy needs to be detected before you know a mosquito is sitting on your leg? An example from the real world occurred in the realm of safety precautions. Ole-Herman (2004) investigated the absolute threshold of sounds presented over loudspeakers and emergency transmission systems so that the voices that were transmitted would definitely be heard. The voices needed to pass the absolute threshold of hearing so that people could hear them.
Signal detection theory
According to signal detection theory, people do not always detect a signal at the same time or in the same way, and sometimes they even get it wrong. Being able to detect the presence of a stimulus depends on many factors, including the individual, fatigue, expectations and past events (Holt et al., 2012).
In a test situation, if someone were asked to say whether they saw a flash of light that person might say ’yes’ when it appeared. This would be called a hit. However, if that person said ’no’ when the light was present, this would be called a miss. If the person said ’yes’ even when the light did not flash, this would be called a false alarm. The last possibility, if the person said ’no’ when the light did not flash, would be called a correct rejection. These four options are set out in Table 8.2.
Table 8.2 Possible outcomes of signal detection (adapted from Sternberg, 2004)
In tests, people also differ in the way they decide whether or not they have sensed a certain stimulus. Some people may guess ’yes’ even when they are unsure, while others may guess ’no’. This is called a response bias.
Bourne and Russo (1998) provide the example of a soldier on patrol. The soldier may notice certain sounds when out on normal patrol (probably more sounds than he would if he was relaxing at home). He would probably notice even more sounds if a comrade had recently been shot. This example shows that even in the realm of sensation, people are not objective; their past experience has an impact. When studying signal detection, we must take into account subjects’ individual differences as well as the context in which the study is being performed.
Discriminating between stimuli
Psychophysics not only studies how we detect stimuli, but also how we detect differences between stimuli. In many situations it is important to notice the differences between stimuli. Holt et al. (2012) provide the example of a piano tuner who needs to be able to detect the slightest change in sounds or pitch so that the piano, once tuned, sounds perfect. People also need to be able to detect changes in the taste of food, for example, so that they can tell when the food is going off.
The difference threshold can explain these changes. This threshold is ’the line one has to cross’ in order to tell when stimulus A is different to stimulus B. However, people may make mistakes in their decision about the difference between two stimuli. The just noticeable difference (JND) is the level at which people will notice a difference between two stimuli 50 per cent of the time, and this is the minimum level of difference required for a sense to know the stimuli has changed or become different. The JND therefore defines differences.
The German physiologist Weber found that, in order for the JND to be reached, a stimulus need not change by the same amount all the time. If you increase the volume of the radio by one level when your environment is quiet, you will notice the change; however, if you increase it by one level when your environment is noisy, you are likely to not notice the change. If you increase the volume of the radio by one level when it is playing very softly, you will notice the change; however, if you increase it by one level when you are already playing it very loudly, you are unlikely to notice the change. Weber’s law, the First Law of Psychophysics (Holt et al., 2012), states that noticing a change depends on the proportion by which the stimulus has changed. For example, if food has a very low level of salt (e.g. a level of 1) you would need to change this by 20 per cent (to an overall level of 1.20) to notice a difference. If you wanted to notice a change in very salty food (e.g. with a level of 15), you would also need to increase this by 20 per cent (to an overall level of 18). While 20 per cent of 1 is 0.2, 20 per cent of 15 is 3, which shows that the change is not equal, but proportional.
If you look at Table 8.3, you will notice that it does not take much difference for the average person to notice a change in visual stimuli (1.6 per cent). However, a 20 per cent difference is required for the average person to notice a change in a salty taste, or a 10.4 per cent difference in smell. This is as a result of the reliance humans have on vision. (These JND levels would not apply to animals. They would notice changes in odour much faster than humans as they rely more on this sense.)
Table 8.3 Percentages by which a stimulus has to change before one can notice the difference (Teghtsoonian, 1971 in Holt et al., 2012)
Difference threshold (Weber fractions)
Lifted weight (kinaesthetics)
Pressure on skin (touch)
Adaptation to stimuli
If we responded to everything that we noticed in our environment, we would not be able to cope. So we adapt to our environment by tuning out some stimuli. Adaptation occurs when we are constantly surrounded by a particular stimulus and so start to block it out. For example, if you walk into a room and smell a horrible stench, the smell will seem less strong the longer you stay in the room.
Our eyes go through this process of adaptation all the time as they adapt to bright or dark light. Adaptation occurs at the same rate regardless of how recently we adapted to the stimuli and it is a process over which we have little conscious control (Sternberg, 2004). For example, if you get into a hot bath the water will feel very hot initially, but as you gradually adapt to the temperature it will feel colder, and you may even add more hot water after a while. If you get out of the water at that point, the outside temperature will seem very cold, but you will adjust to that as well. If you then get back into the bath, it will seem hot again even if you only stepped out for a short time to fetch a bar of soap.
•Sensation and perception depend on neurons firing when physical energy reaches them. Some people have synesthesia, in which sensation experiences overlap.
•To understand sensation and perception, we have to know the pathways that these sensory experiences take in the brain, and how the brain uses these signals to perceive an experience.
•Sensation is a passive process during which the receptors in the sensory organs are activated by energy stimuli from the environment.
•Perception is a process that entails actively choosing information from sensation, organising it and interpreting it to make meaning of the world.
•Psychophysics is the special field in psychology that studies sensations, their limits and how they are perceived. It is interested in the thresholds needed to detect stimuli, the ability to discriminate between stimuli, the errors we make in detection and how we become used to all the stimuli around us.
•A threshold refers to the level of energy that a stimulus must have in order for an organism to perceive it. The absolute threshold is the minimum amount of energy required for an organism to detect a stimulus.
•Signal detection theory argues that different people detect signals at different times and in different ways. This depends on individual differences, fatigue, context and personal experience.
•People can detect changes in the levels of stimuli. The just noticeable difference (JND) is the level at which people will notice a difference between two stimuli 50 per cent of the time; this is the minimum level of difference required for a sense to know the stimuli has changed or become different.
•Adaptation refers to our ability to tune out some stimuli so that we are not overwhelmed by them.
The sensory systems
Each sensory system has certain unique characteristics. These are discussed below, together with what may happen when certain systems are disrupted.
In brief, visual sensation and visual perception involve an energy signal hitting the receptors in the eye, this signal being sent to the brain, and some meaning being made out of the signal in the brain. However in this section the mechanisms underlying vision are discussed in more detail. Once you understand these mechanisms, you will be able to track the visual system’s pathways and apply the disorders of visual processing to them.
8.2THE CASE OF MR S
Source: Shenker (2005)
Mr S, a 75-year-old man, went to his doctor because he was afraid he was losing his sight. He said that his vision had become blurry and he thought that he should perhaps get stronger glasses. He was quite worried however, because he experienced problems with his vision only when he was reading. When asked to read, he said that he could see the words but that they did not seem real. He could catch a ball, walk around without bumping into the furniture and identify colours. When his vision was tested, he could see everything in his left visual field but nothing in his right visual field. He couldn’t read any words but he could write them. When he wrote the words, however, he would then say, ’What does it say?’ He couldn’t read what he had written. When an MRI was done on his brain, doctors found that he had had a stroke in his left occipital lobe that had also affected some of the fibres of the back part of his corpus callosum.
Mr S was diagnosed as having a condition known as alexia (the inability to read) without agraphia (the inability to write), and without having lost the abilities to detect stimuli or to detect changes in stimuli.
In order to explain the disturbance to Mr S’s vision, we would have to evaluate all the components of his visual system. To do this we would first try to establish whether the energy from his environment was being received at the appropriate level. We would then investigate whether the structure of his eye was normal and working correctly in order to receive this signal.
The energy signal that the human eye receives is light. Light is a form of electromagnetic radiation (Holt et al., 2012), which is in the form of wavelengths. Humans can only detect wavelengths ranging between 350 and 750 nanometres (or billionths of a metre). This is a small range of the electromagnetic spectrum (Sternberg, 2004). Bees, for example, can see the ultraviolet and infrared spectrums, which we cannot see (Sternberg, 2004). Figure 8.2 illustrates the electromagnetic spectrum, and shows that humans can only see a very limited number of all the wavelengths that exist.
The structure of the eye
Specialist cells in the eye pick up light in the form of wavelengths. However, before this light energy reaches these cells, it first has to travel through the eye itself. Figure 8.3 illustrates the structure of the eye and the path of light through the eye. The route that light travels to reach the photoreceptor cells at the back of the eye can be outlined in four steps:
Figure 8.2 The portion of the electromagnetic spectrum that humans can see (adapted from Coon, 2004)
Figure 8.3 The structure of the eye and the path light travels to reach the photoreceptor cells
1.The light hits the cornea of the eye. This is the curved covering on the outside of the eye, which protects the eye and keeps its shape.
2.Light passes through the pupil. The pupil appears as the black centre in the eye, but it is in fact an opening. The pupil can increase or decrease in size depending on how bright or dark the light is at which you are looking. If a person is looking at an object in bright light, the pupil will be small because the person does not need a lot of light in order to see the object. However, if a person is looking at an object in the dark or low light, the pupil will be big so that more light can enter the eye to enable the person to see the object more clearly. The iris is the coloured part of the eye that surrounds the pupil. Sternberg (2004) explains that the iris is a circular band of muscles that make the iris bigger or smaller. (The reason why a person’s eye is blue, green or brown is because this muscle reflects specific types of light beams away from the eye. We then perceive this as a specific colour.)
3.The light passes through the lens. This is like the lens on a pair of glasses. The lens has a bulging shape (just like glasses), but is flexible and can change the extent of this bulge. The lens focuses light onto the back of the eye and, in order for the object we are seeing to be in focus, that light needs to be bent at a particular angle — therefore the lens adapts constantly. If the lens does not focus the light correctly, we need glasses to correct this. The lens in glasses helps the lens of the eye to work correctly. In this instance, light will be bent going through the lens of the glasses and then again through the lens of the eye. Another interesting feature of the lens is that it turns the image we are seeing backwards and upside down, although the brain corrects this for us later.
Figure 8.4 If the lenses of our eyes do not focus light correctly, we can use the lenses in a pair of glasses to rectify the problem
4.The bent light that is focused onto the back of the eye hits the retina. The retina is very thin; in fact, it is as thin as this page (Sternberg, 2004). However, it contains all the cells that pick up light. These specialised neurons cover the whole of the back of the eye except one area, which is called the blind spot. The retina’s specialised neurons include cells called photoreceptor cells, which change the electromagnetic energy of light into electrochemical energy (the neural impulse), which can be relayed to the brain.
There are two kinds of photoreceptors: rods and cones, thus named because of their distinctive shape (see Figure 8.3). There are about 120 million rods and six million cones in the eye (Holt et al., 2012). Davey (2004) explains that rods and cones each have their own specific type of photopigment that is affected by a specific wavelength of light. The rods enable us to see in low light as they are sensitive to picking up black and white but not colours. The cones, on the other hand, pick up colours and function best in bright light. Rods are found all over the retina, while cones are found mainly in the fovea with their numbers reducing dramatically outside of the fovea. The fovea is the area where the best visual acuity or best vision occurs.
All the axons of the photoreceptor cells bundle together and exit the eye at the optic nerve. (Refer to Chapter 7 for more information about the process of electrochemical transmission.) Because the optic nerve is made up of the axons of the neurons in the eye and not the actual photoreceptor cells of the neurons, this forms the blind spot. The blind spot leaves a gap in our vision, but we do not notice it because the brain fills it in.
Returning to the case study about Mr S, and having explained the structure and functioning of the eye, it would seem that Mr S does not have a difficulty at this level. If a person were to have a problem with the structure of the eye or the cells at the back of the eye, this would result in diminished overall vision or blindness. Mr S can still see nearly everything and has difficulty with words only.
Therefore, to explain Mr S’s difficulty, we need to investigate the message travelling from the eye to the brain.
The pathway to the brain
The electrochemical signals follow a route from the optic nerve through to the occipital lobe in the back of the brain. From the occipital lobe, the message may then be sent to other areas of the brain for further processing. This route is illustrated in Figure 8.5.
As the picture illustrates, everything you see in your right visual field is transmitted to the left hemisphere, and everything you see in your left visual field is transmitted to the right hemisphere. Notice that each eye picks up images from the right and the left visual field. The optic chiasma splits the two pathways and, at that point, each crosses over to the opposite hemisphere. The messages are then sent to the occipital lobes, which are responsible for the sensation input and the initial processing of the information into visual perception. Depending on the type of information, this is then sent for further processing to the temporal lobes (which tell you what things are) and parietal lobes (which tell you where things are).
Figure 8.5 Visual pathways in the brain (adapted from Sternberg, 2004, p. 127)
Returning to the case study about Mr S, it now seems likely that Mr S has no problems with the general sensations of vision as he can see quite well, but he cannot make meaning of written words, suggesting that he cannot form a perception of words. This is clearly a difficulty in Mr S’s processing abilities. However, before coming to this conclusion, we should also evaluate the specific aspects of Mr S’s vision. He indicated that he sometimes sees a fuzzy image of the words. In order to assess this, we need to evaluate the quality of his vision (his visual acuity) and his colour vision.
Visual acuity refers to how well a person can see objects and distinguish between objects in the environment. If you have poor visual acuity, then you may not be able to see objects at a distance or the fine detail of an object. Generally, the better your acuity, the better your vision. Cones affect acuity as they are concentrated in one area of the eye (the fovea). Cones are also responsible for seeing in bright light.
We can still see in low light, however. This is called dark adaptation and occurs when the rods become activated in low light. Cones operate in bright light and are not of much use in the dark. The rods, however, function in the dark although they cannot detect colour. When you enter a dark room, you probably won’t see anything. As your eyes become used to the dark in the first five to ten minutes, you will start detecting shapes and intensity in objects. The rods continue to become sensitive over a period of half an hour (Holt et al., 2012).
Returning to our case study, Mr S does not seem to have difficulties with visual acuity as his vision is normal for most objects. His colour vision is also normal as he did not report any difficulty in picking out different colours.
Seeing colour involves a combination of three factors (Coon & Mitterer, 2013). The three properties of colour are as follows:
1.Hue. This is determined by the wavelength of light. Sternberg (2004) states that we see the shortest wavelength as violet and the longest as red.
2.Saturation. This is determined by how pure the colour appears or how much it has been combined with white.
3.Brightness. This is determined by the amplitude of the light wave, which is the amount of light we see coming from the wavelengths.
The various theories of colour vision all attempt to explain how we can see so many different colours at different brightness levels and saturation levels. These theories are currently still being debated and we will highlight the main two theories only.
The trichromatic theory, proposed by Thomas Young and later modified by Hermann von Helmholtz (1852, in Sternberg, 2004), focuses on the primary colours of red, blue and green. Young and Helmholtz argued that as these three primary colours form the basis of every possible colour we can think of, it seems logical that the photoreceptors in the eye are specialised to pick up these primary colours. It is, furthermore, impossible to have receptors for all the colours that exist. Therefore, these researchers said that we have three kinds of cone, each sensitive to green or blue or red. To achieve the many different colours we can see, these photoreceptors are activated to different degrees (Sternberg, 2004). This would be similar to mixing paint. If you want violet, you would mix a lot of blue with a little red. When we see violet, therefore, our blue photoreceptor is highly activated and our red one less so.
The reason Young and Helmholtz thought their theory was correct was because some people who are colour blind are only blind for one colour, either blue (exceptionally rare) or green or red.
The other main theory of colour vision takes a different view. The opponent-process theory states that we have neurons in our retina that are able to process pairs of colours. These pairs are red-green, yellow-blue, and black-white. When stimulated, these neurons react to one side of the pair more than the other, resulting in your seeing more red than green, for example. The pairs are therefore called opponents because they work against each other. This theory can account for the experience of after-images. Find a coloured picture of the South African flag. Stare at the middle of the flag for 30 seconds, then look at a white piece of paper. You will see the flag, but in different colours — this is an after-image.
Sternberg (2004) notes that we seem to need both theories to account for colour vision. Trichromatic theory is correct when it states that we have three kinds of cones, but the opponent-process theory is also correct at a higher level of the neuron. The opponent-process theory can also explain after-images, which the trichromatic theory cannot.
•Visual sensation and visual perception involve an energy signal hitting the receptors in the eye.
•This energy signal is light; light is a form of electromagnetic radiation which travels in the form of wavelengths. Humans can only detect a limited range of wavelengths.
•The light coming into the eye first reaches the cornea, then passes through the pupil. The pupil is surrounded by the iris, the coloured part of the eye.
•Behind the pupil is the lens which focuses light onto the back of the eye. The lens also turns the image backwards and upside down.
•The focused light waves hit the retina at the back of the eye. The retina contains the photoreceptor cells that pick up light. These cells change the electromagnetic energy of light into electrochemical energy which can be relayed to the brain.
•There are two kinds of photoreceptors: rods and cones. Rods enable us to see in low light while cones pick up colours and function best in bright light.
•Rods are found all over the retina, while cones are found mainly in the fovea. The fovea is the area where the best vision occurs.
•All the axons of the photoreceptor cells bundle together and exit the eye at the optic nerve. This area forms the blind spot.
•The electrochemical signals follow a route from the optic nerve through to the occipital lobe in the back of the brain. All the signals from your right visual field are transmitted to the left hemisphere, and everything you see in your left visual field is transmitted to the right hemisphere. This crossing over occurs at the optic chiasma.
•Visual acuity refers to how sharp a person’s vision is. Cones are most responsible for good acuity. Rods are activated in low light and adapt to allow us to see in the dark.
•Humans can see in colour. There are three aspects of colour vision: hue, saturation and brightness. The two theories of colour vision mentioned in this chapter are trichromatic theory (Young; Helmholtz) and opponent-process theory.
8.3IS COLOUR BLINDNESS GENETIC?
According to Montgomery (2005), there are about 10 million men in America who are colour blind. This is seven per cent of the male population compared to only 0.4 per cent of the female population. This indicates that there is a genetic basis to colour blindness, which is supported by what we know of human chromosomes. This disorder is carried on the X chromosome, and men only have one of these. However, women have two X chromosomes, so they are protected to a certain extent as if one is disordered the other one can compensate for it.
In the case of Mr S, it was found that his sensation was not affected, as his abilities to pick up light waves and to see colours were all intact. It was suggested that Mr S probably had a problem with his perception of visual elements. In order to make the visual world meaningful, the process of perception relies on the elements of sensation that enter the brain, as well as memory, past experience and the culture in which one lives.
Processing of visual signals in the brain
Visual signals are processed at different levels in the brain. Some regions deal with lower-level tasks and other regions with higher-level tasks. In addition, some regions deal with specialised tasks. The brain regions that are responsible for lower-level or more elementary tasks are called the primary visual areas, while the regions that are responsible for higher-level or more psychologically sophisticated tasks are called the secondary visual areas and the tertiary visual areas. A highly simplified diagram of the arrangement of this system is shown in Figure 8.6.
Primary visual areas
The primary visual areas are lower-level or less psychologically complex areas that lie towards the top of the occipital lobes (see Figure 8.6). This area is often referred to as the primary visual cortex.
The primary visual cortex is the input end of the brain’s visual system; it is the place where the more elementary aspects of processing occur (such as perception of light intensity and edge detection). Damage to the primary visual area causes a condition referred to as cortical blindness, where visual experience stops because key aspects of the input end of the visual system are disrupted. People experience this damage as blindness.
Figure 8.6 The regions of the brain that are responsible for the hierarchically ordered tasks of vision
Secondary and tertiary visual areas
In front of and below the primary visual cortex lies a more psychologically sophisticated system, which is composed of the secondary visual areas (see Figure 8.6). These are dedicated to a range of specialised visual processing tasks, such as the recognition of objects, and colour and motion processing.
Damage to the secondary visual areas causes more complex disorders of visual processing. Patients with damage to these areas lose, for example, the ability to recognise specific objects, or the ability to perceive colour or movement.
At the highest level of the visual system are the tertiary visual areas (see Figure 8.6). These operate the most abstract and psychologically sophisticated aspects of visual processing. These areas depend on other sensory modalities, such as hearing and touch. They are involved in important parts of arithmetic, writing, constructional operations and spatial attention, and in some respects represent the output end of the normal perceptual system.
Damage to the tertiary visual areas does not exactly affect visual perception, but rather causes more abstract disorders that go beyond concrete perception. Possible effects of damage to these areas are an inability to calculate, to write or to construct complex forms.
Top—down and bottom—up processing
Coon and Mitterer (2013) and Holt et al. (2012) distinguish between bottom—up and top—down processing in visual perception.
Feature-detection theory is an example of bottom—up processing. According to this theory, the neurons in the retina send information using the optic nerve to the brain via the thalamus. Of the many parts of the brain to which information is sent, the primary visual cortex is the main one. The primary visual cortex is located in the occipital lobe at the back of the brain. Holt et al. (2012) note that research has shown that specific neurons in the retina make contact with specific regions of the primary visual cortex. There is an almost one-to-one mapping. Some of these neurons in the primary visual cortex only respond to certain visual stimuli. For example, some may only fire for horizontal lines, others for vertical lines. These are called feature detectors as they look out for certain features or characteristics. We seem to have feature detectors for many visual elements such as colour, shape and motion. When we see something in front of us, these many feature detectors fire together (or in parallel) and we integrate the information to form an image. This is a bottom—up process in that it takes all the elements of a visual array and combines it into something bigger or more meaningful.
Top—down processing works the other way round. Bourne and Russo (1998) use the following example. What is the middle image in Figure 8.7? The letter B? Or the number 13?
Figure 8.7 Top—down processing uses the context to understand an object
If you read the numbers in Figure 8.7, you would expect the object in the middle to be 13 as it comes after 12, but if you read from top to bottom, you would read the object as B as it comes after A. Deciding what you see in this instance is dependent on past experience as well as learning — so too is top—down processing.
Rather than being competing ideas, top—down and bottom—up processing are both used in visual perception. We form our perceptions based on what we sense as well as what already exists in our brains (Holt et al., 2012).
Returning to our case study, it appears as if Mr S can process information in a bottom—up fashion as visual information activates features in his primary visual cortex. However, Mr S does not seem to be able to utilise top—down processing in order to utilise his past experience and knowledge of words and letters to identify and read the words.
In the 1920s, gestalt psychologists identified and explained a set of principles that we use in order to perceive our world visually. These principles state that we take the elements that make up an object and form a meaningful whole from them. According to the gestalt psychologists, ’the whole is greater than the sum of its parts’. The gestalt laws of organisation include the following (Feldman, 2014):
•Proximity. Those objects closest together are perceived as belonging together. Have a look at the row of dots in Figure 8.8(a). Instead of seeing a row of single dots, you see a row consisting of pairs of two dots. Another example is when you see a friend sitting on a bench next to a person who looks a little like your friend, but much older. When you walk over to your friend you may say ’hello’ to the other person thinking it is her mother, whereas in reality they do not know each other at all. You placed them together as a meaningful unit as a result of proximity.
Figure 8.8 Diagram of gestalt laws of organisation
•Similarity. Things that look the same are grouped together. Have a look at the blocks and crosses in Figure 8.8(b). You form a cross with the blocks because they are similar in form.
•Closure. People close or ignore the gaps in objects to form a meaningful whole. Have a look at the triangle in Figure 8.8(c). Even though there are holes in the triangle, you complete them to form a meaningful whole.
Holt et al. (2012) suggest that we see the world in quite a stable and constant way — if we did not, we would have to rediscover the shapes and other visual aspects of objects each time we encountered them. Much like the way in which the gestalt psychologists said we make sense of our world, the principles of perceptual constancy show that we use cues in the environment to keep our world predictable and stable. This stability exists in spite of the fact that the sensations we receive from the world are constantly changing. There are various kinds of constancy:
•Colour constancy. This is when the perception of a colour stays the same, although the image of it may change. For example, whether a friend sees you standing in bright sunlight, under a shady tree or in a dimly lit room, he/she still perceives your T-shirt as red. The perception of the colour stays the same (i.e. constant), even though the retinal image changes in different settings owing to varying levels of lighting.
•Size constancy. This refers to the fact that, even though an object gets smaller on the retina as you get further away from it, you know its size remains the same. For example, as a friend who is your height walks away from you, her size will decrease on your retina, but you know that she is not shrinking.
•Shape constancy. This is when the shape of something changes on the retina, but you know its shape remains the same. For example, when you look at a closed door, it is rectangular in shape. When it starts to open, the shape changes to a more trapezoid shape, but you know it is still a rectangle.
Figure 8.9 A door appearing to change shape as it opens
If we were relying on bottom—up processing only, we would think that our friend’s red jersey was fading, that our other friend was shrinking, and that the door was changing shape right in front of our eyes. Top—down processing allows us to maintain perceptual constancy; we have learned things about our world through experience and this learning enables us to make sense of changing stimuli.
Depth perception is the ability to perceive the three-dimensional quality of our world. This is amazing, considering the fact that the retina is like a flat piece of paper and is therefore two-dimensional. We use both monocular and binocular cues from the environment to tell us about depth.
Monocular depth cues depend on one eye only. These cues are the ones most artists use in their artworks and include the following:
•Linear perspective. Parallel lines, such as the rails of a railway track, look as though they move closer together the further away they are.
•Patterns of light and shadow. These have been used by artists like Escher to create perception of depth.
•Relative size. If we see things that we know are similar in size, then we will know that the one that appears smaller is further away.
•Motion parallax. Things that are far away from you look as though they are moving more slowly than things that are closer to you (Holt et al., 2012).
Binocular depth cues depend on both eyes. They include the following:
•Convergence. Hold your finger in front of your face at arm’s length. Look at your finger as you slowly move it towards your face. Can you feel the muscles in your eyes start to tighten the closer the finger gets to your face? This information from the muscles goes to the brain to provide information about distance. The further away something is, the less tension is placed on the eye muscles.
•Retinal disparity. Hold your finger in front of your face again, about 20 cm away. Look at the finger with one eye open, then with the other eye open. Do you notice how your finger seems to jump from one side to the other as you focus on it with each eye? This is due to retinal disparity, as each eye is picking up a different picture of the finger. When the images from both eyes are put together, depth perception occurs.
Santrock (2003) notes that our perception is most often correct. However, when we pick up signals from the environment and come to an incorrect perception, this is called an illusion. See Box 8.4 for some examples of illusions.
Bi-stable images are not illusions, but they do illustrate how our visual perception can change spontaneously to alter what we perceive. A bi-stable image is a two-dimensional image which can be perceived to yield two different images (though never at the same time — our perception switches from one to the other). Figure 8.10 gives two examples of bi-stable images.
Figure 8.10 An example of a bi-stable image: a wine goblet versus two facing profiles
The following are two examples of illusions:
•The Müller-Lyer illusion: Look at Figure 8.11 (a) and (b). If asked to judge which line is longer, most people say the one with the inverted arrows on the end. However, both lines are the same length.
•The Ponzo illusion: Which line in Figure 8.11 (c) is longer, the one at the top or the bottom of the converging lines? Because we think of converging lines as representing distance, we see the furthest line as longer, but they are both the same length.
Figure 8.11 The Müller-Lyer illusion (a) and (b), and the Ponzo illusion (c)
8.5VISUAL PERCEPTUAL DISORDERS — A CASE STUDY
Although a variety of spatial disorders can result from damage to the brain, an interesting specific case was recorded by Luria (1972); this referred to a patient called Zazetsky.
Zazetsky was injured in the left parietal region by a bullet during World War II and after this, suffered from many deficits in his spatial ability. Luria recorded Zazetsky’s personal descriptions of them.
One area of Zazetsky’s difficulties was with the verbal labels left and right. Zazetsky was examined by a doctor to test his vision, and was asked to tell in which direction a semicircle was facing — a question that he should have easily been able to answer. As he described this experience, he simply could not begin to think of an answer, and merely looked at her, so that she became annoyed with him: ’Why don’t you answer? Which direction is the semicircle pointing — to the right or the left?’ It was only then that he understood what she was asking:
I looked at the semicircle but couldn’t judge since I didn’t know what left or right meant … I could see [the semicircle] … it was so clear you couldn’t miss it. But I didn’t understand the doctor’s question … I just sat and stared at the figure but wasn’t able to answer her since I didn’t know what the words meant (Luria, 1972, p. 54).
Zazetsky also seemed to have forgotten the shapes (letters) that we normally use to represent particular sounds. He was quite familiar with the sound ’b’, could say it and knew that it was the sound that starts the word ’boat’. However, he had great difficulty remembering the shape that we normally make on a page to represent the sound ’b’ — the ability to recall that it consists of one vertical line and a curved line: b. He recalled, ’Even after I thought I knew the letters, I couldn’t remember how they were formed. Each time I wanted to think of a particular letter I’d have to run through the alphabet until I found it’ (Luria, 1972, p. 72). It seems that Zazetsky had suffered a disturbance of the ability to generate and recognise even individual letter shapes.
Perceptual deficits of vision
Any deficit of perception is called an agnosia. Visual-object agnosia occurs when people fail to recognise all types of visual objects. Recent work suggests that recognition problems might also be restricted to categories such as animate or inanimate objects, scenes, body parts and even particular types of emotional expression, such as fear. The case of Mr S illustrates what happens when there is a deficit in perception and a visual agnosia results. Even though Mr S’s sensory processing abilities were intact, his ability to process information into a meaningful whole, such as words, was disordered.
Another agnosia is prosopagnosia, where people cannot recognise faces (Sternberg, 2004). The patient cannot map the new perceptual experience of seeing someone’s face (such as their mother) onto the memory they have for that person’s face. Like other forms of agnosia, the basic perceptual abilities are normal, or near normal. They can still judge where objects are, as well as their size and shape. In fact, patients with prosopagnosia generally have such good visual abilities that they can tell that they are looking at a face, and can usually point to the various parts of the face (the eyes, the mouth, etc.), but cannot say who they are looking at. This failure to recognise familiar people can include the faces of famous persons, friends, members of their own family, and sometimes even their own face viewed in a mirror.
However, patients with prosopagnosia do not struggle to name people, because they use other information to recognise and then name a person. For example, while they may not recognise their brother by sight, they may hear his voice and recognise him from that.
Patients with prosopagnosia may still recognise other kinds of objects and can still read, and this finding supports the theory that prosopagnosia is a specific visual disorder. This has led to the suggestion that faces may be a special category requiring a special brain region.
•Visual signals are processed at different levels in the brain; lower-level tasks are processed in the primary visual areas, while higher-level tasks are processed in the secondary and tertiary visual areas. Damage to the primary visual cortex results in cortical blindness.
•The secondary visual areas allow us to see objects, colour and movement, while the tertiary visual areas work with other senses to allow us to write and do arithmetic, among others.
•Two theories of visual perception refer to top—down and bottom—up processing.
•In bottom—up processing, we use feature detectors to take the elements of a visual stimulus and integrate the information to form an image. In top—down processing, we use our past experience and learning to perceive the image.
•According to the gestalt school, we use a set of principles to perceive forms; these principles include proximity, similarity and closure.
•We also have the ability to perceive the world in a stable manner even when the visual input changes. Kinds of visual constancy include colour, size and shape constancy.
•Depth perception allows us to perceive the three-dimensional quality of our world. This process uses both monocular and binocular depth cues.
•Visual illusions occur when we pick up signals from the environment and come to an incorrect perception.
•People can suffer from various perceptual deficits, called agnosias. These include visual object agnosia (failure to recognise objects) and prosopagnosia (failure to recognise faces).
•Visual perception may also be affected by disorders of spatial ability. Perceptual disorders are commonly caused by brain lesions in specific areas.
After vision, hearing is probably the sense on which we rely the most. The energy signals that come from the environment to our ears are in the form of sound waves.
Sound waves are pressure waves. Think about a very large speaker: when someone increases the volume of a speaker to a very loud level, objects in front of the speaker move as the air moves (Holt et al., 2012).
Sound has three characteristics (see Figure 8.12):
•Amplitude. This relates to the size of the sound waves, i.e. how big or how small they are. The size of the wave determines loudness and is measured in decibels (db). For example, the sound waves produced by a man shouting would be much larger than those produced by a child whispering.
•Frequency. This is the number of waves that occur per second. This is measured as cycles per second or hertz (Hz). When you increase the number of cycles per second, the pitch of the sound increases (Holt et al., 2012). The sound produced by a high-pitched whistle would have more cycles per second than the sound produced by a big bass drum.
•Timbre. This relates to the quality of the sound. For example, the notes on a piano would have a different quality to an explosion (Sternberg, 2004).
Figure 8.12 Illustration of amplitude and frequency waves (adapted from Sternberg, 2004, p. 150)
The structure of the ear
Figure 8.13 shows the structure of the ear. The ear is divided into the outer, middle and inner ear. The route that sound waves take through the ear can be explained in a step-by-step process, given on the next page.
The outer ear
Step 1:The sound waves are collected by the pinna or the outer parts of the ear, i.e. the parts that you can see.
Step 2:The sound waves then move down the auditory canal to the eardrum.
Step 3:The sound waves make the eardrum vibrate. Higher frequencies lead to faster vibrations.
The middle ear
Step 4:The middle ear has three bones that collect the vibrations.
Step 5:These bones (the malleus, incus and stapes) increase the vibrations and send them to the inner ear or the cochlea.
The inner ear
Step 6:The vibrations reach the oval window, which is the start of the cochlea.
Step 7:The cochlea is made up of three channels separated by membranes. One of the membranes is the basilar membrane and has small hairs on it. These hairs float in the fluid of the cochlea and are our auditory receptors. The vibration moves parts of these hairs.
Step 8:The movement of the hairs starts the electrochemical message (neural transmission) that is then sent to the brain.
The pathway to the brain
Information from the cochlea starts the electrochemical message that is sent via the auditory nerve to the brain. Sternberg (2004) notes that the path of the auditory nerve goes to the medulla oblongata, then to the midbrain, through the thalamus, and finally to the auditory cortex (in the temporal lobes).
Theories of hearing
The theory explaining how we hear loudness
Holt et al. (2012) explain that loudness is transmitted to the auditory nerve in two ways:
1.Loud sounds have a high amplitude. This high-amplitude sound wave makes the hair cells on the basilar membrane bend more, causing the neurons to fire at a higher rate, thus registering that it is a loud sound.
2.Specific neurons have a higher threshold for firing. If the amplitude is high and the sound wave is high, this will cross the threshold and make these specific neurons fire. If these neurons fire, it sends a message that the sound is loud.
Figure 8.13 The structure of the ear
Theories explaining how we hear pitch
According to Holt et al. (2012), place theory states that we hear pitch because the vibrations caused by each frequency make a specific place on the basilar membrane vibrate. For example, high-frequency waves cause the area close to the oval window in the cochlea to vibrate, while low-frequency sounds cause the basilar membrane at the end of the cochlea to vibrate.
However, because the vibration starts at the oval window and moves down the cochlea (implying the vibration exists throughout the cochlea), this is not very specific; thus this theory does not explain well how we hear low-frequency sounds. Other influences need to be considered.
Frequency theory attempts to address the shortcomings of place theory. According to frequency theory, our ability to distinguish different pitches is related to the number of times the auditory nerve fires. The nerve will fire more often for higher sounds than for lower ones (Holt et al., 2012). This theory is useful in that it can explain low-frequency sounds quite well.
However, while our neurons can only fire at 1 000 times per second, we can hear pitches at 20 000 Hz. This is not explained by frequency theory.
According to Sternberg (2004), researchers have proposed the volley principle to explain how we can hear these very high sounds. They argue that neurons work together when they are stimulated by high-frequency sounds. When the frequency vibration of this high sound enters the cochlea, the neurons act in a cooperative group, alternating the firing. Therefore, while one neuron is resting, the other neuron fires. In this way, a rate of rapid firing is possible. You can think of it as a row of men filling a hole with sand using spades. If one man was doing it, the process would be very slow. However, if they all stand in a line, one man may be throwing sand into the hole while another man may be lifting more soil; in that way, when the first man turns to get more soil, the second man can throw his soil into the hole, and so on.
It appears that some combination of place theory and frequency theory is necessary to explain pitch.
Animals are much better than humans at identifying the direction of sounds. A dog’s ears, for example, are shaped so that they form a tunnel for the sounds to travel down. The ears trap the sound and the dog can move its ears only, instead of his whole head, to find best where the sound is coming from.
Human ears cannot move around in this way. Humans rely on something called a sound shadow to locate a sound. To understand this concept, imagine you are in a house for the first time and a phone is ringing. You want to answer it, but you do not know where the phone is. You will need to locate the phone by sound. The sound of the ringing phone will reach the ear closest to the sound faster than the other ear. The ear closest to the noise will therefore hear the ringing first (and at a slightly higher intensity) and your head will block the sound waves travelling to your other ear to a certain degree, causing a shadow and a lowering of the intensity of the sound. This shadow results in a very small delay in the sound reaching the furthest ear and a slight drop in sound intensity. This is because the sound has to travel an extra distance to reach the furthest ear. Our brain can use these two pieces of information to locate the direction of the sound.
It is more complicated if the sound reaches both ears at the same time because there is no shadow to tell us where the sound is. When this happens, people tend to move their heads one way or the other to create a sound shadow deliberately.
•After vision, hearing is probably the sense we depend on the most. The energy signals that come from the environment to our ears are in the form of sound waves, which are pressure waves.
•Sound has three characteristics: amplitude, frequency and timbre.
•The structure of the ear is quite complex. The outer ear consists of the pinna and the eardrum. The middle ear contains the small bones (malleus, incus and stapes). The inner ear consists of the cochlea which starts at the oval window and is made up of three channels separated by membranes. The basilar membrane carries small hairs which are our auditory receptors.
•Sound waves are collected by the pinna and make the eardrum vibrate. The vibrations are passed to the small bones which amplify them and pass them to the oval window. The vibrations cause the hairs on the basilar membrane to move and this starts the electrochemical message that is then sent to the brain.
•The message is sent via the auditory nerve along a complicated route to the auditory cortex in the temporal lobes.
•Loud sounds have a higher amplitude that triggers specific neurons to fire.
•Pitch theories include place and frequency theory.
•Humans locate sounds using the sound shadow. This refers to the time difference between when a signal reaches our two ears.
Taste is called a chemical sense because a substance must be dissolved in something in order for neural transmission to occur. In other words, in order for us to taste something, some of its molecules need to dissolve in our saliva. Then we are able to taste sweetness, bitterness, saltiness and/or sourness. An additional taste of umami — caused by glutamate and ribonucleotides - is sometimes also added to this list. It is a pleasant savoury taste.
Sternberg (2004) notes that while our threshold for taste is low, the just noticeable difference (JND) is often very high. Imagine sitting blindfolded while someone places different things in your mouth to taste. People often identify the taste incorrectly or cannot detect when the taste has changed slightly.
The tongue has thousands of taste buds (see Figure 8.14) that only last about ten days as more taste buds are created continuously. Each taste bud has a finger-like extension at the top that is sensitive to the chemicals surrounding it. When this protrusion detects a chemical, it sends a message to the brain by making the neuron fire. The message travels from the neurons to the thalamus and then to the somatosensory cortex. Some of the information also goes to the hypothalamus and the limbic system (two areas of the brain involved in emotion).
Our sense of taste is not very sophisticated or sensitive. When we get a cold and our noses become blocked, our sense of taste seems almost to disappear. This is because much of what we taste depends on being able to smell it. This relationship is part of perceiving flavour, which is a blend of taste and smell; in addition, touch and heat/ cold contribute to our experiences of eating. For example, some food is soft or slimy; other food is hard or crunchy. Some food (like chocolate) melts in the warmth of the mouth, contributing to the pleasant experience of eating it.
Like taste, smell is a chemical sense. We smell something when molecules in the air are dissolved in the mucus in the nose (see Figure 8.15). Molecules in the air enter the nose through the nostrils when we inhale them. Here they are transferred to the olfactory epithelium, which is the membrane of the nose that secretes mucus. This area is just below and behind the eyes. On the olfactory epithelium, the molecules activate the olfactory receptor cells. These cells last four to eight weeks only.
Figure 8.14 The structure of the tongue and the pathway to the brain
Figure 8.15 The structure of the nose and pathway to the brain
When the receptor cells are activated, the specialised neurons fire. The information from the neurons then comes together in the olfactory nerve, which leaves the nose and enters the brain through the skull. The olfactory nerve goes straight to the olfactory bulb, which is in the temporal lobes. This direct route from a sensory organ to the specific part of the brain responsible for processing the sensation is unique. Information from the other senses must travel from the sense organs to the thalamus, which is like a conductor telling the information where to go, and then to the area in the brain responsible for processing.
Some information also goes to the hypothalamus and the limbic system, which is possibly why smells often elicit memories and emotions.
We need our sense of smell to be able to taste a full range of food. Anosmia occurs when a person loses their sense of smell, which is a rare phenomenon that can occur after a head injury. People with anosmia often report a lack of interest in food as their ability to smell and taste it has reduced. Our sense of smell also declines with age.
Our sense of smell is not very good in relation to other animals.
We feel pressure and temperature largely through the skin, which is our primary organ of touch. There are specialised receptor cells in the skin. Some parts of our bodies have more receptors than others and thus are more sensitive. For example, there are many more receptors in your fingers compared to your back (Holt et al., 2012).
We call intense negative pressure or temperature pain. The ability to feel pain serves an important survival function. Imagine you could not tell when you were feeling pain. You might touch a hot stove plate and only realise it when you smelt something burning. Pain is our body’s way of telling us that we are in danger. Yet there are a few extremely rare cases of people who cannot perceive a painful stimulus, and of people who can perceive the painful stimulus, but are unable to react to it. This condition is known as congenital analgia. People with this condition, particularly when they are young, are extremely vulnerable to serious injury. However, the vast majority of people experience pain, and it is exceptionally distressing to them.
The perception of pain may be influenced by cultural or situational factors. For example, soldiers have reported not feeling very severe injuries on the battle grounds, yet the same type of injuries received in a surgical procedure have been perceived at a higher pain level (Beecher, 1959, in Holt at al., 2012).
This can be explained by the gate control theory of pain (Melzack & Wall, 1965; Wall & Melzack, 1989). According to this theory, receptors in the skin send a message to the brain when they are activated, causing one to feel pain. This message opens the gate to the brain. However, we have other receptors that can close the gate and so reduce the pain. This can be done in two ways:
1.Create impulses that take over the pain pathway. (This happens when you hurt yourself and then rub the site; the rubbing overwhelms the pain pathway and alleviates the pain. Based on this approach to pain relief, midwives recommend hot baths and back rubs for women in labour.)
2.Shut the gate through thinking it shut or using psychological factors. (This is what happened to the soldiers to whom we referred earlier. The soldiers’ perception of pain may have been influenced by the fact that they were relieved to be alive; as a result they did not notice the pain.)
The kinaesthetic sense
Sternberg (2004) describes kinaesthesia as the sense that monitors the body’s position by noting the skeleton’s position and movement. The body is able to do this because it has receptors in the joints, muscles, tendons and skin that monitor movements of the skeleton. The neural impulses created from this movement go to the brain. Specifically, information travels to the somatosensory cortex and the cerebellum, which are responsible for coordinated movement. We need our kinaesthetic sense in order to move well and to maintain our balance. This, together with the vestibular sense, is used, for example, by a person doing a handstand.
The vestibular sense
The vestibular sense is responsible for our sense of balance and resides in the inner ear. Our ears have semi-circular canals which are three fluid-filled tubes in the inner ear (Holt et al., 2012). You can think of these as three bottles of water that lie on their side. As you walk, the fluid moves from side to side as the head changes angle or rotates. There are small crystals in the semi-circular canals that are called otoliths. These are responsible for sensing the movement of our bodies when we move forwards or backwards, fast or slow, and up or down.
•Taste is a chemical sense as a substance must be dissolved in saliva or other mucus for neural transmission to occur. We are able to taste sweetness, bitterness, saltiness and sourness, as well as umami.
•People often struggle to distinguish between tastes; this refers to the just noticeable difference.
•The tongue has thousands of taste buds that only last about 10 days. When the protrusion on the taste bud detects a chemical, the neuron sends a message to the thalamus and then to the somatosensory cortex, and also to the hypothalamus and the limbic system.
•The senses of taste and smell are closely related as smell is also a chemical sense.
•Molecules in the air reach the olfactory epithelium’s receptor cells. These cells last four to eight weeks only.
•The information goes to the olfactory nerve which goes through the skull to the olfactory bulb in the temporal lobes. This differs from other sense messages which go through the thalamus. As with taste, some of the information also goes to the hypothalamus and the limbic system.
•Anosmia occurs when a person loses their sense of smell.
•The sense of touch involves feeling pressure and temperature, largely through the skin. Pain is intense negative pressure or temperature. Perception and context play a major role in the experience of pain. Gate control theory says that people can disrupt the pain pathway and/ or can alter their perception of pain through psychological means.
•The kinaesthetic sense monitors the body’s position in space through receptors in the joints, muscles, tendons and skin. It works with the vestibular sense, which depends on small crystals (otoliths) located in the semi-circular canals in the inner ear. These senses help us maintain our balance and to move in a coordinated way.
This chapter has shown the importance of sensation and perception to our psychological functioning. As Sternberg (2004, p. 165) states, ’when [our senses] are damaged or lacking, life as we know it is radically different’.
Psychophysics is the study of how physical energy stimulates the sensory organs resulting in meaningful psychological experience. In terms of vision, in the last few decades researchers have demonstrated that there is a great deal of specificity in the way the visual system seems to be designed, which is most notable in the specialisation of individual brain regions. The chapter has also discussed how visual perception works, including form perception, perceptual constancy, depth perception and visual illusions, as well as noting the various disorders of visual perception. After vision, hearing is the most important of the senses for humans. A number of theories have attempted to explain how we hear loudness and pitch. The last part of the chapter gave an overview of the remaining senses — taste, smell and touch, as well as the kinaesthetic and vestibular senses.
absolute threshold: the minimum amount of energy required for an organism to detect a stimulus
adaptation: adaptation occurs when we are constantly surrounded by a particular stimulus and so start to block it out
after-image: an image that remains even when the stimulus is no longer presented
agnosia: deficit of perception
agraphia: the condition whereby a person cannot write
alexia: the condition whereby a person cannot read
amplitude: the size of sound or light waves
anosmia: the condition when a person has lost the sense of smell
auditory nerve: the nerve that takes signals from the ear to the brain
auditory receptors: hairs in the inner ear that receive the amplified vibrations and send a neural impulse to the brain
binocular depth cues: the signs that we sense through two eyes and use to perceive depth
blind spot: the place where the optic nerve leaves the eye, meaning that no photoreceptors can occupy this space
brightness: a property of colour that is determined by the amplitude of the light wave, which is the amount of light we see coming from the wavelengths
bottom—up processing: processing that starts with the smallest, individual elements of a stimuli and proceeds to more complex elements
congenital analgia: the condition whereby people are unable to perceive pain or unable to react to the pain they perceive
chromosomes: units of genetic information on which the design of a person’s body is based
closure: a gestalt law of organisation whereby people ’close’ or ignore the gaps in objects to form a meaningful whole
colour constancy: a feature of perceptual constancy whereby the perception of a colour stays the same or constant even though the image on the retina is not as bright owing to different levels of lighting
cones: cone-shaped photoreceptors that are sensitive to colour
cornea: the transparent curved covering on the outside of the eye
cortical blindness: when the first line of processing of visual input is disrupted and people experience this as ’blindness’
depth perception: the ability to perceive the three-dimensional quality of our world
difference threshold: a threshold that is ’the line one has to cross’ in order to tell when stimulus A is different to stimulus B
eardrum: the membrane inside the outer ear that sound waves cause to vibrate
feature detectors: neurons in the primary visual cortex that only respond to certain visual stimuli, such as horizontal lines
feature-detection theory: a theory of visual perception that states that the neurons in the retina send information using the optic nerve to the brain via the thalamus
First Law of Psychophysics: whether a change in a stimulus is noticed depends on the proportion by which the stimulus has changed
fovea: the area of the retina where the best visual acuity or best vision occurs
frequency: a characteristic of sound that refers to the number of waves that occur per second
frequency theory: a theory that states that our ability to distinguish different pitches is related to the number of times the auditory nerve fires
gate control theory: a theory that states that our experience of pain can be reduced if receptors carrying different messages block the path of the pain to the brain
hue: a property of colour that is determined by the wavelength of light
illusion: an incorrect visual perception
iris: the coloured band of circular muscles that surrounds the pupil
just noticeable difference (JND): the level at which people will notice a difference in stimuli 50 per cent of the time
kinaesthesia (the kinaesthetic sense): the sense that monitors the body’s position by noting the skeleton’s position and movement
lens: a transparent layer over the front of the retina that is used to focus light onto the back of the retina
lesion: damage to an area of the brain
monocular depth cues: the signs that we sense through one eye and use to perceive depth
olfactory epithelium: the membrane of the nose that secretes mucus
opponent-process theory: a theory of colour vision that states that we have neurons in our retina that are able to process three pairs of colours, namely: red-green, yellow-blue and black-white
optic nerve: the nerve that carries signals from the eye to the brain
otoliths: small crystals in the semi-circular canals of the ears responsible for sensing the movement of our bodies when we move forwards or backwards, fast or slow, and up or down
pain: an intensive negative pressure or temperature
perception: a process that entails actively choosing information from sensation, organising it and interpreting it to make meaning of our world
photoreceptor cells: the rod-shaped cells and the cone-shaped cells in the retina that change the electromagnetic energy of light into electrochemical energy (the neural impulse) that can be relayed to the brain
place theory: a theory that states that we hear pitch because the vibrations caused by each frequency make a specific place on the basilar membrane vibrate
primary visual areas (primary visual cortex): the area of the brain to which information travels almost directly from the retina
prosopagnosia: a condition where people cannot recognise faces
proximity: a gestalt law of organisation where those objects closest together are perceived as belonging together
psychophysics: a field in psychology that studies sensations, their limits and how they are perceived
pupil: the opening in the eye that controls the amount of light that is let through
retina: a very thin part of the eye that contains all the cells that pick up light
rods: rod-shaped photoreceptors that are sensitive to black and white
saturation: a property of colour that is determined by how pure the colour appears or how much it has been combined with white
secondary visual areas: a more psychologically sophisticated system than the primary visual areas, dedicated to a range of specialised visual-processing tasks
sensation: a passive process during which the sensory receptors and the brain receive information from the environment
sensory modalities: our sensory abilities, such as hearing, seeing, feeling, smelling and tasting
shape constancy: the feature of perceptual constancy that refers to when the shape of something changes on the retina, but we perceive the shape of the real object as remaining stable
signal detection theory: a theory of stimuli detection that says that noticing a signal depends on many factors besides their physical intensity
similarity: a gestalt law of organisation where things that look the same are grouped together
size constancy: a feature of perceptual constancy where, even though an object gets smaller on the retina as it becomes further away, the person looking at it knows its size remains the same
sound shadow: a feature of hearing where the ear closest to a noise will hear the sound first (and at a slightly higher intensity), and a person’s head will block the sound waves travelling to the other ear to a certain degree, causing a ’shadow’ and a lowering of the intensity of the sound
spatial ability: the ability to be aware of where something is
synesthesia: the condition where different sense experiences overlap
tertiary visual areas: the highest level of the visual system that operates the most abstract and psychologically sophisticated aspects of visual processing
threshold: the level of energy that a stimulus must have in order for you to perceive it
timbre: a characteristic of sound that relates to the quality of the sound
top—down processing: processing that starts with the highest level or ’whole’ of a stimulus and moves to the more basic elements
transduction: a process whereby energy signals turn into an electrochemical impulse
trichromatic theory: a theory of colour vision that proposes that red, blue and green light form the basis of every possible colour of light we can think of, and that the photoreceptors in the eye are specialised to pick up these primary colours of light
vestibular sense: our sense of balance that resides in the inner ear
vision: the ability to sense and make meaning from the light waves from the environment that enter the eye and are sent to the brain
visual acuity: a person’s ability to see the fine detail of objects, to see objects at different distances and to distinguish between objects in the environment
visual-object agnosia: a condition that occurs when people can sense the visual field, but cannot identify or put a name to an object
Multiple choice questions
1.Which one of the following statements about sensation and perception is false (wrong)?
a)Overall sensation is a passive process and perception is an active process.
b)Perception is always an objective process of stimulus input and processing.
c)Sensation occurs when specific energy signals hit specific sensory cells.
d)We make meaning of our world through perception.
2.The minimum amount of energy required for you to detect a stimulus is called:
b)the best threshold
c)the difference threshold
d)the absolute threshold.
3.Melinda is taking part in a psychology experiment. She has been asked to tell the experimenter when she sees a light on the screen in front of her. If she says ’yes’ when no light appears, this is called a:
4.Imagine that you live in the middle of a big city for most of the year, but for one week of the year you leave the city to visit family in a rural area with only a few houses. When you return to the city, you notice a strong smell of car fumes. You comment to your friend that the city smells very bad all of a sudden, but she does not know what you are talking about. This experience can be explained by:
b)the just noticeable difference
5.In the retina, rods are responsible for __________, while cones are responsible for __________.
a)subjective experience; objective experience
b)colour vision; black-and-white vision
c)day vision; colour vision
d)dim-light vision; bright-light vision.
6.You see a friend walking towards you. As he gets closer, the image on your retina gets bigger and bigger, but you do not get a fright and think that your friend is turning into a giant. This is because of:
7.Place theory of hearing states that:
a)we hear pitch depending on the number of times the auditory nerve fires
b)the volley principle is important
c)pitch depends on where the basilar membrane vibrates
d)the sound shadow helps us to locate sounds.
8.The sense of smell is unique because the olfactory pathway to the brain:
a)goes through the limbic system
b)goes through the thalamus
c)goes straight to the olfactory bulb in the brain
d)does not enter the brain.
9.Taste and smell are different to hearing and vision because:
a)taste and smell are less important
b)taste and smell are chemical senses
c)taste and smell are connected to survival
d)taste and smell have no JND.
10.You are feeling very sick with flu. You have a blocked nose and feel light-headed. The doctor says you may have an ear infection that causes you to feel lightheaded because:
a)the vestibular sense resides in the inner ear
b)no blood is getting to your brain
c)you cannot see properly because your eyes are watering
d)the vestibular sense is connected to the nose.
1.Explain to a friend why he has to turn the volume knob of the radio up more to hear a difference in the sound when it is already loud. In your answer explain the JND theory.
2.Compare and contrast the trichromatic and opponent-process theories of colour vision.
3.In hearing, is place theory more correct than frequency theory? Explain your answer.
4.Your friends are arguing about whether men or women experience more pain. What would you tell them about the sensation and perception of pain?
REFERENCES FOR PART 4
Bourne, L. E. & Russo, N. F. (1998). Psychology: Behaviour in context. New York: Norton.
Burgess, C., O’Donohoe, A. & Gill, M. (2000). Agony and ecstasy: A review of MDMA effects and toxicity. European Journal of Psychiatry, 15, 287—294.
Carlson, N. R. (2005). Foundations of physiological psychology (6th ed.). Boston, MA: Pearson.
Chakravarty, V. S., Joseph, D. & Bapi, R. S. (2010). What do the basal ganglia do? A modelling perspective. Biological Cybernetics, 103(3), 237—253.
Coon, D. & Mitterer, J. O. (2013). Introduction to psychology: Gateways to mind and behavior (13th ed.). Independence, KY: Wadsworth/Cengage Learning.
Davey, G. (Ed.). (2004). Complete psychology. London: Hodder & Stoughton.
Feldman, R. S. (2014). Essentials of understanding psychology (11th ed.). Boston, MA: McGraw-Hill.
Gauthier, I. & Nelson, C.A. (2001). The development of face expertise. Current Opinion in Neurobiology, 11, 219—223.
Harvey, P. D. (2012). Clinical applications of neuropsychological assessment. Dialogues in Clinical Neuroscience, 14(1), 91—99.
Holt, N., Bremner, A., Sutherland, E., Vliek, M., Passer, M. & Smith, R. (2012). Psychology: The science of mind and behaviour (2nd ed.). London: McGraw-Hill.
Kalat, J. W. (2001). Biological psychology (7th ed.). Belmont, CA: Wadsworth/Cengage Learning.
Kalat, J. W. (2009). Biological psychology (10th ed.). Belmont, CA: Wadsworth/Cengage Learning.
Luria, A. R. (1972). The man with a shattered world. Cambridge MA: Harvard University Press.
Luria, A. R. (1973a). The working brain. New York: Basic Books.
Luria, A. R. (1973b). The man with a shattered world. New York: Basic Books.
Luria, A. R. (1979). The making of mind: A personal account of Soviet psychology. Cambridge, MA: Harvard University Press.
Martin, G. N. (2006). Human neuropsychology (2nd ed.). Harlow, Essex: Pearson.
Melzack, P. D. & Wall, R. (1965). Pain mechanisms: A new theory. Science, 150, 971—979.
Montgomery, G. (2005). Colour blindness: More prevalent among males. Seeing, hearing, and smelling the world. Retrieved June 21, 2005 from www.hhmi.org/sesnes/b130.html.
Numan, M. & Insel, T. R. (2003). The neurobiology of parental behavior. New York: Springer.
Ogden, J. A. & Corkin, S. (1991). Memories of H.M. In W. C. Abraham, M. Corballis, & K. G. White (Eds), Memory mechanisms: A tribute to G.V. Goddard (pp. 195—215). Hillsdale, NJ: Lawrence Erlbaum.
Ole-Herman, B. (2004). Measure speech intelligibility with a sound level meter. Sound and vibration. Retrieved July 20, 2005 from www.findarticles.com/p/articles /mi_qa4075/is_200410/ai_n9469295/ print.
Parrott, A., Morinan, A., Moss, M. & Scholey, A. (2004). Understanding drugs and behaviour. Chichester: John Wiley.
Peterson, C. (1997). Psychology: A biopsychosocial approach (2nd ed.). New York: Longman.
Santrock, J. W. (2003). Psychology (7th ed.). Boston, MA: McGraw-Hill.
Shenker, J. I. (2005). Teaching biology in a psychology class. Retrieved June 4, 2005 from http://www.psychologicalscience.org/observer/getArticle.cmf?id=1766.
Simner, J. (2012). Defining synaesthesia. British Journal of Psychology, 103, 1—15.
Solms, M. & Turnbull, O. H. (2002). The brain and the inner world: An introduction to the neuroscience of subjective experience. London: Karnac.
Sternberg, R. (2004). Psychology (4th ed.). Orlando, FL: Thomson.
Wall, P. D. & Melzack, R. (1989). Textbook of pain. New York: Churchill Livingston.
Zillmer, E. A. & Spiers, M. V. (2001). Principles of neuropsychology. Belmont, CA: Wadsworth.