Sorting People and Things - The Social Unconscious

Subliminal: how your unconscious mind rules your behavior - Leonard Mlodinow 2013

Sorting People and Things
The Social Unconscious

We would be dazzled if we had to treat everything we saw, every visual input, as a separate element, and had to figure out the connections anew each time we opened our eyes. —GARY KLEIN

IF YOU READ someone a list of ten or twenty items that could be bought at a supermarket, that person will remember only a few. If you recite the list repeatedly, the person’s recall will improve. But what really helps is if the items are mentioned within the categories they fall into—for example, vegetables, fruits, and cereals. Research suggests that we have neurons in our prefrontal cortex that respond to categories, and the list exercise illustrates the reason: categorization is a strategy our brains use to more efficiently process information.1 Remember Shereshevsky, the man with the flawless memory who had great trouble recognizing faces? In his memory, each person had many faces: faces as viewed from different angles, faces in varying lighting, faces for each emotion and for each nuance of emotional intensity. As a result, the encyclopedia of faces on the bookshelf of Shereshevsky’s brain was exceptionally thick and difficult to search, and the process of identifying a new face by matching it to one previously seen—which is the essence of what categorization is—was correspondingly cumbersome.

Every object and person we encounter in the world is unique, but we wouldn’t function very well if we perceived them that way. We don’t have the time or the mental bandwidth to observe and consider each detail of every item in our environment. Instead, we employ a few salient traits that we do observe to assign the object to a category, and then we base our assessment of the object on the category rather than the object itself. By maintaining a set of categories, we thus expedite our reactions. If we hadn’t evolved to operate that way, if our brains treated everything we encountered as an individual, we might be eaten by a bear while still deciding whether this particular furry creature is as dangerous as the one that ate Uncle Bob. Instead, once we see a couple bears eat our relatives, the whole species gets a bad reputation. Then, thanks to categorical thinking, when we spot a huge, shaggy animal with large, sharp incisors, we don’t hang around gathering more data; we act on our automatic hunch that it is dangerous and move away from it. Similarly, once we see a few chairs, we assume that if an object has four legs and a back, it was made to sit on; or if the driver in front of us is weaving erratically, we judge that it is best to keep our distance.

Thinking in terms of generic categories such as “bears,” “chairs,” and “erratic drivers” helps us to navigate our environment with great speed and efficiency; we understand an object’s gross significance first and worry about its individuality later. Categorization is one of the most important mental acts we perform, and we do it all the time. Even your ability to read this book depends on your ability to categorize: mastering reading requires grouping similar symbols, like b and d, in different letter categories, while recognizing that symbols as disparate as b, b, Image, and b all represent the same letter.

Classifying objects isn’t easy, Image. Mixed fonts aside, it is easy to underestimate the complexity of what is involved in categorization because we usually do it quickly and without conscious effort. When we think of food types, for example, we automatically consider an apple and a banana to be in the same category—fruit—though they appear quite different, but we consider an apple and a red billiard ball to be in different categories, even though they appear quite similar. An alley cat and a dachshund might both be brown and of roughly similar size and shape, while an Old English sheepdog is far different—large, white, and shaggy—but even a child knows that the alley cat is in the category feline and the dachshund and sheepdog are canines. To get an idea of just how sophisticated that categorization is, consider this: it was just a few years ago that computer scientists finally learned how to design a computer vision system that could accomplish the task of distinguishing cats from dogs.

As the above examples illustrate, one of the principal ways we categorize is by maximizing the importance of certain differences (the orientation of d versus b or the presence of whiskers) while minimizing the relevance of others (the curviness of Image versus b or the color of the animal). But the arrow of our reasoning can also point the other way. If we conclude that a certain set of objects belongs to one group and a second set of objects to another, we may then perceive those within the same group as more similar than they really are—and those in different groups as less similar than they really are. Merely placing objects in groups can affect our judgment of those objects. So while categorization is a natural and crucial shortcut, like our brain’s other survival-oriented tricks, it has its drawbacks.

One of the earliest experiments investigating the distortions caused by categorization was a simple study in which subjects were asked to estimate the lengths of a set of eight line segments. The longest of those lines was 5 percent longer than the next in the bunch, which, in turn, was 5 percent longer than the third longest, and so on. The researchers asked half their subjects to estimate the lengths of each of the lines, in centimeters. But before asking the other subjects to do the same, they artificially grouped the lines into two sets—the longer four lines were labeled “Group A,” the shorter four labeled “Group B.” The experimenters found that once the lines were thought of as belonging to a group, the subjects perceived them differently. They judged the lines within each group as being closer in length to one another than they really were, and the length difference between the two groups as being greater than it actually was.2

Analogous experiments have since shown the same effect in many other contexts. In one experiment, the judgment of length was replaced by a judgment of color: volunteers were presented with letters and numbers that varied in hue and asked to judge their “degree of redness.” Those who were given the color samples with the reddest characters grouped together judged those to be more alike in color and more different from the other group than did volunteers who appraised the same samples presented without being grouped.3 In another study, researchers found that if you ask people in a given city to estimate the difference in temperature between June 1 and June 30, they will tend to underestimate it; but if you ask them to estimate the difference in temperature between June 15 and July 15, they will overestimate it.4 The artificial grouping of days into months skews our perception: we see two days within a month as being more similar to each other than equally distant days that occur in two different months, even though the time interval between them is identical.

In all these examples, when we categorize, we polarize. Things that for one arbitrary reason or another are identified as belonging to the same category seem more similar to each other than they really are, while those in different categories seem more different than they really are. The unconscious mind transforms fuzzy differences and subtle nuances into clear-cut distinctions. Its goal is to erase irrelevant detail while maintaining information on what is important. When that’s done successfully, we simplify our environment and make it easier and faster to navigate. When it’s done inappropriately, we distort our perceptions, sometimes with results harmful to others, and even ourselves. That’s especially true when our tendency to categorize affects our view of other humans—when we view the doctors in a given practice, the attorneys in a given law firm, the fans of a certain sports team, or the people in a given race or ethnic group as more alike than they really are.

A CALIFORNIA ATTORNEY wrote about the case of a young Salvadoran man who had been the only nonwhite employee at a box-manufacturing plant in a rural area. He had been denied a promotion, then fired for habitual tardiness and for being “too easy-going.” The man claimed that the same could be said of others but that their tardiness went unnoticed. With them, he said, the employer seemed to understand that sometimes a sickness in the family, a problem with a child, or trouble with the car can lead to being late. But with him, lateness was automatically attributed to laziness. His shortcomings were amplified, he said, and his achievements went unrecognized. We’ll never know whether his employer really overlooked the Salvadoran man’s individual traits, whether his employer lumped him in the general category “Hispanic” and then interpreted his behavior in terms of a stereotype. The employer certainly disputed that accusation. And then he added, “Mateo’s being a Mexican didn’t make any difference to me. It’s like I didn’t even notice.”5

The term “stereotype” was coined in 1794 by the French printer Firmin Didot.6 It referred to a type of printing process by which cookie-cutter-like molds could be used to produce duplicate metal plates of hand-set type. With these duplicate plates, newspapers and books could be printed on several presses at once, enabling mass production. The term was first used in its current sense by the American journalist and intellectual Walter Lippmann in his 1922 book Public Opinion, a critical analysis of modern democracy and the role of the public in determining its course. Lippmann was concerned with the ever-growing complexity of the issues facing the voters and the manner in which they developed their views on those issues. He was particularly worried about the role of the mass media. Employing language that sounds as if it was pulled from a recent scholarly article on the psychology of categories, Lippmann wrote, “The real environment is altogether too big, too complex, and too fleeting for direct acquaintance.… And although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage with it.”7 That simpler model was what he called the stereotype.

Lippmann recognized that the stereotypes people use come from cultural exposure. His was an era in which mass-circulation newspapers and magazines, as well as the new medium of film, were distributing ideas and information to audiences larger and more far-flung than had ever before been possible. They made available to the public an unprecedentedly wide array of experiences of the world, yet without necessarily providing an accurate picture. The movies, in particular, conveyed a vivid, real-looking portrait of life, but one often peopled by stock caricatures. In fact, in the early days of film, filmmakers combed the streets looking for “character actors,” easily identifiable social types, to play in their movies. As Lippmann’s contemporary Hugo Münsterberg wrote, “If the [producer] needs the fat bartender with his smug smile, or the humble Jewish peddler, or the Italian organ grinder, he does not rely on wigs and paint; he finds them all ready-made on the East Side [of New York].” Stock character types were (and still are) a convenient shorthand—we recognize them at once—but their use amplifies and exaggerates the character traits associated with the categories they represent. According to the historians Elizabeth Ewen and Stuart Ewen, by noting the analogy between social perception and a printing process capable of generating an unlimited number of identical impressions, “Lippmann had identified and named one of the most potent features of modernity.”8

Image

People, categorized according to the animal they resemble. Courtesy of the National Library of Medicine.

Though categorizations due to race, religion, gender, and nationality get the most press, we categorize people in many other ways as well. We can probably all think of cases in which we lumped athletes with athletes, or bankers with bankers, in which we and others have categorized people we’ve met according to their profession, appearance, ethnicity, education, age, or hair color or even by the cars they drive. Some scholars in the sixteenth and seventeenth centuries even categorized people according to the animal they best resembled, as pictured on the facing page, in images from De Humana Physiognomonia, a kind of field guide to human character written in 1586 by the Italian Giambattista della Porta.9

A more modern illustration of categorization by appearance played out early one afternoon in an aisle of a large discount department store in Iowa City. There, an unshaven man in soiled, patched blue jeans and a blue workman’s shirt shoved a small article of clothing into the pocket of his jacket. A customer down the aisle looked on. A little later, a well-groomed man in pressed dress slacks, a sports jacket, and a tie did the same, observed by a different customer who happened to be shopping nearby. Similar incidents occurred again and again that day, well into the evening, over fifty more times, and there were a hundred more such episodes at other nearby stores. It was as if a brigade of shoplifters had been dispatched to rid the town of cheap socks and tacky ties. But the occasion wasn’t National Kleptomaniacs’ Day; it was an experiment by two social psychologists.10 With the full cooperation of the stores involved, the researchers’ aim was to study how the reactions of bystanders would be affected by the social category of the offender.

The shoplifters were all accomplices of the researchers. Immediately after each shoplifting episode, the thief walked out of hearing distance of the customer but remained within eyesight. Then another research accomplice, dressed as a store employee, stepped to the vicinity of the customer and began rearranging merchandise on the shelves. This gave the customer an easy opportunity to report the crime. The customers all observed the identical behavior, but they did not all react to it in the same way. Significantly fewer of the customers who saw the well-dressed man commit the crime reported it, as compared to those who had watched the scruffy individual. Even more interesting were the differences in attitude the customers had when they did alert the employee to the crime. Their analysis of events went beyond the acts they had observed—they seemed to form a mental picture of the thief based as much on his social category as on his actions. They were often hesitant when reporting the well-dressed criminal but enthusiastic when informing on the unkempt perpetrator, spicing up their accounts with utterances along the lines of “that son of a bitch just stuffed something down his coat.” It was as if the unkempt man’s appearance was a signal to the customers that shoplifting must be the least of his sins, an indicator of an inner nature as soiled as his clothes.

We like to think we judge people as individuals, and at times we consciously try very hard to evaluate others on the basis of their unique characteristics. We often succeed. But if we don’t know a person well, our minds can turn to his or her social category for the answers. Earlier we saw how the brain fills in gaps in visual data—for instance, compensating for the blind spot where the optic nerve attaches to the retina. We also saw how our hearing fills gaps, such as when a cough obliterated a syllable or two in the sentence “The state governors met with their respective legislatures convening in the capital city.” And we saw how our memory will add the details of a scene we remember only in broad strokes and provide a vivid and complete picture of a face even though our brains retained only its general features. In each of these cases our subliminal minds take incomplete data, use context or other cues to complete the picture, make educated guesses, and produce a result that is sometimes accurate, sometimes not, but always convincing. Our minds also fill in the blanks when we judge people, and a person’s category membership is part of the data we use to do that.

The realization that perceptual biases of categorization lie at the root of prejudice is due largely to the psychologist Henri Tajfel, the brain behind the line-length study. The son of a Polish businessman, Tajfel would likely have become a forgotten chemist rather than a pioneering social psychologist were it not for the particular social category to which he himself was assigned. Tajfel was a Jew, a category identification that meant he was banned from enrolling in college, at least in Poland. So he moved to France. There he studied chemistry, but he had no passion for it. He preferred partying—or, as one colleague put it, “savoring French culture and Parisian life.”11 His savoring ended when World War II began, and in November 1939, he joined the French army. Even less savory was where he ended up: in a German POW camp. There Tajfel was introduced to the extremes of social categorization that he would later say led him to his career in social psychology.

The Germans demanded to know the social group to which Tajfel belonged. Was he French? A French Jew? A Jew from elsewhere? If the Nazis thought of Jews as less than human, they nevertheless distinguished between pedigrees of Jew, like vintners distinguishing between the châteaus of origin of soured wine. To be French meant to be treated as an enemy. To be a French Jew meant to be treated as an animal. To admit being a Polish Jew meant swift and certain death. No matter what his personal characteristics or the quality of his relationship with his German captors, as he would later point out, if his identity were discovered, it would be his classification as a Polish Jew that would determine his fate.12 But there was also danger in lying. So, from the menu of stigmatization, Tajfel chose the middle dish: he spent the next four years pretending to be a French Jew.13 He was liberated in 1945 and in May of that year, as he put it, was “disgorged with hundreds of others from a special train arriving at the Gare d’Orsay in Paris … [soon to discover] that hardly anyone I knew in 1939—including my family—was left alive.”14 Tajfel spent the next six years working with war refugees, especially children and adolescents, and mulling over the relationships between categorical thinking, stereotyping, and prejudice. According to the psychologist William Peter Robinson, today’s theoretical understanding of those subjects “can almost without exception be traced back to Tajfel’s theorizing and direct research intervention.”15

Unfortunately, as was the case with other pioneers, it took the field many years to catch up with Tajfel’s insights. Even well into the 1980s, many psychologists viewed discrimination as a conscious and intentional behavior, rather than one commonly arising from normal and unavoidable cognitive processes related to the brain’s vital propensity to categorize.16 In 1998, however, a trio of researchers at the University of Washington published a paper that many see as providing smoking-gun evidence that unconscious, or “implicit,” stereotyping is the rule rather than the exception.17 Their paper presented a computerized tool called the “Implicit Association Test,” or IAT, which has become one of social psychology’s standard tools for measuring the degree to which an individual unconsciously associates traits with social categories. The IAT has helped revolutionize the way social scientists look at stereotyping.

IN THEIR ARTICLE, the IAT pioneers asked their readers to “consider a thought experiment.” Suppose you are shown a series of words naming male and female relatives, such as “brother” or “aunt.” You are asked to say “hello” when presented with a male relative and “good-bye” when shown a female. (In the computerized version you see the words on a screen and respond by pressing letters on the keyboard.) The idea is to respond as quickly as possible while not making too many errors. Most people who try this find that it is easy and proceed rapidly. Next, the researchers ask that you repeat the game, only this time with male and female names, like “Dick” or “Jane” instead of relatives. The names are of unambiguous gender, and again, you can fly through them. But this is just an appetizer.

The real experiment starts now: in phase 1, you are shown a series of words that can be either a name or a relative. You are asked to say “hello” for the male names and relatives and “good-bye” for the female names and relatives. It’s a slightly more complex task than before, but still not taxing. What’s important is the time it takes you to make each selection. Try it with the following word list; you can say “hello” or “good-bye” to yourself if you are afraid of scaring away your own relatives who may be within earshot (hello = male name or relative; good-bye = female name or relative):

John, Joan, brother, granddaughter, Beth, daughter, Mike, niece, Richard, Leonard, son, aunt, grandfather, Brian, Donna, father, mother, grandson, Gary, Kathy.

Now for phase 2. In phase 2 you see a list of the names and relatives again, but this time you are asked to say “hello” when seeing a male name or female relative and “good-bye” when you see a female name or male relative. Again, what’s important is the time it takes you to make your selections. Try it (hello = male name or female relative; good-bye = female name or male relative):

John, Joan, brother, granddaughter, Beth, daughter, Mike, niece, Richard, Leonard, son, aunt, grandfather, Brian, Donna, father, mother, grandson, Gary, Kathy.

The phase 2 response times are typically far greater than those for phase 1: perhaps three-fourths of a second per word, as opposed to just half a second. To understand why, let’s look at this as a task in sorting. You are being asked to consider four categories of objects: male names, male relatives, female names, and female relatives. But these are not independent categories. The categories male names and male relatives are associated—they both refer to males. Likewise, the categories female names and female relatives are associated. In phase 1 you are asked to label the four categories in a manner consistent with that association—to label all males in the same manner, and all females in the same manner. In phase 2, however, you are asked to ignore your association, to label males one way if you see a name but the other way if you see a relative, and to also label female terms differently depending upon whether the term is a name or a relative. That is complicated, and the complexity eats up mental resources, slowing you down.

That is the crux of the IAT: when the labeling you are asked to do follows your mental associations, it speeds you up, but when it mixes across associations, it slows you down. As a result, by examining the difference in speed between the two ways you are asked to label, researchers can probe how strongly a person associates traits with a social category.

For example, suppose that instead of words denoting male and female relatives, I showed you terms related to either science or the arts. If you had no mental association linking men and science or women and the arts, it wouldn’t matter if you had to say “hello” for men’s names and science terms and “good-bye” for women’s names and arts terms, or “hello” for men’s names and arts terms and “good-bye” for women’s names and science terms. Hence there would be no difference between phase 1 and phase 2. But if you had strong associations linking women and the arts and linking men and science—as most people do—the exercise would be very similar to the original task, with male and female relatives and male and female names, and there would be a considerable difference in your response times in phase 1 and phase 2.

When researchers administer tests analogous to this, the results are stunning. For example, they find that about half the public shows a strong or moderate bias toward associating men with science and women with the arts, whether they are aware of such links or not. In fact, there is little correlation between the IAT results and measures of “explicit,” or conscious, gender bias, such as self-reports or attitude questionnaires. Similarly, researchers have shown subjects images of white faces, black faces, hostile words (awful, failure, evil, nasty, and so on), and positive words (peace, joy, love, happy, and so on). If you have pro-white and anti-black associations, it will take you longer to sort words and images when you have to connect positive words to the black category and hostile words to the white category than when black faces and hostile words go in the same bin. About 70 percent of those who have taken the test exhibit this pro-white association, including many who are (consciously) appalled at learning that they hold such attitudes. Even many black people, it turns out, exhibit an unconscious pro-white bias on the IAT. It is difficult not to when you live in a culture that embodies negative stereotypes about African Americans.

Though your evaluation of another person may feel rational and deliberate, it is heavily informed by automatic, unconscious processes—the kind of emotion-regulating processes carried out within the ventromedial prefrontal cortex. In fact, damage to the VMPC has been shown to eliminate unconscious gender stereotyping.18 As Walter Lippmann recognized, we can’t avoid mentally absorbing the categories defined by the society in which we live. They permeate the news, television programming, films, all aspects of our culture. And because our brains naturally categorize, we are vulnerable to acting on the attitudes those categories represent. But before you recommend incorporating VMPC obliteration into your company’s management training course, remember that the propensity to categorize, even to categorize people, is for the most part a blessing. It allows us to understand the difference between a bus driver and a bus passenger, a store clerk and a customer, a receptionist and a physician, a maître d’ and a waiter, and all the other strangers we interact with, without our having to pause and consciously puzzle out everyone’s role anew during each encounter. The challenge is not how to stop categorizing but how to become aware of when we do it in ways that prevent us from being able to see individual people for who they really are.

THE PSYCHOLOGY PIONEER Gordon Allport wrote that categories saturate all that they contain with the same “ideational and emotional flavor.”19 As evidence of that, he cited a 1948 experiment in which a Canadian social scientist wrote to 100 different resorts that had advertised in newspapers around the holidays.20 The scientist drafted two letters to each resort, requesting accommodations on the same date. He signed one letter with the name “Mr. Lockwood” and the other with the name “Mr. Greenberg.” Mr. Lockwood received a reply with an offer of accommodations from 95 of the resorts. Mr. Greenberg received such a reply from just 36. The decisions to spurn Mr. Greenberg were obviously not made on Mr. Greenberg’s own merits but on the religious category to which he presumably belonged.

Prejudging people according to a social category is a time-honored tradition, even among those who champion the underprivileged. Consider this quote by a famed advocate for equality:

Ours is one continued struggle against degradation sought to be inflicted upon us by the European, who desire to degrade us to the level of the raw Kaffir [black African] … whose sole ambition is to collect a certain number of cattle to buy a wife with, and then pass his life in indolence and nakedness.21

That was Mahatma Gandhi. Or consider the words of Che Guevara, a revolutionary who, according to Time magazine, left his native land “to pursue the emancipation of the poor of the earth” and helped overthrow the Cuban dictator Fulgencio Batista.22 What did this Marxist champion of poor oppressed Cubans think of the poor blacks in the United States? He said, “The Negro is indolent and lazy, and spends his money on frivolities, whereas the European is forward-looking, organized and intelligent.”23 And how about this famous advocate for civil rights:

I will say then that I am not, nor ever have been in favor of bringing about in any way the social and political equality of the white and black races … there is a physical difference between the white and black races which I believe will forever forbid the two races living together on terms of social and political equality … and I as much as any other man am in favor of having the superior position assigned to the white race.

That was Abraham Lincoln in a debate at Charlestown, Illinois, in 1858. He was incredibly progressive for his time but still believed that social, if not legal, categorization would forever endure. We’ve made progress. Today in many countries it is difficult to imagine a serious candidate for national political office voicing views such as Lincoln’s—or if he did, at least he wouldn’t be considered the pro—civil rights candidate. Today culture has evolved to the point where most people feel it is wrong to willfully cheat someone out of an opportunity because of character traits we infer from their category identity. But we are only beginning to come to grips with unconscious bias.

Unfortunately, if science has recognized unconscious stereotyping, the law has not. In the United States, for example, individuals claiming discrimination based on race, color, religion, sex, or national origin must prove not only that they were treated differently but that the discrimination was purposeful. No doubt discrimination often is purposeful. There will always be people like the Utah employer who consciously discriminated against women and was quoted in court as having said, “Fucking women, I hate having fucking women in the office.”24 It is relatively easy to address discrimination by people who preach what they practice. The challenge science presents to the legal community is to move beyond that, to address the more difficult issue of unconscious discrimination, of bias that is subtle and hidden even from those who exercise it.

We can all personally fight unconscious bias, for research has shown that our tendency to categorize people can be influenced by our conscious goals. If we are aware of our bias and motivated to overcome it, we can. For example, studies of criminal trials reveal one set of circumstances in which people’s bias regarding appearance is routinely overcome. In particular, it has long been known that people’s attributions of guilt and recommendations of punishment are subliminally influenced by the looks of the defendant.25 But: typically, more attractive defendants receive more lenient treatment only when accused of minor crimes such as traffic infractions or swindles, and not with regard to more serious crimes like murder. Our unconscious judgment, which relies heavily on the categories to which we assign people, is always competing with our more deliberative and analytical conscious thought, which may see them as individuals. As these two sides of our minds battle it out, the degree to which we view a person as an individual versus a generic group member can vary on a sliding scale. That’s what seems to be happening in criminal trials. Serious crimes usually involve longer, more detailed examination of the defendant, with more at stake, and the added conscious focus seems to outweigh the attractiveness bias.

The moral of the story is that if we wish to overcome unconscious bias, it requires effort. A good way to start is by taking a closer look at those we are judging, even if they are not on trial for murder but, instead, are simply asking for a job or a loan—or our vote. Our personal knowledge of a specific member of a category can easily override our category bias, but more important, over time repeated contact with category members can act as an antidote to the negative traits society assigns to people in that category.

I recently had my eyes opened to the way experience can trump bias. It happened after my mother moved into an assisted living center. Her cohorts there are mainly around ninety. Since I have had little exposure to large numbers of people that age, I initially viewed all of them as alike: white hair, slouched posture, tethered to their walkers. I figured that if they’d ever held a job, it must have been building the pyramids. I saw them not as individuals but, rather, as exemplars of their social stereotype, assuming they were all (except my mother, of course) rather dim and feebleminded and forgetful.

My thinking changed abruptly one day in the dining room, when my mother remarked that on the afternoons when the hairdresser visited the assisted living center, she felt pain and dizziness as she leaned her head back to have her hair washed. One of my mother’s friends said that this was a very bad sign. My initial thoughts were dismissive: What does she mean by a bad sign? Is that an astrological prediction? But the friend went on to explain that my mother’s complaints were the classic symptoms of an occluded carotid artery, which could lead to a stroke, and urged that she see her physician about it. My mother’s friend wasn’t just a ninety-year-old; she was a doctor. And as I got to know others in the home, over time, I started to see ninety-year-olds as varied and unique characters, with many different talents, none of which related to the pyramids.

The more we interact with individuals and are exposed to their particular qualities, the more ammunition our minds have to counteract our tendency to stereotype, for the traits we assign to categories are products not just of society’s assumptions but of our own experience. I didn’t take the IAT before and after, but my guess is that my implicit prejudice concerning the very old has been considerably reduced.

IN THE 1980S, scientists in London studied a seventy-seven-year-old shopkeeper who had had a stroke in the lower part of his occipital lobe.26 His motor system and memory were unaffected, and he retained good speaking and visual skills. For the most part he seemed cognitively normal, but he did have one problem. If shown two objects that had the same function but were not identical—say, two different trains, two brushes, or two jugs—he could not recognize the connection between them. He could not tell, even, that the letters a and A meant the same thing. As a result, the patient reported great difficulty in everyday life, even when attempting simple tasks such as setting the table. Scientists say that without our ability to categorize we would not have survived as a species, but I’ll go further: without that ability, one could hardly survive even as an individual. In the previous pages, we’ve seen that categorization, like many of our unconscious mental processes, has both up- and downsides. In the next chapter, we’ll find out what happens when we categorize ourselves, when we define ourselves as being connected, by some trait, to certain other individuals. How does that affect the way we view and treat those within our group and those on the outside?