Psychology For Dummies - Adam Cash 2013
Thinking and Speaking
Thinking and Feeling and Acting
Five Biases in Judgment and Decision Making
Anchoring: Putting too much weight on one piece of information
Availability: Overestimating the probability of an event occurring again because it’s “most available” to memory
Confirmation bias: A tendency to seek only information that agrees with your decision or judgment
Gambler’s fallacy: The mistaken belief that for independent events past occurrences influence future probabilities — as in “I’ve been losing so much, I’m bound to win soon.”
In This Chapter
Thinking about thinking
Booting up the mind
Finding out about the idea process
Understanding memory
Comprehending the decision process
Discovering facts about intelligence
Understanding language
Before I get into the complex psychological concept of thinking, here’s a little mental experiment. Imagine yourself lying in your bed having just awakened from a good night’s sleep. You reach over to shut off your annoying alarm, toss the blankets to the side, and head for the bathroom. Now, here’s the experimental part. When you get to the bathroom, you forget why you’re in there. The answer may seem obvious because you just got out of bed and walked straight to the bathroom, but you’ve forgotten. You look around and can’t figure out where you are. Nothing seems familiar, and you’re surrounded by a strange world of shapes, figures, objects, sounds, and lights. You look into an object that reflects an image of some other thing back at you, but you don’t know what it is. You’re confused, disoriented, and basically lost. Your mind is completely blank. You can’t even think of anything to say in order to cry out for help. You’re stuck there. What are you going to do?
If the example seems a little strange, or at least a bit abstract, it’s because the situation would be strange. Without the ability to think, life would be similar to the bathroom situation I just described. Thinking enables you to recognize objects, solve problems, and communicate. You’d really be in trouble if you couldn’t think. You wouldn’t even be able to figure out how to get out of the bathroom.
In this chapter, I describe the concepts of thinking (cognition) and language and their component processes such as attention, memory, decision making, intelligence, language (including speech and nonverbal language such as American Sign Language), and comprehension.
Finding Out What’s On Your Mind
What exactly is thinking? A bit later in this chapter, I ask you to analyze your own thought processes, so it would help if you knew what you’re analyzing. In psychology, the term, cognition refers to the mental processing of information including memorizing, reasoning, problem solving, conceptualizing, and imagining.
Studying thinking (or, more strictly, cognition) is pretty difficult. Why? It’s hard to see! If I opened up your skull and looked inside, would I see thinking? No, I’d see a wrinkly, grayish-pink thing (your brain). In the early years of psychological research on thinking, psychologists asked people participating in studies on thinking to engage in something called introspection. Introspection is the observation and reporting of one’s own inner experience. Psychologists gave participants a simple math problem to solve and asked them to talk out loud as they performed the calculations. These exercises were intended to capture the steps involved in the thinking process. It is important to keep in mind that a great deal of thinking or cognition occurs outside of conscious awareness. In that sense, introspection would not be very helpful now would it?
Try it! Get a piece of blank paper and a pencil. Your instructions are to solve the following math problem and write down each step that you take, one by one:
47,876 + 23,989
The answer is 71,865. If you didn’t get this right, don’t worry about it; everyone has weaknesses. Actually, if you got the wrong answer, the introspection technique may be able to reveal what you did wrong. Take a second to go over each of the steps you went through to solve the problem.
You just participated in a psychological experiment, and it didn’t hurt a bit, did it?
Now, imagine how hard it would be to use introspection to analyze all of your thinking. It would be pretty difficult — impossible, in fact. Part of the reason psychologists don’t rely on introspection anymore is that people are blind to most of what our minds (or brains) are doing. Introspection can’t capture sophisticated thought processes. These days, psychologists use computer modeling, formal experiments, and other complex means for researching thinking. Psychologists try to build model systems that think the way people do.
Thinking Like a PC
Finding out how thinking works has been a pursuit of inquiry since Aristotle, through the Renaissance with Descartes, and into the modern era. A useful tool that thinking thinkers use to explain the workings of the mind is metaphor. Numerous metaphors have been developed through the ages, including the mind as a steam engine, a clock, and even a computer. In this section, I introduce you to the concept of the mind and thinking as computational processes, in which representations of information are manipulated as the actual process of thought.
Computing
Upon the advent of the modern computer, psychologists and related investigators began to look at the operations performed by computers, called computing or computation, as potential models for human thought processes. This was a significant breakthrough. Using the computer as a model for how thinking occurs is called the computational-representational model of mind (and thinking). The idea is both profound and simplistic: The mind and all of its complex processes, such as perceiving, thinking, problem solving, and so forth, make up an information-processing machine that performs computations.
What is a “computation”? A computation is a manipulation of symbols according to a preset rule that transforms one set of symbols into another. For example, assume that there is the following rule for representing the letters W, S, D, O, and R:
W = 1
S = 2
D = 3
O = 4
R = 5
So the word “words” would be represented as “14532,” and the word “sword” would be represented as “21453.” That transformation is a computation. So, the brain, as a computational “device,” transforms one type of information, such as light waves (or the letters in the “words” example), into another type of information, neural patterns (or the numbers in the “words” example.
Representing
Computations are performed on mental representations. A mental representation is a symbol of a particular stimulus (like a tree) in the mind. The term “symbol” is used here in a very loose sense to refer to something standing in for something else. So the stimulus “tree,” for example, may come to be symbolized by the activation of a specific set of neurons in the brain.
Sit back for a second, get comfortable, and conjure up an image of a pink rose in your mind. Concentrate so that the image is clear; see the green stem and leaves, the pink petals, the thorns. Try to imagine the rose in detail. If someone comes into the room as you’re doing this and asks if there is a rose in the room, what would you say? If there isn’t actually a rose in the room, then you’d say “no.” But consider the idea that there is actually a rose in the room, or at least it’s in the room because there’s one inside your mind, the rose that you’re imagining. So, if I cut open your skull and looked into your head, I’d see a pink rose, right? Of course not! The rose only exists in symbolic or representational form inside of your mind.
Remember, thinking consists of symbols that represent information about the world, the objects within it, and the manipulation of those symbols. The mental manipulation of symbols is based on combining, breaking down, and recombining symbols into more complex strings or systems of symbols that have meaning. Take the word “rose” again. It consists of simpler parts called letters, and the specific combination of those letters gives rise to the specific word and image of the object called “rose.” The letters can be rearranged to spell the word “sore,” which is an entirely different thing and thus an entirely different thought. This reveals that even a simple system like the alphabet can give rise to an almost infinite set of larger and more meaningful symbols or representations.
Where do all these symbols come from? Symbols are generated by sensing things in the world. When I see a rose, there’s a corresponding symbolic representation of that rose in my mind as I think about it. From this point on, the Psychology for Dummies definition of thinking is the mental processing of information as computations performed on representations and the various operations involved in that process.
Processing
Your mind goes through four basic steps when processing information:
1. Stimulus information from your senses reaches the brain. (You see Lebron James slam dunk for the first time.)
Turing’s challenge
Alan Turing came up with something called the Turing test. Turing (1912—1954) was a British mathematician and computer scientist who was instrumental in helping crack secrete German submarine codes during World War II. In his time, a popular parlor game involved placing a man and a woman behind two different doors; guests had to communicate with them by typewritten notes. The point was to guess correctly if it was the man or the woman behind a particular door based only on their answers to the guests’ questions. Turing proposed that the comparison be changed from a man and a woman to a computer and a human being to find out if guests could determine whether the computer or a human was answering their questions. If guests couldn’t tell the difference, then the computer would have to be considered to be a suitable “stand in” for the human. That is, such a test result would mean that a computer could “represent” human thought in its own way — in a symbolic architecture, or computer language. The Turing test is a demonstration of how computation can be performed on symbols or representations and is an analogy for how the human mind can perform computations on symbols and represent the real world in mental terms.
2. The information is analyzed. (Your brain thinks, “Wow. Those are impressive moves.”)
3. Different possible responses are generated. (Your brain tries to figure out how he’s doing it.)
4. A response is executed and monitored for feedback. (You throw on your basketball shoes and head for the court.)
These basic mechanisms of information processing are sometimes called the architecture of thought. These are the rules of thought and require all the following basic components:
Input: Sensory information that comes in from your world or from within your own mind and is considered.
Memory: System necessary for storing knowledge. Information about people and other elements in the world is stored in the mind and memory.
Operations: Rules that determine how information in the memory system is utilized (reasoning, problem solving, and logical analysis). Take math as an example: If I have 100 numbers stored in my memory and am confronted with a mathematical problem, operations determine how I solve that problem.
Output: Action “programs” that involve telling the rest of the mind and body what to do after the thinking operations have been carried out.
Exploring Operations of the Mind
Jerry Fodor (born 1935), is an American philosopher (What? Seriously consider the advice of someone who’s not a psychologist? Blasphemy!), who proposed that the mind’s complicated information-processing system can be divided into specific operations, or modules, that perform specific thinking tasks — an attention module, a problem-solving module, and so forth. In the sections that follow, I introduce some of the more significant mental operations or procedures, including attention, memory, concept formation, and problem solving.
Focusing your attention
There’s a great scene from one of my favorite movies, Dumb and Dumber (1994), starring Jim Carey (Lloyd) and Jeff Daniels (Harry) in which Harry tells Lloyd a story about how a girlfriend broke up with him in high school.
Harry tells Lloyd that he didn’t know why she ended the relationship — something about him not listening to her — but he wasn’t paying attention when she told him the reason.
Harry didn’t pay attention, one of the most critical and primary thinking processes. The world and your own mind are full of information and stimulation. It’s extremely noisy, buzzy, fuzzy, colorful, blurry, and chock-full of stuff. How can you select and focus on what is important and ignore all the rest?
As part of information processing, attention is defined as the cognitive process of selecting stimuli and information from the environment for further processing while excluding or inhibiting other stimuli and information. There’s simply too much information buzzing around in people’s minds and in the environment to attend to it all. Efficient and effective information processing requires selection.
Psychologists Daniel Simons and Christopher Chabris conducted an experiment that illustrates this selective feature of attention. In short, experimental subjects were asked to watch a video of people innocently passing a basketball to each other. Partway through the video, a person in a gorilla suit enters the frame, pounds on his chest, and then leaves the frame. Subjects were asked if they remember seeing the gorilla; about half of them reported never noticing a gorilla. This is known as inattentional blindness; when a person is focusing on something, he misses unrelated information. This is an active principle in magic shows and sleight-of-hand tricks. Sorry, I don’t think David Copperfield is a wizard, but he sure knows how to harness inattentional blindness!
Different types of attention processes exist:
Focused attention: Concentration on one source of input to the exclusion of everything else
Divided attention: Focus on two or more inputs simultaneously
Psychologist Donald Broadbent developed a cognitive model of attention in which attention is characterized as a channel with limited capacity for input to pass through. Sensory information is processed first, and then semantic (or meaning) information is processed. The key to Broadbent’s model is that inputs need to be attended to in order to be further processed.
Broadbent’s model does not explain all the data being collected through research so other models have been developed. One research finding known as the cocktail party effect presented a challenge to Broadbent’s model. Ever been at a noisy party and suddenly you hear your name mentioned from across the room? Well, you weren’t particularly attending to or waiting to hear your name, but you did anyway because it’s important. Your mind attends to important information.
Experiments have lead cognitive psychologists to characterize attention as a dynamic process in which both selection for attention and selection for “non-attention” are made at the same time. Guided search theory is a dynamic model of attention that proposes that attention is guided by salient information from previous searches or episodes of attending. It is considered a top-down model of attention in that people are considered to be actively searching rather than passively receiving inputs as is more the case in Broadbent’s model.
Packing it away in the ol’ memory box
Thinking involves the manipulation of mental symbols that you store as concepts — representations of the objects you encounter in the world. How are these mental symbols stored? Memory!
To conceptualize memory, envision a bank. Think about your checking account and your savings account. Each of these accounts does something a little different with your money because they have different purposes. Checking accounts typically function for everyday and short-term use. Savings are intended to be for longer-term storage. Your memories store information in different ways as well.
Three separate storage systems are involved with memory: sensory memory, short-term memory, and long-term memory.
Sensory storage
Sensory memory is a split-second memory system that stores information coming in through your senses. Have you ever looked at the sun and then closed your eyes and looked away? What happens? You can still see a type of sun in your mind. This afterimage is a visual sensory memory known as an iconic memory. For auditory stimuli, it is called echoic memory. This process happens so fast that it is sometimes considered a part of the perceptual process (for more on perception, see Chapter 5), but is, in fact, part of the overall memory system.
Short-term memory
Short-term memory (STM), also known as working memory, consists of the information that is active in your consciousness right now, the things you’re aware of. The light on the book page, these words being read, the grumbling in your stomach, and the sound of traffic outside your window are all parts of your conscious awareness, and they’re all being stored in your STM. Things you are not aware of can simply be forgotten in many cases.
How much information can your STM store? The general consensus is that it can store seven items of information, plus or minus two items. This is sometimes called the “magical number seven” of STM capacity.
Does that mean that I can only store seven words, seven numbers, or seven other simple items in my STM? No. Thanks to a process called chunking, I can store a lot more information than that. A classic example of chunking is the use of mnemonics, which enables you to take a big chunk of information and break it down into a little phrase, so it’s easier to remember.
Here’s an easy way to form a mnemonic. If you have a list of things you want to memorize, take the first letter of each word on the list and make a catchy phrase out of it. I learned this one in eighth grade and never forgot it: “Kings play chess on fine green silk.” Do you know what that stands for? It stands for the way biologists classify different organisms on the earth: Kingdom, Phylum, Class, Order, Family, Genus, and Species.
The duration of memory for the STM system is approximately 18 seconds. You can extend the length of time that you store information in STM only by engaging in something called rehearsal. Rehearsal is the process of actively thinking about something. Rote rehearsal is repeating something over and over again in your mind or out loud so that you don’t forget it. Rote rehearsal will work, but not as effectively as a more effortful type of rehearsal, such as building a mnemonic like “Kings play chess on fine green silk.”
Long-term memory
If the information in STM is rehearsed long enough, it eventually ends up in your memory’s savings account, the long-term memory (LTM). You basically have two ways to deposit information into your long-term memory bank:
Maintenance rehearsal: You transfer the information from your STM through repetition until it’s committed to long-term storage.
Elaborative rehearsal: Your mind elaborates on the information, integrating it with your existing memories. When information is meaningful and references something that you already know, remembering is easier and forgetting becomes harder.
The more you process the information, linking it to what you already know, the better you will remember it!
You can break down the LTM into three basic divisions:
Episodic memory: Events and situations unique to your experiences (marriages, birthdays, graduations, car accidents, what happened yesterday, and so on)
Semantic memory: Factual information such as important holidays, the name of the first president of the United States, and your Social Security number
Procedural memory: Information on how to do things like riding a bike, solving a math problem, or tying your shoes
Theoretically, the size and time capacity of LTM is infinite because researchers haven’t found a way to test its capacity. It has enough capacity to get the job done. This may sound kind of strange when you consider how much information you seem to forget. If the information is in there somewhere, why do you forget it?
Fahgetaboudit!
Have you ever been told to forget about something? Try this: Forget about cheese. Did it work? Did you forget about cheese or did you actually think more about cheese? The irony of someone telling you to forget something is that it’s impossible to forget about something you’re actively thinking about, making “forget about it” bogus advice. If someone really wants you to forget about something, then she shouldn’t mention it to you at all.
Forgetting information stored in LTM is more of an issue of not being able to access it rather than the information not being there. Two forms of access failure occur, and both involve other information getting in the way:
Retroactive interference: Having a hard time remembering older information because newer information is getting in the way
Proactive interference: Having a hard time remembering newer information because older information is getting in the way
The next time you watch a sitcom on television, try to remember the details of the first 10 to 12 minutes, the middle 10 to 12 minutes, and the last 10 to12 minutes of the program. Or, listen to a lecture and try to remember what was said during the beginning, middle, and end of the presentation. You may notice something psychologists call the serial position effect. Information from the beginning and end of the show or lecture is easier to remember than the middle. Why is that?
The serial position effect occurs because the information at the beginning of the show or lecture is usually committed to long-term memory due to the amount of time that elapses. The information at the end of the show is being kept in your short-term memory because it’s fresh in your mind. The middle stuff? It’s just gone.
Conceptualizing
When was the last time you went out with a friend just to talk? Did you go to a coffeehouse? Did you talk about recent romances and frustrating relationships in your life? Did you talk about politics or the weather? It doesn’t matter; you were talking about a concept.
A concept is a thought or idea that represents a set of related ideas. Romance is a concept. Relationship is a concept. Politics is a concept. Weather is a concept. All these concepts are represented as symbols in your information-processing system of thought, and they get into this system through learning. In other words, concepts are derived and generated; they are formed. When objects share characteristics, they represent the same concept. Some concepts are well-defined, others are not.
Consider the following words:
Tail, Fur, Teeth, Four Legs
What do these words describe? It could be a cat, a dog, a lion, or a bear. The fact is, you really can’t tell just from those words. Some crucial detail is missing, some piece of information that clearly defines the concept and separates it from others.
Now, consider this list of words:
Tail, Fur, Teeth, Four Legs, Bark
What is being described now? This has to be a dog. Why? Cats, lions, and bears don’t bark. The feature “bark” uniquely defines the concept of “dog.” “Bark” is the concept’s defining feature. It is an attribute that must be present in order for the object to be classified as an example of a particular concept. Consider the following words:
Feathers, Beak, Eggs, Fly
These words describe a bird. Hold on a minute. Aren’t there at least two birds that don’t fly? Penguins and ostriches don’t fly, but they’re still birds. So, flying is not a defining feature of a bird because animals don’t have to have that attribute in order to be considered a bird. However, most birds do fly, so “flying” is what is called a characteristic feature or attribute. It is an attribute that most, but not all, members of a concept group possess.
Think about a chair. Try to imagine and picture a chair. Now, describe it. (Describe it to someone else or you’ll look funny describing an imaginary chair to yourself.) Your imagined chair probably consists of wood, four legs, a rectangular or square seat, and a back constructed of two vertical supports on each side of the seat connected by a couple of horizontal slats. This is a typical chair. It’s common. In fact, it may be considered as a prototypical chair. A prototype is the most typical example of an object or event within a particular category. It is the quintessential example of the concept being represented.
Thinking is much more complex than simple one-word descriptions. When developing a thought, single word concepts are combined into sentence-long concepts, sentence-long concepts into paragraph-long concepts, and so on. In other words, attributes are combined into concepts. Concepts are combined into propositions. Multiple propositions are combined into mental models. Finally, mental models are combined into schemas, which are used to represent the world in a language of thought.
For example, check out this process, which continuously builds upon itself:
Proposition: “War is hell” is a combination of two related concepts, war and hell being related.
Mental model: Clustering thoughts help you understand how things relate to each other:
• War is hell.
• World War II was a war.
• World War II was hell.
Schemas: Organizing mental models into larger groups form basic units of understanding that represent the world. An example here is “Some of the soldiers who fought in WWII experienced psychological trauma. Some people believe that this was due to the extreme nature of war. Some people have even said that war is hell.”
Another example may be to consider the concept “book.” Combine the concept of book with another concept, like reading. Then connect those concepts to another concept, like library. Now you have three related concepts: book, reading, and library. This set of concepts may form the proposition of studying (as opposed to reading for pleasure). You can then embed studying into larger divisions, or schema, of school or a certification process.
Concepts are formed from co-occurring features encountered in experience. They represent ideas in relation to each other. That is, in order to understand, comprehend, or grasp the meaning of a concept, the representational mind relates concepts to other concepts.
For example, anyone who regularly interacts with kids will tell you how difficult it is to explain certain concepts to a curious toddler because it’s tough to find words she already knows and can use as a reference.
Child: “What’s a computer?”
Parent: “A computer is like a . . . a . . . um . . . TV . . . um . . . but . . . um . . . you can type on it.”
Child: What’s type mean?
See where this is going?
Some cognitive scientists and psychologists have found their way out of this concept trap by suggesting that all concepts are innate and inborn. One of the more interesting and fruitful approaches to this problem comes from the theories of embodied cognition (EC) and embodied simulation (ES).
The core idea behind EC and ES is that the mind comprehends or grasps concepts through a process of simulation that uses the motor and perceptual parts of the thinker’s brain to represent the experience. This simulation allows for comprehension because people understand concepts through reference to the bodily experiences they associated with them.
“In over your head”
“Slap in the face”
“Eye-opening experience”
All capture the essence of EC and ES. You understand the meaning of these phrases in terms of bodily experience. Bodily, sensory, and motor experiences are the meaning; they are the experience, the concept. If I want to understand a new concept, I use my body, sensory, and motor experiences to come to an understanding. I understand “eye-opening experience” because I have opened my eyes and know (because I’ve experienced it) what that means.
Interestingly, EC/ES proponents say that the parts of the brain that you use to actually open your eyes, or move your arm, or see the sunrise are the same parts of that you use when you conceptualize the meaning of “eye-opening” and other such phrases. Ultimately EC/ES are relative newcomers to the cognitive psychology arena, but they hold a lot of promise and are being hotly researched.
Making decisions
Go left and get there five minutes late. Go right and maybe get there on time; it’s a risk because sometimes there’s major traffic that way. It may mean you’ll be 20 minutes late. For many people, the morning commute is a daily problem. How can you get to work on time with the least amount of travel time, encountering the least amount of traffic, and experiencing the least amount of stress? You have to solve this problem, and part of solving this problem requires you to make some decisions. Perhaps you’ll use reasoning to choose your course; maybe you won’t.
People solve problems all day long and make hundreds, if not thousands, of decisions every day. In fact, a condition related to mental exhaustion is called “decision fatigue” — simply having too many problems and making too many decisions in a given day. Some choices are life or death, some less so; but it all adds up.
Decision making is the act of choosing an option or action from a set of options based on criteria and a strategy. The study of decision making is really a complex cross-discipline science in and of itself, spanning economics, political science, computer science, management and business, and marketing. And I don’t know about you, but sometimes I just flip a coin.
I once worked at a job in which the most important decision made all day long (at least as perceived by my co-workers) was where to go for lunch. We had a “Wheel-O-Lunch” we’d spin; where it stopped was where we’d go — theoretically. Usually we ended up discussing and ultimately overriding the wheel’s “choice” due to recollections of bloating, large bills, or “I had that for dinner last night” qualifiers and caveats. This illustrates a point well discussed in decision-making research: There is more than one way to make a decision, and most people use a variety of approaches.
Choosing
Flipping a coin is one way to make a decision, but that’s not really a cognitive process, is it? It’s simply a way of letting chance choose for you — it’s not really choosing at all. But people do make choices by using processes such as intuitive decision making which refers to choices based on what is most easy, familiar, or preferred. I’m sure you can see how this may work some of the time, but sometimes a decision to do what is preferred is the wrong choice. Just think about the last time you ended up eating that fourth piece of cake only to regret it later.
Decisions can also come from a more scientific approach based on empirical evidence through trial and error, experiment, estimation, experience, or consultation with an expert. Consumer Reports provides experimental evidence to tell you which blender to buy or which brand of deodorant to use.
If you are pressed for time or need to make a lot of decisions with limited resources, then your choice for choosing may be to use a heuristic, which is a mental short cut based on principles, rules, maxims, and so forth. Ethical decision making can be considered heuristic, choosing based on a code of ethics. My religious beliefs may be factored in to my decision making.
Amos Tversky and Daniel Kahneman studied heuristic decision making and identified different types of this model. Here are two commonly used heuristics:
Representative heuristic: Making a choice based on the situation in question being similar to another situation. If you were lost in the forest while on a horse-riding outing you may try to retrace your tracks to find your way out. You start looking for your tracks and by using the representative heuristic, decide that the tracks you find are, in fact, horse tracks because you are familiar with and know what horse tracks look like. Too bad you don’t know what bear tracks look like!
Availability heuristic: Making a decision based on how easily or readily available information is. This is the “first thing that comes to mind” approach to choosing. News agencies are guilty of spreading the use of this heuristic. Oftentimes they all report on the same story or similar stories so the most recent story of the latest fad diet is fresh on your mind. The next time you choose a diet, let the heuristic choose for you, fad all the way!
Reasoning
Reasoning is a thinking process that involves two basic components:
Premises: These are statements about some object or event that support a conclusion. Premises declare some state of affairs such as, “All fire trucks are red.” Another premise may be, “My dad drives a fire truck at work.”
Conclusions: The points derived from the premises. They are only valid if they can be logically or reasonably drawn from the premises. A logical conclusion for the premises stated here may be, “My dad drives a red truck at work.”
Reasoning is the act of drawing conclusions based on the truth of the premises that precede the conclusion. Reasoning can help people figure out if their conclusions are valid or if they make logical sense. When arguments make logical sense, reasoning is good. It makes logical sense that my dad drives a red fire truck at work because this follows from the premises.
But what if it went like this: All fire trucks are red. My dad’s truck is red. Therefore, my dad’s truck is a fire truck. This is not logical because the first premise doesn’t state that all trucks are red, only that fire trucks are red. So, other trucks can be red, including fire trucks. My dad may drive a red Toyota. Logic is like a measuring stick for verifying our reasoning.
There are two basic types of reasoning:
Inductive: In inductive reasoning, you begin with making observations (the premises) in order to collect facts to support or disconfirm (validate) some hypothetically stated outcome or situation (the conclusion).
Consider the following:
Monday it rained.
Tuesday it rained.
Therefore, I conclude that Wednesday it is going to rain.
This is an example of inductive reasoning. Two observations or premises are used to predict a third outcome. I think my local weather person uses inductive logic to make his forecasts, not the million-dollar computer technology that the TV station advertises.
Deductive: Deductive reasoning uses premises that claim to provide conclusive proof of truth for the conclusion. A conclusion based on deductive logic is by necessity true provided that it begins with true premises. Deduction often begins with generalizations and reasons to particulars.
Consider the following example of deductive reasoning:
All men should be free.
I am a man.
Therefore, I should be free.
The conclusion follows logically from the two premises. It has to be that way based on what is stated in the premises. Here’s an example of a false conclusion:
All chickens lay eggs.
My bird laid an egg.
Therefore, my bird must be a chicken.
This is false because the first premise refers to a subset of the larger category, birds. The second premise includes this larger category, and therefore refers to some events not covered by the first premise. If you turn the two premises around, you can create a logically valid argument:
All birds lay eggs.
My chicken laid an egg.
Therefore, my chicken must be a bird.
People make decisions in some pretty haphazard ways. What happened to deliberation, to thinking things through, to being rational? Rational decision-making models are all based on the assumption that people make decisions after weighing the costs and benefits of options and ultimately choosing the option in which the benefits outweigh the costs. These costs may include factors such as utility, risks, functionality, and quality.
The famous psychologist Herbert Simon proposed that although humans can be rational decision makers, this rationality has bounds, or limits. In his theory of bounded rationality, Simon proposes that because the environment is so complex, a prospective decision maker cannot possibly rationally weigh out all the options in order to come to the optimal decision. So, rational decisions are bound; therefore, decisions must be based on limited information, short cuts, and reasonable estimation. Simon does not lament this situation; he states that bounded rationality is a fact of cognition and the human mind, and it usually results in relatively solid and good decisions.
Dan Ariely in his well-known book Predictably Irrational: The Hidden Forces that Shape Our Decisions (2008), further addressed the concept of limited rationality. He experimentally identified many situations in which decisions are not only based on incomplete information but also are sometimes downright irrational and based on nonoptimal information. Bottom line, people make irrational decisions. What’s interesting is that Ariely shows that people make irrational decisions in predictable ways:
Relativity: Sometimes a choice between two options is based on the relation between them, not on the absolute quality of each choice. For example, in US presidential elections, many people say they dislike both candidates, but end up picking the one they determine is “less bad.”
It’s Free: Free things are good right? Well, not always; but sometimes a less good choice is made (irrational) because of its free status. Waiting in line for three hours to get a free gift with purchase isn’t always worth the three hours of waiting.
Hot and Cold: Decisions made when you’re emotionally aroused are different from the choices you make when you’re calm. This is an oldie but goodie when it comes to common sense and decision making; yet people violate it all the time. Decisions made when emotions run strong can be predictably irrational.
Supposedly, reasoning and the ability to solve problems logically are two of the primary abilities that set humans apart from animals. In case you’re wondering, humans can reason, animals can’t. I know that this fact may be up for debate, especially if you consider all the human behavior and decisions that are clearly not based on reason, but bear with me. I’m talking about capability, not performance.
Solving problems
Problem solving sounds pretty straightforward. You have a problem, and you solve it. Did you ever watch that television show MacGyver from the 1980s? MacGyver could solve just about any problem that came his way. He could turn a toothpick into a Jet Ski or a rocket launcher. I’d sit back and watch in amazement, and then I’d get out my trusty toolbox and dismantle the toaster, trying to turn it into a satellite receiver — four hours later, I’d just have a pile of parts and no way to make toast. MacGyver clearly had better problem-solving skills than me.
Newell and Simon (1972) are pretty much the godfathers of problem-solving psychology. Nearly every research study on the topic cites their study. They defined these basic steps of the problem-solving process:
1. Recognizing that a problem exists
2. Constructing a representation of the situation that includes the initial state of the problem and the eventual goal (a solution)
3. Generating and evaluating possible solutions
4. Selecting a solution to attempt
5. Executing the solution and determining if it actually works
These steps are sometimes identified by the acronym IDEAL, which Bransford and Stein formulated in 1993:
I — Identify the problem
D — Define and represent the problem
E — Explore possible strategies
A — Action
L — Look back and evaluate the effects
The world has as many problem-solving strategies as there are problems, but most people tend to use the same ones over and over again. For instance, trial and error is a popular way to solve a problem. I’ve seen young children use trial and error when trying to put shapes into their respective holes in a bucket. A child will pick up the circle block and try putting it in every cut-out until it fits, and then move on to the next block.
The trial-and-error strategy is pretty inefficient, but sometimes it’s the only strategy available, particularly for situations that have no clear definition for the problem, when part of the process is figuring out what the problem is.
Here are a couple more common problem-solving techniques:
Means-ends analysis: This strategy involves breaking the problem down into smaller sub-problems to solve to get to the end result.
Working backwards: This way to solve a problem is like taking something apart and putting it back together again in order to figure out how the object (or problem) is built.
Brainstorming: A technique that involves coming up with as many possible solutions to the problem without editing them in any way. It doesn’t matter how implausible, unfeasible, idiotic, or ridiculous the solutions are; you just put them all out there and eliminate them after you can’t think of any more possible solutions. Even my idea to have Superman use his super-cool breath to stop global warming is included in this technique.
Analogies and metaphors: These strategies involve using a parallel or similar problem that has already been solved to solve a previously unrelated problem. The Cuban Missile Crisis was like a nuclear-powered game of chicken, and whoever flinched, blinked, or chickened out first was the loser. I guess President Kennedy was pretty good at playing chicken.
Thinking You’re Pretty Smart
Acting intelligently is perhaps the pinnacle and most grand cognitive process of the human mind. After all, the cognitive processes of managing information and the various components of attention, remembering things, and so forth ought to produce something useful, right? Indeed, you can view intelligence as the collective output of human cognition that results in an ability to achieve goals, adapt, and function in the world. This is intelligent behavior.
Psychologists have been trying to figure out what intelligence is for a long time. Plenty of examples exist to support the theory that humans lack intelligence. Just take a look at those goofy home-video shows. A guy forgets to turn off the electricity before rewiring a room, or a hiker tries to feed a bear and almost becomes dinner. Maybe I’m entertained by these misfortunes of others caught on videotape because these people couldn’t have been any less intelligent. Or perhaps I feel giddy because I did not suffer their fate.
People differ in their abilities to solve problems, learn, think logically, use language well, understand and acquire concepts, deal with abstractions, integrate ideas, attain goals, and so on. This impressive list of human abilities represents some of the ideas of what intelligence actually is; these abilities are the stuff of intelligence. For a more concrete definition, intelligence can be understood as a collection of cognitive abilities that allows a person to learn from experience, adapt successfully to the world, and go beyond the information presented in the environment.
Considering the factors of intelligence
Sure, intelligence is a collection of cognitive abilities, but a unifying construct called “intelligence” that can be measured and quantified must exist, right? Psychologists think so, and they’ve been working tirelessly to test and measure intelligence for a long time. As part of this work, psychologists have developed intelligence tests and worked with militaries, schools, and corporations, trying to sort individual differences in intelligence in the service of job selection, academic honors, and promotions. From all this testing has emerged the concept of “g” as a general and measurable intelligence factor.
The g-factor is comprised of subcomponents known as s-factors. Together, the g- and s-factors comprise what is called the two-factor theory of intelligence:
g-factor: Some psychologist comes up with a test of mental abilities and gives it to a lot of people. When a score is calculated and averaged across abilities, a general intelligence factor is established. This is factor one of the two-factor theory, commonly referred to as the g-factor, or the general intelligence factor. It is meant to represent how generally intelligent you are based on your performance on this type of intelligence test.
s-factor: The individual scores on each of the specific ability tests represent the s-factors. An s-factor score represents a person’s ability within one particular area. Put all the s-factors together, and you get the g-factor. Commonly measured s-factors of intelligence include memory, attention and concentration, verbal comprehension, vocabulary, spatial skills, and abstract reasoning.
So, intelligence according to the psychometric theory is a score on an intelligence test. How can this be? Each test is made up of subtests, and typically, people who score high on one test do well on the other tests, too. This reveals a relationship among the individual abilities as measured by the subtests; the general intelligence concept underlies that relationship.
In a related theory, psychologist and intelligence research pioneer Louis Thurston (1887—1955) came up with a theory of intelligence called primary mental abilities. It’s basically the same concept as the s-factor part of the two-factor theory, with a little more detail. For Thurston, intelligence is represented by an individual’s different levels of performance in seven areas: verbal comprehension, word fluency, number, memory, space, perceptual speed, and reasoning. Thurston’s work, however, has received very little research support.
Getting a closer look
Psychologists continue to divide general intelligence into specific factors. The Cattell-Horn-Carroll Theory of Cognitive Abilities (CHC Theory) proposes that “g” is comprised of multiple cognitive abilities that when taken as a whole produce “g.” Early work by the individual contributors to CHC theory, Raymond Cattell, John Horn, and John Carroll, converged to produce a model of general intelligence consisting of ten strata with numerous individual abilities within those strata. They are as follows:
Crystallized intelligence (Gc): comprehensive and acquired knowledge
Fluid intelligence: reason and problem-solving abilities
Quantitative reasoning: quantitative and numerical ability
Reading and writing ability: reading and writing
Short-term memory: immediate memory
Long-term storage and retrieval: long-term memory
Visual processing: analysis and use of visual information
Auditory processing: analysis and use of auditory information
Processing speed: thinking fast and automatically
Decision and reaction speed: coming to a decision and reacting swiftly
Researchers continue to work with the CHC model and have developed research programs looking into adding to the 10 strata. Many professionals believe that sensory and motor abilities need to be more fully included in this theory, and researchers are looking at “tentative” factors such as tactile abilities (touch), kinesthetic ability (movement), olfactory ability (sense of smell), and psychomotor ability and speed. Wait a minute, you mean I can be a smart smeller?
Many intelligence researchers and practitioners have accepted CHC as a triumph of psychological science and the consensus model of psychometric conceptions of intelligence. It is, however, a working model, and many intelligence investigators and theorists consider CHC theory as a strong beginning or second act but not the final word on intelligence.
Adding in street smarts
Robert Sternberg developed the triarchic theory of intelligence in part to address the street smarts controversy, which holds that many intelligent people may be smart when it comes to academics or in the classroom but lack common sense in real life or practical matters. An urban myth claims that Albert Einstein, unquestionably gifted in mathematics and physics, couldn’t tie his own shoes. I don’t know if this is true or not, but Sternberg seems to agree that an important aspect of being intelligent is possessing a good level of common sense or practical intelligence. The three intelligence components of his theory are as follows:
Componential: Componential intelligence depends on the same factors measured by traditional intelligence tests (memory, verbal fluency, and so on). This is the book smarts aspect of intelligence. Sternberg emphasized that these abilities are often disconnected from ordinary life, issues, and problems. Einstein seemed to have possessed this component.
Experiential: Experiential intelligence encompasses the ability to deal with two different types of problems: new problems and routine problems. It requires the ability to recognize new problems, as opposed to everyday problems; search for and generate solutions; and implement solutions.
Contextual: Contextual intelligence is a type of practical intelligence that allows people to go about their daily lives without walking in front of cars, telling police officers to get lost, or letting the trash pile up to the ceiling. This is the street smarts aspect of intelligence that psychologists sometimes seem to lack in the eyes of their clients.
Excelling with multiple intelligences
Have you ever wondered what makes Michael Jordan such a good basketball player? What about Mozart? He wrote entire operas in one sitting without editing. That’s pretty impressive! According to Howard Gardener (1983), each of these men display a specific type of intelligence.
Gardener generated a theory known as multiple intelligences from observing extremely talented and gifted people. He came up with seven types of intelligence that are typically left out of conventional theories about intelligence:
Bodily kinesthetic ability: Michael Jordan seems to possess a lot of this ability. People high in bodily kinesthetic ability have superior hand-eye coordination, a great sense of balance, and a keen understanding of and control over their bodies while engaged in physical activities.
Musical ability: If you can tap your foot and clap your hands in unison, then you’ve got a little musical intelligence — a little. People high in musical intelligence possess the natural ability to read, write, and play music exceptionally well.
Spatial ability: Have you ever gotten lost in your own backyard? If so, you probably don’t have a very high level of spatial intelligence. This intelligence involves the ability to navigate and move around in space and the ability to picture three-dimensional scenes in your mind.
Linguistic ability: This is the traditional ability to read, write, and speak well. Poets, writers, and articulate speakers are high in this ability.
Logical-mathematical ability: This intelligence includes basic and complex mathematical problem-solving ability.
Interpersonal ability: The gift of gab and the used-car salesman act are good examples of interpersonal intelligence. A “people person” who has good conversational skills and knows how to interact and relate well with others is high in interpersonal ability.
Intrapersonal ability: How well do you know yourself? Intrapersonal intelligence involves the ability to understand your motives, emotions, and other aspects of your personality.
Any one of us can have varying degrees of Gardener’s intelligences. I may be one heck of a baseball player and a singing math whiz, but I may be unable to carry a conversation, get lost walking home from the grocery, and have no idea how I feel about all that.
Making the grade — on a curve
Psychologists like to measure stuff, especially stuff related to human behavior and thought processes, like cognitive abilities. Measuring and documenting individual differences is at the core of applied psychological science (find more on applied psychology in my free online article "Applying Psychology For a Better World" at www.dummies.com/extras/psychology).
Whether you ascribe to CHC, Sternberg’s model, or the concept of multiple intelligences, don’t forget the concept of average. Intelligence is considered to exist in the human population along what is called a normal distribution. A normal distribution is essentially a statistical concept that relates to the ultimate range of any particular trait or psychological phenomenon across a population.
Individuals vary in how intelligent they are. A normal distribution (see Figure 6-1) is established by assuming that if the full population took an intelligence test, most people would center around average scores with some variation — from slightly less than average to slightly higher than average. A normal distribution is also referred to as a Bell Curve because it looks like a bell with a bulky center and flattening right and left ends. Most people are somewhere in the range of average intelligence. Increasingly fewer people are at intelligence levels that are closer to the highest and lowest ends of the spectrum.
Figure 6-1: Normal distribution.
At the high end of the intelligence curve are people considered intellectually gifted; at the low end are those considered intellectually disabled (see Chapter 13 for more on intellectual disability).
Shining bright
Einstein was a genius right? So what exactly is a genius?
Psychologists typically refer to super-smart people as intellectually gifted rather than use the term genius. But there is no uniform cutoff score on an intelligence test to determine giftedness. An average standard intelligence score is 100 and, generally speaking, any score above 120 is considered superior. Giftedness is granted to people in the top 1 to 3 percent of the population. That is, out of 100 people, only one, two, or three are considered gifted.
Many psychologists are wary of defining giftedness in such numerical and statistical terms and warn that cultural and societal context must be factored in. One culture’s genius is another culture’s madman? I’m not sure it’s that dramatic, but it is important to consider that giftedness is multifaceted and not so easily tied to a cutoff score.
Numerous attempts have been made to pin down a definition of intellectual giftedness. Eminent American psychologist Robert Sternberg proposed that giftedness is more than superior skills related to information processing and analysis; it also includes superior ability to capitalize on and learn from one’s experiences to quickly solve future problems and automatize problem solving. He proposed that gifted people are especially skilled at adapting to and selecting optimal environments in a way that goes beyond basic information processing and what is considered general intelligence or “g.”
Researchers continue to examine the concept of intellectual giftedness, and one consistent finding is that gifted individuals have stronger metacognitive skills, or knowledge of their own mental processes and how to regulate them. These three specific metacognitive strategies are often used by gifted individuals:
Selective Encoding: Distinguishing between relevant and irrelevant information
Selective Combination: Pulling together seemingly disparate elements of a problem for a novel solution
Selective Comparison: Discovering new and nonobvious connections between new and old information
Figuring Out Language
Intelligence is definitely a crowning achievement of cognition, bringing together component processes to produce learning and adapting beings, you and me! But stopping with intelligence sells human cognition short because another amazing and sophisticated set of processes accomplishes the unified goal of language.
The human mind produces, utilizes, and comprehends language. It is one of the most sophisticated and unique cognitive abilities that humans possess. Yes, other species communicate with sounds and have “language” of sorts — think whales and birds — but when was the last time a dolphin told a story or wrote the sus scrofa domesticus (pigs) equivalent to Romeo and Juliet (would that be Hamlet?)
Language as a cognitive process has been extensively studied and hotly debated, and continues to be a central focus of cognitive psychology. The study of language in general is called linguistics, and the psychological study of language is called psycholinguistics.
Babel-On
Perhaps one of the most amazing things about language is that you and I ever learn how to do it. Babies don’t come out talking. They take their time, absorbing and learning. Eventually sounds, words, sentences, paragraphs, stories, and treatises all come out.
In Chapter 12, I introduce language developmental milestones to give you a sense of what kinds of language should be happening when. This section describes the underlying cognitive models of how language develops in the first place.
Numerous models of language development exist in the fields of linguistics and psycholinguistics, but the three most prominent are Nativist, Behaviorist, and Interactionist.
Nativist
Noam Chomsky, a philosopher, linguist, and political thinker, is the foremost proponent of nativist theory. The core of the nativist argument is that language is innate, essentially inborn, and wired into your DNA and brain development. In many ways it simply unfolds as does the development of the brain, liver, pancreas, and that weird-shaped birthmark on your back.
The rules of language are inborn in what Chomsky calls Universal Grammar. All speakers in the world, regardless of individual language differences possess this universal grammar as part of their human genetic endowment. Children can even create their own languages or various forms of slang with its own grammatical rules and structure because of something Chomksy calls the language acquisition device — an inborn cognitive module or mechanism that is triggered by language in the environment. Note that, according to his view, the module is only triggered; there is no learning from the environment going on.
Behaviorist
The behaviorist model says language is learned. People learn it by observing speakers in the world and through classical and operant conditioning processes (see Chapter 8 for more on classical and operant conditioning). Evidence in support of the behaviorist perspective includes the fact that it can take months and sometimes years for a person to develop proper language skills. Behaviorists believe this illustrates the learning process.
Interactionist
Nature or nurture? Both! Language is innate and learned according to this model. A prominent version of interactionist theory is the social interactionist approach, which holds that parents and mature language speakers model and provide a learning scaffold for language learners, guiding them toward mature and correct language use through social interaction.
Sounds, bites, pieces, and pieces
To understand how the mind does language, researchers have broken the it into different parts. Language, it seems, can be understood in terms of the rules of its use, known as grammar. Grammar is divided into three parts:
Phonology: the smallest units of speech (phonemes) that determine how sounds are used to make words
Syntax: ways in which words and phrases are combined to make sentences
Semantics: meaning
An estimated 800 or so different phonemes exist in the human language family. English speakers use 52; some languages use well over 100. Think about how various languages sound. Some seem faster. Some use guttural sounds; other seem smooth and airy. Different languages have different sounds and phoneme (the smallest distinct unit of sound in a language) use.
Consider this sentence: Store the to Jon went. Does that sound right to you? Probably not, because the rules of syntax determine how words are put together in a way that makes sense in a given language. Also, subtle changes in positions of words within a sentence can change the meaning of a particular phrase and convey a very different thought. Take the following words, for example, and see how rearranging them changes the meaning of each phrase drastically: robbed, Louis, bank, the.
Louis robbed the bank.
The bank robbed Louis.
These mean two very different things. Same words, different meanings depending on syntax. Either way, poor Louis.
I’ve noticed an interesting phenomenon when conducting cognitive testing and administering vocabulary tests. Certain wrong definitions occur over and over again. I ask a person the meaning of the word “yesterday,” and she answers, “The things I did yesterday.” Now, I’m not sure what that means per se, but the fact that this definition, albeit not really the right one, occurs again and again is fascinating. To these people, “yesterday” refers to the set of actions he or she engaged in on the previous day. The word means something different to them than it does to me.
Because I don’t really know what they are talking about, we are not really communicating or understanding each other. The other person is not responding to the semantic rule of grammar that determines the shared and agreed upon definition of the word “yesterday.” Semantic rules determine that the meanings of words are universal or widely understood and agreed upon, which makes it possible to communicate. When you say “elephant,” I know you’re talking about a large animal with a long nose because of semantics. This all applies to signed languages, too, which have their own grammar and which will develop in the same way as spoken languages provided there is a rich signing environment around a deaf child.