Show Summary Details

p. 434. How do we use what is in the mind? Thinking, reasoning, and communicatinglocked

  • Gillian Butler
  •  and McManus Freda

Abstract

‘How do we use what is in the mind? Thinking, reasoning, and communicating’ introduces Daniel Kahneman's two systems of thinking, distinguished by their use of energy and different benefits. In order to function well and adapt as we go, we need to achieve a balance between knowing when to snap into action and when to stop and think. Psychologists studying problem-solving have been particularly interested in the way it is influenced by past experience — by information stored in memory. Inevitably, the two systems of thinking influence the ways in which we communicate. Humans have feelings as well as thoughts, which may help us to understand why we do the things that we do.

Behaving thoughtlessly, not stopping to think, being unreasonable or illogical, and being unable to explain our reasons for doing something are failings to which everyone is susceptible. The assumption is that when we do these things we are failing: we should think before we act, be thoughtful and reasonable, and be able to communicate clearly. However, psychologists have made some rather surprising discoveries about this type of behaviour and some of the more recent discoveries are outlined next. In 2002, Daniel Kahneman, described by the philosopher George Pinker as ‘among the most influential psychologists in history’, was awarded a Nobel prize—in economic sciences. Why should this prize be given to a psychologist? The answer is to be found in his work on thinking and reasoning. This work helped to define the new field of behavioural economics to which various governments in the Western world now pay close attention. For instance, in 2011 the UK government established a number of behavioural insight teams whose task is to apply specific psychological findings to the business of government. Behavioural economists study the effects of social, cognitive, and emotional factors on the economic decisions of individuals and institutions, and these of course are subject to the same influences as other p. 44judgements and decisions that we make, and are also central subjects for psychologists.

In Chapters 2 and 3 it was argued that what gets into the mind and what stays there subsequently is not solely determined by the nature of objective reality, but also by the processes involved in perception learning and remembering. If we can make sense of what we perceive, recall information when it is needed, and use it when we think, reason, and communicate, then we can make plans, have ideas, solve problems, imagine more or less fantastic possibilities, and tell others all about it.

Thinking: the costs and the benefits

Thinking uses up energy, and this energy and our use of it has evolved until it is now super-efficient. The human brain runs on about a quarter of the power of an 80–100 watt bulb, controlling everything we do: our senses, movements, internal states such as digestion, and our thinking. The pressure has been to evolve systems that only invest in energy use when there is something to be gained—hence perhaps the relevance to economics. One of Kahneman's most important suggestions is that we have evolved two systems of thinking which are distinguished by their use of energy: a fast one that costs us little, and a slow one that costs much more. These two systems also have different benefits.

System 1 is the fast, intuitive, and automatic one. The benefit is that it takes minimal effort; the cost is that it means we take short-cuts and make mistakes. Much of the time it works well: complete the sentence ‘bread and … ’; add these numbers 2 + 2 = … ; decide when to overtake the car in front; recognize at a glance that someone is angry. These tasks take little effort or voluntary control, as we can do them without thinking—once we have acquired some linguistic, mathematical, driving or social skills. System 1 helps us to make intuitive judgements—so we escape from danger without having to think about it—and it p. 45makes use of the information we have learned and collected in associative memory—so we can distinguish surprises from normal events and automatically use acquired skills. Although System 1 serves us well most of the time, it also means that we jump to conclusions, relying on rules of thumb or ‘heuristics’, which introduce biases into our thinking and reasoning.

System 2 is slow, deliberate, and effortful thinking. It brings with it benefits of conscious thinking and rationality, but has high energy costs. It requires attention, of which we have limited capacity. As we saw in Chapter 2, when our attention is fully occupied in counting the passes made by basketball players we are blind to the gorilla passing behind them. System 2 thinking: deciding which car to buy, filling in the tax return, explaining how to work the DVD player, etc., is tiring. It can also be moderately aversive, as our brains have evolved to maximize conservation of energy and conscious thinking demands application. The ‘law of least effort’ operates throughout our thinking processes, and tells us that, if there are several ways of doing the same thing, people will gravitate towards the easiest one, which is why Kahneman described System 2 thinking as lazy. Paying attention helps us acquire and use our knowledge, but this comes at a cost—quite literally.

The division of labour between System 1 and System 2 is highly efficient. For example, System 1 runs automatically and generates intuitions or impressions and System 2 switches on when effort is needed. For example, I can easily guess whether I have been given the right change, but in order to check I have to stop and think. This division of labour can draw on culturally and personally relevant information also: for example, I immediately realize that my colleague is unwell and think about how to take the pressure off her during the meeting. System 2 is interrupted if attention is drawn away from it. Sometimes this is efficient: you lose the thread in the conversation when you hear the child yell (automatically alerted by System 1), at other times it is disruptive: p. 46you lose track when irritated by the munching of the person next to you in the cinema.

To make efficient use of a short supply of energy we tend to rely on System 1 whenever possible, and this accounts for some common mistakes. The risk we run is illustrated by one of Kahneman's examples. Read the following simple puzzle and allow your intuition to come up quickly with an answer.

A bat and a ball cost $1.10 The bat costs one dollar more than the ball How much does the ball cost?

What number did you come up with? The quick and easy (System 1) answer is 10 cents. But this intuitive answer is wrong: it leaves the bat costing $1.00 which is only 90 cents more than the ball. Engage System 2 to work it out and you will discover the correct answer. As you can quickly tell, this uses more energy and—at least some—concentrated effort (solution in Box 13).

System 1 automatically makes use of associative links in the brain, including links to the body and to emotions, that subsequently influence our actions and feelings—again, in ways that we are unaware of, and therefore cannot prevent. Research on priming is revealing. For example, after being exposed to signs of money (floating dollar bills on a screensaver) people behaved differently: they became more independent, persevered longer with difficult problems, sat further away from others, and were less helpful (they picked up fewer of the pencils dropped by a clumsy research confederate). Exposed to words associated with old age in the USA (Florida, forgetful, bald) hidden in scrambled sentences and without mentioning age, young people subsequently walked more slowly down the hallway as they left the building. Even though they were completely unaware of noticing these words, seeing them influenced their actions and produced an ideomotor effect. This works the other way round as well: ask people to walk slowly p. 47and they become faster at recognizing words associated with old age. Thinking about stabbing a colleague in the back leaves people more inclined to buy soap, disinfectant, or detergent rather than batteries, juice, or candy bars. As Kahneman puts it: ‘feeling that one's soul is stained appears to trigger a desire to cleanse one's body’—the ‘Lady Macbeth effect’. It even influences links to different parts of the body: if you ask someone to tell a lie to a stranger over the phone, later they prefer mouthwash over soap. Telling the lie by email shifts their preference to the soap. These findings support theories of embodied cognition, which suggest that almost all aspects of cognition depend on and make use of ‘low-level’ facilities such as the sensorimotor system and emotions, so are rooted in the body as well as the mind. The degree to which our behaviour is influenced by System 1, thinking that is effortless, automatic, and inaccessible to reflection, is particularly well illustrated by the experiment in Box 8.

Analysis of System 1 shows that our thinking is, and always will be, subject to influences that we cannot be aware of. Indeed, thinking consciously about some activities that have become automatic (running downstairs) is remarkably disruptive. Relegating them to the subconscious increases efficiency, allowing us to do them without thinking even at the cost of occasional absent-mindedness—putting the frozen peas in the bread bin, or driving home and forgetting to make a planned detour to the postbox on the way. It leaves spare thinking capacity for more important matters. The study of such cognitive failures (e.g. absent-mindedness) shows that they increase with stress, fatigue, or confusion, and can be reduced by ‘stopping to think’.

Non-conscious mental activities demonstrably affect our thinking even though they remain outside awareness. Solutions to problems, or creative ideas, may pop into our heads apparently without previous thought, enabling us to see new ways forward: how to negotiate a deal or secure a broken window. More p. 48

Box 8 Hidden influences on our behaviour

Near to an honesty box in which people placed coffee fund contributions, researchers at Newcastle University in the UK alternately displayed images of eyes and of flowers. Each image was displayed for a week at a time. During all the weeks in which eyes were displayed bigger contributions were made than in the weeks with flower images. Over the ten weeks of the study, contributions during the ‘eyes weeks’ were almost three times higher than those made in the ‘flowers weeks’. It was suggested that ‘the evolved psychology of co-operation is highly sensitive to subtle cues of being watched’, and that the findings may have implications for how to provide effective nudges towards socially beneficial outcomes.

In this ‘real world’ field study, the honesty box was run by someone likely to be known to the participants, which might have influenced their behaviour. A follow-up study measuring the amount of litter left in a large cafeteria also found that people were less likely to litter in the presence of posters of eyes than of flowers, and that their behaviour was independent of whether the posters exhorted people to clear up or displayed unrelated messages. There appears to be good support for strong links between images of eyes, the sense of being observed, and the decision to engage in this type of cooperative behaviour.

Bateson, Nettle, and Roberts (2006); Ernest-Jones, Nettle, and Bateson (2011).

surprisingly, we can also make a decision to act without being aware of doing so. Olympic sprinters can take off in less than one-tenth of a second before they can consciously perceive the sound of the starting gun, and changes in brain activity can be identified before people are aware of their intention to move. Did they make the decision to move? Or not?

Reasoning: using your head

One might suppose that reasoning using System 2 is more reliable—and allows us consciously to notice, or to ignore, information that could otherwise influence us unawares, such as the suggestion in advertisements that success comes with owning particular expensive products. We acquire the building blocks of rationality: we think with images and with words, we use concepts, create them, define them, recognize their clear or fuzzy boundaries, use them to define prototypes and to recognize stereotypes. We arm ourselves with skills needed to lead rational lives, such as methods of deductive or inductive reasoning. However, even these are demonstrably influenced by psychological processes. Deductive reasoning follows formal rules of logic, allowing us to draw conclusions which necessarily follow from the premises. From the two premises ‘everyone with fair hair has blue eyes’ and ‘Sam has fair hair’ we can validly draw the conclusion that ‘Sam has blue eyes’. The conclusion will be false if either of the premises is false (as the first one clearly is), but the reasoning remains correct. But even if we succeed in reasoning logically, biases and mistakes creep in. For example, our thinking is biased towards reinforcing our current beliefs and away from accepting information that contradicts them. Research results showing that smoking causes cancer or that the performance of a group of skilled investors on the stock market remained at chance level were unwelcome to those selling cigarettes or stocks, and (initially) difficult for them to accept.

Inductive reasoning involves drawing conclusions that are probably true even though information yet to be discovered might show them to be false. It is commonly used in everyday life: ‘Mary criticized what I said and dismissed my arguments out of hand’—‘Therefore Mary is a critical person’. It often works well but is also subject to common biases: for example, seeking out information that confirms our conclusions (or suspicions) rather than going through the effortful process of looking for p. 50disconfirming information: in this case, that I make many mistakes rather than that Mary is always critical. As William James put it, ‘a great many people think they are thinking when they are merely rearranging their prejudices’.

System 2 thinking, when the effort is put in, helps us to be rational. It enables us to work things out: to follow rules and to make deliberate comparisons, choices, and decisions. But it has limited capacity and needs to conserve energy. Its activities are determined by the workings as well as by the structure of our brains, and honed by our evolutionary history, with the result that we are averse to mental effort. So we engage System 1, employ effort-reducing heuristics, and remain subject to biases. One of the big debates, for instance in behavioural economics, has been about the degree to which we behave as rational beings, and many of the findings suggest that we do so far less than we suppose.

Anchoring is perhaps the best known, and most pervasive, cognitive bias that distorts our powers of reasoning. Kahneman and his colleague Tversky rigged a wheel of fortune, marked from 0 to 100, to stop only at numbers 10 or 65. After spinning the wheel, they asked participants in this experiment some completely unrelated questions such as ‘What is your best guess of the percentage of African nations in the UN?’. Those who saw the number 10 guessed 25 per cent; those who saw number 65 guessed 45 per cent. They were all influenced by a completely irrelevant number, and their guesses were ‘drawn’ towards whichever anchor they had seen. When valuing a house, estate agents primed with a high, but irrelevant, anchor come up with a higher value than those primed with a low anchor—and a higher starting price can suggest a higher value, and bring in higher offers. This appears to be both a matter of suggestion (about value) and a consequence of failing to adjust sufficiently despite knowing about the bias (that the asking price could be inflated). If you ask people to nod or to shake their heads, as p. 51if they were saying yes or no, they adjust less (stay closer to the anchor offered) if they nod than if they shake.

The availability heuristic involves estimating the probability of a certain type of event on the basis of how easy it is to bring to mind relevant instances. The more readily available, the more likely it will seem to us to be. So when the printer does not work I check whether I turned it on, as my usual mistake springs readily to mind. The heuristic often brings with it problem-solving advantages that outweigh its disadvantages. The main disadvantage is that there are many determinants of availability—of what springs readily to mind—such as whether information has been accessed recently, is especially vivid, or emotionally charged, and all of these factors may be logically irrelevant. People who are frightened of flying tend to overestimate the likelihood of plane crashes, but they do so more dramatically if they have recently read about a crash. Some more of the biases that influence our thinking are listed in Box 9.

In most areas of life and much of the time, we are making judgements and decisions under conditions of uncertainty. We are thinking about what to do, or what will happen without knowing the answers. Will it rain? Can I afford a holiday? How am I doing at work? We have the ability to reason logically, and to avoid some obvious sources of irrationality, and we can save energy by switching into automatic modes without putting our lives at risk (driving on the motorway while having an interesting conversation). When faced with problems both systems have their uses.

Psychologists studying problem-solving have been particularly interested in the way it is influenced by past experience—by information stored in memory. It sounds obvious that, in general, we solve problems more easily as we accumulate experience. This is known as the positive transfer effect, and it helps to explain why adults solve problems more easily than children, and experts solve p. 52

Box 9 Sources of some typical thinking errors

Base-rate neglect: Making a judgement about the probability of an outcome (that your business will succeed) while ignoring the general base rate (a 25 per cent success rate in this area).

Framing: Different ways of presenting the same information evoke different emotions (the glass is half full or half empty), and different decisions.

Halo effect, or exaggerated emotional coherence: Noticing one good (or bad) characteristic leads to the assumption that the rest are also good (or bad).

Intensity matching: Assuming that attributes that can be measured on a dimension are easily matched, e.g. Jim is as tall as he is clever.

Loss aversion: Losses loom larger to us than gains. We will work harder not to lose £500 than to win £500.

Over confidence: We overestimate the amount we know and underestimate the role of chance.

Endowment effect: The tendency to value something more highly when we own it than when someone else owns it.

them more easily than novices. Experts are better at solving a chess problem for instance, but both novices and experts benefit from a period of incubation during which they are not (consciously) thinking about the problem at all. Once a strategy for solving a problem has been identified, it may take skill to apply the strategy (rescuing the curdled mayonnaise), and reasoning skills are needed to evaluate progress. Experts are demonstrably better at recognizing patterns, retrieving relevant rules, and eliminating dead-end strategies. But experts can also fail to solve problems precisely because they use the same strategies and rules p. 53as they have used to solve previous problems. Developing a mental set prevents us having to reinvent the wheel each time but slows us up when faced with a new set of difficulties. It is remarkable how blind experts can become (see Box 10).

Box 10 Mental set

University students were presented with a problem which involved looking at a series of cards on which were written two letters A and B, and working out the ‘correct’ sequence (e.g. the letter on the left should be selected on the first card, and the letter on the right on the second card). After several ‘position sequence’ problems had been solved, the type of problem was changed so that selecting the letter A was always correct and the letter B was always incorrect. 80 per cent of the students failed to solve this, trivial, problem within 100 trials, and none of those who had failed to solve the problem selected the correct solution from amongst six possibilities.

Levine (1971).

Functional fixedness, or thinking about objects only in terms of their functions, is another kind of mental set. An envelope is something to put a letter into rather than a container for sugar when you are having a picnic. Solving the sugar problem requires thinking about envelopes in new and creative ways. Creativity has been measured in various ways: for example, by testing the degree to which people think divergently, exploring ideas freely and generating many solutions, or convergently, following a set of steps which appear to converge on one correct solution to a problem. The more uses they can think of for common objects such as a house brick the more divergent or creative they are said to be. Creativity can be used in many ways: creative people are better than others at rationalizing their actions, and have been observed to cheat more too.

p. 54We know that creativity is present at an early age: that young children can use familiar concepts in new and imaginative ways, and that environments that foster independent thinking in a safe way increase creativity. Creativity is important in the arts, in science, in the kitchen, or in the office, and it may confer adaptive advantages by fostering the inventiveness needed in constantly changing conditions. It requires flexibility of thinking and an ability to step over boundaries (see Box 11), and, surprisingly to some people, it is only weakly correlated with intelligence. Characteristics such as nonconformity, confidence, curiosity, and persistence are at least as important as intelligence in determining creativity.

Communicating: getting the point across

Inevitably, the two systems of thinking illustrated throughout this chapter influence the ways in which we communicate. Much of the time we are guided by the impressions from quick and easy System 1 thinking, so communication will be more effective if it grabs attention: using simple expressions or rhymes (‘woes unite foes’ instead of ‘woes unite enemies’), or using bright clear colours for writing. It is also subject to the same biases. People respond to requests to pay off their credit card bills differently according to the anchor with which they are presented. Suggesting a low minimum payment leads to people paying less than if a high

Box 11 The 9-dot problem

Using no more than four straight lines, and without lifting the pen from the paper, connect all the dots in the diagram below.

• • •

• • •

• • •

See Box 13 for solutions.

p. 55minimum payment is suggested, and ends with lower actual payments overall. The higher the suggested minimum payment, or anchor, the more people pay, and the greater the chance that they will pay off their bill completely.

The processes that underlie thinking clearly influence our understanding and our behaviour. The theory of linguistic relativity suggested that language fosters habits of perception as well as thinking and that different views of reality were reflected in different languages: Eskimos supposedly had many different words for snow. However this claim has been described by Steven Pinker as The Great Eskimo Vocabulary Hoax, with no supporting evidence behind it. We can understand the distinctions made in languages other than our own, but linguistic information alone does not prove the point. The experiment in Box 12 demonstrates how a combination of clear thinking, accurate observations, and cultural awareness may help to provide an answer to such questions.

Concluding points

Work on the cognitive skills involved in thinking, reasoning, and communicating is still expanding, focusing for instance on the acquisition and development of these abilities, problems arising with them, interactions between them, and associated activities in the brain. Perhaps the point to emphasize is that, in order to function well and adapt as we go, we need to achieve a balance between knowing when to snap into action and when to stop and think. If we operated entirely on the basis of logic, like a computer, we would be unable to adapt flexibly to the complexities and uncertainties of the everyday world. Hence there are still some respects in which our abilities appear superior to those of artificially intelligent machines, even though the machines may have larger memories and be able to test hypotheses faster than us. In particular, of course, we have feelings as well as thoughts, which may help us to understand why we do the things that we do.p. 56

Box 12 Does language influence the acquisition of mental skills?

Children speaking Asian languages do consistently better at mathematics than English-speaking children and their number words reflect a base-10 system (e.g. 12 is represented as ‘ten-two’). First year school children from three Asian and three Western countries were asked to stack blue blocks, representing 10 units, and white blocks representing 1 unit, into piles to show particular numbers. More Asian than Western children made two correct constructions for each number. The Asian children used two blocks representing 10 units more than the Western children, and the Western children used the single-unit blocks more than the Asian children.

Conclusion: language differences may influence mathematical skills.

The evidence is strengthened by the finding that bilingual Asian-American children also score more highly on mathematical tests than do those who speak only English.

Miura and colleagues (1994).

Box 13 Solutions

Solution to bat and ball problem: 5 cents.

Solution to 9-dot problem, Box 11: This problem can only be solved by continuing some of the lines outside the boundary of the square defined by the dots, or by breaking the ‘boundary’ in some other way: e.g. cutting the dots into three rows and arranging them in one continuous line.