Vous êtes sur la page 1sur 5

Class 1

Illusions in reasoning

Psychologists, philosophers, cognitive scientists, economists and computer scientists all take an interest in how human beings reason. In other words, they all take an interest in how people like us form beliefs about the world or about themselves and how they make decisions. The experts usually distinguish theoretical from practical reasoning. Theoretical reasoning (also called theoretical rationality) concerns the way we form beliefs and especially the way we form beliefs on the basis of evidence derived from other beliefs. To explain: each of our beliefs is formed in one of two ways. (1) Non-inferentially: some beliefs are just immediate to us; they are formed via the information we receive through our senses. They are not inferred from other beliefs. For example, our basic perceptual beliefs, such as my belief that there is a tree outside this window, are probably formed in this way. (2) Inferentially: many of our beliefs are formed on the basis of (that is, inferred from) other beliefs. The inferential processes required for such belief formation are the ones that usually feature in discussions of theoretical reasoning. For example, lets imagine that I believe its Wednesday today. Lets also imagine that I believe I have a class every Wednesday at 1 oclock. Well, because I believe both these things, I had better believe that I have a class at 1 oclock today. Logic demands that I hold this last belief. Furthermore, if I dont believe Ive got a class at 1 oclock today, Im likely to get myself into some practical difficulties. What I believe in this case matters for how things go for me. (Not all our beliefs matter, though; it is likely that you have beliefs about things that have no impact on decisions that you will ever be required to make. What about your belief (or lack thereof) about aliens building the pyramids, or the existence of life on Europa? What other examples of unimportant beliefs can we come up with?) These issues about the logical processes involved in weighing up evidence and forming new beliefs are issues about theoretical reasoning. Practical reasoning (also called practical rationality) concerns the way we make decisions regarding how we will behave. In particular, people who study practical reasoning are interested in how our beliefs and desires combine to produce intentions to act. For example, lets carry on the mundane (well, Wednesdane) story. I believe that I have a class at 1 oclock. Lets say that I also want very much to be in that class (perhaps they serve drinks or give away twenty dollar notes in that class). Well, it makes good sense for me to say that because of what I believe and desire, I intend to be at the class at 1 oclock today. In other words, I intend to perform the action of getting myself along to the class at 1 oclock today. This is all very obvious, but its worth spelling out because it reminds us that what we do how we behave is a product of what we believe and what we desire.

(Dont be misled by the terms theoretical and practical. They are just the names given to two kinds of reasoning. Both kinds of reasoning are equally practical, in the standard sense of that word. This course focuses, as it happens, on theoretical reasoning - on how we do and how we should form new beliefs on the basis of old ones. Bear in mind, though, that any new beliefs we form via the processes of theoretical reasoning are then available as inputs into our intentionforming processes and can help to determine which actions we perform. So theoretical reasoning is often used in the service of practical reasoning.) Cognitive psychologists are interested in how we do reason. What beliefs do we form? What decisions do we make (that is, what intentions do we form)? And why? Philosophers of rationality, on the other hand, are interested in how we should reason. What kinds of belief-forming and decision-making processes are best suited to helping us find our way through the world and achieving our aims?

The Problem
People make a lot of mistakes when they form beliefs and intentions. Not only that, they make systematic mistakes - ones which psychologists can predict they will make (thats why they are systematic) and which philosophers can show to be not in their best interests (thats why they are mistakes). Heres an example of a common mistake from the field of theoretical rationality. (There will be many more examples throughout the course.) Lets pretend that I have a son and that Hilda is a young woman whom I would like him to marry. I might see my son talking in an animated way to Hilda. I might see them together often. I would be right to believe that they hold animated conversations and that they are together often. But would I be right to conclude from this that they are interested romantically in each other? Probably not, at least, not on the basis of only this evidence. But wishful thinking can take us the extra mile and I might be led to conclude that they are interested in each other on the basis of this slender evidence, just because this is the outcome I want. My desire for them to get together might blind me to other explanations for the enthusiasm they seem to show for each other. (What other explanations might there be?) Daniel Gilbert is a professor of psychology at Harvard and he has this to say about wishful thinking. (From Im O.K., Youre Biased By Daniel Gilbert, New York Times: Opinion, April 16, 2006.) The human brain knows many tricks that allow it to consider evidence, weigh facts and still reach precisely the conclusion it favors. When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didnt misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesnt, we subtly tip the scales in our favor. Research suggests that the way we weigh ourselves in the bathroom is the way we weigh evidence outside it. Two psychologists, Peter Ditto and David Lopez, told subjects that they were being tested for a

dangerous enzyme deficiency. Subjects placed a drop of saliva on a test strip and waited to see if it turned green. Some subjects were told that the strip would turn green if they had the deficiency, and others were told that the strip would turn green if they did not. In fact, the strip was just an ordinary piece of paper that never changed color. So how long did subjects stare at the strip before accepting its conclusion? Those who were hoping to see the strip turn green waited a lot longer than those who were hoping not to. Good news may travel slowly, but people are willing to wait for it to arrive. The same researchers asked subjects to evaluate a students intelligence by examining information about him one piece at a time. The information was quite damning, and subjects were told they could stop examining it as soon as theyd reached a firm conclusion. Results showed that when subjects liked the student they were evaluating, they turned over one card after another, searching for the one piece of information that might allow them to say something nice about him. But when they disliked the student, they turned over a few cards, shrugged and called it a day. Heres another example of a systematic mistake in reasoning, this time from the field of practical reasoning. Suppose somebody buys me a $30 CD voucher as a gift. I dont normally buy CDS; I usually download from I-tunes and share music files. But thats ok; there is plenty of my kind of music on CD. However, this is a one-off. I am not likely to be buying another CD in the foreseeable future, not least because I never have much money to play with. The first thing I notice in the shop is that there arent any CDS which cost exactly $30. Thats a pain because I know they wont give me change for my voucher. I narrow the choice down to two options: the latest outing by the Cheapskates, selling for $28.95 and the Dearstalkers debut album, selling for $32.90. I would be equally happy to own either CD, so which should I buy? I reason as follows. If I get the cheaper one, I wont get any change. Thus I will in effect be spending thirty dollars on a $28.95 item, so I will be wasting some of the vouchers buying potential. In order to make the most of the voucher, I should buy the dearer CD and pay the extra $2.90 out of my own money. So thats what I do. I buy the dearer CD. Arguably, I made the wrong call. Remember: I am equally happy to own either CD. The voucher was a gift, so if I exchange it for the cheaper item, I have a free CD. It cost me nothing and I am happy to own it. On the other hand, the dearer item cost me $2.90 of my own cash, so although I have a CD I am happy with, it comes at a small price. So although theres not much money involved, there is a good reason for thinking that the cheaper option is the better option. (What do you think? The people who sell CD vouchers know all about this kind of purchasing decision, of course.) Michael Scriven and Richard Paul, who have written extensively on these issues, summarise our general problem as follows: Everyone thinks; it is our nature to do so. But much of our thinking, left to itself, is biased, distorted, partial, uninformed or down-right prejudiced. Yet the quality of our life and that of what we produce,

make, or build depends precisely on the quality of our thought. Shoddy thinking is costly, both in money and in quality of life. Excellence in thought, however, must be systematically cultivated. (Scriven & Paul, www.criticalthinking.org/about CT.) A well-cultivated thinker, write Scriven and Paul, (1) raises vital questions and problems, formulating them clearly and precisely; (2) gathers and assesses relevant information, using abstract ideas to interpret it effectively, comes to well-reasoned conclusions and solutions, testing them against relevant criteria and standards; (3) thinks open-mindedly within alternative systems of thought, recognizing and assessing, as need be, their assumptions, implications, and practical consequences; and (4) communicates effectively with others in figuring out solutions to complex problems. These are important character traits, and acquiring them will prove to be useful in virtually any academic discipline, any profession and any domestic or commercial environment in which making the right decision matters. Crucially, making the right decision often requires avoiding the systematic errors we have been describing. As noted above, psychologists know that we make these sorts of mistakes and they describe patterns of thought in which our reasoning is highly susceptible these systematic illusions. (Psychology is, after all, the scientific study of how the mind works--scientific, because its theories must be testable against experiments and other controlled observations.) Philosophers, meanwhile, come up with accounts of what good reasoning is like. (Philosophy is, roughly, the study of fundamental questions which cannot (as yet) be usefully investigated by testing theories against the results of experiments and other careful observations. Philosophers, typically, must rely on reasoning alone to corroborate their theories, so they need a good account of good reasoning.) It has been common since the 1960s to apply the term critical thinking to what we are calling good reasoning. Here are Scriven and Paul again. Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action". (ibid.) Scriven and Paul distinguish two components of critical thought--of good reasoning: (1) A set of information and belief generating and processing skills. (ibid.) (This is excellence in theoretical reasoning.) (2) The habit, based on intellectual commitment, of using those skills to guide behavior. (ibid.) (This is excellence in practical reasoning.) Wouldnt it be great if we could get people to reason in accordance with principles of good reasoning? Wouldnt it be wonderful if we could teach a way of

thinking which would guarantee that we formed true beliefs and performed right actions most of the time?

Our Brief for this Course


It would indeed be wonderful if we could teach methods for reasoning correctly. Unfortunately, psychologists investigations into the way we reason and the way we learn are not sufficiently advanced to ensure a reliable result. What we can do is explore the principles of good reasoning (of Critical Thinking, if you like) that are well understood. We can also explore the sorts of systematic mistakes that people make the fallacies they commit. Hence, our brief in this course is to: (1) Investigate the principles of good reasoning that philosophers have identified. (2) Investigate the systematic and common failures in reasoning that psychologists have identified. (3) Consider how the principles of good reasoning can help us avoid the systematic failures in reasoning. The hope is that we will start to reason according to the principles of good reasoning more often than we do now and that we will learn to avoid the fallacies. That will, among other things, make us less susceptible to marketing ploys, less manipulable by politicians, better able to engage in public debate, more useful on juries and more likely to make smart decisions. That, at any rate, is what we hope to achieve. The first step is to get some basic ideas from logic under our belts so that we can formulate our principles of right reasoning. Then we will start exploring the fallacies the systematic divergences from good reasoning patterns.

Vous aimerez peut-être aussi