Vous êtes sur la page 1sur 9

6.

3 consequential fallacy
Politics can enter discussions about science (behavior genetics, in our case) through another avenue: not by focusing on causes (motivation) but on consequences. Here the idea is that envisaged consequences of the possibleacceptance(orrejection)ofagivenscientictheoryshouldbean input in deciding whether that theory should be accepted (or rejected). This reasoning is sometimes called the consequential fallacy.8 6 Similarly, the only publication to which Peter Singer refers the reader for a critique of The Bell Curve is a paper about its tainted sources (Singer 1996: 229). 7 Actually no author has more references in Mackintoshs bibliography than Lynn. (And yes, Rushton is quoted there too.) 8 Closelyrelatedtoacceptingorrejectingaclaimbecauseofitssource(ratherthanbecause of evidence) is accepting or rejecting a claim because of the harm or good that might be 193 Making Sense of Heritability PhilipKitcher defends it with the following argument in Vaulting Ambition:9 Everybody ought to agree that, given sufcient evidence for some hypothesis about humans, we should accept that hypothesis whatever its political implications. But the question of what counts as sufcient evidence is not independent of the political consequences. If the costs of being wrong are sufciently high, then it is reasonable and responsible to ask for more evidence than is demanded in situations where mistakes are relatively innocuous. (Kitcher 1985: 9) Kitcher also defended the same position more recently: where the stakes are high we must demand more of those who claim to resolve the issue. So, it was relevant to point to the political consequences of accepting some of Wilsons claims about human nature because doing so makes us aware of the need for more rigorous arguments and greater certainty in this area. (Kitcher 1997: 280 italics supplied)

Note that Kitcher talks about acceptance of theories, not just about using orapplyingtheories,oraboutactingontheassumptionthatagiventheory is true (or false), where the expected utility approach would indeed be perfectly appropriate. His concern is not about application but about acceptance.10 What does Kitcher mean by acceptance? As Arthur Fine said, acceptance is a specication-hungry concept (Fine 1990: 109). It is nicely ambiguous, allowing for various specications (accept as true, as useful, as expedient, for the nonce, for a reductio, etc.) (Fine 1991: 87). There are three reasons to think that by accept Kitcher actually meant regardastrue.First,sincethisisacommonsenseinterpretationoftheory acceptance (cf. Worrall 2000: 349) and since Kitcher did not explain caused by holding the belief (its consequences). This fallacy has no special name (the consequential fallacy has been suggested) but it can occur whenever our desires intrude on our reasons for belief (Salmon 1995: 187). 9 Although this book is an attack on sociobiology, it is interesting that the theory that Kitcher picks out in the stage-setting, melodramatic rst chapter to illustrate political dangers of hereditarianism is not sociobiology but heritability of IQ (see section 1.5.2). As others have also noted: Sir Cyril Burts name is used to give a foul smell to research quite unconnected to his own; for instance, by Kitcher in his assessment of pop and human sociobiology (Caryl & Deary 1996). 10 The idea that politics should shape epistemic standards is also defended by Anne Fausto Sterling:Iimposethehigheststandardsofproof,forexample,onclaimsaboutbiological inequality, my high standards stemming directly from my philosophical and political beliefs in equality (Fausto Sterling 1992: 1112). 194 Science and sensitivity the meaning of that term, probably the commonsense connotation was intended. Second, since his book Vaulting Ambition was written as much for philosophers as for scientists and the general reader, it is unlikely that he used the term acceptance in some esoteric philosophical sense. Third, and most importantly, Kitcher himself connects acceptance with truth when, in arguing for the need to look at political implications when deliberating about acceptance, he says that a theory choice cannot be just about truth, pure and simple: Lady Bracknells reminder is apposite the truth is rarely pure and never simple (Kitcher 1985: 9).11 Kitchers basic worry is that the fact of scientists merely accepting a given hypothesis (regarding it as true) can have bad social and political effects.For,itisontheauthorityofsciencethatthewiderpublicmightalso

accept that hypothesis, and through a presumably complicated process it might all lead to some very bad outcome. Should we follow Kitcher and decide to raise the hurdle for accepting dangerous theories? No, for two reasons: (1) it would lead to irrational behavior, and (2) it would not achieve its purpose. 6.3.1 Irrationalism with a human face The target of our cognitive efforts (in science and otherwise) is truth. That is, by using fallible truth indicators we try to judge whether H is true or false, and the way we normally function is that only something that we regard as relevant evidence can sway us into accepting or rejecting H. Kitchersrecommendationthatpoliticalconsiderationsshouldplayarole in determining when evidence is sufciently strong for acceptance is a recipeforepistemicirrationality.Heexhortsustoover-believetheories with benecial political consequences, and under-believe theories with harmful political consequences. (The terms over-belief and underbelief come from Haack 1996: 60.) The result is that the ne Humean advice that a wise man proportions his belief to the evidence is thereby being replaced with the advice that a wise man proportions his belief (at least in part) to the envisaged consequences of his belief. It is interesting that the Kitcherian consequence-based approach to the evaluation of theories was very common in Humes time, but then the main concern was over religious and moral repercussions rather than political consequences. Hume condemned that approach in clear terms: There is no 11 Not that it matters much, but it was not LadyBracknell who said that. It was her nephew Algernon. 195 Making Sense of Heritability method of reasoning more common, and yet none more blamable, than, in philosophical disputes, to endeavour the refutation of any hypothesis, by a pretence of its dangerous consequences (Hume 1999: 160). Of course, Kitcher does not advocate the idea that the acceptance of scientic theories should always or exclusively be shaped by their consequences. Yet to the extent that he does so, he indeed urges scientists to behave irrationally. Since his proposal is inspired by good intentions it can be called irrationalism with a human face. Take a scientist who accepted H on the basis of what he regarded as adequate reasons. SupposethatherealizeslaterthathisacceptingHcouldhavesomeunwanted socialconsequences.Isitnotglaringlyobviousthat,aslongasheremained epistemicallyrational,hesimplycouldnotrejectHjustbecausethiswould be more benecial, politically? One can imagine extreme situations in which one would look at envis-

aged political consequences in deliberating whether to publicly defend certain scientic views.12 But Kitcher is not talking about this. He is talking about accepting these views. To see how radical his claim is, consider a non-scientic example. I accept that Oswald killed Kennedy. Now suppose that I am told, credibly, that something horrible will happen to me if I continue accepting the Oswald theory and if it turns out that the theory is false. Obviously, under the circumstances it would be in my strong interest to apply higher standards of evidence to my opinion, in the hope that in this way I will be able to get rid of it, and thereby avoid the danger of horrible consequences. But how could I do it? As much as I might wish to become a skeptic about the Kennedy assassination, I simply could not transform myself into one, merely on the account of the possible consequences. True, if given enough time to weave a web of self-deception, I could manipulate my own cognitive abilities (by self-indoctrination, hypnosis, brainwashing, etc.) and I could eventually end up being unsure whether it was Oswald or Lyndon Johnson who killed Kennedy, but this just illustrates that the project can only be executed in an irrational way. AnothermisconceptionthatcloudstheissuesistheideathatifIaccept H (i.e., regard H as true), then this automatically implies that, in any situation, I will behave as if H is true. This is wrong. Acceptance is not 12 In my opinion, such a decision would be justied only under very exceptional circumstances. For example, a scientist living under a totalitarian regime would have a moral obligation not to defend certain views that could be grist for the mill of the oppressive ideology (mainly because he would not have an opportunity to explain the true meaning ofhisopinion).Ithinkthatthiskindofscienticself-censorshipintheorychoiceisalmost never called for in liberal-democratic countries. 196 Science and sensitivity so rigidly linked to action. For example, I do accept the Oswald theory aboutKennedysassassination,butIwouldcertainlynotbetthelifeofmy daughter on it. Similarly, if scientists accept H, this means that they think H is much better empirically supported than its rivals, that the evidence for H is very strong, etc., but it does not mean that they are advising people to behave as if H is true on all occasions. After one hears the scientic opinion, even if one trusts it completely, one still has to use ones own judgment in deciding what to do. For example, if geologists say that they believethatnobigearthquakewillhitTexasinthenearfuture,thismakes itreasonableforTexansnottobuyearthquakeinsurancefortheirhomes, and in general to go about their lives as if a big quake will never happen. But it certainly does not make it reasonable for people who build nuclear powerplantsinTexastobehaveinthesameway.Theyalsoprobablytrust geologicalscience,yetitistheirdutytopreparefortheeventualitythatthe

acceptedscienticpredictioniswrong.Kitchersideathatscientistsshould accepthypothesesdependingontheconsequencessoundsquiteoddhere. WouldntitbestrangeifgeologistsinformedthegeneralpublicthatTexas is in the seismic zone 0 (i.e., at virtually no risk of damage from earthquakes), but if they then told the nuclear power plant builders that, given the more serious consequences of a mistake in this context, the zone 0 hypothesis can no longer be accepted? What makes Kitchers advice especially extravagant is his insistence (Kitcher 1985: 9) that the rationality of adopting a scientic hypothesis should depend not only on the political costs of accepting it, but also on the possible political costs of failing to accept it.13 This means that in cases where the political costs of not accepting a certain empirically dubious hypothesis were considered to be sufciently high, we would be actually goaded to make a politically motivated effort to accept that theory, against our better (epistemic) judgment. So apparently Kitcher would have nothing but words of praise for the behavior of scientists who rst rejected hypothesis H because they thought it inadequately supported by evidence but who, later, after learning that the rejection of H (or indecision about H) could have harmful social effects, promptly lowered their critical standards and obligingly accepted H. 13 Stranger still, he argues that acceptance of a theory should depend also on the costs and benets of adopting it, if it is true. This seems to go against his previous claim that any theoryshouldbeaccepted,givensufcientevidence.For,ifthecostsofadoptingatheory, if it is true, become prohibitively high, why shouldnt this be a reason, on Kitchers own logic, not to accept that theory even when we have the best possible evidence that it is true? 197 Making Sense of Heritability Here is another odd implication of Kitchers view. Suppose that scientistshadtochoosebetweentworivalhypotheses,H1 (badconsequencesif accepted-but-false) and H2 (no bad consequences if accepted-but-false). Suppose also that the two hypotheses have exactly the same degree of empirical support (before the introduction of political considerations). Now, in this situation the scientists who took to heart the lesson about politically responsible science would gerrymander methodological standards in the Kitcherian spirit (by holding H1 to higher standards of evidence) and they would then declare that they are closer to accepting H2 than H1 although they would not be able to point to any theoretical or empirical advantage of H2 over H1! An additional problem arises because the degree of conrmation of a hypothesis would become oddly context dependent. Take a dangerous hypothesis H1, which is initially not well supported by evidence. Imagine

now that we notice that accepting another highly conrmed hypothesis H2, which belongs to a politically neutral research area, would make H1 itself strongly conrmed. It seems that Kitchers imperative of political responsibility would require that we hold the neutral hypothesis H2 to higher standards of evidence, because under the circumstances a too easy acceptance of H2 would do social harm in an indirect way, by vindicating H1. So political considerations would urge us to assign different degrees of conrmation to the hypothesis in different contexts although the hypothesis would in itself have no political content whatsoever. Implementing Kitchers proposal would inevitably lead to the high politicization of scientic discussions. Scientists are likely to differ widely in their judgments about the degree of political danger of different hypotheses. For example, if the political Pandoras box is opened, socialists would probably want to apply higher standards of acceptance to theories that they see as threatening to their ideology, whereas conservatives would disagree, saying that they envisage the possible truth of these theories with equanimity (or with glee!), and that they see no reason for panic or any kind of protective measures. With other theories, the roles may be reversed. In any case, the evaluation of evidence in theoretical conicts would no longer be possible strictly on the empirical merits of rivaltheories.Sectarianpoliticswouldbecomeanintegralpartofscience. (Ofcourse,weknowthatinfactpoliticalconsiderationsarealreadyinuencing scientic discussions in some contexts, but at present scientists at least have a bad conscience if they realize that such external factors determine their views. Kitcher would help them feel pride, instead of guilt.) 198 Science and sensitivity Finally, if the politically inspired tightening of standards of criticism is going thus to increase in proportion to the degree of political hazard ascribed to scientic theories, then with respect to some hypotheses containing what some see as very dangerous knowledge, methodological requirements for acceptance would at some point become so tough that, given the notorious fallibility of human judgment and the essentially conjectural nature of all science, these theories would, as a matter of fact, be put effectively and forever beyond our ken.14 Is this political sensitivity or dogmatic rejection, or both? All this shows that Kitchers politically calibrated methodology of scienceinevitablyleadstotheintellectualcorruptionanddegradationofscience. Kitchers proposal to introduce higher standards of evidence seems to be already in place in what Linda Gottfredson calls one-party science: the disfavored line of work is subjective to intensive scrutiny and nearly impossible standards, while the favored line of work is held to lax standards in which aws are overlooked (Gottfredson 1994: 57).

6.3.2 . . . and it is self-defeating too LetsforgetaboutirrationalityandaskwhetherKitchersproposalwould be effective. To answer this question, rst note that if politics-laden methodological standards are to be adopted, they have to be publicly advertised as an addition to the existing norms of science. So, if the newpoliticallyconcernedsciencebecamewidespread,everyonewould know that scientists tend to downgrade the real plausibility of politically sensitive theories. It does not matter whether scientists would actually manage sincerely to believe that these theories are less plausible, or whether they would just lie about it (for the sake of the good effects of that noble lie). The important point is that, since the strategy would be a matter of public knowledge, for this very reason it couldnt work. 14 Some scholars think that in his criticism of sociobiology Kitcher already raised the methodological bar so high that not only human sociobiology but much evolutionary biology would fail to meet his standards, and that the life sciences simply could not proceed at all if they had to satisfy Kitchers unrealistic demands (Rosenberg 1987: 80). This is probably one of the main reasons why E. O. Wilson didnt think it worthwhile to respond to Kitchers book. But Wilson must have also been put off by Kitchers mixing of science and politics, which he (Wilson) has always regarded as opprobrious: Opprobrium, in my opinion, is deserved by those who politicize scientic research, who argue the merits of analysis according to its social implications rather than its truth (Wilson 1977). 199 Making Sense of Heritability For example, if you learn that, according to the new rule of the game, a scientic theory is pronounced less acceptable in proportion to the perceived political danger of its acceptance, you will immediately realize that what is happening here is that scientists just started using a kind of coded language, and that you have to do some translating to get at the true meaning of their statements. Roughly, if scientists declared that the genetic explanation of the racial IQ difference is extremely implausible, you would have to take into account that they probably marked down the conrmation degree of that hypothesis because of its possible politicaldangersandyouwouldaccordinglyaddacorrectionfactorand interpret them as really saying that the hypothesis has some empirical support.Inasimilarvein,thephrasehassomeempiricalsupportwould be understood to mean plausible, and plausible would be translated as proved.15 So, Kitchers proposal would achieve nothing. Strictly speaking, this is not quite true. It would actually have a perverse effect of making some politically dangerous theories look more plausible than they really are.

For instance, take one of these theories, H, that is extremely implausible (judgedstrictlybyempiricalevidence,i.e.,beforeitspoliticalimplications are even considered). Now if all social scientists stated (correctly) that H is extremely implausible, in the world regulated by Kitchers norms they would simply be unable to convey the meaning of that statement to the public. To see why, imagine, for the sake of simplicity, that they use the seven-point scale in Table 6.1 to rank scientic theories according to their plausibility. Now look what will happen if the scientic consensus puts H (a politically dangerous hypothesis) in category 7 (extremely implausible). Since everyone would know that dangerous theories tend to be shifted 15 Farfrombeingmerelyaphilosopherssuggestion,itsometimesreallyhappensthatwords insciencechangetheirmeaningwithpoliticalcontext.Forexample,inthereportGeneticsandHumanBehaviorissuedbytheNufeldCouncilonBioethics,wereadthatblack individualsscoreonIQtestsslightlylowerthanwhiteindividuals(Nufeld2002:69,italics supplied), and then on the same page we read that the rise in average IQ in Great Britain since World War II (due to the Flynn effect) was particularly great. But what is the difference between particularly great and slight here? Surprisingly, in terms of the number of IQ points it turns out that there is no difference at all. If we compare the rise in IQ in Great Britain with the measured IQ difference between whites and sub-Saharan blacks we are talking about a difference of essentially the same magnitude (about two standard deviations). So it is one and the same quantitative difference that in a neutral context (Flynn effect) comes to be called particularly great, and in the politically sensitive context (race) becomes de-emphasized and categorized as slight. Is it likely that this game of semantic hide-and-seek will do any good? 200 Science and sensitivity Table 6.1 Rank Degree of conrmation 1 Proved 2 Probably true 3

Plausible 4 As likely to be true as false 5 Promising 6 Implausible but worth exploring 7

Vous aimerez peut-être aussi