Académique Documents
Professionnel Documents
Culture Documents
[ROUGH DRAFT]
Technology and Our Epistemic Situation: Two Problems of Ignorance
In this paper, I argue that there are two distinct problems of ignorance: a problem
of size and a problem of type. Both are more pressing today than ever before, given
the extraordinary expansion of collective human knowledge, and both pertain to
epistemic limitations intrinsic to our evolved cognitive systems. After delineating
these problems in detail, I examine one possible way of overcoming our “relative”
and “absolute” ignorance about the universe: enhancement technologies. I then
argue that, given our epistemic situation, resources currently being spent on normal
research would be far better spent on developing cognition-enhancing technologies
– technologies that promise to help solve the size and type problems previously
sketched.
1. Distinguishing Between Size- and Type-Ignorance
Knowledge is like a sphere, the greater its volume, the larger its contact with the
unknown. – Blaise Pascal
While the concept of knowledge has held center stage in the theater of
Western Philosophy for some twenty-five thousand years, ever since Plato
articulated his tripartite analysis of it, considerably less has been said about its
epistemological antagonist: ignorance. The issue of ignorance is, I believe, more
germane today than ever, not just because of well-known studies concerning
scientific illiteracy and uninformed voters (Mooney and Kirshenbaum 2009;
Skenkman 2008), but because of the extraordinary expansion of collective human
knowledge. But note a difference between these two phenomena: the former is
entirely contingent, since the sort of nescience about “elementary” facts of the
universe and all it envelopes is clearly rectifiable, for example, by ameliorating the
educational system, instilling in students a passion for learning, and so on. In
contrast, the second is an issue of necessitation, since it is no longer possible, given
limits intrinsic to the human mind,i for even the most erudite of individuals to fully
comprehend even a single domain of scientific or humanistic inquiry. There is
simply too much to know.
In addition to (principled) reasons why a single individual can no longer
internalize more than a relatively tiny sliver of the knowledge spectrum, there is a
further issue complicating our epistemic predicament. Consider the case of
cognitive neuroscience: tremendous advancements have led to a fairly
sophisticated understanding of how the brain works. But while there are promising
solutions to what some philosophers, specifically David Chalmers, have labeled
the “easy” problems of consciousness, e.g., the problems pertaining to how our
nervous systems “discriminate, categorize, and react to environmental stimuli”
(Chalmers 1995), there remains the so-called “hard” problem: subjective
experience. The crucial point here is that, given the peculiarity of consciousness –
Page |2
enhanced creativity and intelligence, since individuals who speak two or more
languages can linguistically (re-)formulate problems in different ways, so too can
the individual familiar with multiple disciplines approach a given problem from
different angles or perspectives.
Unfortunately, though, interdisciplinary work is often restricted to domains
of inquiry that are more-or-less contiguous. When one ventures out beyond the
small cluster of disciplines surrounding one’s field of expertise, one quickly
encounters the obstacle of what we might call – borrowing from Kuhn (1996) –
disciplinary incommensurability.xi This incommensurability may be not just
methodological but observational and semantic in nature as well: people trained in
different intellectual traditions often see the world in radically different ways, and
indeed the further one gets from “home base” the more radical differences in the
argot used by distant disciplines become.xii This makes communication between
individuals from different knowledge-areas extremely arduous, if not impossible.
On a personal note, I have often been frustrated – as a neuroscientist and
philosopher trained in the analytic tradition – by discussions with individuals of a
postmodernist bent. This is not in the least because of any prior bias against their
preferred tradition: I am genuinely interested in understanding the postmodern
approach. The problem is that my Po-Mo interlocutors and I each use terms, such
as ‘epistemology’ and ‘ontology’, in different ways, we each employ significantly
different methods in our research (Analytic Philosophy is mostly centered around
conceptual analysis, not deconstruction), and indeed our most basic orientations
toward reality often fail to align well enough to permit an intelligible exchange of
ideas.xiii
Interdisciplinarity thus ends up being for the most part limited to
(sub-)disciplines not too distant on the globe of intellectual inquiry (with each of
C.P. Snow’s “two cultures” located at the poles). And as the many specialized
fields of academia continue to rapidly ramify, the very possibility of making
significant connections across large areas of knowledge becomes less feasible in
proportion: there is simply too much to know for any one individual, or even any
small team of individuals exhibiting some division of cognitive labor (Weisberg
and Muldoon forthcoming), to master the relevant domains of human knowledge.
There are several specific reasons for this.xiv Consider, on the one hand, the
issue of time: no individual lives long enough to master more than one, if one,
knowledge-domain, even if she wants to. We are perennially trapped in, to borrow
a term from Christopher Cherniak, our “finitary predicament” (Cherniak 1990).
This predicament yields what I will call the breadth-depth tradeoff: the more one
knows about any single topic, the less topics one knows about; and the more topics
one knows about the less one knows about any single topic. Fortunately, this
vexatious problem is more practical than principled, and indeed the development
Page |5
don’t know about; our Homo ancestors from the Pleistocene were ignorant of dark
matter, just like we are. The difference between us and them is, of course, that we
have positive knowledge about our negative ignorance – following the apophatic
theologians (such as Nicholas of Kusa), one might call this a kind of learned
ignorance. (Or, in keeping with the “dark matter” metaphor, call it a kind of
enlightened benightedness). The next step then is to convert this state of semi-
knowledge to one of genuine knowledge by constructing an explanatory theory that
adequately accounts for this mysterious and ubiquitous substance.
The point is that the tripartite distinction made above falls entirely within the
supercategory of knowables. By definition, this supercategory contains truths that
we human beings can come to know in principle. In contrast, there almost certainly
exists another (species-relative) supercategory of unknowables.xv Also by
definition, this supercategory contains truths that we cannot come to know in
principle. Using a more philosophically respectable (and non-Rumsfeldian)
phraseology, token puzzles falling within the former category may be called
“problems” and those falling within the latter “mysteries.” Consider, for example,
the case of the bat.xvi While the bat has been evolutionarily optimized to navigate
caves via its sonic sense of echolocation, no matter how hard it might try it could
never learn to do basic arithmetic, nor could it ever form the concept of (e.g.) a
black hole. This is a modal claim: it concerns what is and what is not in principle
possible for the bat, given its particular cognitive apparatus.
As finite beings with an evolutionary history of our own, it stands to reason
that there (may) exist entire constellations of facts, phenomena, theories, or
whatever, that we Homo sapiens could never come to know no matter how hard we
might try, since we lack the right mental machinery to form the requisite concepts
(despite our flattering binomial self-description as “wise men”).xvii In the
transcendental naturalism, or “New Mysterianism,” of Colin McGinn, we are said
to be cognitively closed to such facts, phenomena and theories as the result of
limits intrinsic to our minds.xviii As alluded to in section one, McGinn argues that
the venerable mind-body problem (i.e., What exactly is the connection between
minds and bodies, given that each seems so profoundly different from the other?)
falls within the class of mysteries, and other cognitive scientists, such as Noam
Chomsky, have tentatively suggested that the “causation of behavior” might also
be permanently insoluble (Chomsky 1975, 157), as well as the origin of the
language organ (see Dennett 1996, 389). Whether or not these particular claims
end up being veridical or not, though, is completely independent of the more
general claim that cognitive closure is a real feature of our biological situation.
Thus, there appear to be areas of knowledge that are accessible to some
possible minds but inaccessible to the actual human mind.xix And from this it
follows that, even if we had unlimited time to conduct research in the sciences and
Page |7
leads one to ask the grander question “How much can any one of us, or the
collective whole of humanity, ever know about the cosmos?” In both cases we have
identified principled constraints on our individual and collective capacities to know
arising from our evolved cognitive apparatuses – constraints that preclude us from
making certain progress along either of the two aforementioned axes.xxii
I would now like to look at possible solutions to this epistemological
conundrum. Although I am rather pessimistic about the technological future of
humanity, given the explosion of types (not to mention tokens) of existential risks
anticipated in the twenty-first century ([author citation]), when one focuses
parochially on the knowledge-problem at hand there seems to be a single salient
and promising solution: technology. Before exploring this possible fix, though,
consider first the only other solution available: good old-fashioned Darwinian
evolution. On the assumption that evolutionary change is gradualistic, it follows
that humans acquired (the ability to grasp) highly abstract concepts like electron
and social justice – concepts that are, apparently, only available to us – through a
naturalistic process of piecemeal cognitive development.xxiii Thus, it stands to
reason that further cognitive development of this sort may make available to our
phylogenetic descendants knowledge of our world that is, at present, permanently
beyond our ken. If “encephalization” were to continue, in other words, our
descendants might be able to explore at least some regions of the epistemic
landscape that we Homo sapiens cannot traverse (nor, in some cases, even see are
there: we can’t recognize what we can’t even cognizexxiv).
The problem with this possibility is that there is – or at least there appears to
be – no significant pressure in our highly artificialized selective environment for
the development of a “more advanced” neocortex (although this was not always
the case, as I discuss below).xxv That is to say, the more intelligent individuals
among us are not any fitter than those who are less intelligent.xxvi Thus, it follows
that such evolution would have to occur through the analogous intentional process
that Darwin termed “artificial selection” (as observed in the case of many
domesticated animals). But, when applied to human beings, artificial selection is
nothing other than eugenics, and eugenics has long been rightly relegated to the
trash bin of ethical opprobrium.xxvii So, we can cross the Darwinian option of
cognitive enhancement off the list of acceptable possibilities.
The only remaining option, aside from humbly accepting our circumscribed
epistemic condition, is technological in nature. To begin, then, let us underline that
the idea and practice of cognitive enhancement is not as radical and revolutionary
as it may initially seem. As Bostrom and Sandberg write:
Most efforts to enhance cognition are of a rather mundane nature, and some
may have been practiced for thousands of years. The prime example is
education and training, where the goal is often not only to impart specific
Page |9
– the notepad should count as literally part of the individual’s “coupled” cognitive
system.xxviii Furthermore, Clark and Chalmers argue that such couplings, or
biotechnological hybrids, are far more common than one might think. In fact,
extensions of cognition are traceable back to the very first humans many millions
of years ago. This leads Clark, in a separate publication, to assert that Homo faber
(“man the maker”) has always been to some degree technologically-constituted: we
are, as he puts it, “natural-born cyborgs” (Clark 2004).xxix
Thus, with this attempt to normalize both the technological and
enhancement aspects of technological enhancements,xxx let us return to the two
problems delineated above: size and type. As an issue of size, the former problem
is quantitative in nature: what we need to overcome it are cognitive capacities
different from what we already have only in degree. In contrast, as an issue of type,
the latter problem is qualitative in nature: what we need to overcome it are
cognitive capacities different in kind. This means, essentially, that a solution to the
latter problem would entail a redefinition of the venerable boundary between
mysteries and problems, as understood by the New Mysterians. Thus, while
science is busy converting “unknown unknowns” to “known unknowns” and then
to “known knowns,” the aim of enhancing the quality of the human mind would
entail gentrifying, so to speak, the supercategory of “unknowables” (with all of its
“unknowns”) such that truths once domiciled within it would subsequently find
their home in the alternate supercategory of “knowables.” With the right sort of
techno-change to the brain, such truths would be catapulted within our epistemic
reach.
Now, I need not, for the present purposes, provide any strong argument to
support the prognostication that, if pursued, future technology will effectuate a
change in the boundary separating mysteries and problems (the qualitative problem
of type), or for that matter will significantly increase our capacity for
memorization, or the speed of cerebration, and so on (the quantitative problem of
size). All one needs to accept is that the artifactual products of the inchoate
genetics, nanotechnology and robotics revolutionxxxi might very well bring about
changes in our cognitive systems as extraordinary and profound as those brought
about by Darwinian evolution – say, in the past 2.6 million years, since Homo
habilis first began manufacturing tools around Olduvai Gorge.xxxii This relatively
“weak” assertion about technological possibility is perfectly adequate as a first
premise in an argument for “enhancement” technologies.xxxiii More formally put,
this argument goes as follows:
Premise 1: Future technologies offer the serious possibility of unlimited, or
at least less limited, knowledge-growth along both the vertical and
horizontal axes;
P a g e | 11
Works Referenced:
P a g e | 12
[author citation]
Bostrom, Nick. 2008. Three Ways to Advance Science. Nature. Podcast, URL =
<http://www.nickbostrom.com/views/science.pdf>.
Bostrom, Nick and Rebecca Roache. Smart Policy: Cognitive Enhancement and
the Public Interest. Forthcoming in Enhancing Human Capabilities.
Clark, Andy. 2004. Natural-Born Cyborgs: Minds, Technologies, and the Future
of Human Intelligence. Oxford: Oxford University Press.
Clark, Andy and David Chalmers. 1998. The Extended Mind. Analysis. 58(1): 7-
19.
Craver, Carl. 2007. Explaining the Brain. Oxford: Oxford University Press.
Dawkins, Richard. 2006. The God Delusion. New York: Mariner Books.
P a g e | 13
Dennett, Daniel. 1996. Darwin’s Dangerous Idea: Evolution and the Meanings of
Life. New York: Simon & Schuster.
Ihde, Don. 1990. Technology and the Lifeworld: From Garden to Earth. Indiana:
Indiana University Press.
Jacobs, Gregg. 2003. The Ancestral Mind: Reclaim the Power. London: Penguin.
Kurzweil, R. 2005. The singularity is near: When humans transcend biology. New
York: Viking.
McGinn, Colin. 2006. Can We Solve the Mind-Body Problem? In The Philosophy
of Mind: Classical Problems/Contemporary Issues. Brian Beakley and Peter
Ludlow (eds). 321-337. Cambridge: MIT Press.
P a g e | 14
Retherford, Robert and William Sewell. 1988. Intelligence and Family Size
Reconsidered. Social Biology. 35(1-2): 1-40.
Skenkman, Rick. 2008. Just How Stupid Are We?: Facing the Truth About the
American Voter. New York: Basic Books.
Weisberg, Michael and Ryan Muldoon. Epistemic Landscapes and the Division of
Cognitive Labor. Forthcoming in Philosophy of Science.
P a g e | 15
Wright, Ronald. 2004. A Short History of Progress. New York: De Capo Press.
i
This pertains to the “breadth-depth tradeoff” that I discuss below.
ii
I borrow this distinction, made in a slightly different context, from McGinn 2006, 331.
iii
Thus, allowing one to learn more per increment of time.
iv
Note: I will bracket a number of extremely important issues concerning the ethicality of cognitive
enhancement (Bostrom and Sandberg forthcoming; Allhoff et al. 2009), cognitive enhancement and personal
identity (Schneider 2008), etc. What concerns me at present is only the possibility of solving the size and type
problems of ignorance via cognition-enhancing methods. Further discussion would indeed examine the ethical
and philosophical implications of cognitively enhancing the human mind.
v
Note that the term ‘vertical knowledge’ has been used in a number of different disciplinary contexts, such as
contemporary digital humanities. The definition in this field, though, is almost exactly opposite the way I use the
metaphor of verticality in this paper: someone who has “vertical knowledge” has tremendous knowledge about a
single topic – he or she is an expert. My sense refers more to wide knowledge, to interdisciplinarity or, at the
ideal extreme, the ability for one “to understand how things in the broadest possible sense of the term hang
together in the broadest possible sense of the term” (Sellars 1956, 37). That is verticality.
vi
This problem is typically approached in the sciences through abstraction and idealization. See Godfrey-Smith
2009 for discussion.
vii
This has led to a theory in economics and decision theory called “rational ignorance theory.” See, e.g., Caplan
2001.
viii
Ronald Wright quotes an unidentified person who “once defined specialists as ‘people who know more and
more about less and less, until they know all about nothing” (Wright 2004, 29). Another memorable witticism
comes from Robert Theobald, who contends that “when information doubles, knowledge halves and wisdom
quarters” (Theobald 1996). According to a calculation made by Kevin Kelly and the Google economist Hal
Varian, in fact, “world-wide information has been increasing at the rate of 66% per year for many decades”
(Kelly 2008). Yet another datum to add to the heap.
ix
As the critic of technology Langdon Winner writes: “If ignorance is measured by the amount of available
knowledge that an individual or collective ‘knower’ does not comprehend, one must admit that ignorance, that is
relative ignorance, is growing” (Winner 1977, 283).
x
There are, of course, the obvious political and social costs of ignorance – costs that are in no way trivial.
xi
Kuhn’s notion of incommensurability applies diachronically to single disciplines. In contrast, the sense used
here applies synchronically to multiple disciplines. Of course, all that is needed to overcome this kind of
incommensurability is greater familiarity with the methodological, observational and semantic peculiarities of
those disciplines foreign to some individual – but therein lies the problem!
xii
Consider the case of ‘faith knowledge’, a term I came across in Daniel Migliore’s book Faith Seeking
Understanding. Migliore specifies as a principle of Christology that “Knowledge of Jesus Christ is not simply
‘academic’ or historical knowledge; it is faith knowledge” (Migliore 2004, 167; emphasis in original).
Similarly, the Vatican Council, III, iv, states that “the Catholic Church has always held that there is a twofold
order of knowledge, and that these two orders are distinguished from one another not only in their principle but
in their object; in one we know by natural reason, in the other by Divine faith; the object of the one is truth
attainable by natural reason, the object of the other is mysteries hidden in God, but which we have to believe and
which can only be known to us by Divine revelation.” From the perspective of analytic philosophy, though, the
collocation of ‘faith’ and ‘knowledge’ is utterly oxymoronic – a nonsensical locution, since faith and knowledge
are exact epistemological opposites.
xiii
One almost feels as if she were in the Quinean predicament of radical translation, with her interlocutor
gesturing at rabbits, or undetached rabbit parts, or time-slices of rabbits, and so on, shouting “Gavagai!” But
which of these disjuncts is the “real” referent seems, at times, inscrutable.
xiv
That is, there are several senses in which the human situation is fixed and finite.
xv
Truths within this supercategory are, of course, unknown to us humans because they are unknowable.
xvi
I allude here to Thomas Nagel’s (1974) famous paper on the phenomenology of bat echolocation – a “what it
is like” that seems to permanently lie outside the realm of purely objective science.
xvii
As Dawkins insightfully writes: “[I want to pursue the point] that the way we see the world, and the reason
why we find some things intuitively easy to grasp and others hard, is that our brains are themselves evolved
organs: on-board computers, evolved to help us survive in a world – I shall use the name Middle World – where
the objects that matters to our survival were neither very large nor very small; a world where things either stood
still or moved slowly compared with the speed of light; and where the very improbable could safely be treated as
impossible. Our mental burka window is narrow because it didn’t need to be any wider in order to assist our
ancestors to survive” (Dawkins 2006, 367-368).
xviii
In a slightly different terminology, Jerry Fodor (1983) labels this same idea “epistemic boundedness.” Thus,
we are epistemically bounded from exploring certain regions of possible knowledge about the cosmos.
xix
Interestingly, McGinn distinguishes between relative and absolute cognitive closure. He writes: “A problem is
absolutely cognitively closed if no possible mind could resolve it; a problem is relatively closed if minds of
some sorts can in principle solve it while minds of other sorts cannot” (McGinn 2006, 329). For the purposes of
this paper, I want to avoid getting entangled in the net of abstruse issues concerning the possibility of absolute
cognitive closure.
xx
Note the semantic origin of ‘organism’: it comes from ‘organ’, which derives from the Greek etymon organon.
In Greek, this word meant “tool, instrument, engine of war, …” (OED). Thus, from the etymological
perspective, the metaphor “organisms are artifacts” is analytically true.
xxi
In fact, McGinn conjectures that the mind-body problem might be rather simple, even though completely
opaque to us humans.
xxii
See [author citation] for a thorough critique of techno-progressionism, especially as it manifests itself in the
contemporary transhumanist movement. Obviously, if there is a final theory, then any movement towards this
end would indeed count as scientific progress in the strongest sense. But one should always be weary of the
millennialist accretions that build up around notions of absolute progress. See also Ruse 1996 for a thorough
discussion of progressionism.
xxiii
See also the “Baldwin effect.”
xxiv
Any allusion here to Donald Davidson’s “Swampman” is merely incidental.
xxv
See Dawkins’ 1992 article on evolutionary progress.
xxvi
In fact, there appears to be a negative correlation between measured intelligence and fertility rate. See, e.g.,
Retherford and Sewell 1988. The well-known Flynn effect appears to be the result of environment rather than
genes.
xxvii
Today, a “softer” kind of eugenics is finding expression in the growing field of reprogenetics – a
portmanteau of ‘reproductive’ and ‘genetics’. See Kevles 1992 for an informative overview of eugenics.
xxviii
Clark and Chalmers also argue that their thesis entails that the self is itself extendible beyond traditional
organismic boundaries – that is, if one takes the self to be constituted not just by occurrent beliefs (those beliefs
one has right now) but by dispositional beliefs too.
xxix
Don Ihde (1990) has much to say about the phenomenological relations between human users and the
technologies used; there are connections to be made between Ihde and Clark/Chalmers, no doubt, though no one
has yet made them.
xxx
Of course, to say that technology has, as a matter of fact, played a part in our evolution is not to say that it
ought to have played such a role, or that it ought to play such a role in the future. The transhumanist project must
be justified.
xxxi
See Kurzweil 2005 and [author citation] for more on the “GNR” revolution.
xxxii
As Dennett writes in a hostile review of McGinn: “His thesis about the likely limitations of our brains would
be uncontroversially true if it weren't for our clever trick of expanding the powers of our naked brains by off-
loading much of the work to artifacts we have designed and built just for this purpose. The brains we were born
with are no doubt quite incapable of grasping long division--let alone calculus or photosynthesis--without the aid
of pencil and paper or chalk and blackboard. We have to work to acquire some of our concepts, but we don't
have to do all the work in our heads, thank goodness. One might think, then, that in order to defend a thesis
about the outer limits of our powers, one should at least take a peek at the concepts made available to those who
have armed themselves with the new technology” (Dennett 1991). Note that Dennett’s target is not cognitive
closure per se, but the claim that the mind-body problem is off-limits for us humans. The present thesis is
actually defended in the spirit of Dennett’s critique.
xxxiii
That is, we can’t know for sure until we’ve tried. And the argument here is that resources would be far better
spent trying than continuing to work on the profusion of micro-problems that currently occupy our minds.
xxxiv
These roughly correspond, of course, to the three premises above.