Vous êtes sur la page 1sur 4

James Gleicks History of Information

By GEOFFREY NUNBERG
The universe, the 18th-century mathematician and philosopher Jean Le Rond dAlembert said, would only be one fact and one great truth for whoever knew how to embrace it from a single point of view. James Gleick has such a perspective, and signals it in the first word of the title of his new book, The Information, using the definite article we usually reserve for totalities like the universe, the ether and the Internet. Information, he argues, is more than just the contents of our overflowing libraries and Web servers. It is the blood and the fuel, the vital principle of the world. Human consciousness, society, life on earth, the cosmos its bits all the way down.
THE INFORMATION
A History. A Theory. A Flood Books of The Times: The Information by James Gleick

Claude Shannon, the father of information theory.

Gleick makes his case in a sweeping survey that covers the five millenniums of humanitys engagement with information, from the invention of writing in Sumer to the elevation of information to a first principle in the sciences over the last half-century or so. Its a grand narrative if ever there was one, but its key moment can be pinpointed to 1948, when Claude Shannon, a young mathematician with a background in cryptography and telephony, published a paper called A Mathematical Theory of Communication in a Bell Labs technical journal. For Shannon, communication was purely a matter of sending a message over a noisy channel so that someone else could recover it. Whether the message was meaningful, he said, was irrelevant to the engineering problem. Think of a game of Wheel of Fortune, where each card thats turned over narrows the set of possible answers, except that here the answer could be anything: a common English phrase, a Polish surname, or just a set of license plate numbers. Whatever the message, the contribution made by each signal what he called, somewhat provocatively, its information could be quantified in binary digits (i.e., 1s and 0s), a term that conveniently condensed to bits. Shannons paper, published the same year as the invention of the transistor, instantaneously created the field of information theory, with broad applications in engineering and computer science. Beyond that, it transformed information from a term associated with requests to telephone operators to an intellectual buzzword, bandied about

so loosely that Shannon was moved to write a gently cautionary note called The Bandwagon. But unlike the equally voguish discipline of cybernetics proposed that same year by Norbert Wiener, which left little behind it but a useful prefix, information theory wound up reshaping fields from economics to philosophy, and heralded a dramatic rethinking of biology and physics. In the 1950s, Francis Crick, the co-discoverer of the structure of DNA, was still putting information in quotation marks when describing how one protein copied a sequence of nucleic acids from another. But molecular biologists were soon speaking of information, not to mention codes, libraries, alphabets and transcription, without any sense of metaphor. In Gleicks words, Genes themselves are made of bits. At the same time, physicists exploring what Einstein had called the spooky paradoxes of quantum mechanics began to see information as the substance from which everything else in the universe derives. As the physicist John Archibald Wheeler put it in a paper title, It From Bit. Gleick ranges over the scientific landscape in a looping itinerary that takes the reader from Maxwells demon to Godels theorem, from black holes to selfish genes. Some of the concepts are challenging, but as in previous books like Chaos and Genius, his biography of Richard Feynman, Gleick provides lucid expositions for readers who are up to following the science and suggestive analogies for those who are just reading for the plot. And there are anecdotes that every reader can enjoy: Shannon building a machine called Throbac I that did arithmetic with Roman numerals; the Victorian polymath Charles Babbage writing to Tennyson to take exception to the arithmetic in Every minute dies a man / Every minute one is born. But unlike chaos, information also has a human history. In a series of chapters, Gleick recounts oft-told tales about the invention of writing systems and the compilation of the Oxford English Dictionary along with the stories of lesser-known structures of coding and communication. In the late 18th century, long before Samuel Morse, for example, the Chappe brothers of France invented the first telegraph in the form of a network of hundreds of towers topped by semaphore arms with which the government could relay messages from Paris to Bordeaux in less than a day, weather permitting. One French deputy described the Chappes ingenious signaling system as one of the great inventions of history, along with the compass, printing and gunpowder. And once the Chappes optical telegraph had been replaced by the more democratic and versatile electric version, frugal customers hit on the similarly ingenious expedient of using economical abbreviations for common messages, like gmlet for give my love to texting avant la lettre. This is all engagingly told, though Gleicks focus on information systems occasionally leads him to exaggerate the effects technologies like printing and the telegraph could have all by themselves. For example, he repeats the largely discredited argument, made by the classicist Eric Havelock in the 1970s, that it was the introduction of the alphabet that led to the development of science, philosophy and the true beginning of consciousness. Such errors are mostly minor. But Gleicks tendency to neglect the social context casts a deeper shadow over the books final chapters, where he turns from explicating information as a scientific concept to considering it as an everyday concern, switching roles from science writer to seer. For him, the information we worry is engulfing us is just another manifestation of the primal substance that underlies all of biological life and the physical

universe we are creatures of the information, in his phrase, in more than just our genetic or chemical makeup. In an epilogue called The Return of Meaning, Gleick argues that to understand how information gives rise to belief and knowledge, we have to renounce Shannons ruthless sacrifice of meaning, which required jettisoning the very quality that gives information its value. But Shannon wasnt sacrificing meaning so much as ignoring it, in the same way that a traffic engineer doesnt care what, if anything, the trucks on the highway are carrying. Once you start to think of information as something meaningful, you have to untether it from its mathematical definition, which leaves you with nothing to go on but the word itself. And in its ordinary usage, information is a hard word to get a handle on (even after a recent revision, the Oxford English Dictionary still makes a hash of its history). Its one of those words, like objectivity and literacy, that enable us to slip from one meaning to the next without letting on, even to ourselves, that weve changed the subject. That elusiveness is epitomized in the phrase information age, which caught on in the 1970s, about the same time we started to refer to computers and the like as information technology. Computers clearly are that, if you think of information in terms of bits and bandwidth. But the phrases give us license to assume that the stuff sitting on our hard drives is the same as the stuff that we feel overwhelmed by, that everybody ought to have access to, and that wants to be free. Like most people who write about the information age, Gleick cant avoid this semantic slippage. When he describes the information explosion, he reckons the increase in bytes, citing the relentless procession of prefixes (kilo-, mega-, giga-, tera-, peta-, exa-, and now zetta-, with yotta- in the wings) thats mirrored in the proliferation of smartphones, tablets, game consoles and windowless server farms. But theres no road back from bits to meaning. For one thing, the units dont correspond: the text of War and Peace takes up less disk space than a Madonna music video. Even more to the point, is information just whatever can be stored on silicon, paper or tape? It is if youre Cisco or Seagate, who couldnt care less whether the bytes theyre making provision for are encoding World of Warcraft or home videos of dancing toddlers. (Americans consume more bytes of electronic games in a year than of all other media put together, including movies, TV, print and the Internet.) But those arent the sorts of things we have in mind when we worry about the growing gap between information haves and have-nots or insist that the free exchange of information is essential to a healthy democracy. Information, in the socially important sense stuff that is storable, transferable and meaningful independent of context is neither eternal nor ubiquitous. It was a creation of the modern media and the modern state (Walter Benjamin dated its appearance to the mid-19th century). And it accounts for just a small portion of the flood of bits in circulation. Even so, theres enough information coming at us from all sides to leave us feeling overwhelmed, just as people in earlier ages felt smothered by what Leibniz called that horrible mass of books that keeps on growing. In response, 17th-century writers compiled indexes, bibliographies, compendiums and encyclopedias to winnow out the chaff. Contemplating the problem of turning information into useful knowledge, Gleick sees a similar role for blogs and aggregators, syntheses like Wikipedia, and the vast, collaborative

filter of our connectivity. Now, as at any moment of technological disruption, he writes, the old ways of organizing knowledge no longer work. But knowledge isnt simply information that has been vetted and made comprehensible. Medical information, for example, evokes the flood of hits that appear when you do aGoogle search for back pain or vitamin D. Medical knowledge, on the other hand, evokes the fabric of institutions and communities that are responsible for creating, curating and diffusing what is known. In fact, you could argue that the most important role of search engines is to locate the online outcroppings of the old ways of organizing knowledge that we still depend on, like the N.I.H., the S.E.C., the O.E.D., the BBC, the N.Y.P.L. and ESPN. Even Wikipedias guidelines insist that articles be based on reliable, published sources, a category that excludes most blogs, not to mention Wikipedia itself. Gleick wouldnt deny any of this, but his focus on information as a prime mover and universal substance leads him to depict its realm as a distinct place at a remove from the larger social world, rather than as an extension of it. As he puts it, in the vatic tone that this topic tends to elicit, Human knowledge soaks into the network, into the cloud (more of those totalizing definite articles). In an evocative final paragraph, he pictures humanity wandering the corridors of Borgess imaginary Library of Babel, which contains the texts of every possible book in every language, true and false, scanning the shelves in search of lines of meaning among the leagues of cacophony and incoherence. If it comes to that, though, well have lots of help identifying the volumes that are worth reading, and not just from social networks and blogs but from libraries, publishers and other bulwarks of the informational old order. Despite some problems, a prodigious intellectual survey like The Information deserves to be on all their lists. Geoffrey Nunberg teaches at the School of Information at the University of California, Berkeley.

Vous aimerez peut-être aussi