Vous êtes sur la page 1sur 289

In today's excerpt--comments on punctuation from David Crystal,

author, co-author, or editor of over 100 books, including the


Cambridge Encyclopedia of Language and the Cambridge Encyclopedia of
the English Language:
"Early English manuscripts had no punctuation. They often didn't even
have spaces between words. The earliest conventions were introduced as
a guide to phrasing when reading aloud became an important
activity. ... There was a great deal of experiment. Over thirty marks
can be found in medieval manuscripts--various combinations of dots,
curls, and dashes. Most of them disappeared after the arrival of
printing. Some of them look like modern marks, but their function was
not the same: a point, for example, represented a pause, rather than a
sentence ending, and the height of a point could vary to express
degrees of pause.
"Printers had to make decisions about punctuation and capitalization
as well as about spelling. The earliest European printers generally
followed the marks they found in the manuscripts, the actual shapes
depending on the typeface used. Most recognized three kinds of pause,
represented by a point, a virgule (/), and a mark of interrogation.
[William] Caxton chiefly used a virgule and a point (.), occasionally
a colon (:) and a paragraph mark. Word-breaks at the end of a line
were shown by a double-virgule (//). The comma began to replace the
virgule in the 1520s. ... Towards the end of the fifteenth century,
semicircular parentheses, the question mark, and the semicolon, as
well as the comma, were introduced in Europe, but it took some time
for them all to appear in England. ...
"There was a great deal of inconsistency of usage, especially when
several people worked on the same book. ... Even in modern editions a
comparison of two editions (e.g. of Shakespeare's Sonnets) will bring
to light a remarkable range of [differences]. ... Uncertainty always
surrounds a new punctuation mark. In the sixteenth century there was a
great confusion among compositors over the use of the apostrophe. At
first they only used it as a marker of an omitted letter; its use as a
marker of possession came much later, in the eighteenth century. ...
It took a long time for the use of these marks to achieve some sort of
stability. In fact, of course, they never did totally stabilize. ...
Publishers compile [guidelines] to ensure consistency. No two
publishers have the same list. I know, because I have published with
many firms. ...
"The history of punctuation shows that the complexity does not
disappear. Rather, it changes as time goes by. And it is continuing to
change. The biggest punctuation changes since the Renaissance are
about to hit us, because of the Internet."
David Crystal, The Fight for English, Oxford, Copyright 2006 by David
Crystal, pp. 139- 142.

-------------------------------------------------------------------------------------------------In today's excerpt--Mao Tse-tung, absolute ruler over one-quarter of


the world's population and ultimately responsible for over 70 million
deaths in peacetime, became well known for his many wives, his
mistresses, and his indifference to his children:
"Mao acquired a wife--his third--almost as soon as he settled into
outlaw country. a pretty young woman with large eyes, high cheekbones, an almond shaped face and a willowy figure, Gui-yuan was just
turning eighteen when she met Mao. ... Mao at once began to court her,
and by the beginning of 1928 they were 'married'--with no binding
ceremony but a sumptuous banquet prepared by Mrs. Yuan. ... Unlike
[his previous wife], who was madly in love with him, Gui-yuan married
Mao with reluctance. A beautiful woman in a crowd of men, she had many
suitors and considered Mao, at thirty-four, 'too old' and 'not worthy'
of her. ...
"By the end of a year of marriage, Gui-yuan had resolved to leave Mao.
She confided to a friend that she was unlucky to have married him.
When Mao decided to leave outlaw land, in January 1929, she tried
desperately to stay behind. ... However, Mao ordered her to be taken
along 'at any cost.' She cried all the way, repeatedly falling behind,
only to be fetched by Mao's guards. ...
"[In 1937] for Gui-yuan, Mao's years of flagrant womanizing was the
last straw. Over their marriage of nearly ten years, she had had to
live with her husband's heartlessness. ... She was bitter that
although he was indifferent to children, and he had not cared when
four of theirs had died or been abandoned, he repeatedly made her
pregnant. ... Gui-yuan left for Russia [for hospital care] in early
October 1937. Their new one-year-old daughter remained in Yenan. ...
It was in this freezing world that she gave birth to a boy, to whom
she gave the Russian name of Lyova. He died of pneumonia after only
six months, and Gui-yuan sank into inconsolable grief, staring at his
grave, murmuring his name and weeping.
"Their was no warmth from her husband. When the baby was born, she had
written to Mao to say that the boy looked just like him. Mao did not
reply. No word either on his son's death. Then, in summer 1939, nearly
two years after they had parted, Gui-yuan learned by chance that Mao
had remarried. ... Because he had remarried, Mao did not want Gui-yuan
back in China. ...As a result, the infant daughter she had left in
Yenan spent her first few years as a virtual orphan. ...When she was
four, the daughter was taken to Russia to join her mother. Gui-yuan
hugged her daughter long and hard through streams of tears, ...[but]
she later had a breakdown, and the brunt of her rage was born by her
daughter; others often heard the daughter screaming as her mother beat

her. Gui-yuan was put in a mental institution, howling as she was torn
away from her room and bundled in a car. Her terrified daughter, now
seven, ran away and hid in the woods, and grew up an introverted and
silent child."
Jung Chang and Jon Halliday, Mao, Vintage Books, Copyright 2005 by
Globalflair Ltd., pp. 60-61, 203-204.
-------------------------------------------------------------------------------------------------In today's excerpt--black and green tea:
"The earliest unambiguous reference to tea is from the first century
BCE, some twenty-six centuries after Shen Nung's supposed discovery of
tea. Having started out as an obscure medicinal and religious
beverage, tea first seems to have become a domestic drink in China
around this time. ...
"Tea is first mentioned [as an import from China] in European reports
from the region in the 1550s. ... The first tea was green tea, the
kind that had always been consumed by the Chinese. Black tea, which is
made by allowing the newly picked green leaves to oxidize by leaving
them overnight, only appeared during the Ming dynasty; its origins are
a mystery. It came to be regarded by the Chinese as suitable only for
consumption by foreigners and eventually dominated exports to Europe.
Clueless as to the origins of tea, Europeans wrongly assumed green and
black tea were two entirely different botanical species. ...
"Tea got its start [in England] when it became fashionable at the
English court following the marriage in 1662 of Charles II to
Catherine of Braganza, daughter of King John IV of Portugal. Her
enormous dowry included the Portuguese trading posts of Tangier and
Bombay, ... a fortune in gold, and a chest of tea. Catherine was a
devoted tea drinker and brought the custom with her. ...
"It is not too much exaggeration to say that almost nobody in Britain
drank tea at the beginning of the eighteenth century, and nearly
everybody did at the end of it. Official imports grew from six tons in
1699 to eleven thousand tons a century later, and the price of a pound
of tea was one-twentieth of the price at the beginning. ...
[Consumption was actually greater than imports would indicate because
of] the widespread practice of adulteration, the stretching of tea by
mixing it with ash and willow leaves, sawdust, flowers, and more
dubious substances--even sheep's dung, according to one account--often
colored and disguised using chemical dyes. Tea was adulterated in one
way or another at almost every stage along the chain from leaf to
cup. ... Black tea became more popular, partly because it was more
durable than green tea on long voyages, but also as a side effect of
this adulteration. Many of the chemicals used to make fake green tea

were poisonous, whereas black tea was safer, even when adulterated. As
black tea started to displace the smoother, less bitter green tea, the
addition of sugar and milk helped to make it more palatable."
Tom Standage, A History of the World in Six Glasses, Walker, Copyright
2005 by Tom Standage, pp. 178, 185-189.
-------------------------------------------------------------------------------------------------In today's excerpt--the Koran or Qur' an, believed by Muslims to be
the revelation of God given to Muhammad (570-632 C.E.) by the angel
Gabriel over a period of 23 years, and standardized in its written
form under the third caliph in about 650 C.E. Here, Princeton
Professor Philip Hitti describes the Koran in his classic 1943 history
of the Arabs:
"Although there are approximately twice as many Christians as Moslems
in the world, it can safely be said that the Koran is the most widely
read book ever written. For besides its use in worship it is the
textbook from which practically every young Moslem learns to read
Arabic. ... The first translation into English appeared in 1649, made
from the French, by Alexander Ross, Vicar of Carisbrooke: 'The Alcoran
of Mohamet ... newly Englished, for the satisfaction of all that
desire to look into the Turkish vanities.' ...
"Compared to the Bible, the Koran offers but few textual
uncertainties. The first, final and only canonized version of the
Koran was collated nineteen years after the death of Muhammad, when it
was seen that the memorizers of the Koran were becoming extinct
through the battles that were decimating the ranks of the
believers.. ... Its 6,239 verses, its 77,934 words, even its 323,621
letters have since been painstakingly counted. ... The Book is not
only the heart of a religion, the guide to the Kingdom of Heaven, but
a compendium of science and a political document, embodying the code
of law for a kingdom on earth. ...
"The parallels between the Old Testament and the Koran are many and
striking. Almost all the historical narratives of the Koran have their
biblical counterparts. Among the Old Testament characters, Adam, Noah,
Abraham (mentioned about seventy times), Ishmael, Lot, Joseph, Moses
(whose name occurs in thirty-four chapters), Saul, David, Solomon,
Elijah, Job and Jonah figure prominently. The story of the Creation
and Fall of Adam is cited five times, the flood eight and Sodom eight.
Of the New Testament characters Zacharias, John the Baptist, Jesus and
Mary are the only ones emphasized. ... Certain miraculous acts
attributed to Jesus the child in the Koran, such as speaking out in
the cradle and creating birds out of clay, recall similar acts
recorded in the Apocryphal Gospels. ...

"In the [surahs, or chapters of the Koran], the oneness of Allah, His
attributes, the ethical duties of man and the coming retribution are
the favorite themes. ... Critics consider the statutes relating to
divorce the most objectionable, and those about the treatment of
slaves, orphans, and strangers the most humane portions of Islamic
legislations. The freeing of slaves is encouraged as something most
pleasing to God and an expiation for many a sin. ...
"Its length is four-fifths that of the New Testament in Arabic. This
book, a strong, living voice, is meant for oral recitation and should
be heard in the original to be appreciated. No small measure of its
force lies in its rhyme and rhetoric and in its cadence and sweep,
which cannot be reproduced when the book is translated."
Philip K. Hitti, A Short History of the Arabs, Gateway, Copyright 1943
by Princeton University Press, pp. 42-46.
-------------------------------------------------------------------------------------------------In today's excerpt--explanations of the causes of the industrial
revolution. Among the most important events in human history, the
industrial revolution, which began in Britain, untethered mankind from
dependence on animals, wind and water as its sources of energy. In
doing so, it created an unmatched explosion in wealth and population.
Because of its importance, historians have attempted to explain its
causes--and have marched out reasons ranging from the Protestant work
ethic to the scientific revolution. Here, Robert Marks argues that it
was instead the scarcity of land and the proximity and availability of
coal in England:
"Population growth and agricultural development put pressure on the
land resources of England. Indeed, by 1600 much of southern England
had already been deforested, largely to meet the needs of the growing
city of London for fuel for heating and cooking. Fortunately for the
British, veins of coal were close enough to the surface of the ground
and close enough to London to create both a demand for coal and the
beginnings of the coal industry. By 1800, Britain was producing ten
million tons of coal, or 90 percent of the world's output, virtually
all destined for the homes and hearths of London. As the surface
deposits were depleted, mine shafts were sunk, and the deeper they
went in search of coal, the more the mines encountered groundwater
seeping into and flooding the mine shafts. Mine operators had a
problem, and they began devising ways to get the water out of the
mines.
"Ultimately what they found useful was a device that used steam to
push a piston. Early versions of this machine, developed first by
Thomas Newcomen in 1712 and then vastly improved by James Watt in the
1760s, were so inefficient that the cost of fuel would have rendered

them useless, except for one thing: at the mine head, coal was in
effect free. Between 1712 and 1800, there were 2,500 of the
contraptions built, almost all of which were used at coal mines. ...
"Eurocentric explanations of the Industrial Revolution typically
invoke the 'scientific revolution,' the immensely interesting and
ultimately very important development beginning in the sixteenth
century, whereby some Europeans began to think of nature as a separate
entity that could be understood and modeled mathematically. Although
it is absolutely true that science has become an integral part of the
world, and while it has come to play a leading role, especially from
the late 1800s on, in developing new industries, there is little
evidence to tie European science to the beginnings of the Industrial
Revolution or to the technologies that fueled it. ...
"In fact, the principles of technologies used in the Industrial
Revolution were well known in China; what explains their development
in England and not China, as suggested above, were the particular
circumstances that made the first, extremely inefficient steam engines
effectively free. China did not have that good fortune."
Robert B. Marks, The Origins of the Modern World, Roman & Littlefield,
Copyright 2007 by Roman & Littlefield, pp. 108-112.
-------------------------------------------------------------------------------------------------In today's excerpt--in the progress of the scientific revolution (1543
to the present), all individuals were replaceable, and it was instead
the incremental progress of hundreds and then thousands of individuals
that led to each breakthrough achievement:
"It is natural to describe key events in terms of the work of
individuals who made a mark in science-- Copernicus, Vesalius, Darwin,
Wallace and the rest. But this does not mean that science has
progressed as a result of a string of irreplaceable geniuses possessed
of a special insight into how the world works. Geniuses maybe (though
not always); but irreplaceable certainly not. Scientific progress
builds step by step, and as the example of Darwin and Wallace [who
independently established the principle of evolution at close to the
same time] shows, when the time is ripe, two or more individuals may
make the next step independently of one another. It is the luck of the
draw, or historical accident, whose name gets remembered as the
discoverer of a new phenomenon. What is much more important than human
genius is the development of technology, and it is no surprise that
the start of the scientific revolution 'coincides' with the
development of the telescope and the microscope.
"I can only think of one partial exception to this situation, and even
there I would qualify the exception more than most historians of

science do. Isaac Newton was clearly something of a special case, both
because of the breadth of his scientific achievements and in
particular because of the clear way in which he laid down the ground
rules on which science ought to operate. Even Newton, though, relied
on his immediate predecessors, in particular Galileo Galilei and Rene
Descartes, and in that sense his contributions followed naturally from
what went before. If Newton had never lived, scientific progress would
have been held back by a few decades, but only by a few decades.
Edmond Halley or Robert Hooke might well have come up with the famous
inverse square law of gravity; Gottfried Leibnitz actually did invent
calculus independently of Newton (and made a better job of it); and
Christiaan Huygens's superior wave theory of light was held back by
Newton's espousal of the rival particle theory."
John Gribbin, The Scientists, Random House, Copyright 2002 by John and
Mary Gribbin, pp. xix-xx.
-------------------------------------------------------------------------------------------------In today's encore excerpt--the Golden Age of Piracy in the Caribbean,
1715 to 1725, which was led by a clique of twenty to thirty pirate
commodores and a few thousand crewmen:
"Engaging as their legends are--particularly as enhanced by Robert
Louis Stevenson and Walt Disney--the true story of the pirates of the
Caribbean is even more captivating: a long lost tale of tyranny and
resistance, a maritime revolt that shook the very foundations of the
newly formed British Empire, bringing transatlantic commerce to a
standstill and fueling the democratic sentiments that would later
drive the American revolution. At its center was a pirate republic, a
zone of freedom in the midst of an authoritarian age. ...
"They ran their ships democratically, electing and deposing their
captains by popular vote, sharing plunder equally, and making
important decisions in an open council--all in sharp contrast to the
dictatorial regimes in place aboard other ships. At a time when
ordinary sailors received no social protections of any kind, the
Bahamian pirates provided disability benefits for their crews. ...
"They were sailors, indentured servants, and runaway slaves rebelling
against their oppressors: captains, ship owners, and the autocrats of
the great slave plantations of America and the West Indies. ... At the
height of the Golden Age, it was not unusual for escaped slaves to
account for a quarter or more of a pirate vessel's crew, and several
mulattos rose to become full-fledged pirate captains. ... The
authorities made pirates out to be cruel and dangerous monsters,
rapists and murderers who killed men on a whim and tortured children
for pleasure, and indeed some were. Many of these tales were
intentionally exaggerated, however, to sway a skeptical public. ... In

the voluminous descriptions of [Samuel 'Black Sam'] Bellamy's and


Blackbeard 's [Edward Thatch's] attacks on shipping--nearly 300
vessels in all--there is not one recorded instance of them killing a
captive. More often than not, their victims would later report having
been treated fairly by these pirates, who typically returned ships and
cargo that did not serve their purposes. ... At the height of their
careers, each commanded a small fleet of pirate vessels, a company
consisting of hundreds of men, and ... a flagship capable of
challenging any man-of-war in the Americas."
Colin Woodard, The Republic of Pirates, Harcourt, 2007, pp. 1-8.
-------------------------------------------------------------------------------------------------In today's excerpt--as discussed by political advisor Frank Luntz, the
sequential arrangement of information often creates the very meaning
of that information:
"[In film, when] two unrelated images are presented, one after the
other, the audience infers a causal or substantive link between them.
A shot of a masked killer raising a butcher knife, followed by a shot
of a woman opening her mouth, tells us that the woman is scared. But
if that same image of a woman opening her mouth is preceded by a shot
of a clock showing that it's 3 a.m., the woman may seem not to be
screaming, but yawning. The mind takes the information it receives and
synthesizes it to create a third idea, a new whole. ...
"The essential importance of the order in which information is
presented first hit home for me early in my career when I was working
for Ross Perot during the 1992 presidential campaign. I had three
videos to test: a) a Perot biography, b) testimonials of various
people praising Perot, and c) Perot himself delivering a speech.
Without giving it much thought, I'd been showing the videos to various
focus groups of independent voters in that order--until, at the
beginning of one session, I realized to my horror that I'd failed to
rewind the first two videotapes. So I was forced to begin the focus
group with the tape of Perot himself talking.
"The results were stunning.
"In every previous focus group, the participants had fallen in love
with Perot by the time they'd seen all three tapes in their particular
order. No matter what the negative information I threw at them, they
could not be moved off their support. But now, when people were seeing
the tapes in the opposite order, they were immediately skeptical of
Perot's capabilities and claims, and abandoned him at the first
negative information they heard. ... I repeated this experiment
several times, reversing the order, and watched as the same phenomenon
took place. Demographically identical focus groups in the same cities

had radically different reactions--all based on whether or not they


saw Perot's biographical video and the third-party testimonials (and
were therefore predisposed and conditioned to like him) before or
after the candidate spoke for himself.
"The language lesson: A+B+C does not necessarily equal C+B+A. The
order of presentation determines the reaction."
Dr. Frank Luntz, Words that Work, Hyperion, Copyright 2007 by Dr.
Frank Luntz, pp. 40-41.
-------------------------------------------------------------------------------------------------In today's encore excerpt--many of the reasons that English spelling
contains many silent letters and other complexities date from the 15th
century, around the time of William Caxton's 1476 introduction the
printing press in England:
"In spelling, the [English] language was assimilating the consequences
of having a civil service of French scribes, who paid little attention
to the traditions of English spelling that had developed in AngloSaxon times. Not only did French qu arrive, replacing Old English cw
(as in queen), but ch replaced c (in words such as church--Old English
cirice), sh and sch replaced sc(as in ship--Old English scip), and
much more. Vowels were written in a great number of ways. Much of the
irregularity of modern English spelling derives from the forcing
together of Old English and French systems of spelling in the Middle
Ages. People struggled to find the best way of writing English
throughout the period. ... Even Caxton didn't help, at times. Some of
his typesetters were Dutch, and they introduced some of their own
spelling conventions into their work. That is where the gh in such
words as ghost comes from.
"Any desire to standardize would also have been hindered by the ...
Great English Vowel Shift, [which] took place in the early 1400s.
Before the shift, a word like loud would have been pronounced 'lood';
name as 'nahm'; leaf as 'layf'; mice as 'mees'. ...
"The renewed interest in classical languages and cultures, which
formed part of the ethos of the Renaissance, had introduced a new
perspective into spelling: etymology. Etymology is the study of the
history of words, and there was a widespread view that words should
show their history in the way they were spelled. These weren't
classicists showing off. There was a genuine belief that it would help
people if they could 'see' the original Latin in a Latin-derived
English word. So someone added a b to the word typically spelled det,
dett, or dette in Middle English, because the source in Latin was
debitum, and it became debt, and caught on. Similarly, an o was added
to peple, because it came from populum: we find both poeple and

people, before the latter became the norm. An s was added to ile and
iland, because of Latin insula, so we now have island. There are many
more such cases. Some people nowadays find it hard to understand why
there are so many 'silent letters' of this kind in English. It is
because other people thought they were helping."
David Crystal, The Fight for English: How language pundits ate, shot,
and left, Oxford, 2006, pp. 26-9.
-------------------------------------------------------------------------------------------------In today's encore excerpt, Kevin Phillips, long-time Republican pundit
and author of the ground-breaking 1969 book The Emerging Republican
Majority, on his recently articulated thesis on the association of war
and religion:
"Although the Europe of 1900-1914 represented the world's most
advanced civilization, talk of Armageddon and crusadership flourished.
By 1919 military recruiting posters showed St. George, St. Michael,
angels and even Christ in the background. ... The most extreme
blessing of the cannons came from the bishop of London, A.F.
Winnington-Ingram, who called the war 'a great crusade--we cannot deny
it--to kill Germans.' He advised The Guardian that 'you ask for my
advice in a sentence as to what the church is to do--I answer MOBILIZE
THE NATION FOR HOLY WAR.' ...
"Few historians have paid much attention to the loss of faith, but one
explanation may be safely ventured. Organized religion did not profit
from the great disillusionment when the various chosen peoples turned
out not to be. For Britain, the lesson followed a century in which
British Christianity had moved in many of the directions that we have
later seen in the United States--evangelical religion, global
missionary intensity, end-times anticipation, and sense of biblical
prophecy beginning to come together in the Middle East.
"But when the Armageddon of 1914-1918 brought twenty million deaths
instead of Christ's return, the embarrassment was not limited to flagbedecked Anglican churches and noncomformist chapels that had joined
in the parade. Religion in general seemed to have failed, and the
British Church attendance shrank--and then shrank again."
Kevin Phillips, American Theocracy, Viking, 2006, 250-1, 382-3.
-------------------------------------------------------------------------------------------------In today's excerpt--from Robert Wright's groundbreaking and
controversial book, The Moral Animal, this background on monogamy
versus polygyny (multiple wives), which he discusses as a precursor to

his discussion of the logic of monogamy contrasted against the


historical predominance of polygyny:
"A huge majority [of human societies]--980 of the 1,154 past or
present societies for which anthropologists have data--have permitted
a man to have more than one wife. And that number includes most of the
world's hunter--gatherer societies, societies that are the closest
thing we have to a living example of the context of human
evolution. ...
"There is a sense in which polygynous marriage has not been the
historical norm. For 43 percent of the 980 polygynous cultures,
polygyny is classified as 'occasional.' And even where it is 'common,'
multiple wives are generally reserved for a relatively few men who can
afford them or qualify for them via formal rank. For eons and eons,
most marriages have been monogamous, even though most societies
haven't been. Still, the anthropological record suggests that polygyny
is natural in the sense that men given the opportunity to have more
than one wife are strongly inclined to seize it. ...
"[For] societies that have hovered right around the subsistence
level, ... where little is stowed away for a rainy day, a man who
stretches his resources between two families may end up with few or no
surviving children. And even if he were willing to gamble on a second
family, he'd have trouble attracting a second wife. ... The general
principle is that economic equality among men--especially, but not
only, if near subsistence level--tends to short-circuit polygyny. This
tendency by itself dispels a good part of the monogamy mystery, for
more than half of the known monogamous societies have been classified
as 'nonstratified' by anthropologists. What really demand explanation
are the six dozen societies in the history of the world, including the
modern industrial nations, that have been monogamous yet economically
stratified. These are true freaks of nature. ...
"Laura Betzig has shown that in pre-industrial societies, extreme
polygyny often goes hand in hand with extreme political hierarchy, and
reaches its zenith under the most despotic regimes. ... In Inca
society, the four political offices from petty chief to chief were
allotted ceilings of seven, eight, fifteen, and thirty women,
respectively. It stands to reason that as political power became more
widely disbursed, so did wives. And the ultimate widths are one-manone-vote and one-man-one-wife. Both characterize most of today's
industrial nations."
Robert Wright, The Moral Animal, Vintage, Copyright 1994 by Robert
Wright, pp. 90-94.
--------------------------------------------------------------------------------------------------

In today's excerpt--hangovers:
"A hangover peaks when alcohol that has been poured into the body is
finally eliminated from it--that is, when the blood-alcohol level
returns to zero. The toxin is now gone, but the damage it has done is
not. By fairly common consent, a hangover will involve some
combination of headache, upset stomach, thirst, food aversion, nausea,
diarrhea, tremulousness, fatigue, and a general feeling of
wretchedness. Scientists haven't yet found all the reasons for this
network of woes, but they have proposed various causes.
"One is withdrawal, which would bring on the tremors and also
sweating. A second factor may be dehydration. Alcohol interferes with
the secretion of the hormone that inhibits urination. Hence the heavy
traffic to the rest rooms at bars and parties. The resulting
dehydration seems to trigger the thirst and lethargy. While that is
going on, the alcohol may also be inducing hypoglycemia (low blood
sugar), which converts into light-headedness and muscle weakness, the
feeling that one's bones have turned to jello. Meanwhile, the body, to
break down the alcohol, is releasing chemicals that may be more toxic
than alcohol itself; these would result in nausea and other symptoms.
Finally, the alcohol has produced inflammation, which in turn causes
the white blood cells to flood the bloodstream with molecules called
cytokines.
"Apparently, cytokines are the source of the aches and pains and
lethargy that, when our bodies are attacked by a flu virus--and
likewise, perhaps, by alcohol--encourage us to stay in bed rather than
go to work, thereby freeing up the body's energy for use by the white
cells in combatting the invader. In a series of experiments, mice that
were given a cytokine inducer underwent dramatic changes. Adult males
wouldn't socialize with young males new to their cage. Mothers
displayed 'impaired nest- building.' ...
"But hangover symptoms are not just physical; they are cognitive as
well. People with hangovers show delayed reaction times and
difficulties with attention, concentration, and visual-spatial
perception. A group of airplane pilots given simulated flight tests
after a night's drinking put in substandard performances. Similarly,
automobile drivers, the morning after, get low marks on simulated road
tests. Needless to say, this is a hazard, and not just for those at
the wheel. There are laws against drunk driving, but not against
driving with a hangover. ...
"Some words for hangover, like ours, refer prosaically to the cause:
the Egyptians say they are 'still drunk,' the Japanese 'two days
drunk,' the Chinese 'drunk overnight.' The Swedes get 'smacked from
behind.' But it is in languages that describe the effects rather than
the cause that we begin to see real poetic power. Salvadorans wake up
'made of rubber,' the French with a 'wooden mouth' or a 'hair ache.'

The Germans and the Dutch say they have a 'tomcat,' presumably
wailing. The Poles, reportedly, experience a 'howling of kittens.' My
favorites are the Danes, who get 'carpenters in the forehead.'
Joan Acocella, "A Few Too Many," The New Yorker, May 26, 2008, pp.
32-33.
-------------------------------------------------------------------------------------------------In today's excerpt--naming America. After four increasingly
unsuccessful voyages to the New Work, Columbus dies in 1506 in ruin
and disgrace. "I am ruined," he wrote in one of his last surviving
letters, "alone, desolate, infirm, daily expecting death. ... Weep for
me, whoever has charity, truth, and justice.":
"In a final insult [to Columbus], the most enduring honor of all went
to a fellow Italian who had befriended Columbus in his last years. 'He
is a very honorable man and always desirous of pleasing me,' wrote
Columbus, ever a poor judge of character, 'and is determined to do
everything possible for me.' The man's name was Amerigo Vespucci.
"A well-connected Florentine merchant and a scion of the Medicis,
Vespucci moved to Seville and outfitted fleets crossing the Atlantic.
He sailed to the Indies several times between 1499 and 1502, under
both Spanish and Portuguese auspices, and claimed to be a great
navigator. But his true genius was for hype and self-promotion.
" 'I hope to be famous for many an age,' he wrote, in one of the
embellished accounts he gave of his voyages. Vespucci invented some
episodes and lifted others from Columbus's writing. Unlike the
Admiral, though, he showed great flair for lubricious tales designed
to titillate his European audience.
"Native women, he claimed, were giantesses--'taller kneeling than I am
standing'--and impervious to age and childbearing, with taut wombs and
breasts that never sagged. ... Best of all, they were 'very desirous
to copulate with us Christians,' Not surprisingly, Vespucci's account
became an instant best seller. ...
"[Unlike Columbus, who never gave up his belief that the lands he
discovered were part of Asia], Vespucci referred to the [South
American] region as 'a new world.' unknown to 'our ancestors,' ... In
1507, a year after Columbus's death, the German geographer Martin
Waldseemuller published a text and map adding a 'fourth part' to the
known world of Europe, Asia, and Africa. 'I see no reason why one
should justly object to calling this part Amerige,' Waldseemuller
wrote, 'or America, after Amerigo, its discoverer, a man of great
ability.' His revised world map had 'America' engraved next to a
landmass roughly resembling Brazil.

"Waldseemuller later changed his mind and dropped the name from a
subsequent edition. But 'America' was reprised in 1538 by the great
cartographer Gerard Mercator, who applied it to continents both north
and south."
Tony Horwitz, A Voyage Long and Strange, Henry Holt, Copyright 2008 by
Tony Horwitz, pp. 77-79.
-------------------------------------------------------------------------------------------------In today's encore excerpt--atoms:
"There are more than than one hundred different types of atoms, from
lightweights like hydrogen and helium through welterweights like tin
and iodine and out to such mumbling mooseheads as ununpentium and
ununquadium, but they're all much the same nearly nil size. You can
fit more than three atoms in a nanometer, meaning it would take 10 to
the 13th power, or ten trillion of them, to coat the disk of our
pinhead. And the funny thing about an atom is that its outlandish
smallness is still too big for it: almost all of its subnanometer span
is taken up by empty space. The real meat of an atom is its core, its
nucleus, which accounts for about 99.9 percent of an atom's matter.
When you step on your bathroom scale, you are essentially weighing the
sum of your atomic nuclei. If you could strip them all from your body,
go on a total denuclear diet, you'd be down to about twenty grams, the
weight of four nickels, or roughly the weight of the doornail that you
would be as dead as.
"Those remaining twenty grams belong to your electrons, the
fundamental particles that orbit an atom's nucleus. An electron has
less than 1/1,800 the mass of a simple atomic nucleus. ... Viewed from
the more impressive angle of volumetrics, we see that, while the
nucleus may make up nearly all of an atom's mass, ... it takes up only
a trillionth of its volume.
"Here it is worth a final reversion to metaphor. If the nucleus of an
atom were a basketball located at the center of Earth, the electrons
would be cherry pits whizzing about in the outermost layer of Earth's
atmosphere. Between our nuclear [basketball] and the whizzing pits,
there would be no Earth: no iron, nickel, magma, soil, sea, or
sky, ... nothing, literally, to speak of. ... We live in a universe
that is largely devoid of matter. Yet still the Milky Way glows, and
still our hemoglobin flows, and when we hug our friends, our fingers
don't sink into the vacuum with which all atoms are filled. If in
touching their skin we are touching the void, why does it feel so
complete?"
Natalie Angier, The Canon, Houghton Mifflin, 2007, pp. 85-86.

-------------------------------------------------------------------------------------------------In today's encore excerpt--the current outpouring of opposition to


immigration reflects similar outpourings at many points along the way
in American history. One such example is the fear of Asian immigration
in the early 1900s. However, not only was that fear unfounded, but
fundamental global demographics have now changed:
"In 1907, William Randolph Hearst's San Francisco Examiner published a
two-part Sunday supplement which warned that 'the Yellow Peril is
here.' Hearst was hardly alone in his prediction that 'Japan May Seize
the Pacific Coast.' Those who did not fear outright invasion feared
being out-bred. ... 'If California is to be preserved for the next
generation as a 'white man's country' there must be some movement
started that will restrict the Japanese birthrate in California.' ...
How different it is today. Fertility rates in Asia have reached such
low levels that population loss is inevitable throughout much of the
region.
"When asked how long it will take for the world's population to
double, nearly half of all Americans say 20 years or less. This is
hardly surprising, given the sensations of overcrowding all of us feel
in our day-to-day lives and the persistent reports we hear of teeming
Third World megacities. Yet looking beneath the surface of events, we
can see that world population growth has already slowed dramatically
over the last generation and is headed on a course for absolute
decline. ... These predictions come with considerable certainty. The
primary reason is the unprecedented fall in fertility rates over the
last generation that is now spreading to every corner of the globe. In
both hemispheres, in nations rich and poor, in Christian, Taoist,
Confucian, Hindu, and especially Islamic countries, one broad social
trend holds constant at the beginning of the twenty-first century. As
more and more of the world's population moves to crowded urban areas,
and as women gain in education and economic opportunity, people are
producing fewer and fewer children.
"Today, global fertility rates are half what they were in 1972. No
industrialized nation still produces enough children to sustain its
population over time. ... Germany could easily lose the equivalent of
the current population of East Germany over the next half century.
Russia's population is already decreasing by over three-quarters of a
million a year. Japan's population is meanwhile expected to fall by as
much as one-third. ... Yet the steepest drops in fertility, and the
most rapid rates of population aging, are now occurring in the
developing world. ... Today, when Americans think of Mexico, for
example, they think of televised images of desperate, unemployed
youths swimming the Rio Grande or slipping through border fences.
However, because Mexican fertility rates have dropped so dramatically,

by mid-century Mexico [and most of Latin America] will be a less


youthful country than the United States."
Phillip Longman, The Empty Cradle, Basic Books, 2004, pp. 47, 7-8.
-------------------------------------------------------------------------------------------------In today's excerpt--the democratic wave in Latin America:
"In 1977 all but four Latin America countries were dictatorships. By
1990 only Cuba was, while Mexico had begun to move along its slow road
to democracy. Even as academic treatises were being published claiming
that Latin America suffered from 'blocked societies,' incapable of
democratic modernization, winds of change blew through the region. The
democratic wave began in the Dominican Republic and moved quickly to
Peru, Ecuador and beyond. At first, some analysts saw this as just
another swing of the pendulum.
"Yet it soon became clear that several deeper factors were at work.
The first was that the international climate was changing. ... In the
late 1970s, Jimmy Carter had proclaimed the importance of human rights
in American foreign policy, especially as regards Latin America. That
led to friction between the United States and some of the
dictatorships. As important, in Spain and Portugal, mortality put [an
end] to the longstanding fascist dictatorships of Francisco Franco and
Antonio de Oliviera Salazar. The Iberian transition to democracy in
the 1970s was highly influential across the Atlantic. Second, state
terror and long years of exile caused the left to reflect on the folly
of its conduct in the 1960s and 1970s. Many left-wingers came to
accept the value of civil liberties and of democracy--without the
derogatory adjectives, such as 'bourgeois' or 'formal,' with which
they had previously vilified it.
"An analogous re-evaluation took place on the right and among
businessmen. Many of them had assumed that dictatorships, free of the
need to satisfy voters, would be able to take the unpopular decisions
required to put in place policies that would guarantee faster economic
growth in the medium to long term. Yet it had not turned out like
that--and this was the third and most important factor behind the turn
to democracy. Most of the dictatorships had proved as incapable of
grappling with the economic challenges facing the region as their
civilian predecessors had been. In fact, if not in left-wing myth,
military officers around the world tend to be hostile to free-market
economics. In Latin America, that was partly because the armed forces
themselves had a vested interest in a large state, since this provided
jobs for officers and subsidies for military enterprises such as arms
factories.
Michael Reid, Forgotten Continent, Yale, Copyright 2007 by Michael

Reid, pp.120-122.
-------------------------------------------------------------------------------------------------In today's excerpt--the phrase 'entangling alliances' and the policy
of isolationism, both often falsely attributed to George Washington:
"The central interpretive strain of [George Washington's] Farewell
Address has been to read it as the seminal statement of American
isolationism. Ironically, the phrase most associated with this
interpretive tradition, 'entangling alliances with none,' is not
present in the Farewell Address. (Double irony, it appears in
Jefferson's first inaugural, of all places). Here are the salient
words, which isolationists hurled against Woodrow Wilson in 1917 and
Franklin Roosevelt in 1941: 'Europe has a set of primary interests,
which to us have none, or very remote relation. Hence she must be
engaged in frequent controversies, the causes of which are foreign to
our concerns. ... 'Tis our true policy to steer clear of permanent
Alliances, with any portion of the foreign world.'
"In truth, Washington's isolationist prescription rests atop a deeper
message about American foreign policy, which deserves more recognition
than it has received as the seminal statement in the realistic
tradition. Here are the key words: 'There can be no greater error to
expect, or calculate upon real favours from Nation to Nation. 'Tis an
illusion which experience must cure, which a just pride ought to
discard.' Washington was saying that the relationship between nations
was not like the relationship between individuals, which could
periodically be conducted on the basis of mutual trust. Nations always
had and always would behave solely on the basis of interest.
"It followed that all treaties were merely temporary arrangements
destined to be discarded once those interests shifted. In the context
of his own time, this was a defense of the Jay Treaty, which
repudiated the Franco-American alliance and aligned America's
commercial interests with British markets as well as protection of the
all-powerful British fleet."
Joseph J. Ellis, His Excellency, Knopf, Copyright 2004 by Joseph J.
Ellis, p. 235.
-------------------------------------------------------------------------------------------------In today's excerpt-evolutionary biologist Robert Trivers (b. 1943)
argues that, consciously or subconsciously, we keep our rationales for
our actions and beliefs carefully arrayed near the surface-ready as
necessary for our defense:

"The reason the generic human arguing style feels so effortless is


that, by the time the arguing starts, the work has already been done.
Robert Trivers has written about the periodic disputes ... that are
often part of a close relationship, whether a friendship or a
marriage. The argument, he notes, 'may appear to burst forth
spontaneously, with little or no preview, yet as it rolls along, two
whole landscapes of information appear to lie already organized,
waiting only for the lightning of anger to show themselves.'
"The proposition here is that the human brain is, in large part, a
machine for winning arguments, a machine for convincing others that
its owner is in the right--and thus a machine for convincing its owner
of the same thing. The brain is like a good lawyer: given any set of
interests to defend, it sets about convincing the world of their moral
and logical worth, regardless of whether they in fact have any of
either. Like a lawyer, the human brain wants victory, not truth; and,
like a lawyer, it is sometimes more admirable for skill than virtue.
"Long before Trivers wrote about the selfish uses of self-deception,
social scientists had gathered supporting data. In one experiment,
people with strongly held positions on a social issue were exposed to
four arguments, two pro and two con. On each side of the issue, the
arguments were of two sorts: (a) quite plausible, and (b) implausible
to the point of absurdity. People tended to remember the plausible
arguments that supported their views and the implausible arguments
that didn't, the net effect being to drive home the correctness of
their position and the silliness of the alternative.
"One might think that, being rational creatures, we would eventually
grow suspicious of our uncannily long string of rectitude, our
unerring knack for being on the right side of any dispute over credit,
or money, or manners, or anything else. Nope. Time and again--whether
arguing over a place in line, a promotion we never got, or which car
hit which--we are shocked at the blindness of people who dare suggest
that our outrage isn't warranted."
Robert Wright, The Moral Animal, Vintage, Copyright 1994 by Robert
Wright, pp. 280- 281.
-------------------------------------------------------------------------------------------------In today's encore excerpt--ethnic conflict, which is more pervasive
today than ever before, is tragically fundamental to history and is
essential to understanding situations such as present-day Iraq:
"The list of ethnic massacres is a long one. A nonexclusive list of
victims of ethnic massacres since the Romans includes: the Danes in
Anglo-Saxon England in 1002; the Jews in Europe during the First
Crusade, 1069-1099; the French in Sicily in 1282; the French in Bruges

in 1302; the Flemings in England in 1381; the Jews in Iberia in 1391;


converted Jews in Portugal in 1507; the Huguenots in France in 1572;
Protestants in Magdeburg in 1631; Jews and Poles in the Ukraine,
1648-1954; indigenous populations in the United States, Australia, and
Tasmania in the eighteenth and nineteenth centuries; Jews in Russia in
the nineteenth century; the French in Haiti in 1804; Arab Christians
in Lebanon in 1841; Turkish Armenians in 1895-1896 and 1915-1916;
Nestorian, Jacobite, and Maronite Christians in the Turkish Empire in
1915-1916; Greeks in Smyrna in 1922; Haitians in the Dominican
Republic in 1936; the Jewish Holocaust in German-occupied territory,
1933-1945; Serbians in Croatia in 1945; Muslims and Hindus in British
India in 1946-1947; the Chinese in 1965 and the Timorese in 1974 and
1998 in Indonesia; Igbos in Nigeria in 1967-1970; the Vietnamese in
Cambodia in 1970-1978; the Bengalis in Pakistan in 1971; the Tutsis in
Rwanda in 1956-1965; 1972, and 1993-1994; Tamils in Sri Lanka in 1958,
1971, 1977, 1981, and 1983; Armenians in Azerbaijan in 1990; Muslims
in Bosnia in 1992; Kosovars and Serbians in Kosovo in 1998-2000. To
show how far from exhaustive this list is, the political scientist Ted
Gurr counted fifty ethnically based conflicts in 1993-1994 alone. ...
"As Scientific American said in September 1998, 'Many of the world's
problems stem from the fact that it has 5,000 ethnic groups but only
190 countries.' ...
"Ethnic diversity does not automatically imply ethnic conflict,
violent or otherwise, it merely reflects the potential for such
conflict, if opportunistic politicians try to exploit ethnic divisions
to gain an ethnic power base. Apparently such opportunism is
common. ... High ethnic diversity is a good predictor of civil war and
genocide. The risk of civil war is two and a half times higher in the
most ethnically diverse quarter of the [countries in the] sample as
compared to the least ethnically diverse quarter.. The risk of
genocide is three times higher in the same comparison. ...
"[However,] ethnically diverse countries with good institutions tend
to escape the violence, poverty, and redistribution usually associated
with ethnic diversity. Democracy also helps neutralize ethnic
differences; ethnically diverse democracies don't seem to be at an
economic disadvantage relative to ethnically homogeneous democracies."
William Easterly, The Elusive Quest for Growth, The MIT Press,
Copyright 2001 by MIT, pp. 268-278.
-------------------------------------------------------------------------------------------------In today's excerpt--seeing ourselves in photographs and mirrors:
"Researchers have determined that mirrors can subtly affect human
behavior, often in surprisingly positive ways. Subjects tested in a

room with a mirror have been found to work harder, to be more helpful
and to be less inclined to cheat, compared with control groups
performing the same exercises in nonmirrored settings. Reporting in
the Journal of Personality and Social Psychology, C. Neil Macrae,
Galen V. Bodenhausen and Alan B. Milne found that people in a room
with a mirror were comparatively less likely to judge others based on
social stereotypes about, for example, sex, race or religion.
" 'When people are made to be self-aware, they are likelier to stop
and think about what they are doing,' Dr. Bodenhausen said. 'A
byproduct of that awareness may be a shift away from acting on
autopilot toward more desirable ways of behaving.' Physical selfreflection, in other words, encourages philosophical self-reflection,
a crash course in the Socratic notion that you cannot know or
appreciate others until you know yourself. ...
"In a report titled 'Mirror, Mirror on the Wall: Enhancement in SelfRecognition,' which appears online in The Personality and Social
Psychology Bulletin, Nicholas Epley and Erin Whitchurch described
experiments in which people were asked to identify pictures of
themselves amid a lineup of distracter faces. Participants identified
their personal portraits significantly quicker when their faces were
computer enhanced to be 20 percent more attractive. They were also
likelier, when presented with images of themselves made prettier,
homelier or left untouched, to call the enhanced image their genuine,
unairbrushed face. Such internalized photoshoppery is not simply the
result of an all-purpose preference for prettiness: when asked to
identify images of strangers in subsequent rounds of testing,
participants were best at spotting the unenhanced faces.
"How can we be so self-delusional when the truth stares back at us?
'Although we do indeed see ourselves in the mirror every day, we don't
look exactly the same every time,' explained Dr. Epley, a professor of
behavioral science at the University of Chicago Graduate School of
Business. There is the scruffy-morning you, the assembled-for-work
you, the dressed-for-an-elegant-dinner you. 'Which image is you?' he
said. 'Our research shows that people, on average, resolve that
ambiguity in their favor, forming a representation of their image that
is more attractive than they actually are.' "
Natalie Angier, "Mirrors Don't Lie. Mislead? Oh, Yes." The New York
Times, Science Times, July 22, 2008, F1.
-------------------------------------------------------------------------------------------------In today's excerpt--divorce customs, ancient and not-so-ancient:
"For nearly a thousand years, an Englishman sick of his wife could
slip a halter around her neck, lead her to market--the cattle market--

and sell her to the highest bidder, often with her willing
participation. This informal route to divorce for the lower classes
lasted, amazingly, until at least 1887. ... [As reported by nonfiction authors Lawrence Stone in The Family, Sex, and Marriage and
Samuel Menefee in Wives for Sale], a drunken husband sells his wife in
the opening chapter of Thomas Hardy's The Mayor of Casterbridge
(1886), much to the astonishment of contemporary critics. Oblivious to
the informal, unlawful marriage and divorce customs of the less
literate brethren ('wife-sale' dates back to c. 1073), they could not
imagine such a thing happening on British soil in the nineteenth
century, even though popular broadsides depicting the practice (one of
which illustrates the cover of Menefee's book) were still being
produced and widely circulated during that same century. ...
"[In the Old Testament, the law allowed for divorce because of
infertility, and] Israelite men could divorce their wives for reasons
far more vague than infertility. (Wives couldn't divorce their
husbands for any reason.) If, for instance, 'she fails to please him
because he finds something obnoxious about her,' there's no need to
hire a pricey lawyer. He simply 'writes her a bill of divorcement,
hands it to her, and sends her away from his house.' He'd better be
sure this is what he wants, because he can't have her back again. ...
"The Bible, leaving nothing to chance, provides soldiers with a lesson
on the fine art of taking enemy women to wife after the enemy has been
vanquished. ... You don't just throw her to the ground and have your
way with her then and there. You don't throw her on the ground at all.
And you don't have your way with her for an entire month. No, 'you
shall bring her into your house, and she shall trim her hair, pare her
nails, and discard her captive's garb. She shall spend a month's time
in your house lamenting her father and mother; after that you may come
to her and possess her, and she shall be your wife.' The lesson
includes instruction on how to get rid of her, too. No bill of
divorcement is required, but restrictions do apply: 'Then, should you
no longer want her, you must release her outright. You must not sell
her for money; since you had your will of her, you must not enslave
her.' "
Susan Squire, I Don't: A Contrarian History of Marriage, Bloomsbury,
Copyright 2008 by Susan Squire, pp. 36-44, 227
-------------------------------------------------------------------------------------------------In today's excerpt--in a Roman empire that was ruled by a small number
of elites, heavily populated by slaves and the poor, and possessed of
a flaccid paganism, Christianity grew from ten thousand believers in
100 CE to six million in 300 CE. It was the fastest spread of a
religion in history until the rise of Islam in the sixth century CE.
And it spread in spite of the difficulty of maintaining uniform

beliefs given poor communication, poor literacy and wide geographical


dispersion:
"Early Christianity was tiny and scattered. No precise figures
survive, but best estimates suggest that there were considerably fewer
than ten thousand Christians in 100 CE, and only about two hundred
thousand Christians in 200 CE, dispersed among several hundred towns.
The late-second-century figure equals only 0.3 percent of the total
population of the Roman empire (which was about 60 million). ... The
rapidity of its growth rate helps explain why coded statements of
belief, rather than complex rules of practice, were the passport to
full membership. The very small size of Christianity helps explain why
the Roman state paid so little attention to suppressing it
effectively. ...
"In the early stages of Christianity, at any one time, perhaps only a
few dozen Christians could read or write fluently. On the numbers
which I have just cited, and even if we allow for a significantly
higher rate of literacy among Christians than among pagans (outside of
the ruling elite), by the end of the first century all Christianity is
likely to have included, at any one time, less than fifty adult men
who could write or read biblical texts fluently. ...
"Religion was not a frontier along which the Roman elite considered it
needed to defend itself with vigor, at least not until the middle of
the third century. And when the state did attack the Christian church
on a massive scale ... the number of Christians, in spite of temporary
setbacks, continued to grow. ... By the end of the third century,
perhaps 10 percent of the empire's population--6 million out of 60
million people--were Christians. The emperor Constantine openly
converted to Christianity in 312, and the emperors who succeeded him
were also Christian. ... It is difficult to decide whether this [turn
of events], which had so much influence on the future course of
western culture, should be called a triumph of the Christian church or
the triumph of the Roman state.."
"What is amazing is that in spite of the practical difficulties of
size, dispersion, rapid growth, and illegality, and in spite of the
startling variety of early Christian beliefs, Christian leaders
actively pursued and preserved the ideal of unity and orthodoxy.
Keith Hopkins, A World Full of Gods, Plume, Copyright 1999 by Keith
Hopkins, 2001, pp. 82-84.
-------------------------------------------------------------------------------------------------In today's excerpt--after the Black Death, the terrifying plague that
killed one-third to one-half of all Europe's inhabitants from 1347 to
1349, there was a change in the names parents gave their children:

"The centrality of religion in medieval European life is impossible to


overstate. ... If you want to pray, you go to your parish and submit
to the direction of a priest. If you want to confess, you sit in the
confessional and [tell] your sins to the man on the other side of the
partition, who pronounces judgement and penance. ...
"Then along comes the Black Death, mowing down the sinful and the
sinless indiscriminately. ... You can be healthy on Monday, infected
on Tuesday, and a corpse on Saturday, leaving precious little time to
wipe the sin slate clean by confessing and repenting in preparation
for your personal judgement day. The biggest hurdle of all might have
been luring the priest, any priest, to one's deathbed of contagion in
order to perform last rites, the final cleansing. If a cleric does
show up, he might charge an outrageous price for mumbling a few
prayers. Stories of deathbed fee-gougers also abound, adding to the
popular perception that extravagance and greed motivate more often
than not. ...
"Once the epidemic is over, the survivors increasingly turn away from
organized religion. Instead, they put their faith in the saints,
especially those associated with pain and suffering. One modern
historian conducted a comparative study of the most popular names for
boys in Florence following the Black Death, in part to determine its
effect on religious practice. That effect appears to be, in a word,
enormous. Virtually no Florentine born before 1350 was named
'Antonio,' after Anthony of Padua, the patron saint of the oppressed,
the elderly, the poor, and the starving. After 1427 the name ranked
second. At number six, also unknown preceding the plague, is
Bartolomeo--after one of the original twelve apostles; he was
purportedly flayed alive and crucified by the Romans, surely
qualifying him for the pain-and-suffering category. (Michelangelo's
The Last Judgement shows Bartholomew clutching his skin, the organ of
the body that most visibly bears the signs of Black Death.)
"Also rising out of nowhere to the heights of post-plague fashion is
Lorenzo. Here the inspiration is Lawrence of Rome, a third-century
deacon who achieved martyrdom by being roasted on a gridiron. The
sudden vogue for 'Christopher,' patron saint of pestilence, needs no
further explanation."
Susan Squire, I Don't, Bloomsbury, Copyright 2008 by Susan Squire, pp.
166-167.
-------------------------------------------------------------------------------------------------In today's excerpt--how China, once the world's economic and
technological leader, fell behind. It closed its doors to the outside
world in 1434, and with this isolation from trade in commerce and

ideas, began a centuries-long period of stagnation:


"China's population of 1.3 billion constitutes more than a fifth of
humanity. Asia's population, in total, includes 60 percent of
humanity. Asia's fate is truly the world's fate. .., China and India
are ancient civilizations that in important ways were far ahead of
Europe not so many centuries ago. The rise of the West--the western
part of the Eurasian landmass--was one of the great ruptures of human
history, overturning more than a millenium or more in which Asia
rather than Europe had the technological lead. [Today], Asia is not
merely catching up with Europe and the United States, it is also
catching up with its own past as a technological leader. ...
"Where did China stumble, and why? ... Around the start of the
sixteenth century, just after Columbus had found the sea route to the
Americas and Vasco de Gama had circled the Cape of Good Hope to reach
Asia by sea, China was clearly the world's technological superpower,
and had been so for at least a millenium. Europe conquered Asia after
1500 with the compass, gunpowder, and the printing press, all Chinese
innovations. There was nothing fated about such a turnaround. China's
dominance, it appears, was squandered, and 1434 is increasingly
understood to be a pivotal year.
"In that year, the Ming emperor effectively closed China to
international trade, dismantling the world's largest and most advanced
fleet of ocean vessels. Between 1405 and 1433, the Chinese fleet,
under the command of the famed eunuch admiral, Zheng He, had visited
ports of the Indian Ocean all the way to East Africa, showing the
flag, transmitting Chinese culture and knowledge, and exploring the
vast lands of the Indian Ocean region. Then, all at once, the imperial
court decided that the voyages were too expensive, perhaps because of
increased threats of nomadic incursions over China's northern land
border. For whatever reason, the emperor ended ocean-going trade and
exploration, closed down shipyards, and placed severe limitations on
Chinese merchant trade for centuries to come. Never again would China
enjoy technological leadership in naval construction and navigation,
or command the seas even in its own neighborhood. ...
"In 1975, China's per capita income was a mere 7.5 percent of Western
Europe's. Since then ... China has soared, reaching around 20 percent
of Europe's income level by 2000. ... China is ending extreme poverty,
and is on its way to reversing centuries of relative decline."
Jeffrey Sachs, The End of Poverty, Penguin, Copyright 2005 by Jeffrey
Sachs, pp. 149-151.
-------------------------------------------------------------------------------------------------In today's excerpt--in the chivalrous twelfth century, relationships

and sex, viewed as dutiful and dispassionate under the Church, begin
to emerge as rapturous and transcendent. The new age of courtly love
sweeps through the courts of Europe and engenders a new genre of songs
and poems. Aiding in this transformation are Eleanor of Aquitaine
(1122-1204) and the troubadors:
"The [new] game of courtly love is an elaborate blueprint for the
building of desire, as opposed to the quenching of it. The higher it
builds without fulfillment, the more perfect a lover the knight proves
himself to be. ...
"Consummated or not, courtly love is by definition adulterous. The
knight who jousts on horseback, sword in hand, competes against other
knights for a highly desirable lady. But they're not fighting for her
hand in marriage, or even for the privilege of courting her. She
already has a husband. Initially, at least, they're not even fighting
for the privilege of sleeping with her. They're fighting for the
privilege of loving her--synonymous with serving her. ...
"In 1154, Henry, Duke of Normandy, captures the English throne as
Henry I, making his wife Eleanor [of Aquitaine] a queen for the second
time--and [through her] bestowing upon the English court a resident
expert on the rules of the game. From there the ideal of love ... will
be converted into the middle-class ideal of marriage: the melding of
two minds, bodies, and hearts into one. ... Eleanor and her kin would
find it next to unimaginable that the heady quality of adultery would
one day converge with the dutiful, dispassionate quality of marriage
as they experience it.
"Maybe that's what finally enables the convergence: Love enters
marriage through the extramarital back door. As [Christian author]
C.S. Lewis noted in his study of courtly doctrine, Allegory of Love,
'Any idealization of sexual love, in a society where marriage is
purely utilitarian, must begin by an idealization of adultery.' ...
"What troubadors bring about is the reinvention of love. They make its
pursuit desirable, even admirable. Previously, epic tales of sexual
desire ended in mutually assured destruction for all concerned. ...
[Now], to gamble all you have, even your life, on romantic rapture
becomes the route to transcendence. The most memorable romantic lovers
of courtly literature--Tristan and Isolde, Lancelot and Guinevere,
Troilus and Cressida--meet tragic ends, but noble ones. They martyr
themselves for the glory of the faith. The new religion of love is a
wedge to the future."
Susan Squire, I Don't, Bloomsbury, Copyright 2008 by Susan Squire, pp.
151-159.
--------------------------------------------------------------------------------------------------

In today's excerpt--the dominant ingredient in the American diet,


directly or indirectly, is corn, which is in a quarter of the fortyfive thousand items found in a typical supermarket:
"An American supermarket ... is dominated by a single species:Zea
mays, the giant tropical grass most Americans know as corn. Corn is
what feeds the steer that becomes the steak. Corn feeds the chicken
and the pig, the turkey and the lamb, the catfish and the tilapia and,
increasingly, even the salmon, a carnivore by nature that the fish
farmers are reengineering to tolerate corn. ... The milk and cheese
and yogurt, which once came from dairy cows that grazed on grass, now
typically come from Holsteins that spend their working lives indoors
tethered to machines, eating corn.
"Head over to the processed foods and you find ever more intricate
manifestations of corn. A chicken nugget, for example, piles corn upon
corn: what chicken it contains consists of corn, of course, but so do
most of a nugget's other constituents, including the modified corn
starch that glues the thing together, the corn flour in the batter
that coats it, the corn oil in which it gets fried. Much less
obviously, the leavenings and lecithin, the mono-, di-, and
triglycerides, the attractive golden coloring, and even the citric
acid that keeps the nugget 'fresh' can all be derived from corn.
"To wash down your chicken nuggets with virtually any soft drink is to
have some corn with your corn. Since the 1980s virtually all the sodas
and most of the fruit drinks sold in the supermarket have been
sweetened with high-fructose corn syrup --after water, corn sweetener
is their principal ingredient. Grab a beer for your beverage instead
and you'd still be drinking corn, in the form of alcohol fermented
from glucose refined from corn.
"Read the ingredients on the label of any processed food and, provided
that you know the chemical names it travels under, corn is what you
will find. For modified or unmodified starch, for glucose syrup and
maltodextrin, for crystalline fructose and ascorbic acid, for lecithin
and dextrose, lactic acid and lysine, for maltose and HFCS, for MSG
and polyols, for the caramel color and the xantham gum, read: corn.
Corn is in the coffee whitener and Cheez Whiz, the frozen yogurt and
TV dinner, the canned fruit and ketchup and candles, the soups and
snacks and cake mixes, the frosting and gravy and frozen waffles, the
syrups and hot sauces, the mayonnaise and mustard, the hot dogs and
the bologna, the margarine and the shortening, the salad dressings and
the relishes and even the vitamins. Yes, it's even in the Twinkies,
too. There are some forty-five thousand items in the average American
supermarket and more than a quarter of them now contain corn.
"Even in produce on a day when there's ostensibly no corn for sale
you'll nevertheless find plenty of corn: in the vegetable wax that

gives the cucumbers their sheen, in the pesticide responsible for the
produce's perfection, even in the coating on the cardboard it was
shipped in. Indeed, the supermarket itself--the wallboard and joint
compound, the linoleum and fiberglass and adhesives out of which the
building itself has been built--is in no small measure a manifestation
of corn."
Michael Pollan, The Omnivore's Dilemma, Penguin, Copyright 2006 by
Michael Pollan, pp. 18-19.
-------------------------------------------------------------------------------------------------In today's encore excerpt--explanations rob events of their emotional
impact:
"Explanations allow us to make full use of our experiences, but they
also change the nature of those experiences. As we have seen, when
experiences are unpleasant, we quickly move to explain them in ways
that make us feel better, and indeed, studies show that the mere act
of explaining an unpleasant event can help defang it. ... But just as
explanations ameliorate the impact of unpleasant events, so too do
they ameliorate the impact of pleasant events. ...
"For example, college students volunteered for a study in which they
believed they were interacting in an online chat room with students
from other universities. In fact, they were actually interacting with
a sophisticated computer program that simulated the presence of other
students. After the simulated students had provided the real students
with information about themselves, the researcher pretended to ask the
simulated students to decide which of the people in the chat room they
liked most ... in just a few minutes, something remarkable happened:
Each real student received e-mail messages from every one of the
simulated students indicating they liked that student best!
"Now, here's the catch: Some real students (informed group) received
e-mail that allowed them to know which simulated student wrote each of
the messages, and other real students (uninformed group) received email messages that had been stripped of that identifying
information. ... Hence, real students in the informed group were able
to generate explanations for their good fortune ('Eva appreciates my
values because we're both involved in Habitat for Humanity') ...
whereas real students in the uninformed group were not (Someone
appreciates my values, I wonder who?) ... Although real students in
both groups were initially delighted to have been chosen as everyone's
best friend, only the real students in the uninformed group remained
delighted fifteen minutes later. If you've ever had a secret admirer,
then you understand why. ...
"The reason why unexplained events have a disproportionate emotional

impact is that we are especially likely to keep thinking about them.


People spontaneously try to explain events, and studies show that when
people do not complete the things they set out to do, they are
especially likely to think about and remember their unfinished
business. Once we explain an event, we can fold it up like fresh
laundry, put it away in memory's drawer, and move on to the next one;
but if an event defies explanation, it becomes a mystery ... and
refuses to stay in the back of our minds. ... Explanation robs events
of their emotional impact because it makes them seem likely and allows
us to stop thinking about them."
Daniel Gilbert, Stumbling on Happiness, Knopf, 2006, pp. 186-189.
-------------------------------------------------------------------------------------------------In today's excerpt--happiness:
"Imagine a long, terrible dental procedure. You are rigid in the
chair, hands clenched, soaked with sweat--and then the dentist leans
over and says, 'We're done now. You can go home. But if you want, I'd
be happy to top you off with a few minutes of mild pain.'
"There is a good argument for saying 'Yes. Please do.'
"The psychologist and recent Nobel laureate Daniel Kahneman conducted
a series of studies on the memory of painful events, such as
colonoscopies. He discovered that when we think back on these events,
we are influenced by the intensity of the endings, and so we have a
more positive memory of an experience that ends with mild pain than of
one that ends with extreme pain, even if the mild pain is added to the
same amount of extreme pain. At the moment the dentist makes you his
offer, you would, of course, want to say no--but later on, you would
be better off if you had said yes, because your overall memory of the
event wouldn't be as unpleasant.
"Such contradictions arise all the time. If you ask people which makes
them happier, work or vacation, they will remind you that they work
for money and spend the money on vacations. But if you give them a
beeper that goes off at random times, and ask them to record their
activity and mood each time they hear a beep, you'll likely find that
they are happier at work. Work is often engaging and social; vacations
are often boring and stressful. Similarly, if you ask people about
their greatest happiness in life, more than a third mention their
children or grandchildren, but when they use a diary to record their
happiness, it turns out that taking care of the kids is a downer-parenting ranks just a bit higher than housework, and falls below sex,
socializing with friends, watching TV, praying, eating, and cooking."
Paul Bloom, "First Person Plural," The Atlantic, November 2008, pp.

90-92.
-------------------------------------------------------------------------------------------------In today's excerpt--over the last several decades, programs to aid the
global poor that have been conceived at high levels within foreign aid
organizations have routinely fallen short. Those conceived in the
field have the better record of effectiveness:
"At the World Economic Forum in Davos in 2005, celebrities from Gordon
Brown to Bill Clinton to Bono liked the idea of bed nets as a major
cure for poverty. Sharon Stone jumped up and raised a million dollars
on the spot (from an audience made up largely of middle aged males)
for more bed nets in Tanzania. Insecticide-treated bed nets can
protect people from being bitten by malarial mosquitoes while they
sleep, which significantly lowers malaria infections and deaths. But
if bed nets are such an effective cure, why hadn't planners already
gotten them to the poor? Unfortunately, neither celebrities nor aid
administrators have many ideas for how to get bed nets to the poor.
Such net are often diverted to the black market, become out of stock
in health clinics, or wind up being used as fishing nets or wedding
veils. ...
"[Field workers at] Population Services International (PSI),
headquartered in Washington, D.C., ... stumbled across a way to get
insecticide-treated bed nets to the poor in Malawi, with initial
funding and logistical support from official aid agencies. PSI sells
bed nets for fifty cents to mothers through antenatal clinics in the
countryside, which means it gets the nets to those who both value them
and need them. (Pregnant women and children under five are the
principal risk groups for malaria.) The nurse who distributes the nets
gets nine cents per net to keep for herself, so the nets are always in
stock. PSI also sells nets to richer urban Malawians through privatesector channels for five dollars a net. The profits from this are used
to pay for the subsidized nets sold at the clinics, so the program
pays for itself. PSI's bed net program increased the nationwide
average of children under five sleeping under nets from 8 percent in
2000 to 55 percent in 2004, with a similar increase for pregnant
women.
"A follow-up survey found almost universal use of the nets by those
people who paid for them. By contrast, a study of a program to hand
out free nets in Zambia to people, whether they wanted them or not
(the favored approach of the centralized planners), found that 70
percent of the recipients didn't use the nets. The 'Malawi model' is
now spreading to other African countries."
William Easterly, The White Man's Burden, Penguin, Copyright 2006 by
William Easterly, pp. 13-14.

-------------------------------------------------------------------------------------------------In today's excerpt--science becomes a profession:


"It is natural to describe key [scientific] events in terms of the
work of individuals who made a mark in science--Copernicus, Vesalius,
Darwin, Wallace and the rest. But this does not mean that science has
progressed as a result of the work of a string of irreplaceable
geniuses possessed of a special insight into how the world works.
Geniuses maybe (though not always); but irreplaceable certainly not.
Scientific progress builds step by step, and as the example of Charles
Darwin and Alfred Russel Wallace [who independently and simultaneously
put forward the theory of evolution] shows, when the time is ripe, two
or more individuals may make the next step independently of one
another. It is the luck of the draw, or historical accident, whose
name gets remembered as the discoverer of a new phenomenon.
"What is much more important than human genius is the development of
technology, and it is no surprise that the start of the scientific
revolution 'coincides' with the development of the telescope and the
microscope. ... If Newton had never lived, scientific progress might
have been held back by a few decades. But only by a few decades.
Edmond Halley or Robert Hooke might well have come up with the famous
inverse square law of gravity; Gottfried Leibniz actually did invent
calculus independently of Newton (and made a better job of it); and
Christiaan Huygens's superior wave theory of light was held back by
Newton's espousal of the rival particle theory. ...
"Although the figure of Charles Darwin dominates any discussion of
nineteenth-century science, he is something of an anomaly. It is
during the nineteenth century--almost exactly during Darwin's
lifetime--that science makes the shift from being a gentlemanly hobby,
where the interests and abilities of a single individual can have a
profound impact, to a well-populated profession, where progress
depends on the work of many individuals who are, to some extent,
interchangeable. Even in the case of the theory of natural selection,
as we have seen, if Darwin hadn't come up with the idea, Wallace would
have, and from now on we will increasingly find that discoveries are
made more or less simultaneously by different people working
independently and largely in ignorance of one another. ... The other
side of this particular coin, unfortunately, is that the growing
number of scientists brings with it a growing inertia and resulting
resistance to change, which means that all too often when some
brilliant individual does come up with a profound new insight into the
way the world works, this is not accepted immediately on merit and may
take a generation to work its way into the collective received wisdom
of science. ...

"In 1766, there were probably no more than 300 people who we would now
class as scientists in the entire world. By 1800, ... there were about
a thousand. By ... 1844, there were about 10,000, and by 1900
somewhere around 100,000. Roughly speaking, the number of scientists
doubled every fifteen years during the nineteenth century. But
remember that the whole population of Europe doubled, from about 100
million to about 200 million, between 1750 and 1850, and the
population of Britain alone doubled between 1800 and 1850, from
roughly 9 million to roughly 18 million."
John Gribbin, The Scientists, Random House, Copyright 2002 by John and
Mary Gribbin, pp. xix-xx, 359-361.
-------------------------------------------------------------------------------------------------In today's excerpt--people's ability to rationalize choices after they
are already made:
"Unspoken assumptions and implied information are important to both
the perception of a trick and its subsequent reconstruction. Magician
James Randi ("the Amaz!ng Randi") notes that spectators are more
easily lulled into accepting suggestions and unspoken information than
direct assertions. Hence, in the reconstruction the spectator may
remember implied suggestions as if they were direct proof.
"Psychologists Petter Johansson and Lars Hall, both at Lund University
in Sweden, and their colleagues have applied this and other magic
techniques in developing a completely novel way of addressing
neuroscientific questions. They presented picture pairs of female
faces to naive experimental subjects and asked the subjects to choose
which face in each pair they found more attractive. On some trials the
subjects were also asked to describe the reasons for their choice.
Unknown to the subjects, the investigators occasionally used a
sleight-of-hand technique, learned from a professional magician named
Peter Rosengren, to switch one face for the other-after the subjects
made their choice.
Thus, for the pairs that were secretly manipulated, the result of the
subject's choice became the opposite of his or her initial intention.
Intriguingly, the subjects noticed the switch in only 26 percent of
all the manipulated pairs. But even more surprising, when the subjects
were asked to state the reasons for their choice in a manipulated
trial, they confabulated to justify the outcome-an outcome that was
the opposite of their actual choice! Johansson and his colleagues call
the phenomenon "choice blindness." By tacitly but strongly suggesting
the subjects had already made a choice, the investigators were able to
study how people justify their choices--even choices they do not
actually make.

Susana Martinez-Conde and Stephen L. Macknik, "The Magic and the


Brain," Scientific American, December 2008, pp. 77-78.
-------------------------------------------------------------------------------------------------In today's excerpt--Reinhold Niebuhr, Missouri-born theologian and
cassandric commentator on American culture in the mid-twentieth
century, is invoked by a twenty-first century cassandra, Andrew
Bacevich, in his commentary The Limits of Power:
"As pastor, teacher, activist, theologian, and prolific author,
Niebuhr was a towering presence in American intellectual life from the
1930s through the 1960s. Even today, he deserves recognition as the
most clear-eyed of American prophets. Niebuhr speaks to us from the
past, offering truths of enormous relevance to the present. As
prophet, he warned that what he called 'our dreams of managing
history' ... posed a potentially mortal threat to the United
States.' ...
"Niebuhr wrote after World War II [that] ... a position of apparent
preeminence placed the United States 'under the most grievous
temptations to self-adulation.' ...
"Niebuhr once wrote disapprovingly of Americans, their 'culture soft
and vulgar, equating joy with happiness and happiness with
comfort.' ... In Niebuhr's words, they will cling to 'a culture which
makes 'living standards' the final norm of the good life and which
regards the perfection of techniques as the guarantor of every
cultural as well as every social-moral value.' ...
"Niebuhr [also] wrote, "One of the most pathetic aspects of human
history is that every civilization expresses itself most
pretentiously, compounds its partial and universal values most
convincingly, and claims immortality for its finite existence at the
very moment when the decay which leads to death has already
begun.' ...
" 'The trustful acceptance of false solutions for our perplexing
problems,' he wrote a half century ago, 'adds a touch of pathos to the
tragedy of our age.' ... For all nations, Niebuhr once observed, 'The
desire to gain an immediate selfish advantage always imperils their
ultimate interests. If they recognize this fact, they usually realize
it too late.' "
Andrew J. Bacevich, The Limits of Power, Metropolitan, Copyright 2008
by Andrew J. Bacevich, pp. 8-12, 182.
--------------------------------------------------------------------------------------------------

In today's excerpt-happiness is contagious, and it is more contagious


than unhappiness. In a twenty-year study of over 4700 individuals,
Harvard professor Nicholas Cristakis carefully examined the effects of
happiness within a social network, and documented the results in the
British Medical Journal. The following is excerpted from the much more
detailed paper presented in a recent issue of that journal:
"Emotional states can be transferred directly from one individual to
another by mimicry and 'emotional contagion,' perhaps by the copying
of emotionally relevant bodily actions, particularly facial
expressions, seen in others. People can 'catch' emotional states they
observe in others over time frames ranging from seconds to weeks. For
example, students randomly assigned to a mildly depressed room-mate
became increasingly depressed over a three month period, and the
possibility of emotional contagion between strangers, even those in
ephemeral contact, has been documented by the effects of 'service with
a smile' on customer satisfaction and tipping. ...
"While there are many determinants of happiness, whether an individual
is happy also depends on whether others in the individual's social
network are happy. Happy people tend to be located in the centre of
their local social networks and in large clusters of other happy
people. The happiness of an individual is associated with the
happiness of people up to three degrees removed in the social network.
Happiness, in other words, is not merely a function of individual
experience or individual choice but is also a property of groups of
people. Indeed, changes in individual happiness can ripple through
social networks and generate large scale structure in the network,
giving rise to clusters of happy and unhappy individuals. These
results are even more remarkable considering that happiness requires
close physical proximity to spread and that the effect decays over
time. ...
"These models show that happy alters (friends) consistently influence
[an individual's] happiness more than unhappy alters, and only the
total number of happy alters remains significant in all
specifications. In other words, the number of happy friends seems to
have a more reliable effect on ego happiness than the number of
unhappy friends. Thus, the social network effect of happiness is
multiplicative and asymmetric. ...
"All these relations indicate the importance of physical proximity,
and the strong influence of neighbours suggests that the spread of
happiness might depend more on frequent social contact than deep
social connections. ...
"Happiness spreads significantly more through same sex relationships
than opposite sex relationships. ...

"Conclusions: Human happiness is not merely the province of isolated


individuals. ... The better connected are one's friends and family,
the more likely one will attain happiness in the future. ... People's
happiness depends on the happiness of others with whom they are
connected. This provides further justification for seeing happiness,
like health, as a collective phenomenon."
James H Fowler and Nicholas A Christakis, "Dynamic spread of happiness
in a large social network: longitudinal analysis over 20 years in the
Framingham Heart Study,"British Medical Journal, 4 December 2008.
-------------------------------------------------------------------------------------------------In today's encore excerpt--raw versus cooked. Since brain tissue
requires 22 times the food energy that skeletal muscle does, Homo
erectus would have had to chew raw food for six hours each day to
obtain enough food energy to sustain its brain size. This fact has led
to Harvard anthropologist Richard Wrangham's controversial theory that
fire and cooking were the necessary steps that allowed for the
evolution of larger brains:
" 'I tend to think about human evolution through the lens of chimps,'
Wrangham remarks. 'What would it take to convert a chimpanzee-like
ancestor into a human?' Fire to cook food, he reasoned, which led to
bigger bodies and brains. And that is exactly what he found in Homo
erectus, our ancestor that first appeared 1.6 million to 1.9 million
years ago. H. erectus's brain was 50 percent larger than that of its
predecessor, H. habilis, and it experienced the biggest drop in tooth
size in human evolution. 'There's no other time that satisfies the
expectations that we would have for changes in the body that would be
accompanied by cooking,' Wrangham says.
"The problem with his idea: proof is slim that any human could control
fire that far back. Other researchers believe cooking did not occur
until perhaps only 500,000 years ago. ...
"So Wrangham did more research. He examined groups of modern huntergatherers all over the world and found that no human group currently
eats all their food raw. Humans seem to be well adapted to eating
cooked food: modern humans need a lot of high- quality calories (brain
tissue requires 22 times the energy of skeletal muscle); tough,
fibrous fruits and tubers cannot provide enough. Wrangham and his
colleagues calculated that H. erectus (which was in H. sapiens's size
range) would have to eat roughly 12 pounds of raw plant food a day, or
six pounds of raw plants plus raw meat, to get enough calories to
survive. Studies on modern women show that those on a raw vegetarian
diet often miss their menstrual periods because of lack of energy.
Adding high-energy raw meat does not help much, either--Wrangham found
data showing that even at chimps' chewing rate, which can deliver them

400 food calories per hour, H. erectus would have needed to chew raw
meat for 5.7 to 6.2 hours a day to fulfill its daily energy needs.
When it was not gathering food, it would literally be chewing that
food for the rest of the day. ... [Animals] expend less effort
breaking down cooked food than raw. Heat alters the physical structure
of proteins and starches, thereby making enzymatic breakdown easier.
"Wrangham's theory would fit together nicely if not for that pesky
problem of controlled fire. Wrangham points to some data of early
fires that may indicate that H. erectus did indeed tame fire."
Rachael Moeller Gorman, "Cooking Up Bigger Brains," Scientific
American, January 2008, pp. 102-104.
-------------------------------------------------------------------------------------------------In today's excerpt-with the fall of New Orleans in 1862, the defeat of
the South in the American Civil War became almost assured, since with
the fall of New Orleans, the South lost its primary means of paying
for the war:
"The fall of Vicksburg is always seen as one of the great turning
points in the war. And yet, from a financial point of view, it was
really not the decisive one. The key event had happened more than a
year before, two hundred miles downstream from Vicksburg, where the
Mississippi joins the Gulf of Mexico. On 29 April 1862, Flag Officer
David Farragut had run the guns of Fort Jackson and Fort St Philip to
seize control of New Orleans. This was a far less bloody and
protracted clash than the siege of Vicksburg, but equally disastrous
for the Southern cause. ...
"In the final analysis, it was as much a lack of hard cash as a lack
of industrial capacity or manpower that undercut what was, in military
terms, an impressive effort by the Southern states. ...
"When, [in order to finance its war,] the Confederacy tried to sell
conventional bonds in European markets, investors showed little
enthusiasm. But the Southerners had an ingenious trick up their
sleeves. The trick (like the sleeves themselves) was made of cotton,
the key to the Confederate economy and by far the South's largest
export. The idea was to use the South's cotton crop not just as a
source of export earnings, but as collateral for a new kind of cottonbacked bond. When the obscure French firm of Emile Erlanger and Co.
started issuing cotton-backed bonds on the South's behalf, the
response in London and Amsterdam was more positive. ...
"Yet the South's ability to manipulate the bond market depended on one
overriding condition: that investors should be able to take physical
possession of the cotton which underpinned the bonds if the South

failed to make its interest payments. Collateral is, after all, only
good if a creditor can get his hands on it. And that is why the fall
of New Orleans in April 1862 was the real turning point in the
American Civil War. ...
"With its domestic bond market exhausted and only two paltry foreign
loans, the Confederate government was forced to print unbacked paper
dollars to pay for the war and its other expenses, 1.7 billion
dollars' worth in all. Both sides in the Civil War had to print money,
it is true. But by the end of the war the Union's 'greenback' dollars
were still worth about 50 cents in gold, whereas the Confederacy's
'greybacks' were worth just one cent, despite a vain attempt at
currency reform in 1864. With ever more paper money chasing ever fewer
goods, inflation exploded. Prices in the South rose by around 4,000
percent during the Civil War. By contrast, prices in the North rose by
just 60 per cent. Even before the surrender of the principal
Confederate armies in April 1865, the economy of the South was
collapsing, with hyperinflation as the sure harbinger of defeat."
Niall Ferguson, The Ascent of Money, Penguin, Copyright 2008 by Niall
Ferguson, pp. 92-97.
-------------------------------------------------------------------------------------------------In today's excerpt-the powers of the United States president:
"According to James Madison's Notes of Debates in the Federal
Convention of 1787, the [powers of the president] received
surprisingly little attention at the Constitutional Convention in
Philadelphia. ...
"In the end, the Framers were artfully vague about the extent and
limits of the president's powers. Article I, Section 8 of the
Constitution, which empowers Congress, runs 429 words; Article II,
Section 2, the presidential equivalent, is about half as long. The
powers assigned to the president alone are few: he can require Cabinet
members to give him their opinions in writing; he can convene a
special session of Congress 'on extraordinary occasions,' and may set
a date for adjournment if the two houses cannot agree on one; he
receives ambassadors and is commander in chief of the armed forces; he
has a veto on legislation (which Congress can override); and he has
the power to pardon.
"The president also shares two powers with the Senate--to make
treaties, and to appoint federal judges and other 'officers of the
United States,' including Cabinet members. And, finally, the president
has two specific duties--to give regular reports on the state of the
union, and to 'take care that the laws be faithfully executed.'

"All in all, the text of Article II, while somewhat ambiguous--a flaw
that would be quickly exploited--provided little warning that the
office of president would become uniquely powerful. Even at the
convention, Madison mused that it 'would rarely if ever happen that
the executive constituted as ours is proposed to be would have
firmness enough to resist the legislature.' In fact, when citizens
considered the draft Constitution during the ratification debates in
1787 and 1788, many of their concerns centered on the possibility that
the Senate would make the president its cat's-paw. Few people foresaw
the modern presidency, largely because the office as we know it today
bears so little relation to that prescribed by the Constitution. ...
"[In contrast], under the pen name 'Pacificus,' Alexander Hamilton
wrote a defense of [a president's] power to act without congressional
sanction. The first Pacificus essay is the mother document of the
'unitary executive' theory that Bush's apologists have pushed to its
limits since 2001. Hamilton seized on the first words of Article II:
'The executive power shall be vested in a President of the United
States of America.' He contrasted this wording with Article I, which
governs Congress and which begins, 'All legislative powers herein
granted shall be vested in a Congress of the United States.' What this
meant, Hamilton argued, was that Article II was 'a general grant
of ... power' to the president. Although Congress was limited to its
enumerated powers, the executive could do literally anything that the
Constitution did not expressly forbid. Hamilton's president existed,
in effect, outside the Constitution.
"That's the Bush conception, too. In 2005, John Yoo, the author of
most of the administration's controversial 'torture memos,' drew on
Hamilton's essay when he wrote, 'The Constitution provides a general
grant of executive power to the president.' Since Article I vests in
Congress 'only those legislative powers 'herein granted,' ' Yoo
argued, the more broadly stated Article II must grant the president
'an unenumerated executive authority.'
Garrett Epps, "The Founder's Great Mistake," The Atlantic, January/
February 2009, pp. 68-71.
-------------------------------------------------------------------------------------------------In today's encore excerpt--70% of the world's population resides on
just 7% of the world's land:
"Today, there are just over 6 billion people on earth. Six hundred
years ago, in 1400, humankind was just 6 percent of that, or about 350
million, slightly more than the current population of the United
States. ... The 350 million people living in 1400 were not uniformly
distributed across the face of the earth, but rather clustered in a
very few pockets of much higher density. Indeed, of the 60 million

square miles of dry land on earth, most people lived on just 4.25
million square miles, or barely 7 percent of the dry land. The reason,
of course, is that that land was the most suitable for agriculture,
the rest being covered by swamp, steppe, desert, or ice.
"Moreover, those densely populated regions of earth corresponded to
just fifteen highly developed civilizations, the most notable being
(from east to west) Japan, Korea, China, Indonesia, Indonesia,
Indochina, the Islamic West Asia, Europe, Aztec, and Inca.
Astoundingly, nearly all of the 350 million people alive in 1400 lived
in a handful of civilizations occupying a very small proportion of the
earth's surface. Even more astoundingly, that still holds true today:
70 percent of the world's six billion people live on those same 4.25
million square miles.
Robert B. Marks, The Origins of the Modern World, Rowman and
Littlefield, Copyright 2007 by Rowman and Littlefield Publishers, pp.
23-24.
-------------------------------------------------------------------------------------------------In today's encore excerpt--the language of the French nation. With the
arrival of the 18th and 19th centuries came the rise of the "nation"
in a sense not previously seen in Europe. This nationalism--which was
to be achieved through a common language and a common sense of
identity and purpose--was needed in part as a replacement for the
divine right of kings and the unifying influence that it had long
provided. As Graham Robb's book The Discovery of France shows, this
new nationalism came more readily in large cities than in remote
villages in the period following the Revolution:
"Again and again, Robb shows how the centralizing ambitions of the
metropolis were thwarted by peoples who barely considered themselves
to be 'French' and did not even speak the language.
" 'O Oc Si Bai Ya Win Oui Oyi Awe Jo Ja Oua' is the title of one of
Robb's chapters--just a few of the many words for 'Yes' in the microdialects of France. In 1794, the Abbe Gregoire sent out a
questionnaire to town halls asking how patois--which the Encyclopedie
defined as 'corrupt language as spoken in almost all the provinces'-could be destroyed. His survey revealed that France contained a mere 3
million pure French-speakers, 11 percent of the population. More than
6 million were in total ignorance of the French language. The abbe
found this alarming. 'In liberty, we are the advance-guard of nations.
In language, we are still at the Tower of Babel.' He followed this up
with a report, 'The Necessity and Means of Exterminating Patois and
Universalizing the Use of the French language'. To the abbe, the Babel
of patois was dangerous because it undermined patriotism. How could
there possibly be a nation without a common language?

"Reading Robb, one is left suddenly uncertain as to whether France


ever really was a complete nation, at least until the early twentieth
century. Even in 1863, a quarter of army recruits spoke only patois.
As late as 1880, only a fifth of the population was entirely at ease
in the French language. And this linguistic alienation went hand in
hand with a hostility to the idea of France itself. The abbe was right
to have been worried. In Gascony and Provence, they spoke
contemptuously of the 'Franchiman' and the 'Franciot', by which they
meant the people from the north [of France]. Elsewhere, Robb depicts
fierce local communities in which there was violent prejudice both
against visitors and against neighboring settlements. In the early
1740s, a cartographer taking part in the Cassini mission to make for
the first time a reliable map of France was hacked to death in a tiny
village in the Massif Central called Les Estables. A savage and
irrational act? Not according to Robb, who argues that these people
'were defending themselves against an act of war'. To be mapped out
was eventually, over time, to be phased out of existence."
Bee Wilson, "The Truth is Out There," a review of Graham Robb's new
book The Discovery of France in The Times Literary Supplement, January
4, 2008, p. 13.
-------------------------------------------------------------------------------------------------In today's excerpt--the concept of good taste. As a larger merchant
class and middle-class began to emerge in England and Western Europe,
money alone was no longer a sufficient way for the wealthy to
distinguish themselves from the lesser classes. So the concept of
"taste" emerged as a means for the "elite" to assert superiority over
those whose wealth was beginning to ascend:
"'Taste' is a term which first acquired prominence in England in the
later 17th century. As goods multiplied, it became ... an important
form of cultural differentiation. As a contemporary noted in 1633,
'great folks' always had a tendency to 'think nothing of that which is
common and ordinary people may easily come by.' Taste involved
transcending mere financial criteria when assessing the value of
goods, introducing instead a subtler and more elusive yardstick.
"It implied a capacity for discrimination of the kind shown in 1606 by
the wine connoisseur Captain Dawtrey, who, 'taking the glass in his
hand, held it up awhile betwixt him and the window, as to consider the
colour; and then putting it to his nose he seemed to take comfort in
the odour of the same.' It required the ability to choose the best out
of a wide range of functionally indistinguishable options, like the 50
different patterns of wallpaper that on one occasion in 1752
confronted the poet William Shenstone. The essayist Joseph Addison
compared a person who had true taste in literary matters with the man

who could identify each of ten different kinds of tea or any


combination of them. ...
"Taste was notoriously a quality which the vulgar lacked, for they
were without the necessary education and experience, whereas
connoisseurs were cultivated, well travelled and 'conversant with the
better sort of people.' 'Those who depend for food on bodily labour,'
ruled the critic Lord Kames in 1762, 'are totally devoid of taste.'
The middle-class inhabitants of the London suburbs were scorned by
their social superiors for their bad taste, manifested in the
embarrassingly derivative style of their houses and gardens. Taste was
the prerogative of the 'polite.' It was a faculty which required
education, foreign travel and close conformity to the standard set by
an elite minority. In Samuel Johnson's words, 'a few, a very few,
commonly constitute the taste of the time' (1754). ...
"The competition thus shifted away from the conspicuous display of
opulence to a more restrained demonstration of elegance, refinement
and fastidious discrimination. ... The ownership of culturally
esteemed objects became a symbol of status; and the claim to superior
sensibilities, defined as the capacity to feel pain at what causes no
pain to others, emerged, in Jeremy Bentham's words, as 'a mark of ...
belonging to the ruling few.' The purchasing power of the middling and
lower classes might rise, but the elite could hold on to its monopoly
of cultural capital by asserting that wealth was not enough."
Keith Thomas, "To Buy or Not to Buy," History Today, Volume: 59 Issue:
2 , pp. 12-13.
-------------------------------------------------------------------------------------------------In today's excerpt-beef. In nature, cows graze and eat prairie grass.
In the beef industry, cows are taken to CAFOs-Concentrated Animal
Feeding Operations-where they live in stalls and are fed corn. It is
this beef that ends up on our dinner tables:
"So then why [aren't steers fed grass]? Speed, in a word, or, in the
industry's preferred term, 'efficiency.' Cows raised on grass simply
take longer to reach slaughter weight than cows raised on a richer
diet, and for half a century now the industry has devoted itself to
shortening a beef animal's allotted span on earth. 'In my
grandfather's time, cows were four or five years old at slaughter,' [a
CAFO operator] explained. 'In the fifties, when my father was
ranching, it was two or three years old. Now we get there at fourteen
to sixteen months.' Fast food, indeed. What gets a steer from 80 to
1,100 pounds in fourteen months is tremendous quantities of corn,
protein and fat supplements, and an arsenal of new drugs. ...
"[At the CAFO's] thundering hub, three meals a day for thirty-seven

thousand animals are designed and mixed by computer. A million pounds


of feed pass through the mill each day. Every hour of every day a
tractor trailer pulls up to the loading dock to deliver another fifty
tons of corn. ... [to which are added] thousands of gallons of
liquefied fat and protein supplements, vats of liquid vitamins and
synthetic estrogen, and ... fifty-pound sacks of antibiotics--Rumensin
and Tylosin. Along with alfalfa hay and silage (for roughage), all
these ingredients will be automatically blended and then piped into
the parade of dump trucks that three times a day fan out from here to
keep the [CAFO's] eight and a half miles of trough filled. ...
"We've come to think of 'corn-fed' as some kind of old-fashioned
virtue, which it may well be when you're referring to Midwestern
children, but feeding large quantities of corn to cows for the greater
part of their lives is a practice neither particularly old nor
virtuous....
"Cattle rarely live on feedlot diets for more than 150 days, which
might be about as much as their systems can tolerate. 'I don't know
how long you could feed them this ration before you'd see
problems,' [Vetenarian] Dr. Mel Metzin said; another vet told me the
diet would eventually 'blow out their livers' and kill them. Over time
the acids eat away at the rumen wall, allowing bacteria to enter the
animal's bloodstream. These microbes wind up in the liver, where they
form abscesses and impair the liver's function. Between 15 percent and
30 percent of feedlot cows are found at slaughter to have abscessed
livers. ... What keeps a feedlot animal healthy--or healthy enough-are antibiotics."
Michael Pollan, The Omnivore's Dilemma, Penguin, Copyright 2006 by
Michael Pollan, pp. 71-78.
-------------------------------------------------------------------------------------------------In today's encore excerpt--the mad scramble for Africa, during which
western European powers took over essentially the entire continent in
a thirty year period:
"For centuries, Europeans found penetration of Africa to be almost
impossible: various diseases endemic to the tropical parts of the
continent, especially malaria, restricted slave-trading Europeans to
coastal enclaves free from the disease. By the nineteenth century,
steamships may have permitted access to the interior on Africa's
various rivers, but malaria still killed most of the explorers.
Although the cause of malaria was not discovered until 1880 and the
means of transmission by mosquito not uncovered until 1897, a process
of trial and error led to the realization by mid-nineteenth century
that the bark of the cinchona tree native to South America contained
quinine, a substance that prevented malaria. British military

personnel then successfully planted cinchona seeds in India and by the


1870s had greatly increased the supply of quinine to their troops.
"The subsequent 'scramble for Africa' may have been initiated in the
1870s by French insecurities from their defeat by the Germans in 1871,
by the bizarre ... scheming of Belgium's King Leopold II, and by
British determination to protect their colonial interests in India,
but all of those motivations would have been irrelevant had it not
been for ... quinine, ... for steamboats, or for new technology in
weapons that killed more efficiently. ...
"The American Civil War and a European arms race in the 1860s and
1870s revolutionized guns. ... The pinnacle of perfection came in the
1880s with the invention of a reliable machine gun, named after its
inventor Hiram Maxim. ... Africans put up a valiant and stiff
resistance [to Europeans], but their technology was no match for the
Maxim gun. The most famous and perhaps deadly instance was at the 1898
Battle of Obdurman where British troops confronted the 40,000-man
Sudanese Dervish army. As described by Winston Churchill, ... 'The
charging Dervishes sank down in tangled heaps.' ... After five hours,
the British had lost 20 soldiers; 10,000 Sudanese were killed. As a
saying had it:
"Whatever happens we have got
The Maxim gun and they have not.
"With such a technological advantage, by 1900 most of Africa had been
divided up among a handful of European powers, in particular Britain,
France, Germany, Belgium, with Portugal hanging onto its seventeenthcentury colonial possession in Angola. Only Ethiopia, under the
extraordinary leadership of King Menelik, defeated the weakest
European power, Italy, and thereby maintained its independence."
Robert B. Marks, The Origins of the Modern World, Rowman &
Littlefield, Copyright 2006 by Rowman & Littlefield, pp. 142-144.
-------------------------------------------------------------------------------------------------In today's excerpt-Socrates. The popular image of Socrates as
immense moral integrity was largely the creation of his pupil
If we examine evidence of his trial, argues Robin Waterfield,
different picture emerges, of a cunning politician opposed to
democracy:

a man of
Plato.
a
Athenian

"We are blessed by having more extant words written about Socrates
than any other man of his time, and cursed by the fact that we cannot
tell which, if any, of these words are true. We can be certain that
Plato and Xenophon were not committed to factual reporting. ...
Socrates himself wrote nothing, and the work of his immediate

followers, after his death, is not historically reliable.


"Generations of classical scholars ... [have] chosen to privilege
Plato's portrait over those Xenophon's or anyone else. And so the
Socrates who is likely to be familiar is Plato's Socrates: the
merciless interrogator, committed to nothing but the truth, and
determined, by means of incisive argument, to lay his own and our
moral lives on a foundation of knowledge rather than opinion; a
specialist in moral philosophy and moral psychology; a man of immense
moral integrity, who was unjustly put to death (by drinking a cup of
poison hemlock), aged sixty-nine or seventy, by the classical Athenian
democracy under which he lived.
"But the uncomfortable truth is that little or nothing of this picture
of Socrates may be accurate. Plato's description of Socrates'
philosophy was actually a clever way of outlining and introducing
Plato's own philosophy.
"In the course of his speech 'Against Timarchus', the politician
Aeschines referred to Socrates' trial, saying that the Athenian people
condemned him for having been the teacher of Critias. Aeschines was
speaking in 345 BC, fifty-four years after Socrates' trial. ...
Critias was one of the leaders of the Thirty Tyrants, the oligarchic
junta which Sparta imposed on the Athenians in 404 BC after defeating
them in the Peloponnesian War. The Thirty aspired to turn Athens into
a hierarchical, Spartan-style society. They restricted the number of
citizens to 3,000, disarmed everyone else, awarded themselves the
power of life or death over all non-citizens, and expelled all noncitizens from living within the city itself. Non- citizens were to be
the farmers, manufacturers and merchants for the elite 3,000, while
all political power was effectively vested in the Thirty and their
henchmen. In order to see through their radical program of social
reform, and in order to raise much needed cash (the city had been
bankrupted by the war), they murdered about 1,500 people in a few
weeks. Many more fled into exile.
"The Thirty were soon defeated. Critias was killed and the rest fled
or were allowed to leave. But Critias had long been a friend and
student of Socrates, who had became tainted by the association.
[Socrates] tolerated even the excesses of the Thirty because he was,
at least to a degree, sympathetic to their aims ... of favouring a
Spartan-style society over Athenian democracy.
"There can be no doubt, then, that Socrates' trial was politically
motivated, and there can be no doubt that, from the point of view of
the Athenian democracy, he was guilty as charged. He was no true
citizen of the democracy. There can be no doubt, either, that
attention to the historical facts surrounding the case must lead us to
qualify the Platonic-Xenophontic portrait of Socrates. He was put on
trial as a political undesirable, and his radical political vision was

indeed anti-democratic. This is not the Socrates with whom we are


comfortably familiar, but it is more likely to be closer to the truth
than the fictions that permeate the literary evidence."
Robin Waterfield, "The Historical Socrates," History Today, January
09, pp. 26-29.
-------------------------------------------------------------------------------------------------In today's excerpt-the brain can grow new neurons, but these disappear
unless cognitively challenged:
"Fresh neurons arise in the brain every day. ... Recent work, albeit
mostly in rats, indicates that learning enhances the survival of new
neurons in the adult brain, and the more engaging and challenging the
problem, the greater the number of neurons that stick around. These
neurons are then presumably available to aid in situations that tax
the mind. It seems, then, that a mental workout can buff up the brain,
much as physical exercise builds up the body. ...
"In the 1990s scientists rocked the field of neurobiology with the
startling news that the mature mammalian brain is capable of sprouting
new neurons. Biologists had long believed that this talent for
neurogenesis was reserved for young, developing minds and was lost
with age. But in the early part of the decade Elizabeth Gould, then at
the Rockefeller University, demonstrated that new cells arise in the
adult brain--particularly in a region called the hippocampus, which is
involved in learning and memory. ...
"Studies indicate that in rats, between 5,000 and 10,000 new neurons
arise in the hippocampus every day. (Although the human hippocampus
also welcomes new neurons, we do not know how many.) The cells are not
generated like clockwork, however. Instead their production can be
influenced by a number of different environmental factors. For
example, alcohol consumption has been shown to retard the generation
of new brain cells. And their birth rate can be enhanced by exercise.
Rats and mice that log time on a running wheel can kick out twice as
many new cells as mice that lead a more sedentary life. ...
"Exercise and other actions may help produce extra brain cells. But
those new recruits do not necessarily stick around. Many if not most
of them disappear within just a few weeks of arising. Of course, most
cells in the body do not survive indefinitely. So the fact that these
cells die is, in itself, not shocking. But their quick demise is a bit
of a puzzler. Why would the brain go through the trouble of producing
new cells only to have them disappear rapidly?
"From our work in rats, the answer seems to be: they are made 'just in
case.' If the animals are cognitively challenged, the cells will

linger. If not, they will fade away."


Tracey J. Shors, "Saving New Brain Cells," Scientific American, March
2009, pp. 47-48.
-------------------------------------------------------------------------------------------------In today's encore excerpt--Robert Wright argues that it is new
technologies--especially information technologies such as alphabets,
money, and contracts for investing--that lead to decentralized
economic opportunity, which then leads to decentralized power, which
in turn leads to democracy:
"The Greeks ... helped test a thesis that as a society's technologies
become broadly accessible, the result can be not just economic
vibrance but political freedom. The Greeks added vowels to the
phonetic alphabet, carrying it to its height of accessibility. They
grasped the virtues of coins and started minting their own. ... On
balance, the results of the test were encouraging. Classical Athens,
in its better moments, was economically vibrant, broadly literate (by
ancient standards), and democratic (ditto).
"Actually, the general notion that economic decentralization disperses
political power had gotten some support from earlier phases in
cultural evolution as well. The (relatively) market-oriented Aztecs
had their unusually egalitarian legal code. And in (relatively)
market-oriented Mesopotamia justice was sometimes administered by
citizens' assemblies. In early northern Mesopotamia, a profusion of
clay contracts speaks of a robust private sector ... [aided by a]
simplified, less esoteric cuneiform script used in contracts, ... and
we find evidence of something like democracy. The documents from
community assemblies show them not merely meting out justice, but
assuming a deliberative, quasi-legislative function. ... If economic
freedom ... brings the wealth that makes states powerful, and if
economic freedom tends to entail political freedom, then history might
turn out to be on the side of political freedom. After all, powerful
states have a tendency to prevail over weaker ones. ...
"[Much later] as some historians have noted, the Magna Carta, though
rightly regarded as a milestone in democracy's history, looms too
large in the popular mind. Issued by King John in 1215 under pressure
from his barons, it guaranteed them the right to be consulted about
taxation. But this was not so new. The English monarchy had consulted
councils of nobles on major decisions for centuries. The bigger
development was the expanding realm of representation over the
following century, as the king's council--the parliament--came to
include burgesses, representatives of the towns. As markets
distributed economic power more broadly across British society,
political power followed. That's how the world works."

Robert Wright, Nonzero, Vintage, Copyright 2000 by Robert Wright, pp.


120-121, 153.
-------------------------------------------------------------------------------------------------In today's excerpt-the last great economic calamity in U.S. history:
the recession of 1981-82, a very different crisis than the one we now
face. In the late 70s, inflation in excess of 10% was ravaging the
economy, the result of loose monetary policy, oil shocks, and Vietnamera spending. Some surveys of that time show that an astounding 70
percent of the Americans cited it as the major problem. By the
mid-80s, inflation had returned to low levels after Fed Chairman Paul
Volcker, a Jimmy Carter appointee, had attacked it with high interest
rates, and then-president Ronald Reagan had allowed Volcker's
approach. But the cost was high:
"Americans detested inflation. We seemed to have lost control, both as
individuals and as a society, over our fate. ... Among government
officials, there was a widespread fatalism about continued inflation.
President Carter often seemed forlorn at the prospect. Early in 1980,
he was asked at a press conference what he planned to do about the
problem. He replied, 'It would be misleading for me to tell any of you
that there is a solution to it.' His resignation was common. ...
Later, Carter himself judged that inflation had been the decisive
issue [in his election loss], more important than his mishandling of
the Iranian hostage crisis. ...
"Inflation was rationalized as a reflection of the deeper ills of
American society. It was not a cause of our problems; it was a
consequence of our condition. Specifically, it was said to show that
the nation was becoming ungovernable. Americans had more wants than
could be met. ...
"Volcker [took the view that inflation was simply a monetary
phenomenon-the government had printed too much money, and] took a
sledgehammer to inflationary expectations. He raised interest rates,
tightened credit, and triggered the most punishing economic slump
since the 1930s. In December 1980, banks' 'prime rate' (the loan rate
for the worthiest business borrowers) hit a record 21.5 percent.
Mortgage and bond rates rose in concert. By the summer of 1981,
consumers had trouble borrowing for homes and cars. Many companies
couldn't borrow for new investment. Industrial production dropped 12
percent from mid-1981 until late 1982. In many industries, declines
were steeper. In autos, it was 34 percent (from June 1981 to January
1982), and in steel it was 56 percent (from August 1981 to December
1982). By 1982 the number of business failures had tripled from 1979.
Construction starts of new homes in 1982 were 40 percent below the
1979 level. Worse, unemployment exploded. By late 1982, it was 10.8

percent, which remains a post-World War II record. ...


"There was an outpouring of bills and resolutions to impeach Volcker,
roll back interest rates, or require the appointment of new Fed
governors sympathetic to farmers, workers, consumers, and small
businesses. Rep. Jack Kemp (D-N.Y.), a prominent Republican 'supplysider,' wanted Volcker to resign. In August 1982, Sen. Robert C. Byrd
of West Virginia, the Democratic floor leader, introduced the Balanced
Monetary Policy Act of 1982, which would have forced the Fed to reduce
interest rates.
"Reagan's popularity ratings collapsed. In May 1981, early in his
presidency, Reagan's approval had reached a high of 68 percent. By
April 1982, it was 45 percent (46 percent disapproved); by January
1983, it was 35 percent, the low point (56 percent disapproved). ...
"Even now, the social costs of controlling inflation seem horrendous.
Over a four-year period (1979-82), the U.S. economy's output barely
increased. Since 1950, there had been nothing like that. Unemployment
peaked in 1982 near 11 percent-a figure that, a few years earlier,
would have been widely judged inconceivable. ... The number of
business failures in 1982 (24,908) was nearly 50 percent higher than
in any other year since World War II, and it would double to 52,078 by
1984. From 1979 to 1983, farm income declined almost 50 percent."
Robert J. Samuelson, "Lessons from the Great Inflation," Reason,
January 2009, pp. 51-54.
-------------------------------------------------------------------------------------------------In today's excerpt-by 1800 AD, the economic plight of the average man,
according to economic historian Gregory Clark in A Farewell to Alms, A
Brief Economic History of the World, was not better than that of his
hunting and gathering ancestors. After 1800 the economic well-being of
men in some societies skyrocketed upward, and the well-being of others
plummeted:
"Before 1800 income per person-the food, clothing, heat, light, and
housing available per head-varied across societies and epochs. But
there was no upward trend. A simple but powerful mechanism ... ensured
that short-term gains in income through technological advances were
inevitably lost through population growth.
"Thus the average person in the world of 1800 was no better off than
the average person of 100,000 BC. Indeed in 1800 the bulk of the
world's population was poorer than their remote ancestors. The lucky
denizens of wealthy societies such as eighteenth-century England or
the Netherlands managed a material lifestyle equivalent to that of the
Stone Age. But the vast swath of humanity in East and South Asia,

particularly in China and Japan, eked out a living under conditions


probably significantly poorer than those of cavemen.
"The quality of life also failed to improve on any other observable
dimension. Life expectancy was no higher in 1800 than for huntergatherers: thirty to thirty-five years. Stature, a measure both of the
quality of diet and of children's exposure to disease, was higher in
the Stone Age than in 1800. And while foragers satisfy their material
wants with small amounts of work, the modest comforts of the English
in 1800 were purchased only through a life of unrelenting drudgery.
Nor did the variety of material consumption improve. The average
forager had a diet, and a work life, much more varied than the typical
English worker of 1800, even though the English table by then included
such exotics as tea, pepper, and sugar. Incomes rose sharply in many
countries after 1800 but declined in others.
-------------------------------------------------------------------------------------------------In today's excerpt--supersizing and the "thrifty gene":
"That distinction [of inventing supersizing] belongs to a man named
David Wallerstein. Until his death in 1993, Wallerstein served on the
board of directors at McDonald's, but in the fifties and sixties he
worked for a chain of movie theaters in Texas, where he labored to
expand sales of soda and popcorn--the high-markup items that theaters
depend on for their profitability. As the story is told in John Love's
official history of McDonald's, Wallerstein tried everything he could
think of to goose up sales--two-for-one deals, matinee specials--but
found he simply could not induce customers to buy more than one soda
and one bag of popcorn. He thought he knew why: Going for seconds
makes people feel piggish.
"Wallerstein discovered that people would spring for more popcorn and
soda--a lot more--as long as it came in a single gigantic serving.
Thus was born the two-quart bucket of popcorn, the sixty-four-ounce
Big Gulp, and, in time, the Big Mac and the jumbo fries, though Ray
Kroc himself took some convincing. In 1968, Wallerstein went to work
for McDonald's, but try as he might, he couldn't convince Kroc, the
company's founder, of supersizing's magic powers.
" 'If people want more fries,' Kroc told him, 'they can buy two bags.'
Wallerstein patiently explained that McDonald's customers did want
more but were reluctant to buy a second bag. 'They don't want to look
like gluttons.'
"Kroc remained skeptical, so Wallerstein went looking for proof. He
began staking out McDonald's outlets in and around Chicago, observing
how people ate. He saw customers noisily draining their sodas, and
digging infinitesimal bits of salt and burnt spud out of their little

bags of French fries. After Wallerstein presented his findings, Kroc


relented and approved supersized portions, and the dramatic spike in
sales confirmed the marketer's hunch. Deep cultural taboos against
gluttony--one of the seven deadly sins, after all--had been holding us
back. Wallerstein's dubious achievement was to devise the dietary
equivalent of a papal dispensation: Supersize it! He had discovered
the secret to expanding the (supposedly) fixed human stomach.
"One might think that people would stop eating and drinking these
gargantuan portions as soon as they felt full, but it turns out hunger
doesn't work that way. Researchers have found that people (and
animals) presented with large portions will eat up to 30 percent more
than they would otherwise. Human appetite, it turns out, is
surprisingly elastic, which makes excellent evolutionary sense: It
behooved our hunter-gatherer ancestors to feast whenever the
opportunity presented itself, allowing them to build up reserves of
fat against future famine. Obesity researchers call this trait the
'thrifty gene.' And while the gene represents a useful adaptation in
an environment of food scarcity and unpredictability, it's a disaster
in an environment of fast-food abundance, when the opportunity to
feast presents itself 24/7. Our bodies are storing reserves of fat
against a famine that never comes."
Michael Pollan, The Omnivore's Dilemma, Penguin, Copyright 2006 by
Michael Pollan, pp. 105-106.
-------------------------------------------------------------------------------------------------In today's encore excerpt--the banana. Originally from the fertile
coastal soils of Asia and India, and named after the Arabic word for
finger, bananas are the world's largest fruit crop and the fourthlargest product grown overall, after wheat, rice, and corn. Though a
food staple for poor people in many parts of the world, bananas face a
disease epidemic with no known cure that could lead to their
extinction:
"Americans eat more bananas per year than apples and oranges combined;
and in many other parts of the world, bananas--more than rice, more
than potatoes--are what keep hundreds of millions of people alive. ...
A banana tree isn't a tree at all; it's the world's largest herb. The
fruit itself is actually a giant berry. Most of us eat just a single
kind of banana, a variety called the Cavendish, but over one thousand
types are found worldwide, including dozens of wild varieties, many no
bigger than your pinky and filled with tooth-shattering seeds. ...
"There is no country on earth that loves bananas more than India.
There are more varieties of the fruit there than anywhere else. If you
visit, I recommend that you search for the lovely Thella Chakkarakeli,
a candy-sweet fruit that is moist enough to almost be considered

juicy. ... India grows 20 percent of the world's bananas--about 17


million tons--each year. That's three times more fruit than the
world's number two banana-producing nation--Ecuador. ... More than 670
types of bananas, cultivated and wild, grow in the country. Thirty-two
forest bananas are so rare that only a single plant or two have been
discovered. ...
"The Cavendish is not the fruit your grandparents enjoyed. That banana
was the Gros Michel--by all accounts a more spectacular banana than
our Cavendish. It was larger, with a thicker skin, a creamier texture,
and a more intense, fruity taste. ... But the Gros Michel disappeared.
A disease began the ravage banana crops ... [and] by 1960, fifty years
after the malady was first discovered, the Gros Michel was effectively
extinct. The banana industry was in crisis, itself threatened with
disappearance. It was only at the last minute that a new banana was
adopted--the Cavendish, which was immune to the disease. ...
"Today, [a new] blight is tearing through banana crops worldwide. It
has spread to Pakistan, the Philippines, and Indonesia. It is on the
rise in Africa. While it has yet to arrive in our hemisphere, in
dozens of interviews I have conducted since 2004, I couldn't find a
single person studying the fruit who seriously believes it won't. For
the past five years, banana scientist have been trying--in a race
against time--to modify the fruit to make it resistant to [this new]
disease."
Dan Koeppel, Banana, Hudson Street Press, Copyright 2008 by Dan
Koeppel, pp. xii-xviii, 30- 31.
-------------------------------------------------------------------------------------------------In today's excerpt--Thomas Jefferson, though famed and revered for his
"small-government" philosophy, was in his actions one of early
America's most active practitioners of large-scale government
intervention and activism, buying the Louisiana territory without
constitutional authority, selling federal lands at below cost to
ensure widespread ownership, and developing a $20 million plan for
building roads and canals:
"Jefferson was explicitly averse to expensive central government,
federal indebtedness, and a central bank. He was an admirer of Adam
Smith's 'invisible hand,' and probably read The Wealth of Nations
years after it was published in England in 1776. ... A good
government, Jefferson summarized in a letter during Washington's term,
must be 'a wise and frugal government, which shall restrain men from
injuring one another, which shall leave them otherwise free to
regulate their own pursuits of industry and improvement, and shall not
take from the mouth of labor the bread it has earned.'

"Thus, the philosophy was set in principle, but Jefferson violated it


in practice. The pragmatic basis of American prosperity and freedom
resided in one fact, about which Jefferson did not delude himself.
Land was widely and inexpensively available. [Thus] Jefferson bought
the Louisiana territory from Napoleon for $15 million, which he agreed
to borrow. He willingly defied the Constitution's limits on his
authority to do so without congressional approval. ...
"But there was another critical choice Jefferson made. The broad
distribution of land he thought ideal could be accomplished only
through government control and regulation. The federal and state
governments owned almost all the unclaimed land at the start of the
nation, and Jefferson was among the early political leaders who were
determined to be sure the land was sold at affordable prices and was
widely owned. ... This was a powerful use of government, even if land
availability compared to the size of the population made low prices
easier to implement. ... As the historian Frank Bourgin points out,
the federal government was by far the largest owner of land in the
nation, holding some 1.6 million acres at one point. It could have
been a significant source of federal revenues. ...
"For all his disdain for government, Jefferson always sought to set
aside federal land for schools. He was initially ideologically
hesitant to use federal moneys for new roads, but by his second term
he had changed his mind regarding the federal financing of roads. ...
"Jefferson had already approved the building of the Cumberland Road to
connect the Potomac and the Ohio Rivers in 1806, which would become
the largest public works project undertaken until the Erie Canal. He
asked Albert Gallatin to prepare a comprehensive program of roads and
canals to be implemented once the national debt was nearly paid off.
Gallatin drew up an ambitious ten-year plan for the development of
transportation that would cost a stunning $20 million, to be financed
with bonds and paid off over time through tariffs and land sales.
Jefferson believed a constitutional amendment was needed to authorize
the spending. Perhaps he would have gotten it, but the embargo he
imposed in 1808 on trade with Britain ended all such ambitious plans.
'The planning that took place in Jefferson's second term of office
remains to this day so little known,' writes Frank Bourgin, 'that the
student of American history must marvel at this fact.' "
Jeff Madrick, The Case for Big Government, Princeton, Copyright 2009
by Princeton University Press, Kindle Loc. 353-400.
-------------------------------------------------------------------------------------------------In today's excerpt--grammar myths:
"Take the idea that it is wrong to say If a student comes before I get

there, they can slip their test under my office door, becausestudent
is singular and they 'is plural.' Linguists traditionally observe that
esteemed writers have been using they as a gender-neutral pronoun for
almost a thousand years. As far back as the 1400s, in the Sir Amadace
story, one finds the likes of Iche mon in thayre degree ("Each man in
their degree").
"Maybe when the sentence is as far back as Middle English, there is a
sense that it is a different language on some level than what we
speak--the archaic spelling alone cannot help but look vaguely
maladroit. But Shakespeare is not assumed to have been in his cups
when he wrote in The Comedy of Errors, 'There's not a man I meet but
doth salute me / As I were their well-acquainted friend' (Act IV,
Scene 111). Later, Thackeray in Vanity Fair tosses off 'A person can't
help their birth.' ...
"Or there's the objection to nouns being used as verbs. These days,
impact comes in for especial condemnation: The new rules are impacting
the efficiency of the procedure. People lustily express that they do
not 'like' this, endlessly writing in to language usage columnists
about it. Or one does not 'like' the use of structure as in I
structured the test to be as brief as possible.
"Well, okay--but that means you also don't 'like' the use of view,
silence, worship, copy, outlaw, and countless other words that started
as nouns and are now also verbs. Nor do many people shudder at the use
of fax as a verb. ...
"Over the years, I have gotten the feeling that there isn't much
linguists can do to cut through this. ... There are always books out
that try to put linguists' point across. Back 1950, Robert Hall's
Leave Your Language Alone! was all over the place, including a late
edition kicking around in the house I grew up in. Steven Pinker's The
Language Instinct, which includes a dazzling chapter on the grammar
myths, has been one of the most popular books on language ever
written. As I write, the flabbergastingly fecund David Crystal has
just published another book in the tradition, The Fight for English:
How Language Pundits Ate, Shot, and Left. But the air of frustration
in Crystal's title points up how persistent the myths are. ...
"English is shot through with things that don't really follow. I'm the
only one, amn't I? Shouldn't it be amn't after all? Aren't, note, is
'wrong' since are is used with you, we, and they, not I. There's no 'I
are.' Aren't I? is thoroughly illogical-and yet if you decided to
start saying amn't all the time, you would lose most of your friends
and never get promotions. Except, actually, in parts of Scotland and
Ireland where people actually do say amn't--in which case the rest of
us think of them as 'quaint' rather than correct!"
John McWhorter, Our Magnificent Bastard Tongue, Gotham, Copyright 2008

by John McWhorter, pp. 65-69, 80


-------------------------------------------------------------------------------------------------In today's excerpt--the Bible. David Plotz, a Jewish journalist,
decides to read the Old Testament as an honest exercise in discovering
the roots of his heritage. In his short new book, Good Book--The
Bizarre, Hilarious, Disturbing, Marvelous, and Inspiring Things I
Learned When I Read Every Single Word of the Bible--he reports
chapter-by-chapter on the entire Old Testament, and the following
excerpt regarding Moses and the Passover captures the style and tone
of his reporting:
"Back to Egypt: Aaron and Moses pay a visit to Pharaoh, and at first
request merely that he allow the Israelites a few days off for a camp
meeting in the wilderness. When the negotiations falter, Moses and
Aaron increase their demands, eventually insisting that Pharaoh
liberate the Israelites. As Pharaoh resists, Moses begins inflicting
plagues on the Egyptians.
"Curiously, the most compelling characters in the drama of the plagues
are not Moses, Aaron, or Pharaoh, but Pharaoh's anonymous sorcerers. I
am fascinated by these guys. We are introduced to them when Moses and
Aaron first visit Pharaoh. To impress Pharaoh, Aaron throws down his
rod and it turns into a snake. The cocky sorcerers toss down their
rods, which turn into snakes, too. But then Aaron's snake gobbles
theirs up. God 1, Sorcerers 0.
"The sorcerers, of course, don't learn their lesson. Aaron and Moses
begin delivering plagues, and the sorcerers keep thinking they can
trump God. When Moses and Aaron turn the Nile to blood, the sorcerers
do 'the same with their spells.' Aaron and Moses cover Egypt with
frogs. The sorcerers do 'the same with their spells.' Moses and Aaron
bring lice. But the sorcerers, their powers waning, can't conjure up
lice. (That's really lame. Even I could conjure up lice: I would just
drop by my daughter's first-grade classroom and rub a few heads.) A
couple of plagues later, the sorcerers' defeat is total. Moses
afflicts the Egyptians with boils. The sorcerers, summoned to work
their counter-magic, don't even show up: they can't, because they're
covered with boils. The increasing feebleness of their dark arts makes
for great black comedy--and hilariously effective testimony for God's
power. The sorcerers are the gangster's dumb sidekicks, ... and it's
wonderful to see them meet the deserved misfortune of flunkies
everywhere.
"Except for the trouncing of the sorcerers, however, the plagues don't
speak well for God. In fact, this is the most disturbing story in the
Bible so far--even more troubling than the Flood. The ten plagues
basically go like this. Moses and Aaron unleash a plague. Pharaoh

promises to let the Israelites go if God will lift


plague ceases, and Pharaoh immediately reneges, so
is unleashed. The mystery, of course, is: why does
Exodus tells us the answer: he reneges because God
heart.'

the plague. The


that another plague
Pharaoh renege?
has 'stiffened his

"Why would God keep hardening Pharaoh's heart so that He can inflict
yet another monstrous plague? Why would God prolong the Egyptians'
suffering? God tells us why. Listen carefully:
"For I have hardened his heart ... in order that I may display these
My signs among them, and that you may recount in the hearing of your
sons and your sons' sons how I made a mockery of the Egyptians and how
I displayed My signs among them-in order that you may know I am the
Lord.
"In other words, God is causing the plagues so that we can tell
stories about the plagues. He's torturing the Egyptians so that we
will worship Him. What kind of insecure and cruel God murders children
so that His followers will obey Him, and will tell stories about
Him? ... He even performs the last and worst plague--the slaying of
the firstborn--Himself. He wants the plagues to persist and worsen, so
that we will tell stories about them. And lo and behold, 3,500 years
later, that's exactly what we do every Passover.
"[And] how stupid is Pharaoh? Egypt has been pummeled by frogs,
vermin, lice, cattle disease, hail, and other plagues; it has lost all
its firstborn males (the plague that finally leads to freedom for the
Israelites); its gods are manifestly impotent against the wrath of our
God. But that doesn't deter the idiotic monarch from pursuing the
Israelites across the Red Sea. "
David Plotz, Good Book, HarperCollins, Copyright 2009 by David Plotz,
Kindle Loc. 677-720.
-------------------------------------------------------------------------------------------------In today's excerpt--decisions. It has long been held that the rational
parts of our brains make the best decisions, and better decisions are
made when the emotional parts of our brains are suppressed in the
decision-making process. It turns out that the opposite is true--we
cannot make decisions without employing emotions:
"In 1982, a patient named Elliot walked into the office of neurologist
Antonio Damasio. A few months earlier, a small tumor had been cut out
of Elliot's cortex, near the frontal lobe of his brain. Before the
surgery, Elliot had been a model father and husband. He'd held down an
important management job in a large corporation and was active in his
local church. But the operation changed everything. Although Elliot's

IQ had stayed the same--he still tested in the 97th percentile--he now
exhibited one psychological flaw: he was incapable of making a
decision.
"This dysfunction made normal life impossible. Routine tasks that
should have taken ten minutes now required several hours. Elliot
endlessly deliberated over irrelevant details, like whether to use a
blue or black pen, what radio station to listen to, and where to park
his car. ... His indecision was pathological. Before long, Elliot was
fired from his job. That's when things really began to fall apart. He
started a series of new businesses, but they all failed. He was taken
in by a con man and was forced into bankruptcy. His wife divorced him.
The IRS began an investigation. He moved back in with his parents. ...
"But why was Elliot suddenly incapable of making good decisions? What
had happened to his brain? Damasio's first insight occurred while
talking to Elliot about the tragic turn his life had taken. ...
Damasio remembers, 'I never saw a tinge of emotion in my many hours of
conversation with him: no sadness, no impatience, no frustration.' ...
The results [of tests] were clear: Elliot felt nothing. He had the
emotional life of a mannequin.
"This was a completely unexpected discovery. At the time, neuroscience
assumed that human emotions were irrational. A person without any
emotions--in other words, someone like Elliot--should therefore make
better decisions. ... [However], when we are cut off from our
feelings, the most banal decisions became impossible. A brain that
can't feel can't make up its mind. ... Other patients with similar
patterns of brain damage ... all appeared intelligent and showed no
deficits on any conventional cognitive tests. And yet they all
suffered from the same profound flaw: because they didn't experience
emotion, they had tremendous difficulty making any decisions. ... The
crucial importance of our emotions--the fact that we can't make
decisions without them--contradicts the conventional view of human
nature, with its ancient philosophical roots. ...
"How does this emotional brain system work? The orbitofrontal cortex
(OFC), the part of the brain that Elliot was missing, is responsible
for integrating visceral emotions into the decision-making process. It
connects the feelings generated by the 'primitive' brain--areas like
the brain stem and the amygdala, which is in the limbic system--to the
stream of conscious thought. When a person is drawn to a specific
receiver, or a certain entree on the menu, or a particular romantic
prospect, the mind is trying to tell him that he should choose that
option. It has already assessed the alternatives--this analysis takes
place outside of conscious awareness--and converted that assessment
into a positive emotion. And when he sees a receiver who's tightly
covered, or smells a food he doesn't like, or glimpses an exgirlfriend, it is the OFC that makes him want to get away. The world
is full of things, and it is our feelings that help us choose among

them.
"When this neural connection is severed--when our OFCs can't
comprehend our own emotions--we lose access to the wealth of opinions
that we normally rely on. All of a sudden, you no longer know what to
think about the receiver running a short post pattern or whether it's
a good idea to order the cheeseburger for lunch. The end result is
that it's impossible to make decent decisions. This is why the OFC is
one of the few cortical regions that are markedly larger in humans
than they are in other primates. While Plato and Freud would have
guessed that the job of the OFC was to protect us from our emotions,
to fortify reason against feeling, its actual function is precisely
the opposite. From the perspective of the human brain, man is the most
emotional animal of all."
Jonah Lehrer, How We Decide, Houghton Mifflin, Copyright 2009 by Jonah
Lehrer, Kindle Loc. 233-87.
-------------------------------------------------------------------------------------------------In today's excerpt--population trends in Europe. Statisticians tell us
that a population must have an average of 2.1 children per family to
maintain its numerical level. Any less than that, that population
declines:
"Three deeply misleading assumptions about demographic trends have
become lodged in the public mind. The first is that mass migration
into Europe, legal and illegal, combined with an eroding native
population base, is transforming the ethnic, cultural, and religious
identity of the continent. The second assumption, which is related to
the first, is that Europe's native population is in steady and serious
decline from a falling birthrate, and that the aging population will
place intolerable demands on governments to maintain public pension
and health systems. The third is that population growth in the
developing world will continue at a high rate. Allowing for the
uncertainty of all population projections, the most recent data
indicate that all of these assumptions are highly questionable and
that they are not a reliable basis for serious policy decisions. ...
"One fact that gets lost among ... is that the birthrates of Muslim
women in Europe--and around the world--have been falling significantly
for some time. Data on birthrates among different religious groups in
Europe are scarce, but they point in a clear direction. Between 1990
and 2005, for example, the fertility rate in the Netherlands for
Moroccan-born women fell from 4.9 to 2.9, and for Turkish-born women
from 3.2 to 1.9. ...
"In some Muslim countries--Tunisia, the United Arab Emirates, Bahrain,
Kuwait, and Lebanon--fertility rates have already fallen to near-

European levels. Algeria and Morocco, each with a fertility rate of


2.4, are both dropping fast toward such levels. Turkey is experiencing
a similar trend. ... Reports suggests that in Indonesia, the country
with the world's largest Muslim population, the fertility rate for the
years 2010-15 will drop to 2.02, a shade below replacement level. ...
"Iran is experiencing what may be one of the most dramatic demographic
shifts in human history. Thirty years ago, after the shah had been
driven into exile and the Islamic Republic was being established, the
fertility rate was 6.5. By the turn of the century, it had dropped to
2.2. Today, at 1.7, it has collapsed to European levels. The
implications are profound for the politics and power games of the
Middle East and the Persian Gulf, putting into doubt Iran's dreams of
being the regional superpower and altering the tense dynamics between
the Sunni and Shiite wings of Islam. ...
"The falling fertility rates in large segments of the Islamic world
have been matched by another significant shift: Across northern and
western Europe, women have suddenly started having more babies.
Germany's minister for the family, Ursula von der Leyen, announced in
February that the country had recorded its second straight year of
increased births. Sweden's fertility rate jumped eight percent in 2004
and stayed put. Both Britain and France now project that their
populations will rise from the current 60 million each to more than 75
million by midcentury. ...
"By contrast, the downward population trends for southern and eastern
Europe show little sign of reversal. Ukraine, for example, now has a
population of 46 million; if maintained, its low fertility rate will
whittle its population down by nearly 50 percent by mid-century. The
Czech Republic, Italy, and Poland face declines almost as drastic.
"In Russia, the effects of declining fertility are amplified by an
[AIDS and alcoholism] phenomenon so extreme that it has given rise to
an ominous new term--hypermortality. ... It is important to consider
what this means for the future of the Russian economy. Identified by
Goldman Sachs as one of the BRIC quartet (along with Brazil, India,
and China) of key emerging markets, Russia has been the object of
great hopes and considerable investments. But a very large question
mark must be placed on the economic prospects of a country whose young
male work force looks set to decrease by half."
Martin Walker, "The World's New Numbers," The Wilson Quarterly, Spring
2009, pp. 25-28.
-------------------------------------------------------------------------------------------------In today's excerpt--the Columbine massacre, at which 13 students were
murdered and many more injured. As reported by Dave Cullen, local law

enforcement had a large amount of incriminating evidence on Eric


Harris, the leader of the massacre, months and years before April 20,
1999, the tragic day--including a felony arrest, evidence of multiple
hate crimes, a venomous web site, and the urgent complaints of other
parents. However, they not only failed to follow up diligently on
these, they denied their existence for years following the tragedy:
"Jeffco [Jefferson County law enforcement] had a problem. Before Eric
[Harris] and Dylan [Klebold] shot themselves, officers had discovered
files on the boys. The cops had twelve pages from Eric's web site,
spewing hate and threatening to kill. For detectives, a written
confession, discovered before the killers were captured, was a big
break. It certainly simplified the search warrant. But for commanders,
a public confession, which they had sat on since 1997--that could be a
PR disaster.
"The Web pages had come from Randy and Judy Brown [parent's of Eric's
classmate Brooks Brown] . They had warned the sheriff's department
repeatedly about Eric, for more than a year and a half. Sometime
around noon, April 20, the file was shuttled to the command center in
a trailer set up in Clement Park. Jeffco officials quoted Eric's site
extensively in the search warrants executed that afternoon, but then
denied ever seeing it. (They would spend several years repeating those
denials. They suppressed the damning warrants as well.) Then Sheriff
Stone fingered [the innocent] Brooks as a suspect on The Today Show.
"It was a rough time for the Brown family. The public got two
conflicting stories: Randy and Judy Brown had either labored to
prevent Columbine or raised one of its conspirators. Or both.
"To the Browns it looked like retribution. Yes, their son had been
close to the killers--close enough to see it coming. The Browns had
blown the whistle on Eric Harris over a year earlier, and the cops had
done nothing. After Eric went through with his threats, the Browns
were fingered as accomplices instead of heroes. They couldn't believe
it. They told The New York Times they had contacted the sheriff's
department about Eric fifteen times. Jeffco officials would insist for
years that the Browns never met with an investigator--despite holding
a report indicating they had.
"The officers knew they had a problem, and it was much worse than the
Browns realized. Thirteen months before the massacre, Sheriff's
Investigators John Hicks and Mike Guerra had investigated one of the
Browns' complaints. They'd discovered substantial evidence that Eric
was building pipe bombs. Guerra had considered it serious enough to
draft an affidavit for a search warrant against the Harris home. For
some reason, the warrant was never taken before a judge. Guerra's
affidavit was convincing. It spelled out all the key components:
motive, means, and opportunity. A few days after the massacre, about a
dozen local officials slipped away from the Feds and gathered

clandestinely in an innocuous office in the county Open Space


Department building. It would come to be known as the Open Space
meeting. The purpose was to discuss the affidavit for a search
warrant. How bad was it? What should they tell the public?
"Guerra was driven to the meeting, and told never to discuss it
outside that group. He complied. The meeting was kept secret, too.
That held for five years. ... He described it as 'one of those coveryour- ass meetings.' ...
"At a notorious press conference ten days after the murders, Jeffco
officials suppressed the affidavit and boldly lied about what they had
known. They said they could not find Eric's Web pages, they found no
evidence of pipe bombs matching Eric's descriptions, and had no record
of the Brown's meeting with Hicks. Guerra's affidavit plainly
contradicted all three claims. Officials had just spent days reviewing
it. They would repeat the lies for years."
Dave Cullen, Columbine, Hachette Book Group, Copyright 2009 by Dave
Cullen, pp. 165-166.
-------------------------------------------------------------------------------------------------In today's excerpt--the persecution of Christians in the Roman Empire,
and the subsequent persecution of "pagans" by Christians, as discussed
by James J. O'Donnell, a Princeton and Yale-educated historian who has
taught at Bryn Mawr, Cornell and Penn, and has published widely on the
history and culture of the late antique Mediterranean world:
"Christianity, born and bred in the Jewish matrix, made the rest of
the world what it called pagan by detaching the Jewish assertion of
uniqueness from place of origin, and opening membership to all
humankind. 'Go and teach all nations,' Jesus was said to have taught,
and Christians most often took this teaching quite seriously, even if
it didn't move most of them to relocate and teach in strange lands.
They followed in this regard not only Jesus but Paul, for it was
Paul's reading of Christianity--as something far more ambitious than
the revival or fulfillment of traditional Judaism--that prevailed in
the end.
"Forcing a message of uniqueness and exclusivity allowed Christians to
make themselves satisfyingly unpopular. Persecution became their badge
of success. Popular imagination probably still thinks of a long period
of time in which hard-nosed Roman governors regularly pulled brave,
dewy-eyed, idealistic Christians off the streets, tortured them, and
then fed them to the lions. The facts are less glamorous, but the
influential church historian Eusebius, a fourth-century contemporary
and supporter of Constantine, imbued this idea with long life in his
account of ten waves of persecution that mirrored Egypt's ten plagues

in the time of Moses. What really happened was episodic, local, and
highly inconsistent. ...
"Most Christians lived and died like their fellow Romans, undisturbed
by government, quarreling now and then with some of their neighbors.
In the 250s, the emperor Decius ordered the suppression of
Christianity, and in the early 300s, the emperor Galerius launched the
most systematic attempt ever to deter and uproot Christian practice.
In such times, suspect Christians were required to perform some
minimal public religious act and get a certificate to prove they had
done so. There is no sign that such fits of suppression and
persecution had any lasting effect.
"Christians resisted persecution well--both the ordinary spasmodic
kind and the infrequent broader campaign--because their communities
were many-headed, did not have substantial real property, and lived so
fully intermingled with Roman society that they could not simply be
carved out and attacked. A century after Galerius, when Christian
emperors set out to--we might as well use the word--persecute 'pagan'
communities and practices, they were far more devastatingly effective.
They halted the supply of state funds for traditional practices,
crippling much of what had been long familiar. Then they seized
buildings and banned ritual in them, sweeping the landscape nearly
clean of the old ways. What survived--and much did--was personal,
small-scale, or highly localized. Over a relatively short time, the
new bludgeoned the old into submission and eventually supplanted it.
That's what real persecution could do, unafraid to use violence but
not needing to use very much of it. But Christianity never faced
anything like what it would later visit on the traditional cults."
James J. O'Donnell, The Ruin of the Roman Empire, HarperCollins,
Copyright 2008 by James J. O'Donnell, pp. 150-151.
-------------------------------------------------------------------------------------------------In today's excerpt--the Columbine massacre and the media. In the hours
after the April 20, 1999 Columbine massacre, the press began reporting
rumors as fact--that the killers were "targeting" jocks, were victims
of bullying, were Goths, and belonged to a gang called the Trench Coat
Mafia. The myths they promulgated in those first few hours were all
incorrect yet persist as explanations in the popular mind even to this
day:
"The Trench Coat Mafia [explanation] was mythologized because it was
colorful, memorable, and fit the existing myth of the school shooter
as outcast loner. All the Columbine myths worked that way. And they
all sprang to life incredibly fast--most of the notorious myths took
root [in the few hours] before the killers' bodies were found.

"We remember Columbine as a pair of outcast Goths from the Trench Coat
Mafia snapping and tearing through their high school hunting down
jocks to settle a long-running feud. Almost none of that happened. No
Goths, no outcasts, nobody snapping. No targets, no feud, and no
Trench Coat Mafia. Most of those elements existed at Columbine--which
is what gave them such currency. They just had nothing to do with the
murders. The lesser myths are equally unsupported: no connection to
Marilyn Manson, Hitler's birthday, minorities, or Christians. Few
people knowledgeable about the case believe those myths anymore. Not
reporters, investigators, families of the victims, or their legal
teams. And yet most of the public takes them for granted. Why? ...
"In a school of two thousand, most of the student body didn't even
know the boys. Nor had many seen gunfire directly. Initially, most
students told reporters they had no idea who attacked them. That
changed fast. Most of the two thousand got themselves to a television
or kept a constant cell phone vigil with viewers. It took only a few
TV mentions for the trench coat connection to take hold. It sounded so
obvious. Of course! Trench coats, Trench Coat Mafia! ...
"Repetition was the problem. Only a handful of students mentioned the
Trench Coat Mafia (TCM) during the first five hours of CNN coverage-virtually all fed from local news stations. But reporters homed in on
the idea. ... Kids 'knew' the TCM was involved because witnesses and
news anchors had said so on TV. They confirmed it with friends
watching similar reports. ... Pretty soon, most of the students had
multiple independent confirmations. They believed they knew the TCM
was behind the attack as a fact. From 1:00 to 8:00 P.m., the number of
students in Clement Park citing the group went from almost none to
nearly all. They weren't making it up, they were [simply] repeating it
back. ...
"The writers assumed kids were informing the media. It was the other
way around. Most of the myths were in place by nightfall. By then, it
was a given that the killers had been targeting jocks. The target myth
was the most insidious, because it went straight to motive. The public
believes Columbine was an act of retribution: a desperate reprisal for
unspeakable jock-abuse. Like the other myths, it began with a kernel
of truth.
"Bullying and racism? Those were known threats. Explaining it away was
reassuring. By evening, the target theory was dominating most
broadcasts; nearly all the major papers featured it. ... Reuters
attributed the theory to 'many witnesses' and USA Today to
'students.' ... If students said targeting, that was surely it. Police
detectives ... were baffled by this media consensus."
Dave Cullen, Columbine, Hachette Book Group, Copyright 2009 by Dave
Cullen, pp. 149-152.

-------------------------------------------------------------------------------------------------In today's excerpt--psychopaths. Eric Harris, leader of the duo that


carried out the Columbine massacre, was a psychopath:
"Eric Harris was neither normal nor insane. Psychopathy (si-COP-uhthee) represents a third category. Psychopathic brains don't function
like those in either of the other groups, but they are consistently
similar to one another. Eric killed for two reasons: to demonstrate
his superiority and to enjoy it.
"To a psychopath, both motives make sense. Psychopaths ... can torture
and mutilate their victims with about the same sense of concern that
we feel when we carve a turkey for Thanksgiving dinner.' ...
"Psychopaths have likely plagued mankind since the beginning, but they
are still poorly understood. In the 1800s, as the fledgling field of
psychology began classifying mental disorders, one group refused to
fit. Every known psychosis was marked by a failure of reasoning or a
debilitating ailment: paralyzing fear, hallucinations, voices,
phobias, and so on. In 1885, the term psychopath was introduced to
describe vicious human predators who were not deranged, delusional, or
depressed. They just enjoyed being bad.
"Psychopaths are distinguished by two characteristics. The first is a
ruthless disregard for others: they will defraud, maim, or kill for
the most trivial personal gain. The second is an astonishing gift for
disguising the first. It's the deception that makes them so dangerous.
You never see him coming. (It's usually a him--more than 80 percent
are male.) Don't look for the oddball creeping you out. Psychopaths
don't act like Hannibal Lecter or Norman Bates. They come off like
Hugh Grant, in his most adorable role. ...
"Psychopaths take great personal pride in their deceptions and extract
tremendous joy from them. Lies become the psychopath's occupation, and
when the truth will work, they lie for sport. 'I like to con people,'
one subject told a researcher during an extended interview. 'I'm
conning you right now.' Lying for amusement is so profound in
psychopaths, it stands out as their signature characteristic. 'Duping
delight,' psychologist Paul Ekman dubbed it. ...
"Symptoms appear so early, and so often in stable homes with normal
siblings, that the condition seems to be inborn. Most parents report
having been aware of disturbing signs before the child entered
kindergarten. Dr. Robert Hare described a five-year-old girl
repeatedly attempting to flush her kitten down the toilet. 'I caught
her just as she was about to try again,' the mother said. 'She seemed
quite unconcerned, maybe a bit angry--about being found out.' When the
woman told her husband, the girl calmly denied the whole thing. Shame

did not register; neither did fear. Psychopaths are not individuals
losing touch with those emotions. They never developed them from the
start. ...
"Researchers are still just beginning to understand psychopaths, but
they believe psychopaths crave the emotional responses they lack. They
are nearly always thrill seekers. They love roller coasters and hang
gliding, and they seek out high-anxiety occupations, like ER tech,
bond trader, or Marine. Crime, danger, impoverishment, death--any sort
of risk will help. They chase new sources of excitement because it is
so difficult for them to sustain.
"They rarely stick with a career; they get bored. Even as career
criminals, psychopaths underperform. They 'lack clear goals and
objectives, getting involved in a wide variety of opportunistic
offenses, rather than specializing the way typical career criminals
do,' Dr. Hervey Cleckley wrote. They make careless mistakes and pass
up stunning opportunities, because they lose interest. They perform
spectacularly in short bursts--a few weeks, a few months, a yearlong
big con--then walk away. ...
"Rare killer psychopaths nearly always get bored with murder, too.
When they slit a throat, their pulse races, but it falls just as fast.
It stays down--no more joy from cutting throats for a while; that
thrill has already been spent."
Dave Cullen, Columbine, Hachette Book Group, Copyright 2009 by Dave
Cullen, pp. 239-244.
-------------------------------------------------------------------------------------------------In today's excerpt--eerie coincidences. Things often happen that seem
like omens or fateful coincidences-the pianist at the bar starts
playing a song you'd just been thinking of, or you pass the window of
a pawnshop and see the heirloom ring that had been stolen from your
apartment eighteen months ago, or a long lost friend calls just after
you learn of a personal tragedy. Natalie Angier explains that often
things we view as omens are instead reasonably probably outcomes:
"The more one knows about probabilities, the less amazing the most
woo-woo coincidences become. ...
"John Littlewood, a renowned mathematician at the University of
Cambridge, formalized the apparent intrusion of the supernatural into
ordinary life as a kind of natural law, which he called 'Littlewood's
Law of Miracles.' He defined a 'miracle' as many people might: a onein-a-million event to which we accord real significance when it
occurs. By his law, such 'miracles' arise in anyone's life at an
average of once a month. Here's how Littlewood explained it: You are

out and about and barraged by the world for some eight hours a day.
You see and hear things happening at a rate of maybe one per second,
amounting to 30,000 or so 'events' a day, or a million per month. The
vast majority of events you barely notice, but every so often, from
the great stream of happenings, you are treated to a marvel: the
pianist at the bar starts playing a song you'd just been thinking of,
or you pass the window of a pawnshop and see the heirloom ring that
had been stolen from your apartment eighteen months ago. Yes, life is
full of miracles, minor, major, middling C. It's called 'not being in
a persistent vegetative state' and 'having a life span longer than a
click beetle's.'
"And because there is nothing more miraculous than birth, [Professor]
Deborah Nolan also likes to wow her new students with the famous
birthday game. I'll bet you, she says, that at least two people in
this room have the same birthday. The sixty-five people glance around
at one another and see nothing close to a year's offering of days
represented, and they're dubious. Nolan starts at one end of the
classroom, asks the student her birthday, writes it on the blackboard,
moves to the next, and jots likewise, and pretty soon, yup, a
duplicate emerges. How can that be, the students wonder, with less
than 20 percent of 365 on hand to choose from (or 366 if you want to
be leap-year sure of it)? First, Nolan reminds them of what they're
talking about--not the odds of matching a particular birthday, but of
finding a match, any match, somewhere in their classroom sample.
"She then has them think about the problem from the other direction:
What are the odds of them not finding a match? That figure, she
demonstrates, falls rapidly as they proceed. Each time a new birth
date is added to the list, another day is dinged from the possible 365
that could subsequently be cited without a match. Yet each time the
next person is about to announce a birthday, the pool the student
theoretically will pick from remains what it always was--365. One
number is shrinking, in other words, while the other remains the same,
and because the odds here are calculated on the basis of comparing
(through multiplication and division) the initial fixed set of
possible options with an ever diminishing set of permissible ones, the
probability of finding no birthday match in a group of sixty-five
plunges rapidly to below 1 percent. Of course, the prediction is only
a probability, not a guarantee. For all its abstract and
counterintuitive texture, however, the statistic proves itself time
and again in Nolan's classroom a dexterous gauge of reality.
"If you're not looking for such a high degree of confidence, she adds,
but are willing to settle for a fifty-fifty probability of finding a
shared birthday in a gathering, the necessary number of participants
accordingly can be cut to twenty-three."
Natalie Angier, The Canon, Houghton Mifflin, Copyright 2007 by Natalie
Angier, pp. 50-52.

-------------------------------------------------------------------------------------------------In today's encore excerpt--from the annals of evolutionary psychology,


the observation that males want lots of sex, and sometimes bring
gifts:
"In species after species, females are coy and males are not. Indeed,
males are so dim in their sexual discernment they may pursue things
other than females. Among some kinds of frogs, mistaken homosexual
courtship is so common that a 'release call' is used by males who find
themselves in the clutches of another male to notify them that they
are both wasting their time. Male snakes, for their part, have been
known to spend a while with dead females before moving on to a live
prospect. And male turkeys will avidly court a stuffed replica of a
female turkey. In fact, a replica of a female turkey's head suspended
fifteen inches from the ground will generally do the trick. The male
circles the head, does its ritual displays, and then (confident,
presumably, that its performance has been impressive) rises into the
air and comes down in the proximity of the female's backside, which
turns out not to exist. The more virile males will show such interest
even when a wooden head is used, and a few can summon lust for a
wooden head with no eyes or beak. ...
"For a species low in [the need] for male parental [involvement], the
basic dynamic of courtship, as we've seen, is pretty simple: the male
really wants sex; the female isn't so sure. She may want time to
(unconsciously) assess the quality of his genes, whether by inspecting
him or letting him battle with other males for her favor. She may also
pause to weigh the chances that he carries a disease. And she may try
to extract a precopulation gift, taking advantage of the high demand
for her eggs. This 'nuptial offering'--which technically constitutes a
tiny male parental investment, since it nourishes her and her eggs--is
seen in a variety of species, ranging from primates to black-tipped
hanging flies. The female hanging fly insists on having a dead insect
to eat during sex. If she finishes before the male is finished, she
may head off in search of another meal, leaving him high and dry. If
she isn't so quick, the male may repossess the leftovers for
subsequent dates."
Robert Wright, The Moral Animal, First Vintage, Copyright 1994 by
Robert Wright, pp. 46-47, 59-60.
-------------------------------------------------------------------------------------------------In today's excerpt - Jonah Lehrer proposes that morality is a form of
decision-making, and is based on emotions, not logic:

"Psychopaths shed light on a crucial subset of decision-making that's


referred to as morality. Morality can be a squishy, vague concept, and
yet, at its simplest level, it's nothing but a series of choices about
how we treat other people. When you act in a moral manner - when you
recoil from violence, treat others fairly, and help strangers in need
- you are making decisions that take people besides yourself into
account. You are thinking about the feelings of others, sympathizing
with their states of mind.
"This is what psychopaths can't do. ... They are missing the primal
emotional cues that the rest of us use as guides when making moral
decisions. The psychopath's brain is bored by expressions of terror.
The main problem seems to be a broken amygdala, a brain area
responsible for propagating aversive emotions such as fear and
anxiety. As a result, psychopaths never feel bad when they make other
people feel bad. ... Hurting someone else is just another way of
getting what he wants, a perfectly reasonable way to satisfy desires.
The absence of emotion makes the most basic moral concepts
incomprehensible. G. K. Chesterton was right: 'The madman is not the
man who has lost his reason. The madman is the man who has lost
everything except his reason.'
"At first glance, the connection between morality and the emotions
might be a little unnerving. Moral decisions are supposed to rest on a
firm logical and legal foundation. Doing the right thing means
carefully weighing competing claims, like a dispassionate judge. These
aspirations have a long history. The luminaries of the Enlightenment,
such as Leibniz and Descartes, tried to construct a moral system
entirely free of feelings. Immanuel Kant argued that doing the right
thing was merely a consequence of acting rationally. Immorality, he
said, was a result of illogic. ... The modern legal system still
subscribes to this antiquated set of assumptions and pardons anybody
who demonstrates a 'defect in rationality' - these people are declared
legally insane, since the rational brain is supposedly responsible for
distinguishing between right and wrong. If you can't reason, then you
shouldn't be punished.
"But all of these old conceptions of morality are based on a
fundamental mistake. Neuroscience can now see the substrate of moral
decisions, and there's nothing rational about it. 'Moral judgment is
like aesthetic judgment,' writes Jonathan Haidt, a psychologist at the
University of Virginia. 'When you see a painting, you usually know
instantly and automatically whether you like it. If someone asks you
to explain your judgment, you confabulate ... Moral arguments are much
the same: Two people feel strongly about an issue, their feelings come
first, and their reasons are invented on the fly, to throw at each
other.'
"Kant and his followers thought the rational brain acted like a
scientist: we used reason to arrive at an accurate view of the world.

This meant that morality was based on objective values; moral


judgments described moral facts. But the mind doesn't work this way.
When you are confronted with an ethical dilemma, the unconscious
automatically generates an emotional reaction. (This is what
psychopaths can't do.) Within a few milliseconds, the brain has made
up its mind; you know what is right and what is wrong. These moral
instincts aren't rational. ...
"It's only after the emotions have already made the moral decision
that those rational circuits in the prefrontal cortex are activated.
People come up with persuasive reasons to justify their moral
intuition. When it comes to making ethical decisions, human
rationality isn't a scientist, it's a lawyer. This inner attorney
gathers bits of evidence, post hoc justifications, and pithy rhetoric
in order to make the automatic reaction seem reasonable. But this
reasonableness is just a facade, an elaborate self- delusion. Benjamin
Franklin said it best in his autobiography: 'So convenient a thing it
is to be a reasonable creature, since it enables one to find or make a
reason for everything one has a mind to do.'
"In other words, our standard view of morality - the philosophical
consensus for thousands of years - has been exactly backward. We've
assumed that our moral decisions are the byproducts of rational
thought, that humanity's moral rules are founded in such things as the
Ten Commandments and Kant's categorical imperative. Philosophers and
theologians have spilled lots of ink arguing about the precise logic
of certain ethical dilemmas. But these arguments miss the central
reality of moral decisions, which is that logic and legality have
little to do with anything."
Jonah Lehrer, How We Decide, Houghton, Mifflin, Harcourt, Copyright
2009 by Jonah Lehrer, Kindle Loc. 1922-79.
-------------------------------------------------------------------------------------------------In today's encore excerpt - the discovery of America. Author Tony
Horwitz muses on the discovery of America after hearing from a
Plymouth Rock tour guide named Claire that the most common question
from tourists was why the date etched on the rock was 1620 instead of
1492:
" 'People think Columbus dropped off the Pilgrims and sailed home.'
Claire had to patiently explain that Columbus's landing and the
Pilgrims' arrival occurred a thousand miles and 128 years apart. ...
"By the time the first English settled, other Europeans had already
reached half of the forty-eight states that today make up the
continental United States. One of the earliest arrivals was Giovanni
da Verrazzano, who toured the Eastern Seaboard in 1524, almost a full

century before the Pilgrims arrived. ... Even less remembered are the
Portuguese pilots who steered Spanish ships along both coasts of the
continent in the sixteenth century, probing upriver to Bangor, Maine,
and all the way to Oregon. ... In 1542, Spanish conquistadors
completed a reconnaissance of the continent's interior: scaling the
Appalachians, rafting the Mississippi, peering down the Grand Canyon,
and galloping as far inland as central Kansas. ...
"The Spanish didn't just explore: they settled, from the Rio Grande to
the Atlantic. Upon founding St. Augustine, the first European city on
U.S. soil, the Spanish gave thanks and dined with Indians-fifty-six
years before the Pilgrim Thanksgiving at Plymouth. ... Plymouth, it
turned out, wasn't even the first English colony in New England. That
distinction belonged to Fort St. George, in Popham, Maine. Nor were
the Pilgrims the first to settle Massachusetts. In 1602, a band of
English built a fort on the island of Cuttyhunk. They came, not for
religious freedom, but to get rich from digging sassafras, a commodity
prized in Europe as a cure for the clap. ...
"The Pilgrims, and later, the Americans who pushed west from the
Atlantic, didn't pioneer a virgin wilderness. They occupied a land
long since transformed by European contact. ... Samoset, the first
Indian the Pilgrims met at Plymouth, greeted the settlers in English.
The first thing he asked for was beer."
Tony Horwitz, A Voyage Long and Strange, Henry Holt, Copyright 2008 by
Tony Horwitz, pp. 3-6.
-------------------------------------------------------------------------------------------------In today's excerpt - 1848, the year of revolution - perhaps the most
pivotal year in the violent transformation of the countries of Europe
from monarchies to democracies. By 1848, the industrial revolution was
powerfully transforming Europe, creating a population explosion and a
large middle class. This middle class began to form an irrepressible
counterbalance to absolute monarchy and ultimately would not be denied
the right to participate in government - and thus the industrial
revolution itself spawned the democracies of the West. But the
industrial revolution also created a new class of laborers subject to
the hard and unending demands of the new manufacturing era, and these
political revolutions were also replete with ethnic conflict:
"In 1848 a violent storm of revolutions tore through Europe. With an
astounding rapidity, crowds of working-class radicals and middle-class
liberals in Paris, Milan, Venice, Naples, Palermo, Vienna, Prague,
Budapest, Krakow and Berlin toppled the old regimes and began the task
of forging a new, liberal order. Political events so dramatic had not
been seen in Europe since the French Revolution Of 1789 - and would
not be witnessed again until the revolutions of Eastern and Central

Europe in 1989, or perhaps the less far-reaching Bolshevik Revolution


Of 1917. ... The brick-built authoritarian edifice that had imposed
itself on Europeans for almost two generations folded under the weight
of the insurrections. ...
"For the Germans, Italians, Hungarians, Romanians, Poles, Czechs,
Croats and Serbs, the year was to be the 'Springtime of Peoples', a
chance to assert their own sense of national identity and to gain
political recognition. In the cases of the Germans and the Italians,
it was an opportunity for national unification under a liberal or even
democratic order. Nationalism, therefore, was one issue that came
frothing to the surface of European politics in 1848. While rooted in
constitutionalism and civil rights, it was a nationalism that,
ominously, made little allowance for the legitimacy of claims of other
national groups. In many places such narrowness of vision led to
bitter ethnic conflict, which in the end helped to destroy the
revolutionary regimes of Central and Eastern Europe. ...
"The revolutions were scarred almost everywhere by a bitter, often
violent, political polarization. Moderates wanted parliamentary
government - but not necessarily to enfranchise everyone - and they
were challenged by radicals who wanted democracy - frequently combined
with dramatic social reform - without delay....
"A third issue that came boiling to the surface in 1848 and never left
the European political agenda was the 'social question.' The abject
misery of both urban and rural people had loomed menacingly in the
thirty or so years since the Napoleonic Wars. The poverty was caused
by a burgeoning population, which was not yet offset by a
corresponding growth in the economy. Governments, however, did little
to address the social distress, which was taken up as a cause by a
relatively new political current - socialism - in 1848. The
revolutions therefore thrust the 'social question' firmly and
irrevocably into politics. Any subsequent regime, no matter how
conservative or authoritarian, ignored it at its peril. In 1848,
however, the question of what to do about poverty would prove to be
one of the great nemeses of the liberal, revolutionary regimes."
Mike Rapport, 1848, Basic Books, Copyright 2008 by Mike Rapport, pp.
ix-x.
-------------------------------------------------------------------------------------------------In today's excerpt - memory and fear. The emotion of each memory is
chemically encoded in the brain's amygdala. And each memory is changed
- chemically altered - each time we retrieve it, for better or for
worse. Therapists try and use this in helping patients overcome fears:
"Learned fears [such as stage-fright] are acquired in part in

circuitry centering on the amygdala, which Joseph LeDoux likes to call


the brain's 'Fear Central.' LeDoux knows the neural terrain of the
amygdala intimately; he's been studying this clump of neurons for
decades at the Center for Neural Science at New York University. The
cells in the amygdala where sensory information registers, and the
adjacent areas that acquire fear, LeDoux has discovered, actually fire
in new patterns at the moment a fear has been learned.
"Our memories are in part reconstructions. Whenever we retrieve a
memory, the brain rewrites it a bit, updating the past according to
our present concerns and understanding. At the cellular level, LeDoux
explains, retrieving a memory means it will be 'reconsolidated,'
slightly altered chemically by a new protein synthesis that will help
store it anew after being updated.
"Thus each time we bring a memory to mind, we adjust its very
chemistry: the next time we retrieve it, that memory will come up as
we last modified it. The specifics of the new consolidation depend on
what we learn as we recall it. If we merely have a flare-up of the
same fear, we deepen our fearfulness.
"But, ... if at the time of the fear we tell ourselves something that
eases its grip, then the same memory becomes reencoded with less power
over us. Gradually, we can bring the once-feared memory to mind
without feeling the rush of distress all over again. In such a case,
says LeDoux, the cells in our amygdala reprogram so that we lose the
original fear conditioning. One goal of therapy, then, can be seen as
gradually altering the neurons for learned fear.
"Treatments sometimes actually expose the person to whatever primes
their fear. Exposure sessions begin with getting the person relaxed,
often through a few minutes of slow abdominal breathing. Then the
person confronts the threatening situation, in a careful gradation
culminating in the very worst version.
"[For example], one New York City traffic officer confided that she
had flown into a rage at a motorist who called her a 'low-life bitch.'
So in her exposure therapy that phrase was repeated to her, first in a
flat tone, then with increasing emotional intensity, and finally with
added obscene gestures. The exposure succeeds when, no matter how
obnoxious the repeated phrase, she can stay relaxed - and presumably
when back on the street she can calmly write a traffic ticket despite
insults."
Daniel Goleman, Social Intelligence, Bantam, Copyright 2006, pp.
78-79.
--------------------------------------------------------------------------------------------------

In today's excerpt - the 80/20 rule, the expression commonly used to


state that a small percentage of the total of any set accounts for a
large percentage of the output or effect of that set:
"Have you ever heard of the 80/20 rule? It is the common signature of
a power law - actually it is how it all started, when Vilfredo Pareto
made the observation that 80 percent of the land in Italy was owned by
20 percent of the people. Some use the rule to imply that 80 percent
of the work is done by 20 percent of the people. Or that 80 percent
worth of effort contributes to only 20 percent of results, and vice
versa.
"As far as axioms go, this one wasn't phrased to impress you the most:
it could easily be called the 50/01 rule, that is, 50 percent of the
work comes from 1 percent of the workers. This formulation makes the
world took even more unfair, yet the two formulae are exactly the
same. How? Well, if there is inequality, then those who constitute the
20 percent in the 80/20 rule also contribute unequally - only a few of
them deliver the lion's share of the results. This trickles down to
about one in a hundred contributing a little more than half the total.
"The 80/20 rule is only metaphorical; it is not a rule, even less a
rigid law. In the U.S. book business, the proportions are more like
97/20 (i.e., 97 percent of book sales are made by 20 percent of the
authors); it's even worse if you focus on literary nonfiction (twenty
books of close to eight thousand represent half the sales).
"Note here that it is not all uncertainty. In some situations you may
have a concentration, of the 80/20 type, with very predictable and
tractable properties, which enables clear decision making, because you
can identify beforehand where the meaningful 20 percent are. These
situations are very easy to control. For instance, Malcolm Gladwell
wrote in an article in The New Yorker that most abuse of prisoners is
attributable to a very small number of vicious guards. Filter those
guards out and your rate of prisoner abuse drops dramatically. (In
publishing, on the other hand, you do not know beforehand which book
will bring home the bacon. The same with wars, as you do not know
beforehand which conflict will kill a portion of the planet's
residents.)"
Nassim Nicholas Taleb, The Black Swan, Random House, Copyright 2007 by
Nassim Nicholas Taleb, pp. 235-236.
-------------------------------------------------------------------------------------------------In today's excerpt - the emptiness of space, the distances between
planets and stars:
"To gain a richer sense of cosmic proportions, we can paraphrase

William Blake, and see the Earth as a fine grain of sand. The sun,
then, would be an orange-sized object twenty feet away, while Jupiter,
the biggest planet of the solar system, would be a pebble eighty-four
feet in the other direction - almost the length of a basketball court
- and the outermost orbs of the solar system, Neptune and Pluto, would
be larger and smaller grains, respectively, found at a distance of two
and a quarter blocks from Granule Earth.
"Beyond that, the gaps between scenic vistas become absurd, and it's
best to settle in for a nice, comfy coma. Assuming our little orrery
of a solar system is tucked into a quiet neighborhood in Newark, New
Jersey, you won't reach the next stars - the Alpha Centauri triple
star system - until somewhere just west of Omaha, or the star after
that until the foothills of the Rockies. And in between astronomical
objects is lots and lots of space, silky, sullen, inky-dinky space,
plenty of nothing, nulls within voids. Just as the dominion of the
very small, the interior of the atom, is composed almost entirely of
empty space, so, too, is the kingdom of the heavens. Nature, it seems,
adores a vacuum.
" 'The universe is a pretty empty place, and that's something most
people don't get' said Michael Brown of Caltech. 'You go watch Star
Wars, and you see the heroes flying through an asteroid belt, and
they're twisting and turning nonstop to avoid colliding with
asteroids.' In reality, he said, when the Galileospacecraft flew
through our solar system's asteroid belt in the early 1990s, NASA
spent millions of dollars in a manic effort to steer the ship close
enough to one of the rubble rocks to take photos and maybe sample a
bit of its dust. 'And when they got lucky and the spacecraft actually
passed by two asteroids, it was considered truly amazing,' said Brown.
'For most of Galileo'sjourney, there was nothing. Nothing to see,
nothing to take pretty pictures of. And we're talking about the solar
system, which is a fairly dense region of the universe.'
"Don't be fooled by the gorgeous pictures of dazzling pinwheel
galaxies with sunnyside bulges in their midsections, either. They,
too, are mostly ghostly: the average separation between stars is about
100,000 times greater than the distance between us and the Sun. Yes,
our Milky Way has about 300 billion stars to its credit, but those
stars are dispersed across a chasmic piece of property 100,000 lightyears in diameter. That's roughly 6 trillion miles (the distance light
travels in a year) multiplied by 100,000 ... miles wide. Even using
the shrunken scale of a citrus sun lying just twenty feet away from
our sand-grain Earth, crossing the galaxy would require a trip of more
than 24 million miles."
Natalie Angier, The Canon, Houghton Mifflin, Copyright 2007 by Natalie
Angier, pp. 81-82.
----------------------------------------------------------------------

----------------------------In today's excerpt - the U.N. estimates that sea levels will rise
about a foot over the rest of this century. Yet since 1860, our planet
has experienced a sea-level rise of about a foot without calamity.
Bjorn Lomborg, a leading voice of the "emerging pragmatic center" in
the highly-charged debate on climate change, fully acknowledges a
global warming crisis and its damaging consequences, but advocates
cost-effective alternatives to Kyoto, and cautions against warnings of
catastrophe unsupported by current science:
"In its 2007 report, the U.N. estimates that sea levels will rise
about a foot over the rest of the century. While this is not a trivial
amount, it is also important to realize that it is certainly not
outside historical experience. Since 1860, we have experienced a sealevel rise of about a foot, yet this has clearly not caused major
disruptions. ...
"Often, the risk of sea-level rise is strongly dramatized in the
public discourse. A cover story of U.S. News & World Reportfamously
predicted that 'global warming could cause droughts, disease, and
political upheaval' and other nasty effects, from pestilence and
famine to wars and refugee movement. We will return to these concerns
later, but their primary projection for sea-level rise was this: 'By
midcentury, the chic Art Deco hotels that now line Miami's South Beach
could stand waterlogged and abandoned.'
"Yet sea-level increase by 2050 will be about five inches - no more
than the change we have experienced since 1940 and less than the
change those Art Deco hotels have already stood through. Moreover,
with sea-level changes occurring slowly throughout the century,
economically rational foresight will make sure that protection will be
afforded to property that is worth more than the protection costs, and
settlement will be avoided where costs will outweigh benefits. The
Intergovernmental Panel on Climate Change (part of the U.N. World
Meteorological Organization) cites the total cost for U.S. national
protection and property abandonment for more than a three-foot sealevel rise (more than triple what is expected in 2100) at about $5
billion to $6 billion over the century. Considering that the adequate
protection costs for Miami would be just a tiny fraction of this cost
spread over the century, that the property value for Miami Beach in
2006 was close to $23 billion, and that the Art Deco National Historic
District is the second-largest tourist magnet in Florida after Disney
World, contributing more than $11 billion annually to the economy,
five inches will simply not leave Miami Beach hotels waterlogged and
abandoned.
"But this of course is exactly the opposite of what we often hear."
Bjorn Lomborg, Cool It, Vintage, Copyright 2007 by Bjorn Lomborg, pp.

60-61.
-------------------------------------------------------------------------------------------------In today's excerpt - new thoughts on the "clash of civilizations" from
Robert Wright, author of the highly influential books Nonzero and The
Moral Animal, in his new book The Evolution of God:
"It sounds paradoxical. On the one hand, I think gods arose as
illusions, and that the subsequent history of the idea of god is, in
some sense, the evolution of an illusion. On the other hand: (1) the
story of this evolution itself points to the existence of something
you can meaningfully call divinity; and (2) the 'illusion,' in the
course of evolving, has gotten streamlined in a way that moved it
closer to plausibility. In both of these senses, the illusion has
gotten less and less illusory.
"Does that make sense? Probably not. I hope it will by the end of the
book. For now I should just concede that the kind of god that remains
plausible, after all this streamlining, is not the kind of god that
most religious believers currently have in mind.
"There are two other things that I hope will make a new kind of sense
by the end of this book, and both are aspects of the current world
situation. One is what some people call a clash of civilizations - the
tension between the Judeo-Christian West and the Muslim world, as
conspicuously manifested on September 11, 2001. Ever since that day,
people have been wondering how, if at all, the world's Abrahamic
religions can get along with one another as globalization forces them
into closer and closer contact.
"Well, history is full of civilizations clashing, and for that matter,
of civilizations not clashing. And the story of the role played by
religious ideas - fanning the flames or dampening the flames, and
often changing in the process - is instructive. I think it tells us
what we can do to make the current 'clash' more likely to have a happy
ending.
"The second aspect of the current world situation I'll address is
another kind of clash - the much-discussed 'clash' between science and
religion. Like the first kind of clash, this one has a long and
instructive history. It can be traced at least as far back as ancient
Babylon, where eclipses that had long been attributed to restless and
malignant supernatural beings were suddenly found to occur at
predictable intervals - predictable enough to make you wonder whether
restless and malignant supernatural beings were really the problem.
"There have been many such unsettling (from religion's point of view)
discoveries since then, but always some notion of the divine has

survived the encounter with science. The notion has had to change, but
that's no indictment of religion. After all, science has changed
relentlessly, revising if not discarding old theories, and none of us
think of that as an indictment of science. On the contrary, we think
this ongoing adaptation is carrying science closer to the truth.
"Maybe the same thing is happening to religion. Maybe, in the end, a
mercilessly scientific account of our predicament ... is actually
compatible with a truly religious worldview, and is part of the
process that refines a religious worldview, moving it closer to truth.
These two big 'clash' questions can be put into one sentence: Can
religions in the modern world reconcile themselves to one another, and
can they reconcile themselves to science? I think their history points
to affirmative answers."
Robert Wright, The Evolution of God, Little, Brown and Company,
Copyright 2009 by Robert Wright, pp. 4-6.
-------------------------------------------------------------------------------------------------In today's excerpt - veteran prison counselor Sunny Schwartz embarks
on the task of bringing a highly innovative program called
'restorative justice' to the San Francisco County Prison system. Like
most prisons, these prisons - which house prisoners with records
fairly typical of jails throughout the country - are commonly referred
to by guards and others associated with them as "monster factories."
The United States has the world's highest incarceration rate:
"I had to make good on the promise of [introducing] constructive
programs, and my first big push went to getting a school in the jail.
I discovered that our neighbor to the north, just behind the razorwire fence and a stand of trees, was Skyline Community College. I gave
them the hard sell, told them our population needed their classes more
than anyone else, and they'd said they would try. One of the first
things the college did was help us perform a survey of our
population's needs so I would know what kind of programs I should
start:
"75 percent [of the prisoners] were reading somewhere between the
fourth- and sixth-grade levels. 90 percent never had a legal job. 90
percent were self-identified addicts. 80 percent were self-identified
victims of sexual or physical violence as a child. 65 percent had been
placed in a special-education class at some point. 75 percent were
high school dropouts.
"It was dismal. If there was ever a set of numbers that spoke more
plainly to the need for some alternative to warehousing people, I
hadn't seen it. Even I was surprised that 80 percent said they had
been abused in the past, and I was stunned that 90 percent had never

had a legal job. These were incredible obstacles. ...


"[I learned of a program called] Restorative Justice. The name alone
piqued my interest. Nothing I'd seen in the criminal justice system
had ever been in the business of "restoring" anything. I'd seen crimes
committed, I'd seen people punished, lives and families ruined, but
never restoration. ... The three principles of restorative justice are
offender accountability, victim restoration and community involvement
to heal the harm caused by crime. ... The goal of restorative justice
was to heal the victims, for perpetrators to take responsibility for
their actions and make meaningful restitution and for governments and
communities to be part of the process. ...
"Most people, I think, believe that prison or jail should be a
horrible experience. People don't think of it as a deterrent so much
as just deserts. 'They' hurt 'us,' therefore 'we' should hurt 'them.'
For years, politicians have won elections by promising to take away
cable television and weight rooms and anything seen to make prison
cushy. We have a culture where jokes about prison rape are made out in
the open. The prevailing wisdom is that prisoners deserve to be
treated like animals; they should fear prison and suffer while they
are there. Anyone who has spent time working with prisoners knows this
has largely come to pass. What most people don't realize is the
consequences of making prisons a living nightmare. Most of the inmates
I'd worked with, particularly when I was a law intern, felt punished,
but not many of them took responsibility for their crimes, or felt any
remorse.
"Martin Aguerro, the pedophile, the first client I had when I started
in 1980, was a case in point. He complained about the squalid
treatment and living conditions in jail, he felt wronged, but I never
got the sense that he thought about his crimes. In fact, everything
about the system of prosecution and defense is set up so that
criminals get into a habit of denying their responsibility. Every step
of the way between the arrest and the trial, people accused of crimes
deny everything, or keep silent. It's what their defense attorneys
tell them to do. After their trial, if they're convicted, many don't
change their mind-set. Why should they? To truly confront what they've
done requires confronting the shame and fear and the reality of their
situation. Few people choose to do this, because it's difficult. After
all, it's hard for noncriminals to take responsibility for doing the
wrong thing, much less someone sitting in a prison cell. So criminals
blame someone or something else - the cop who caught them, or their
lousy upbringing - for their circumstances and spend their time
growing angrier and angrier about being treated like an animal. They
are usually full of rage when they are released, and less prepared to
function as citizens; the predictable products of the monster
factory."
Sunny Schwartz and David Boodell , Dreams from the Monster Factory: A

Tale of Prison, Redemption, and One Woman's Fight to Restore Justice


to All, Scribner, Copyright 2009 by Sunny Schwartz and David Boodell ,
pp. 93-94, 126-127.
-------------------------------------------------------------------------------------------------In today's excerpt - controversial hints that the Old Testament god
Yahweh (Jehovah) had a female companion named Asherah, the Canaanite
goddess famed for her wisdom who was also known as Athirat. In the
Canaanite religion, or Levantine religion as a whole, El (the name
just means "god") was the supreme god, the father of humankind and all
creatures, and Asherah's husband. Archeologists have established that
there were numerous female idols among the carved idols in households
of the region, and some theorize that Asherah was worshipped in
ancient Israel as the consort of El and in Judah as the Queen of
Heaven and consort of Yahweh. The Hebrews baked small cakes for her
festival. Some scholars further theorize that El, the preeminent god
in northern Canaan, was eventually merged with Yahweh, the preeminent
god of southern Canaan, in a manner very characteristic of the
combinations and exchanges of gods that occurred regularly in ancient
religious history:
"One oft-claimed difference [by biblical scholars] between the
Abrahamic god and other gods in the vicinity is that whereas the pagan
gods had sex lives, Yahweh didn't. 'Israel's God,' as [biblical
scholar Yehezkel] Kaufmann put it, 'has no sexual qualities or
desires.' It's true that there's no biblical ode to Yahweh that
compares with the Ugaritic boast that Baal copulated with a heifer '77
times,' even '88 times,' or that El's penis 'extends like the sea.'
And it seems puzzling: If Yahweh eventually merged with El, and El had
a sex life, why didn't the postmerger Yahweh have one? Why, more
specifically, didn't Yahweh inherit El's consort, the goddess Athirat?
"Maybe he did. There are references in the Bible to a goddess named
Asherah, and scholars have long believed that Asherah is just the
Hebrew version of Athirat. Of course, the biblical writers didn't
depict Asherah as God's wife - this isn't the sort of theological
theme they generally championed - but rather heap disdain on her, and
on the Israelites who worshipped her. However, in the late twentieth
century, archaeologists discovered intriguing inscriptions, dating to
around 800 BCE, at two different Middle Eastern sites. The
inscriptions were blessings in the name not just of Yahweh but of 'his
Asherah.' The word 'his' puts an intriguing spin on a passage in 2
Kings reporting that, near the end of the seventh century, Asherah was
spending time in Yahweh's temple. A priest who didn't favor polytheism
'brought out the image of Asherah from the house of the Lord, outside
Jerusalem, to the Wadi Kidron, burned it at the Wadi Kidron, beat it
to dust and threw the dust of it upon the graves of the common
people.' (2 Kings 21: 7, 23: 4-6)"

Robert Wright, The Evolution of God, Little, Brown, Copyright 2009 by


Robert Wright, pp. 118-119.
-------------------------------------------------------------------------------------------------In today's excerpt - the curse of abundant oil resources in developing
countries. Developing countries without oil grow four times faster
than those with oil. Developing countries with oil are far more likely
to be militarized and devolve into civil war:
"[With its oil wealth], Venezuela began to import more and more and
produce less, a typical symptom of Dutch disease, where resource-rich
countries see other parts of their economics wither. (Venezuela
actually had Dutch disease before the Dutch, but that term wouldn't be
invented until the natural gas boom in the Netherlands in the 1960s
torpedoed the country's economy. The condition should be called the
Caracas cramp.)
"[After the discovery of oil in Venezuela in 1921], nobody paid taxes.
If you're an oil state, it's far more efficient to ask oil buyers for
more money than to collect taxes from your population, which requires
a vast network of tax collectors, a bureaucracy, laws that are fair,
and a justice system to administer them. Collecting oil money, by
contrast, requires a small cadre of intellectuals to set policy and
diplomats to make it happen. ... The political, economic, and
psychological ramifications of this ... are profound.
" 'Systematically the government went after oil money rather than
raising taxes,' says economist Francisco Monaldi. 'There is no
taxation and therefore no representation here. The state here is
extremely autonomous.' Whether it's a dictatorship, a democracy, or
something in between, the state's only patron is the oil industry, and
all of its attention is focused outward. What's more, the state owes
nothing more than promises to the people of Venezuela, because they
have so little leverage on the state's income.
"When a state develops the ability to collect taxes, the bureaucracy
and mechanisms it creates are expensive. They perpetuate their
existence by diligently collecting as much money as possible and
encouraging the growth of a private economy to collect taxes from. A
strong private economy, so the thinking goes, creates a strong civil
society, fostering other centers of power that keep the state in
check. Like other intellectuals I talk with in other oil states,
Monaldi finds taxes more interesting and more useful than abstract
ideas about democracy and ballot boxes. Taxes aren't democracy, but
they seem to connect taxpayers and government in a way that has
democratizing effects. Studies by Michael L. Ross at UCLA found that
taxes alone don't foster accountability, but the relationship of taxes

to government services creates a struggle for value between the state


and citizens, which is some kind of accountability. ...
"Abdoulaye Djonouma, president of Chad's Chamber of Commerce, says oil
brought about economic and agricultural collapse in Nigeria and Gabon.
For Chad, which has fewer resources, he fears worse: militarization.
He ticks off all the former French colonies that have become
militarized. Virtually all. (One study found that oil-exporting
countries spend between two and ten times more on their militaries
than other developing countries.) ...
"At Stanford, Terry Lynn Karl's analysis of Venezuela's economy during
the 1970s and '80s shows that countries whose economy is dominated by
oil exports tend to experience shrinking standards of living something that Chad can hardly afford. Oil has opportunity costs: A
study by Jeffrey Sachs and Andres Warner showed that of ninety-seven
developing countries, those without oil grew four times as much as
those with oil. At UCLA, Michael L. Ross did regression studies
showing that governments that export oil tend to become less
democratic over time. At Oxford, Paul Collier's regression studies
show that oil, and mineral-exporting countries have a 23 percent
likelihood of civil war within five years, compared to less than 1
percent for nondependent countries."
Lisa Margonelli, Oil on the Brain, Nan A. Talese/Doubleday, Copyright
2007 by Lisa Margonelli, pp. 146-147,174-176
-------------------------------------------------------------------------------------------------In today's excerpt - Israel, like many other nations, had gone from
polytheism to monolatry - the worship of one god in preference to
others - primarily because kings could enhance their perceived power
if they were closely identified with the primary god and then all
other gods were subordinated. But what then brought about the next
decisive and epoch-making step: Israel's transition from monolatry to
true monotheism - the disappearance of other gods? Monotheism was the
innovation that then became the basis for the three great and
interrelated Abrahamic religions - Judaism, Christianity and Islam that are embraced by half the world's population today. Some scholars
suggest that monotheism was born out of the terrifying, hideous trauma
of the Babylonian exile of the Israelites in 586 BCE:
"When King Zedekiah of Judah rebelled against the Babylonians, they
captured him, killed his sons before his eyes, plucked out those eyes,
then burned Yahweh's (Jehovah's) temple to the ground. And they
completed a process they'd started years earlier, the transfer of
Israel's upper classes to Babylon. Now, as of 586 BCE, the Babylonian
exile - the most famous trauma in the story of ancient Israel - was in
full swing. No doubt the Babylonians, following theological

conventions of the day, took all this to signify Yahweh's humiliation


at the hands of their national god, Marduk. ...
"The retributive impulse is universally human, almost certainly
grounded in the genes of our species. And it is deeply, often hotly,
felt. But, however laden with emotion, it has an intrinsic logic, and
in terms of this logic Israel's monotheism makes sense. The core of
the logic is, as the Bible puts it, an eye for an eye, a tooth for
tooth; punishment is proportional to the original transgression. And
what was the magnitude of the transgression that Israel's exiles had
suffered? The Babylonians hadn't just conquered their land and
belittled their god. They had removed them from their land and,
ostensibly, killed their god. Whereas [previously] Assyria had
stripped Jerusalem's temple of its treasures, the Babylonians had
destroyed the temple itself. And a god's temple was, in the ancient
Middle East, literally the god's home.
"The ultimate transgression calls for the ultimate punishment. An apt
response when a people kills your god is to kill theirs - to define it
out of existence. And if other nations' gods no longer exist, and if
you've already decided that Yahweh is the only god in your nation,
then you've just segued from monolatry to monotheism.
"This isn't to say that monotheism followed from retributive logic as
rigorously as four follows from two plus two. ... [Yet] there is a
sense of humiliation so massive that counterbalancing it would require
Yahweh's elevation to unprecedented heights - which meant the demotion
of the world's other gods to unprecedented depths, perilously near the
subsistence level. Monotheism was, among other things, the ultimate
revenge."
Robert Wright, The Evolution of God, Little, Brown, Copyright 2009 by
Robert Wright, pp. 165, 177-178.
-------------------------------------------------------------------------------------------------In today's excerpt - the risk inherent in positive emotions:
observations from the psychiatrist George Vaillant, who has long been
the chief curator of the Harvard Study of Adult Development, one of
the longest-running - and probably the most exhaustive - longitudinal
studies of mental and physical well-being in history. Begun in 1937 as
a study of healthy, well-adjusted Harvard sophomores (all male), it
has followed its subjects for more than 70 years:
"As Freud was displaced by biological psychiatry and cognitive
psychology - and the massive data sets and double-blind trials that
became the industry standard - Vaillant's work risked obsolescence.
But in the late 1990s, a tide called 'positive psychology' came in,
and lifted his boat. Driven by a savvy, brilliant psychologist at the

University of Pennsylvania named Martin Seligman, the movement to


create a scientific study of the good life has spread wildly through
academia and popular culture (dozens of books, a cover story in Time,
attention from Oprah, etc.).
"Vaillant became a kind of godfather to the field, and a champion of
its message that psychology can improve ordinary lives, not just treat
disease. But in many ways, his role in the movement is as provocateur.
Last October, I watched him give a lecture to Seligman's graduate
students on the power of positive emotions - awe, love, compassion,
gratitude, forgiveness, joy, hope, and trust (or faith). 'The
happiness books say, 'Try happiness. You'll like it a lot more than
misery' - which is perfectly true,' he told them. But why, he asked,
do people tell psychologists they'd cross the street to avoid someone
who had given them a compliment the previous day?
"In fact, Vaillant went on, positive emotions make us more vulnerable
than negative ones. One reason is that they're future-oriented. Fear
and sadness have immediate payoffs - protecting us from attack or
attracting resources at times of distress. Gratitude and joy, over
time, will yield better health and deeper connections - but in the
short term actually put us at risk. That's because, while negative
emotions tend to be insulating, positive emotions expose us to the
common elements of rejection and heartbreak.
"To illustrate his point, he told a story about one of his
'prize' [Harvard] Study men, a doctor and well-loved husband. 'On his
70th birthday,' Vaillant said, 'when he retired from the faculty of
medicine, his wife got hold of his patient list and secretly wrote to
many of his longest-running patients, 'Would you write a letter of
appreciation?' And back came 100 single-spaced, desperately loving
letters - often with pictures attached. And she put them in a lovely
presentation box covered with Thai silk, and gave it to him.' Eight
years later, Vaillant interviewed the man, who proudly pulled the box
down from his shelf. 'George, I don't know what you're going to make
of this,' the man said, as he began to cry, 'but I've never read it.'
'It's very hard,' Vaillant said, 'for most of us to tolerate being
loved.'
Joshua Wolf Shenk, "What Makes Us Happy?" The Atlantic, June 2009, pp.
47-48.
-------------------------------------------------------------------------------------------------In today's encore excerpt - in the early 20th century, with American
industry just beginning to expand overseas, and with Latin America
still just emerging from its colonial shackles, bananas became one of
America's first powerhouse industries:

"Bananas are the world's largest fruit crop and the fourth-largest
product grown overall, after wheat, rice and corn. ... In Central
America, [American banana companies] built and toppled nations: a
struggle to control the banana crop led to the overthrow of
Guatemala's first democratically elected government in the 1950s,
which in turn gave birth to the Mayan genocide of the 1980s. In the
1960s, banana companies - trying to regain plantations nationalized by
Fidel Castro - allowed the CIA to use their freighters as part of the
failed Bay of Pigs invasion of Cuba. ... Eli Black, the chairman of
Chiquita, threw himself out of the window of a Manhattan skyscraper in
1974 after his company's political machinations were exposed. ...
"On August 12, 1898, Spain surrendered [Cuba as a result of their loss
to America in the Spanish-American War], and the United States gained
control over the island, opening a naval base at Guantanamo Bay. Over
the next thirty-five years; the U.S. military intervened in Latin
America twenty-eight times: in Mexico, in Haiti, the Dominican
Republic, and Cuba in the Caribbean; and in Panama, Honduras,
Nicaragua, Guatemala, Costa Rica, and El Salvador in Central America.
The biggest consequence of those incursions was to make the region
safe for bananas. One of the first businesses to enter Cuba was United
Fruit. The banana and sugar plantations it established would
eventually encompass 300,000 acres. An 1899 article in the Los Angeles
Times described Latin America as 'Uncle Sam's New Fruit Garden,'
offering readers insight into 'How bananas, pineapples, and cocoanuts
can be turned into fortunes.' ...
"[But the U.S.] public knew little about events like the 1912 U.S.
invasion of Honduras, which granted United Fruit broad rights to build
railroads and grow bananas in the country. They weren't aware that in
1918 alone, U.S. military forces put down banana workers' strikes in
Panama, Columbia, and Guatemala. For every direct intervention, there
were two or three softer ones, accomplished by proxy through local
armies and police forces controlled by friendly governments. One of
the few observers to take note of the situation was Count Vay de Vaya
of Hungary, who ... upon returning from a visit to Latin America,
described the banana as 'a weapon of conquest.' "
Dan Koeppel, Banana, Hudson Street, Copyright 2008 by Dan Koeppel, pp.
xiii-xiv, 63-64.
-------------------------------------------------------------------------------------------------In today's excerpt - our ignorance regarding mushrooms:
"We don't know the most basic things about mushrooms.
"Part of the problem is simply that fungi are very difficult to
observe. What we call a mushroom is only the tip of the iceberg of a

much bigger and essentially invisible organism that lives most of its
life underground. The mushroom is the 'fruiting body' of a
subterranean network of microscopic hyphae, improbably long rootlike
cells that thread themselves through the soil like neurons. Bunched
like cables, the hyphae form webs of (still microscopic) mycelium.
Mycologists can't dig up a mushroom like a plant to study its
structure because its mycelium is too tiny and delicate to tease from
the soil without disintegrating. ... To see the whole organism of
which [the mushroom] is merely a component may simply be impossible.
Fungi also lack the comprehensible syntax of plants, the orderly and
visible chronology of seed and vegetative growth, flower, fruit, and
seed again. The fungi surely have a syntax of their own, but we don't
know all its rules, especially the ones that govern the creation of a
mushroom, which can take three years or thirty, depending. On what? We
don't really know. ...
"Fungi, lacking chlorophyll, differ from plants in that they can't
manufacture food energy from the sun. Like animals, they feed on
organic matter made by plants, or by plant eaters. Most of the fungi
we eat obtain their energy by one of two means: saprophytically, by
decomposing dead vegetable matter, and mycorrhizally [like
chanterelles and morels], by associating with the roots of living
plants. Among the saprophytes, many of which can be cultivated by
inoculating a suitable mass of dead organic matter (logs, manure,
grain) with their spores, are the common white button mushrooms,
shiitakes, cremini, Portobellos, and oyster mushrooms. Most of the
choicest wild mushrooms are impossible to cultivate, or nearly so,
since they need living and often very old trees in order to grow, and
can take several decades to fruit. The mycelium can grow more or less
indefinitely, in some cases for centuries, without necessarily
fruiting. A single fungus recently found in Michigan covers an area of
forty acres underground and is thought to be a few centuries old. So
inoculating old oaks or pines is no guarantee of harvesting future
mushrooms, at least not on a human time scale. Presumably, these fungi
live and die on an arboreal time scale.
"Mycorrhizal fungi have coevolved with trees, with whom they've worked
out a mutually beneficial relationship in which they trade the
products of their very different metabolisms. If the special genius of
plants is photosynthesis, the ability of chlorophyll to transform
sunlight and water and soil minerals into carbohydrates, the special
genius of fungi is the ability to break down organic molecules and
minerals into simple molecules and atoms through the action of their
powerful enzymes. The hyphae surround or penetrate the plant's roots,
providing them with a steady diet of elements in exchange for a drop
of simple sugars that the plant synthesizes in its leaves. The network
of hyphae vastly extends the effective reach and surface area of a
plant's root system, and while trees can survive without their fungal
associates, they seldom thrive. It is thought that the fungi may also
protect their plant hosts from bacterial and fungal diseases.

"The talent of fungi for decomposing and recycling organic matter is


what makes them indispensable, not only to trees but to all life on
earth."
Michael Pollan, The Omnivore's Dilemma, Penguin, Copyright 2006 by
Michael Pollan, pp. 374-375
-------------------------------------------------------------------------------------------------In today's excerpt - the concept of a nation, in the modern sense of a
place which is a primary source of identity for its inhabitants, is a
very recent phenomenon:
"Of the many ways in which we can define ourselves, the nation, at
least for the last two centuries, has been one of the most enticing.
The idea that we are part of a very large family, or, in Benedict
Anderson's words, an imagined community, has been as powerful a force
as Fascism or Communism. Nationalism brought Germany and Italy into
being, destroyed Austria-Hungary, and, more recently, broke apart
Yugoslavia. People have suffered and died, and have harmed and killed
others, for their 'nation.'
"History provides much of the fuel for nationalism. It creates the
collective memories that help to bring the nation into being. The
shared celebration of the nation's great achievements - and the shared
sorrow at its defeats - sustain and foster it. The further back the
history appears to go, the more solid and enduring the nation seems and the worthier its claims. Ernest Renan, the nineteenth-century
French thinker who wrote an early classic on nationalism, dismissed
all the other justifications for the existence of nations, such as
blood, geography, language or religion. 'A nation', he wrote, 'is a
great solidarity created by the sentiment of the sacrifices which have
been made and those which one is disposed to make in the future.' As
one of his critics preferred to put it, 'A nation is a group of people
united by a mistaken view about the past and a hatred of their
neighbours.' Renan saw the nation as something that depended on the
assent of its members. 'The existence of a nation is a plebiscite of
every day, as the existence of an individual is a perpetual
affirmation of life.' For many nationalists, there is no such thing as
voluntary assent; you were born into a nation and had no choice about
whether or not you belonged, even if history had intervened. When
France claimed the Rhineland after World War I, one of the arguments
it used was that, even though they spoke German, its inhabitants were
really French. Although ill fortune had allowed them to fall under
German rule, they had remained French in essence, as their love of
wine, their Catholicism, and their joie de vivre so clearly
demonstrated.

"Renan was trying to grapple with a new phenomenon because nationalism


is a very late development indeed in terms of human history. For many
centuries, most Europeans thought of themselves not as British (or
English or Scottish or Welsh), French, or German but as members of a
particular family, clan, region, religion or guild, Sometimes they
defined themselves in terms of their overlords, whether local barons
or emperors. When they did define themselves as German or French, it
was as much a cultural category as a political one, and they certainly
did not assume, as modern national movements almost always do, that
nations had a right to rule themselves on a specific piece of
territory.
"Those older ways of defining oneself persisted well into the modern
age. When commissions from the League of Nations tried to determine
borders after World War I in the center of Europe, they repeatedly
came upon locals who had no idea whether they were Czechs or Slovaks,
Lithuanians or Poles. We are Catholic or Orthodox, came the answers,
merchants or farmers, or simply people of this village or that. Danilo
Dolci, the Italian sociologist and activist, was astonished to find in
the 1950s that there were people living in the interior of Sicily who
had never heard of Italy, even though, in theory, they had been
Italians for several generations. They were the anomalies, though,
left behind as nationalism increasingly became the way in which
Europeans defined themselves. Rapid communications, growing literacy,
urbanization, and above all the idea that it was right and proper to
see oneself as part of a nation, and a nation, moreover, which ought
to have its own state on its own territory, all fed into the great
wave of nationalism which shook Europe in the nineteenth century and
the wider world in the twentieth."
Margaret MacMillan, The Uses and Abuses of History, Profile, Copyright
2008, 2009 by Margaret MacMillan, pp. 81-83.
-------------------------------------------------------------------------------------------------In today's - the American Revolution, population trends, and holes. In
the 1770s, Great Britain's population was roughly nine million people,
and the combined population of the thirteen American colonies was
almost three million and growing, diminishing the likelihood that
Britain could keep the Americans subjugated in colonies. The
population gap between Britain and America was closely rapidly America was experiencing far higher rates of inmigration, and at seven
children per family, American birthrates significantly exceeded
Britain's, leading Ben Franklin to say after one battle that
"Britain ... has killed 150 Yankees this campaign. ... During the same
time 60,000 children have been born in America":
"Are there lessons for America in its own revolution that can be
applied to Iraq and Afghanistan? In a Dissent symposium on exit

strategies, historian Stanley Weintraub, author of Iron Tears:


America's Battle for Freedom, Britain's Quagmire: 1775-1783 (2005),
says he has found a few.
"In the 1760s, Britain, fresh from defeating the French in North
America, saw 'only profit and prestige ahead' in its colonies. First,
though, it was deemed necessary to rebuild the British economy, which
had been pinched by fighting a seven-year war 3,000 miles from home.
To Parliament, it made perfect sense to tax those who had benefited
most from the war. But the colonists saw things differently, objecting
to their lack of representation in Parliament, among many other
grievances. British observers, such as Samuel Johnson, grumbled that
the colonists were no less politically excluded than inhabitants of
some of the teeming districts of London. Americans, he said, were 'a
race of convicts, and ought to be thankful for anything we allow them
short of hanging.'
"Johnson's contempt was matched by that of many royal supremacists. In
the 1770s, few of them recognized that 'the sprawling overseas
colonies, more than 1,800 miles north to south, would become more
populous than the mother country and would be impossible to subdue.'
Yet as early as 1775, when hostilities broke out, Benjamin Franklin
was able to do the math. 'Britain, at the expence of three millions,
has killed 150 Yankees this campaign, which is []20,000 a head. ...
During the same time 60,000 children have been born in America.' It
was easy enough to 'calculate the time and expense necessary to kill
us all, and conquer our whole territory.' The Yankee war effort didn't
have to be brilliant, just -protracted.
"King George III helped matters by dispatching a series of disastrous
commanders, 'ambitious careerists, with promotions, titles, and
parliamentary gratuities dancing in their heads.' As these commanders
of noble birth fumbled in the colonies, a London newspaper jeeringly
remarked on the contrast with the rebel generals: 'a boat builder, a
servant, a milkman, a jockey, a clerk.'
"In 1776, London dispatched an armada to take New York City and Long
Island that would not be surpassed in numbers until D-Day. But the
British never managed to wipe out the rebels, while attrition
gradually sapped the redcoats' ranks and spirit. Parliament took to
hiring Hessians and other mercenaries. By February 1781, almost six
years after the first shot was fired, a member of the House of Commons
moved to end 'this mad war,' but the measure failed by a single vote.
The game finally ended at Yorktown in October, with French
intervention tipping the balance: 'Third forces are often crucial,'
Weintraub notes.
"The real failing of the British, he writes, is that they 'had no exit
strategy other than victory.' Only after defeat did King George III
recognize 'the first rule of holes: When you realize you're in one,

stop digging.' A lesson learned, but seldom followed. Weintraub adds,


'Future governments would pour vast resources into subjugating, yet
failing to assimilate ... the subcontinent of India' as well as large
parts of Africa, 'all at staggering cost to the home islands. It was
always foolhardy to be tempted to stay, and always too late to get
out.' "
Stanley Weintraub, "The First Rule of Holes," The Wilson Quarterly,
Summer 2009, pp. 79-80, Source: "The American Colonies" by Stanley
Weintraub, in Dissent, Winter 2009.
-------------------------------------------------------------------------------------------------In today's excerpt - with the productivity gains of the Industrial
Revolution, the percent of the population required to feed the world
dropped from up to 80% to 2%, and thus allowed the world to move from
villages to cities:
"Most Malthusian (pre-Industrial Revolution) economies had 70 or even
80 percent of the population employed in agriculture. By 1861 that
share had dropped to 21 percent in England. But that switch to
industry, as we shall see, was due to the idiosyncrasies of England's
geography and demography. There is, in fact, nothing inherently
industrial about the Industrial Revolution. Since 1800 the
productivity of agriculture has increased by as much as that of the
rest of the economy, and without these gains in agriculture modern
growth would have been impossible. We have to resign ourselves to the
fact that one of the defining events in human history has been
mislabeled. ...
"Material well-being has marched upward in successful economies since
the Industrial Revolution to levels no one in 1800 could have
imagined. After six hundred years of stasis, income has increased
nearly tenfold since 1800. ... "As income marched upward the share of
farm products in consumption treaded downward, and the share of
farmers among producers declined in step. In preindustrial economies
farmers made up 50 - 80 percent of the population. Today, if we had a
free market in food, 2 percent of the population could feed everyone.
The farm population share in the United States, for example, is 2.1
percent. Half of these people are kept in farming by government
subsidies that futilely try to stem the inexorable exodus from the
land and from rural communities. A mountain of European Union
subsidies keeps 3.3 percent of the French in their belovedcampagne.
The less sentimental British, with a more efficient agriculture,
employ only 1.2 percent of the population in farming. The Industrial
Revolution looks peculiarly industrial largely because of the switch
of population and production out of agriculture and into industry
thanks to higher incomes.

"The switch of labor out of agriculture has profoundly affected social


life. In Malthusian societies most of the population lived in small
rural settlements of a few hundred souls. They had to be close to the
daily grind of their work in the fields, since they walked to work. In
the southeast of England, for example, villages in the eighteenth
century were on average only two miles apart. Typically they had fewer
than a hundred residents. The countryside was densely settled because
of all the labor required in inefficient preindustrial agriculture:
plowing, reaping, threshing, hauling manure, tending animals.
"With an ever-dwindling proportion of the population tied to the land
through agriculture, modern populations are footloose. People can
locate anywhere, but they have concentrated increasingly in urban
centers because of the richer labor market and the social amenities
they offer. "
Gregory Clark, A Farewell to Alms: A Brief Economic History of the
World, Princeton University Press, Copyright 2007 by Princeton
University Press, Kindle Loc. 3439-86
-------------------------------------------------------------------------------------------------In today's excerpt - the Inka (Inca) empire, a now lost empire that
was located in the Andean region on the west coast of South America,
and Inkan royal mummies:
"In 1491 the Inka ruled the greatest empire on earth. Bigger than Ming
Dynasty China, bigger than Ivan the Great's expanding Russia, bigger
than Songhay in the Sahel or powerful Great Zimbabwe in the West
Africa tablelands, bigger than the cresting Ottoman Empire, bigger
than the Triple Alliance (as the Aztec empire is more precisely
known), bigger by far than any European state, the Inka dominion
extended over a staggering thirty-two degrees of latitude - as if a
single power held sway from St. Petersburg to Cairo. ...
"The Inka empire, the greatest state ever seen in the Andes, was also
the shortest lived. It began in the fifteenth century and lasted
barely a hundred years before being smashed by Spain. ...
"The Inka sovereign had the title of 'Inka"- he was the Inka - but he
could also include 'Inka' in his name. In addition, Inka elites
changed their names as they went through their lives. Each Inka was
thus known by several names, any of which might include 'Inka.' ...
"People in Andean societies viewed themselves as belonging to family
lineages. (Europeans did, too, but lineages were more important in the
Andes; the pop-cultural comparison might beThe Lord of the Rings, in
which characters introduce themselves as 'X, son of Y' or 'A, of B's
line.') Royal lineages, called panaqa, were special. Each new emperor

was born in one panaqa but created a new one when he took the fringe.
To the new panaqa belonged the Inka and his wives and children, along
with his retainers and advisers. When the Inka died his panaqa
mummified his body. Because the Inka was believed to be an immortal
deity, his mummy was treated, logically enough, as if it were still
living. Soon after arriving in Qosqo (the Empire's capital), Pizarro's
companion Miguel de Estete saw a parade of defunct emperors. (Pizarro
was the Spaniard who conquered the Inkans). They were brought out on
litters, 'seated on their thrones and surrounded by pages and women
with flywhisks in their hands, who ministered to them with as much
respect as if they had been alive.'
"Because the royal mummies were not considered dead, their successors
obviously could not inherit their wealth. Each Inka's panaqa retained
all of his possessions forever, including his palaces, residences, and
shrines; all of his remaining clothes, eating utensils, fingernail
parings, and hair clippings; and the tribute from the land he had
conquered. In consequence, as Pedro Pizarro realized, 'the greater
part of the people, treasure, expenses, and vices were under the
control of the dead.' The mummies spoke through female mediums who
represented the panaqa's surviving courtiers or their descendants.
With almost a dozen immortal emperors jostling for position, highlevel Inka society was characterized by ramose political intrigue of a
scale that would have delighted the Medici. Emblematically, (new
emperor) Wayna Qhapaq could not construct his own villa in [his
country] - his undead ancestors had used up all the available space.
Inka society had a serious mummy problem."
Charles C. Mann, 1491, Vintage, Copyright 2005, 2006 by Charles C.
Mann, pp. 71, 75, 98-99.
-------------------------------------------------------------------------------------------------In today's excerpt - opium, and the British business of trading opium
for Chinese tea. The British were buying so much tea from China, that
they had a balance-of-payments crisis. The solution was ingenious grow opium in India and smuggle it into China for sale:
"For several thousand years, humans have dried the juice of the common
poppy, Papaver somniferum, into opium. As with many modern crops, the
poppy is a cultivar, that is, a cultivated variety that does not grow
easily in the wild, which suggests that agricultural societies take
their drugs as seriously as their food.
"Humans probably first extracted opium for consumption in southern
Europe, and the Greeks and Romans used it extensively. Arab traders
transplanted the poppy to the more hospitable soils and climates of
Persia and India, and then to China, where its use is recorded as
early as the eighth century after Christ.

"For almost all of recorded history, no particular opprobrium was


attached to consuming opium as a painkiller, relaxant, work aid, and
social lubricant. The Dutch in Indonesia were the first to smoke
opium, in the early 1600s, when they began adding a few grains to a
recent New World import, tobacco. The Chinese probably acquired the
practice from the Dutch base in Formosa, whence the opium pipe rapidly
spread to the mainland. As early as 1512, [the Portugeuse apothecary]
Tome Pires observed opium commerce in Malacca (south of Singapore),
centuries before the British and Dutch became involved in the trade.
This indicates that the drug was a high-volume item in Indian Ocean
emporium commerce well before the English came to dominate it.
"Nineteenth-century Europeans swallowed enormous amounts of opium,
whereas the Chinese smoked theirs. Since inhaled opium is more
addictive than opium taken orally, it was considered much more
dangerous in China than in the nations of the West. In England,
horticultural organizations awarded prizes for particularly potent
domestically grown poppies (although most opium used in Britain came
from Turkey), and opium was consumed guiltlessly by both high and low,
most famously by Samuel Taylor Coleridge ('Kubla Khan'), Thomas de
Quincey (Confessions of an Opium Eater), and Arthur Conan Doyle's
Sherlock Holmes. The drug could be purchased freely in England until
the Pharmacy Act of 1868, other Western nations did not restrict its
use until around 1900. ...
"At the end of the eighteenth century ... [Britain's East India
Company(EIC)] sold its high-end branded opium to private traders, who
shipped it to China's mountainous Pearl River estuary island of
Lintin, where they based themselves on easily defended floating hulks
just off its shore. Local smugglers brought the contraband upriver and
slipped it past Canton's customs inspectors. The smugglers paid
Chinese silver to the English private traders, who then exchanged it
at EIC offices for silver bills drawable by them on Company accounts
in Calcutta and London. The EIC, in turn, used the silver obtained
from the private traders to pay for tea.
"The popular image of an entire Chinese population and its economy
ravaged by opium is a misconception. In the first place, the drug was
quite expensive and largely the province of the mandarin and merchant
elite classes. Second, like alcohol, it was catastrophically addictive
in only a small proportion of its users. Even the infamous opium dens
did not live up to their seedy reputation. ...
"Academic research [bears out that] opium was largely a social drug
that harmed only a tiny percentage of users. One modern scholar
estimates that although as many as half of men and one-fourth of women
were occasional users, and in 1879 only about one Chinese person in a
hundred inhaled enough opium to even be at risk of addiction. ...

"The Chinese emperor and the mandarins did express some moral outrage
over the debilitation caused by opium, but they were far more
concerned about the drug's damage to their balance of trade. China
subscribed to European-style mercantilism as faithfully as any
seventeenth-century Western monarchy. Before 1800, the tea trade was,
at least in the terms of the mercantilist ideology of the day, grossly
favorable to the Chinese. The EIC's records pinpoint 1806 as the year
when the silver flow reversed. After that date, the value of opium
imports exceeded that of tea exports, and Chinese silver began flowing
out of the Celestial Kingdom for the first time. After 1818, silver
constituted fully one-fifth of the value of Chinese export goods."
William J. Bernstein, A Splendid Exchange: How Trade Shaped the World,
Atlantic Monthly Press, Copyright 2008 by William J. Bernstein, Kindle
Loc. 3616-47.
-------------------------------------------------------------------------------------------------In today's excerpt - dopamine, pleasure, and too much pleasure:
"The importance of dopamine was discovered by accident. In 1954, James
Olds and Peter Milner, two neuroscientists at McGill University,
decided to implant an electrode deep into the center of a rat's brain.
The precise placement of the electrode was largely happenstance; at
the time, the geography of the mind remained a mystery. But Olds and
Milner got lucky. They inserted the needle right next to the nucleus
accumbens (NAcc), a part of the brain that generates pleasurable
feelings. Whenever you eat a piece of chocolate cake, or listen to a
favorite pop song, or watch your favorite team win the World Series,
it is your NAcc that helps you feel so happy.
"But Olds and Milner quickly discovered that too much pleasure can be
fatal. They placed the electrodes in several rodents' brains and then
ran a small current into each wire, making the NAccs continually
excited. The scientists noticed that the rodents lost interest in
everything. They stopped eating and drinking. All courtship behavior
ceased. The rats would just huddle in the corners of their cages,
transfixed by their bliss. Within days, all of the animals had
perished. They died of thirst.
"It took several decades of painstaking research, but neuroscientists
eventually discovered that the rats had been suffering from an excess
of dopamine. The stimulation of the NAcc triggered a massive release
of the neurotransmitter, which overwhelmed the rodents with ecstasy.
In humans, addictive drugs work the same way: a crack addict who has
just gotten a fix is no different than a rat in an electrical rapture.
The brains of both creatures have been blinded by pleasure. This,
then, became the dopaminergic cliche; it was the chemical explanation
for sex, drugs, and rock and roll.

"But happiness isn't the only feeling that dopamine produces.


Scientists now know that this neurotransmitter helps to regulate all
of our emotions, from the first stirrings of love to the most visceral
forms of disgust. It is the common neural currency of the mind, the
molecule that helps us decide among alternatives. By looking at how
dopamine works inside the brain, we can see why feelings are capable
of providing deep insights. While Plato disparaged emotions as
irrational and untrustworthy - the wild horses of the soul - they
actually reflect an enormous amount of invisible analysis."
Jonah Lehrer, How We Decide, Houghton Mifflin Harcourt, Copyright 2009
by Jonah Lehrer, Kindle Loc. 463-538.
-------------------------------------------------------------------------------------------------In today's excerpt - famous writers and their odd ways of writing:
"Dame Edith Sitwell used to lie in an open coffin for a while before
she began her day's writing. When I mentioned this macabre bit of
gossip to a poet friend, he said acidly, 'If only someone had thought
to shut it.' ...
"Sitwell's coffin trick may sound like a prank, unless you look at how
other writers have gone about courting their muses. ... For example,
the poet Schiller used to keep rotten apples under the lid of his desk
and inhale their pungent bouquet when he needed to find the right
word. Then he would close the drawer, but the fragrance remained in
his head. ...
"Amy Lowell, like George Sand, liked to smoke cigars while writing,
and went so far in 1915 as to buy 10,000 of her favorite Manila
stogies to make sure she could keep her creative fires kindled. ...
Balzac drank more than 50 cups of coffee a day, and actually died from
caffeine poisoning, although colossal amounts of caffeine don't seem
to have bothered W. H. Auden or Dr. Johnson, who was reported to have
drunk 25 cups of tea at one sitting. Victor Hugo, Benjamin Franklin
and many others felt that they did their best work if they wrote while
they were nude. ...
"Colette used to begin her day's writing by first picking fleas from
her cat, and it's not hard to imagine how the methodical stroking and
probing into fur might have focused such a voluptuary's mind. After
all, this was a woman who could never travel light, but insisted on
taking a hamper of such essentials as chocolate, cheese, meats,
flowers and a baguette whenever she made even brief sorties. ...
"Alfred de Musset, George Sand's lover, confided that it piqued him
when she went directly from lovemaking to her writing desk, as she

often did. But surely that was not so direct as Voltaire's actually
using his lover's naked back as a writing desk. Robert Louis
Stevenson, Mark Twain and Truman Capote all used to lie down when they
wrote, with Capote going so far as to declare himself 'a completely
horizontal writer.' ...
"Benjamin Franklin, Edmond Rostand and others wrote while soaking in a
bathtub. In fact, Franklin brought the first bathtub to the United
States in the 1780's, and he loved a good, long, thoughtful
submersion. In water and ideas, I mean. ...
"The Romantics, of course, were fond of opium, and Coleridge freely
admitted to indulging in two grains of it before working. The list of
writers triggered to inspirational highs by alcohol would occupy a
small, damp book. T. S. Eliot's tonic was viral - he preferred writing
when he had a head cold. The rustling of his head, as if full of
petticoats, shattered the usual logical links between things and
allowed his mind to roam."
Diane Ackerman, "O Muse! You Do Make Things Difficult!" The New York
Times, Sunday, November 12, 1989, Section 7, Page 1.
-------------------------------------------------------------------------------------------------In today's excerpt - Dr. Frankenstein's creation, called simply the
Creature in Mary Shelley's groundbreaking 1818 novel, Frankenstein. In
the early 1800's, at the dawn of science as a profession, some
scientists had begun to believe that electricity itself was the life
force that animated the spirit in humans. From this idea, Mary Shelley
crafted the world's first science fiction novel. In contrast to the
mindless grunts of present day "Frankensteins", the original Creature
was the most articulate character in Shelley's novel - and yearned for
love:
"Many scientific men of the day [were] entranced by the potentialities
of the voltaic battery, and its possible connections with 'animal
magnetism' and human animation. Electricity in a sense became a
metaphor for life itself. ... The most singular literary response to
[these ideas], called Vitalism ... was Mary Shelley's cult novel
Frankenstein, or The Modern Prometheus (1818). In this story ... a
sort of human life is physically created, or rather reconstructed. But
the soul or spirit is irretrievably damaged. ...
"As her novel developed, Mary Shelley began to ask in what sense
Frankenstein's new 'Creature' would be human. Would it have language,
would it have a moral conscience, would it have human feelings and
sympathies, would it have a soul? (It should not be forgotten that
Mary was pregnant with her own baby in 1817.) ...

"[Dr.] Frankenstein's Creature has been constructed as a fully


developed man, from adult body parts, but his mind is that of a
totally undeveloped infant. He has no memory, no language, no
conscience. He starts life as virtually a wild animal, an orangutan or
an ape. ...
"Almost his first conscious act of recognition, when he has escaped
the laboratory into the wood at night, is his sighting of the moon, an
object that fills him with wonder, although he has no name for it.
[The Creature himself then narrates:] 'I started up and beheld a
radiant form rise from among the trees. I gazed with a kind of wonder.
It moved slowly, but it enlightened my path ... It was still cold ...
No distinct ideas occupied my mind; all was confused. I felt light,
and hunger, and thirst, and darkness; innumerable sounds rung in my
ears and on all sides various scents saluted me. ... My mind received,
every day, additional ideas.'
"From this moment the Creature evolves rapidly through all the
primitive stages of man. Mary's account is almost anthropological,
reminiscent of [Sir Joseph] Banks's account of the Tahitians. First he
learns to use fire, to cook, to read. Then he studies European history
and civilization, through the works of Plutarch, Milton and Goethe.
Secretly listening to the cottagers in the woods, he learns conceptual
ideas such as warfare, slavery, tyranny. His conscience is aroused,
and his sense of justice. But above all, he discovers the need for
companionship, sympathy and affection. And this is the one thing he
cannot find, because he is so monstrously ugly: 'The cold stars shone
in their mockery, and the bare trees waved their branches above me,
the sweet voice of a bird burst forth amidst the universal stillness.
All, save I, were at rest ... I, like the arch-fiend, bore a hell
within me, and finding myself unsympathized with, wished to tear up
the trees, spread havoc and destruction around me, and then to have
sat down and enjoyed the ruin.'
"On the bleak Mer de Glace glacier in the French Alps, the Creature
appeals to his creator [Dr.] Frankenstein for sympathy, and for love.
'I am malicious because I am miserable. Am I not shunned and hated by
all mankind? You, my creator would not call it murder, if you could
precipitate me into one of those ice-rifts ... Oh! My creator, make me
happy! Let me feel gratitude towards you for one benefit! Let me see
that I excite the sympathy of one existing thing. Do not deny me my
request!'
"His terrible corrosive and destructive solitude becomes the central
theme of the second part of Mary Shelley's novel. Goaded by his
misery, the Creature kills and destroys. Yet he also tries to take
stock of his own violent actions and contradictory emotions. He
concludes that his one hope of happiness lies in sexual companionship.
The scene on the Mer de Glace in which he begs Frankenstein to create
a wife for him is central to his search for human identity and

happiness. The clear implication is that a fully human 'soul' can only
be created through friendship and love."
Richard Holmes, The Age of Wonder: How the Romantic Generation
Discovered the Beauty and Terror of Science, Pantheon, Copyright 2008
by Richard Holmes, Kindle Loc. 6789-6800, 7112-54, 7221-60.
-------------------------------------------------------------------------------------------------In today's excerpt - pirates and anarchy:
"From countless films and books we all know that, historically,
pirates were criminally insane, traitorous thieves, torturers and
terrorists. Anarchy was the rule, and the rule of law was nonexistent.
"Not so, dissents George Mason University economist Peter T. Leeson in
his myth-busting book, The Invisible Hook (Princeton University Press,
2009), which shows how the unseen hand of economic exchange produces
social cohesion even among pirates. Piratical mythology can't be true,
in fact, because no community of people could possibly be successful
at anything for any length of time if their society were utterly
anarchistic. Thus, Leeson says, pirate life was 'orderly and honest'
and had to be to meet buccaneers' economic goal of turning a profit.
'To cooperate for mutual gain - indeed, to advance their criminal
organization at all - pirates needed to prevent their outlaw society
from degenerating into bedlam. 'There is honor among thieves,' as Adam
Smith noted in The Theory of Moral Sentiments: 'Society cannot subsist
among those who are at all times ready to hurt and injure one
another. ... If there is any society among robbers and murderers, they
must at least ... abstain from robbing and murdering one another.'
"Pirate societies, in fact, provide evidence for Smith's theory that
economies are the result of bottom-up spontaneous self-organized order
that naturally arises from social interactions, as opposed to top-down
bureaucratic design. Just as historians have demonstrated that the
'Wild West' of 19th-century America was a relatively ordered society
in which ranchers, farmers and miners concocted their own rules and
institutions for conflict resolution way before the long arm of
federal law reached them, Leeson shows how pirate communities
democratically elected their captains and constructed constitutions.
Those documents commonly outlined rules about drinking, smoking,
gambling, sex (no boys or women allowed onboard), use of fire and
candles, fighting and disorderly conduct, desertion and shirking one's
duties during battle. ... Enforcement was key. Just as civil courts
required witnesses to swear on the Bible, pirate crews had to consent
to the captain's codes before sailing. In the words of one observer:
'All swore to 'em, upon a Hatchet for want of a Bible. Whenever any
enter on board of these Ships voluntarily, they are obliged to sign
all their Articles of Agreement ... to prevent Disputes and Ranglings

afterwards.' Thus, the pirate code 'emerged from piratical


interactions and information sharing, not from a pirate king who
centrally designed and imposed a common code on all current and future
sea bandits.'
"From where, then, did the myth of piratical lawlessness and anarchy
arise? From the pirates themselves, who helped to perpetrate the myth
to minimize losses and maximize profits. Consider the Jolly Roger flag
that displayed the skull and crossbones. Leeson says it was a signal
to merchant ships that they were about to be boarded by a marauding
horde of heartless heathens; the nonviolent surrender of all booty was
therefore perceived as preferable to fighting back. Of course, to
maintain that reputation, pirates occasionally had to engage in
violence, reports of which they provided to newspaper editors, who
duly published them in gory and exaggerated detail. But as 18th
century English pirate Captain Sam Bellamy explained, 'I scorn to do
any one a Mischief, when it is not for my Advantage.' Leeson
concludes, 'By signaling pirates' identity to potential targets, the
Jolly Roger prevented bloody battle that would needlessly injure or
kill not only pirates, but also innocent merchant seamen.'
"This economic analysis also explains why Somali pirates typically
receive ransom payoffs instead of violent resistance from shipping
crews and their owners. It is in everyone's economic interest to
negotiate the transactions as quickly and peacefully as possible.
Markets operating in a lawless society are more like black markets
than free markets, and because the Somali government has lost control
of its society, Somali pirates are essentially free to take the law
into their own hands. Until Somalia establishes a rule of law and a
lawful free market for its citizens, lawless black market piracy will
remain profitable. Until then, an-arrgh-chy will reign."
Michael Schermer, "Captain Hook Meets Adam Smith," Scientific
American, October 2009, p. 34.
-------------------------------------------------------------------------------------------------In today's encore excerpt - Japan, the leading power in Asia in the
1930s, knew from the experience of World War I that it needed oil in
order to remain a military power. Japan had the same imperial
expansionist desires that European nations had long held, and had
recently invaded both China and Korea. But it had virtually no oil.
Japan's bombing of Pearl Harbor was flanking maneuver for its primary
objective - the oil of the Dutch East Indies:
"By the late 1930s, Japan produced only about 7 percent of the oil it
consumed. The rest was imported--80 percent from the United States,
and another 10 percent from the Dutch East Indies. ... The [Japanese]
Navy had its sights set on the Dutch East Indies, Malaya, Indochina,

and a number of smaller islands in the Pacific, particularly the prime


and absolutely essential resource - oil.
"Here was the deadly paradox for Japan. It wanted to reduce its
reliance on the United States, especially for most of its oil, much of
which went to fuel its fleet and air force. Japan feared that such
dependence would cripple it in a war. But Tokyo's vision of security
and the steps it took to gain oil autonomy [through a takeover of East
Indies oil] created exactly the conditions that would point to war
with the United States. ...
"On July 24, 1941, ... a dozen Japanese troop transports were on their
way south to effect the occupation of southern Indochina, [a
steppingstone to the East Indies]. On the evening of July 25, the U.S.
government responded ordering all Japanese financial assets in the
United States to be frozen [and] ... a virtually total oil embargo was
the result. ...
"Tokyo had worked itself into a corner. Japan's petroleum reserves
would, without replenishment, last no more than two years. ... Foreign
minister Teijiro Toyoda wrote on July 31, 1941, 'Our Empire to save
its very life must take measures to secure the raw materials of the
South Seas.' ... [Diplomatic negotiations ensued but on] November 27
the United States had completely given up on negotiations with
Japan. ...
"The bombs began to fall on the American fleet in Pearl Harbor at 7:55
a.m., Hawaiian time. ... Senior American officials had fully expected
a Japanese attack, and imminently. But they expected it to be in
Southeast Asia. ...
"Pearl Harbor was not the main Japanese target. Hawaii was but one
piece of a massive, far-flung military onslaught. In the same hours as
the attack on the U.S. Pacific Fleet, the Japanese were bombing and
blockading Hong Kong, bombing Singapore, bombing the Philippines,
bombarding the islands of Wake and Guam, taking over Thailand,
invading Malaya on the way to Singapore - and preparing to invade the
East Indies. The operation against Pearl Harbor was meant to protect
the flank - to safeguard the Japanese invasion of the Indies and the
rest of Southeast Asia. ... The primary target of this huge campaign
remained the oil fields of the East Indies."
Daniel Yergin, The Prize, Free Press, Copyright 1991, 1992 by Daniel
Yergin, pp. 307-326.
-------------------------------------------------------------------------------------------------In today's excerpt - Nixon masters TV as part of his successful 1967
presidential campaign, and as part of that his staff invents the

completely staged "impromptu" encounter with voters:


"The idea for Nixon's new approach to television had come of an
appearance the previous autumn on Mike Douglas's afternoon chat show.
As Nixon sat in the Douglas show's makeup chair he chatted
perfunctorily with a young producer about how silly it was that it
took gimmicks like going on daytime talk shows to get elected in
America in 1968. The producer, a twenty- six-year-old named Roger
Ailes, did not come back with the expected deferential chuckle.
Instead he lectured him: if Nixon still thought talk shows were a
gimmick, he'd never become president of the United States. Ailes then
reeled off a litany of Nixon's TV mistakes in 1960, when Ailes had
been in high school-and, before he knew it, had been whisked to New
York and invited to work for the man in charge of the media team,
Frank Shakespeare. ...
"His young confederate Ailes was a TV-producing prodigy, transforming
Douglas from a local Philadelphia fixture into a national icon of
square chic: 'Each weekday more than 6,000,000 housewives in 171
cities set up their ironing boards in front of the TV set to watch
their idol,' said a feature story in Time. Ailes was perfect to
execute the newest Nixon's new idea, the most brazen in the history of
political TV. Ailes, Garment, Shakespeare, Ray Price, and a young
lawyer [named] Tom Evans, met in a CBS screening room. Like football
coaches, they reviewed game film: seven hours of Nixon TV appearances.
As a stump speaker, the medium could make him look like an earnest,
sweaty litigator. He did better on camera in informal settings,
looking a questioner in the eye. They decided that this would be how
they would make sure Nixon was seen - all through 1968.
"But Richard Nixon had enemies. Genuinely impromptu encounters - the
sort that were supposed to be the charm of New Hampshire campaigning had a chance of turning nasty. Thus the innovation. They would film
impromptu encounters. Only they would be staged.
"Shakespeare brought on board a TV specialist from Bob Haldeman's old
employer, J. Walter Thompson [Advertising]. Harry Treleaven was a TVobsessed nerd who perennially bored people by rhapsodizing over the
technical details of his craft. Militantly indifferent to ideology,
his last triumph was rewiring the image of George Herbert Walker Bush,
the new congressman from Houston who'd lost a Senate race as a
Goldwater Republican in '64. Men-on-the- street in Houston had thought
George Bush likable, though 'there was a haziness about exactly where
he stood politically,' Treleaven wrote in a postmortem memo. Treleaven
thought that was swell. 'Most national issues today are so
complicated, so difficult to understand,' he said, that they 'bore the
average voter.' Putting 85 percent of Bush's budget into advertising,
almost two-thirds of that into TV, he set to work inventing George
Bush as a casual kind of guy who walked around with his coat slung
over his shoulder (he was actually an aristocrat from Connecticut).

Since the polls had him behind, Treleaven also made him a 'fighting
underdog,' 'a man who's working his heart out to win.' His ideology,
whatever it was, wasn't mentioned.
"Nixon gave this team carte blanche: 'We're going to build this whole
campaign around television. You fellows just tell me what you want me
to do and I'll do it.'
"On February 3, he was slipped out a back door in Concord and spirited
to tiny Hillsborough, where an audience of two dozen townsfolk
handpicked by the local Nixon committee sat waiting in a local
courtroom. Outside were uniformed guards to keep out the press - the
men to whom Richard Nixon had just pledged his most open campaign
ever. Lights, camera, action; citizens asked their questions; cameras
captured their man's answers; then, Treleaven, Ailes, and Garment got
to work chopping the best bits into TV spots. ...
"The reporters threatened mutiny. Ailes offered them a compromise:
from now on they'd be allowed to watch on monitors in a room nearby
and interview the audience after the show. If they didn't like it,
tough. A man who raged at what he could not control, Richard Nixon had
found a way to be in control."
Rick Perlstein, Nixonland, Scribner, Copyright 2008 by Rick Perlstein,
pp. 233-235.
-------------------------------------------------------------------------------------------------In today's excerpt - in the middle ages, a vast portion of what is now
Spain was ruled by Muslims, who were a model of religious tolerance,
and who provided Europe with the knowledge and technology that was one
of the keys to its resurgence in the Renaissance until they were
finally driven from Spain in 1492 by Ferdinand and Isabella. Their
territory is in part remembered today as Andalusia - "Al Andalus":
"After the Moorish conquest of Spain in the eighth century, the emir
of Al Andalus had been a vassal of the caliphs of Damascus and
Baghdad. But this western outpost of Islam was the first of the Muslim
provinces to break free of its Oriental masters. When the Mongols
destroyed the caliphate in Baghdad in 1258, the independence of Al
Andalus was solidified, and the Spanish Moors began to relate more to
Europe than the Middle East.
"In arts and agriculture, learning and tolerance, Al Andulus was a
beacon of enlightenment to the rest of Europe. In the fertile valleys
of the Guadalquivir and the Guadiana rivers, as well as the terraced
slopes of the Alpujarras, agriculture surpassed anything elsewhere on
the continent. Moorish filigree silver- and leatherwork became famous
throughout the Mediterranean. In engineering, the skill of the Spanish

Moors had no parallel, and the splendor of their architecture was


manifest in the glorious mosque of Cordoba, the Giralda and Alcazar of
Seville, and the Alhambra of Granada. Its excellence in art and
literature, mathematics and science, history and philosophy defined
this brilliant civilization.
"Among its finest achievements was its tolerance. Jews and Christians
were welcomed, if not as equals, then as full-fledged citizens. They
were permitted to practice their faith and their rituals without
interference. This tolerance was in keeping with the principles of the
Koran, which taught that Jews and Christians were to be respected as
'peoples of the Book' or believers in the word of God. Jews and
Christians were assimilated into Islamic culture, and occasionally,
Moorish leaders helped to build Christian houses of worship.
"In 1248, work began on the colossal Alhambra in Granada. With its
thirteen towers and fortified walls above the ravine of the Darro
River, the river of gold, the red palace took shape over the next
hundred years. The extraordinary rooms of its interior - the Courtyard
of the Lions, the Hall of the Two Sisters, the Court of the Myrtles were finished at the end of the long process under the reign of Yusef
I in the mid-fourteenth century. With their arabesque moldings and
gold ornament and vegetal carvings, these rooms became the wonder of
the world. Most stunning of all was the Courtyard of the Lions, whose
Oriental feel was more reminiscent of Japan than the Middle East and
whose vision was to replicate the Garden of Paradise."
James Reston, Jr., The Dogs of God, Anchor, Copyright 2005 by James
Reston, Jr., pp. 7-8.
-------------------------------------------------------------------------------------------------In today's excerpt - if you happen to work for a bureaucracy, you'll
need to know the subtleties of "officespeak":
"This section deals with the technical aspects of officespeak, such as
passive voice, circular reasoning, and rhetorical questions. These are
the nuts and bolts of the Rube Goldberg contraption that is the
language of the office. Obscurity, vagueness, and a noncommittal
stance on everything define the essence of officespeak. No one wants
to come out and say what they really think. It is much safer for the
company and those up top to constantly cloak their language in order
to hide how much they do know or, just as often, how much they don't
know. ...
Passive voice: The bread and butter of press releases and official
statements. For those who have forgotten their basic grammar, a
sentence in the passive voice does not have an active verb. Thus, no
one can take the blame for 'doing' something, since nothing,

grammatically speaking, has been done by anybody. Using the passive


voice takes the emphasis off yourself (or the company). Here [is an]
few example of how the passive voice can render any situation
guiltless:
'Five hundred employees were laid off.' (Not 'The company laid off
five hundred employees,' or even worse, 'I laid off five hundred
employees.' These layoffs occurred in a netherworld of displaced
blame, in which the company and the individual are miraculously absent
from the picture.) ...
Circular reasoning: Another favorite when it comes time to deliver bad
news. In circular reasoning, a problem is posited and a reason is
given. Except that the reason is basically just a rewording of the
problem. Pretty nifty. Here are some examples to better explain the
examples:
'Our profits are down because of [a decrease in revenues].'
'People were laid off because there was a surplus of workers.' ...
Rhetorical questions: The questions that ask for no answers. So why
even ask the question? Because it makes it seem as though the listener
is participating in a true dialogue. When your boss asks, 'Who's
staying late tonight?' you know he really means, 'Anyone who wants to
keep their job will work late.' Still, there's that split second when
you think you have a say in the matter, when you believe your opinion
counts. Only to be reminded, yet again, that no one cares what you
think. ...
Hollow statements: The second cousin of circular reasoning. Hollow
statements make it seem as though something positive is happening
(such as better profits or increased market share), but they lack any
proof to support the claim.
'Our company is performing better than it looks.'
'Once productivity increases, so will profits.' ...
They and them: Pronouns used to refer to the high-level management
that no one has ever met, only heard whispers about. 'They' are
faceless and often nameless. And their decisions render those beneath
them impotent to change anything. 'They' fire people, 'they' freeze
wages, 'they' make your life a living hell. It's not your boss who is
responsible - he would love to reverse all these directives if he
could. But you see, his hands are tied.
'I'd love to give you that raise, you know I would. But they're the
ones in charge.'
'Okay, gang, bad news, no more cargo shorts allowed. Hey, I love the
casual look, but they hate it.' ...

Obfuscation: A tendency to obscure, darken, or stupefy. The primary


goal of the above techniques is, in the end, obfuscation. Whether it's
by means of the methods outlined above or by injecting jargon-heavy
phrases into sentences, corporations want to make their motives and
actions as difficult to comprehend as possible."
D.W. Martin, Officespeak, Simon Spotlight, Copyright 2005 by David
Martin, pp. 11-20.
-------------------------------------------------------------------------------------------------In today's excerpt - the Crusades, Sir Walter Scott, and Ivanhoe. The
Crusades were perhaps the most powerful and long-lasting legacy of the
Middle Ages. Reflecting the power of this legacy, in 1819, Sir Walter
Scott's Ivanhoe exploded onto the European scene, selling millions of
copies, leading to hundreds of staged productions, and reigniting
interest in the imagined chivalry and religious virtue of the two
hundred years of European Crusades in the Middle East:
"In November 1095 Pope Urban II called upon the knights of France to
journey to the Holy Land and liberate the city of Jerusalem and the
Christians of the east from Muslim power. In return they would be
granted an unprecedented spiritual reward - the remission of all their
sins - and thereby escape the torments of Hell, their likely
destination after lives of violence and greed. The response to Urban's
appeal was astounding; over 60,000 people set out to recover the Holy
Land and secure this reward and, in some cases, take the chance to set
up new territories. Almost four years later, in July 1099, the
survivors conquered Jerusalem in an orgy of killing. While most of the
knights returned home, the creation of the Crusader States formed a
permanent Christian (or 'Frankish') presence in the Levant. In 1187,
however, Saladin defeated their forces at the Battle of Hattin and
brought Jerusalem back under Muslim control. The Franks held onto
other lands until 1291 when they were finally driven out by the
Mamluks of Egypt to end Christian rule in the Holy Land.
"Crusading was too deeply established within Catholic Europe to
disappear after the loss of the Holy Land in 1291, [and] the roots
laid down by crusading proved extraordinarily deep, in part because of
the idea's flexibility. In the course of the 12th and 13th centuries
crusades were launched against the Muslims of Spain and other enemies
of the faith such as the pagan tribes of northeastern Europe (the
Baltic Crusades). ... Crusading offered a platform for knights to show
bravery and integrity. The idea of fighting for God, the ultimate
lord, gave service in crusading armies a special attraction, although
at times knights' determination to win fame for themselves could cause
them to put notions of honor ahead of the greater Christian cause. ...
"Perhaps the last crusading battle of note took place at Lepanto in

1571 where a fleet of Spanish,Venetian and Military Order vessels


defeated the Turks. The Knights of St John (the Hospitallers)
preserved control over their island outposts of Rhodes, until 1523,
and then Malta, but otherwise crusading subsided. The advent of
Protestantism brought severe judgments on such a papally-directed
concept. ...
"Yet during the 19th century, crusading, or a mutated form of it,
gained new interest in the West. One reason was the writing of Sir
Walter Scott whose tales of chivalric endeavor in the Holy Land, most
particularly Ivanhoe (1819) and The Talisman (1832), enraptured
audiences across Europe. As a Calvinist, Scott's view of 'intolerant
zeal' was restrained, but overall he gave a positive impression of the
crusades. Scott's works were translated into numerous languages and in
France alone he had sold over two million books by 1840. Ivanhoe alone
inspired almost 300 dramas; within a year of its publication, 16
versions of the story were being staged across England. ...
"In tandem with these developments, the 19th century saw a dramatic
expansion of European political power into the Muslim near east,
largely at the expense of the declining Ottoman Empire. France invaded
Algeria in 1830 and soon afterwards Spain and Italy, too, embarked
upon North African adventures. Some looked to the crusades as a
forerunner, especially after France took control of Syria in 1920.
Paul Pic, Professor of Law at the University of Lyon, regarded Syria
as 'a natural extension of France', while in 1929 Jean Longnon wrote
that: 'The name of Frank has remained a symbol of nobility, courage
and generosity ... and if our country has been called on to receive
the protectorate of Syria, it is the result of that influence.' [Syria
remained a French 'protectorate until after World War II]."
Jonathan Phillips, "The Call of the Crusades," History Today, Volume:
59, Issue: 11
-------------------------------------------------------------------------------------------------In today's excerpt - under Muslim rule, Spain became the center of
wealth in Europe, and Cordoba was Europe's most glamorous city.
Converting to Islam became fashionable, and one of Christian Europe's
most profitable new businesses was selling slaves to the Muslims, the
highest-quality of which were eunuchs. The ruling dynasty in that era
was the Umayyads, the ruler himself was known as the Caliph, and his
domain was known as the Caliphate. That part of Spain under the
Caliph's rule was al-Andalus or Andalusia:
"At the start of the tenth century, it has been estimated, the
population of al-Andalus was only one-fifth Muslim; by 976, that
percentage had been reversed. The status of Christians in Islamic
Spain had [become] unfashionable. The Church in al-Andalus had long

been thundering against the passion of its flock for Saracen (Muslim)
chic; but increasingly, whether translating the scriptures into
Arabic, or adopting Muslim names for themselves, or dancing attendance
on the Caliph at his court, even bishops were succumbing to its
allure. ...
"The Caliphate offered, to the ambitious merchant, a free-trade area
like no other in the world. Far eastwards of al-Andalus it extended,
to Persia and beyond, while in the markets of the great cities of
Islam were to be found wonders from even further afield: sandalwood
from India, paper from China, camphor from Borneo. What was Christian
Spain, with her flea-bitten little villages, to compare? Why, unlike
their equivalents in Italy, they were not even good for slaves!
"The Andalusis were now the importers of slaves; and a swarm of
Christian suppliers, with little else to offer which might serve to
tickle Andalusi palates, had competed to corner the market [in slaves]
no less eagerly than their Muslim competitors. ... In Arabic, as in
most European languages, the word 'Slavs' was becoming, by the tenth
century, increasingly synonymous with human cattle.
"Nothing, indeed, in the fractured Europe of the time, was more
authentically multicultural than the business of enslaving Slavs. West
Slavs captured in the wars of the Saxon emperors would be sold by
Frankish merchants to Jewish middlemen, who then, under the shocked
gaze of Christian bishops, would drive their shackled stock along the
high roads of Provence and Catalonia, and across the frontier into the
Caliphate.
"Few opportunities were neglected in the struggle to obtain a
competitive edge. In the Frankish town of Verdun, for instance, the
Jewish merchants who had their headquarters there were renowned for
their facility with the gelding knife. A particular specialization was
the supply of 'carzimasia': eunuchs who had been deprived of their
penises as well as their testicles. Even for the most practiced
surgeon, the medical risks attendant on performing a penectomy were
considerable - and yet the wastage served only to increase the
survivors' value. Exclusivity, then as now, was the mark of a luxury
brand.
"And luxury, in al-Andalus, could make for truly fabulous profit. The
productivity of the land; the teeming industry of the cities; the
influx of precious metals from mines in Africa: all had helped to
establish the realm of the Umayyads as Europe's premier showcase for
conspicuous consumption. Cordoba, the capital of al-Andalus, was a
wonder of the age. Just as Otto, emperor [of the Holy Roman Empire]
though he was, lacked a residence that could rival so much as the
gatehouse of the palace of the Caliph, so was there nowhere else in
western Europe a settlement that remotely approached the scale and
splendour of Cordoba. Indeed, in the whole of Christendom, there was

only a single city that could boast of being a more magnificent seat
of empire - and that was Constantinople, the Queen of Cities herself."
Tom Holland, The Forge of Christendom, Doubleday, Copyright 2008 by
Tom Holland, pp. 100-103.
-------------------------------------------------------------------------------------------------In today's excerpt - in the 1970s, the science pages of The New York
Times, Newsweek, and other publications sounded the dire warning of
impending global cooling. Headlines from The New York Times of the
period included "International Team of Specialists Finds No End in
Sight to 30-Year Cooling Trend in Northern Hemisphere," and "New
Studies Point to Ice Age Again." The excerpt below is from a New York
Times article titled "Scientists Ask Why World Climate is Changing;
Major Cooling May Be Ahead." There were, inevitably, dissenting
voices:
"There are specialists who say that a new ice age is on the way - the
inevitable consequence of a natural cyclic process, or as a result of
man-made pollution of the atmosphere. And there are those who say that
such pollution may actually head off an ice age.
"Sooner or later a major cooling of the climate is widely considered
inevitable. Hints that it may already have begun are evident. The drop
in mean temperatures since 1950 in the Northern Hemisphere has been
sufficient. for example, to shorten Britain's growing season for crops
by two weeks. ...
"The first half of this century has apparently been the warmest period
since the 'hot spell' between 5,000 and 7,000 years ago immediately
following the last ice age. That the climate, at least in the Northern
Hemisphere. has been getting cooler since about 1950, is well
established - if one ignores the last two winters. ...
"From the chemical composition of Pacific sediments, from studies of
soil types in Central Europe and from fossil plankton that lived in
the Caribbean it has been shown that in the last million years there
have been considerably more ice ages than previously supposed.
"According to the classic timetable, four great ice ages occurred.
However, the new records of global climate show seven extraordinarily
abrupt changes in the last million years. As noted in the academy
report, they represent transition, in a few centuries 'from full
glacial to full interglacial conditions.' ...
"In a recent issue of the British journal Nature, Drs. Reid A. Bryson
and E. W. Wahl of the Center for Climate Research at the University of
Wisconsin cite records from nine North Atlantic weatherships

indicating that from 1951 to the 1968-1972 period surface water


temperatures dropped steadily. The fall was comparable, they reported,
to a return to the 'Little Ice Age' that existed from 1430 to
1850. ...
"There is general agreement that introducing large amounts of smoke
particles or carbon dioxide into the atmosphere can alter climate. The
same would be true of generating industrial heat comparable to a
substantial fraction of solar energy falling on the earth. The debate
centers on the precise roles of these effects and the levels of
pollution that would cause serious changes. ...
"The observatory atop Mauna Loa, the great Hawaiian volcano, has
recorded a steady rise in the annual mean level of carbon dioxide in
the atmosphere, amounting to 4 per cent between 1958 and 1972. That,
however, was a period of global cooling - not the reverse, as one
would expect from a greenhouse effect. ...
"If worldwide energy consumption continues to increase at its present
rates, catastrophic climate changes have been projected by M. I.
Budyko, a leading Soviet specialist. He says that the critical level
will probably be reached within a century. This, he has written, will
lead to 'a complete destruction of polar ice covers.' Not only would
sea levels rise but, with the Arctic Ocean free of ice, the entire
weather system of the Northern Hemisphere would be altered."
Walter Sullivan, "Scientists Ask Why World Climate is Changing; Major
Cooling May Be Ahead," The New York Times, May 21, 1975.
-------------------------------------------------------------------------------------------------In today's excerpt - the human brain is a "cognitive miser"- it can
employ several approaches to solving a given problem, but almost
always chooses the one that requires the least computational power:
"We tend to be cognitive misers. When approaching a problem, we can
choose from any of several cognitive mechanisms. Some mechanisms have
great computational power, letting us solve many problems with great
accuracy, but they are slow, require much concentration and can
interfere with other cognitive tasks. Others are comparatively low in
computational power, but they are fast, require little concentration
and do not interfere with other ongoing cognition. Humans are
cognitive misers because our basic tendency is to default to the
processing mechanisms that require less computational effort, even if
they are less accurate. Are you a cognitive miser? Consider the
following problem, taken from the work of Hector Levesque, a computer
scientist at the University of Toronto. Try to answer it yourself
before reading the solution.

Problem: Jack is looking at Anne, but Anne is looking at George. Jack


is married, but George is not. Is a married person looking at an
unmarried person?
A) Yes
B) No
C) Cannot be determined
"More than 80 percent of people choose C. But the correct answer is A.
Here is how to think it through logically: Anne is the only person
whose marital status is unknown. You need to consider both
possibilities, either married or unmarried, to determine whether you
have enough information to draw a conclusion. If Anne is married, the
answer is A: she would be the married person who is looking at an
unmarried person (George). If Anne is not married, the answer is still
A: in this case, Jack is the married person, and he is looking at
Anne, the unmarried person. This thought process is called fully
disjunctive reasoning - reasoning that considers all possibilities.
The fact that the problem does not reveal whether Anne is or is not
married suggests to people that they do not have enough information,
and they make the easiest inference (C) without thinking through all
the possibilities. Most people can carry out fully disjunctive
reasoning when they are explicitly told that it is necessary (as when
there is no option like 'cannot be determined' available). But most do
not automatically do so, and the tendency to do so is only weakly
correlated with intelligence.
"Here is another test of cognitive miserliness, as described by Nobel
Prize-winning psychologist Daniel Kahneman and his colleague Shane
Frederick.
"A bat and a ball cost $1.10 in total. The bat costs $1.00 more than
the ball. How much does the ball cost?
"Many people give the first response that comes to mind - 10 cents.
But if they thought a little harder, they would realize that this
cannot be right: the bat would then have to cost $1.10, for a total of
$1.20. IQ is no guarantee against this error. Kahneman and Frederick
found that large numbers of highly select university students at the
Massachusetts Institute of Technology, Princeton and Harvard were
cognitive misers, just like the rest of us, when given this and
similar problems."
Keith E. Stanovich, "Rational and Irrational Thought: The Thinking
That IQ Tests Miss," Scientific American, November/December 2009, pp.
35-36.
--------------------------------------------------------------------------------------------------

In today's excerpt - murder rates in the United States are the highest
among affluent democracies, and historians and criminologists have
only recently attempted to construct theories to explain these high
levels:
"The United States has the highest homicide rate of any affluent
democracy, nearly four times that of France and the United Kingdom,
and six times that of Germany. Why? Historians haven't often asked
this question. Even historians who like to try to solve cold cases
usually cede to sociologists and other social scientists the study of
what makes murder rates rise and fall, or what might account for why
one country is more murderous than another. Only in the nineteenseventies did historians begin studying homicide in any systematic
way. In the United States, that effort was led by Eric Monkkonen, who
died in 2005, his promising work unfinished. Monkkonen's research has
been taken up by Randolph Roth, whose book 'American Homicide' offers
a vast investigation of murder, in the aggregate, and over time. ...
"In the archives, murders are easier to count than other crimes. Rapes
go unreported, thefts can be hidden, adultery isn't necessarily
actionable, but murder will nearly always out. Murders enter the
historical record through coroners' inquests, court transcripts,
parish ledgers, and even tombstones. ... The number of uncounted
murders, known as the 'dark figure,' is thought to be quite small.
Given enough archival research, historians can conceivably count, with
fair accuracy, the frequency with which people of earlier eras killed
one another, with this caveat: the farther back you go in time - and
the documentary trail doesn't go back much farther than 1300 - the
more fragmentary the record and the bigger the dark figure....
"In Europe, homicide rates, conventionally represented as the number
of murder victims per hundred thousand people in the population per
year, have been falling for centuries. ... In feuding medieval Europe,
the murder rate hovered around thirty-five. Duels replaced feuds.
Duels are more mannered; they also have a lower body count. By 1500,
the murder rate in Western Europe had fallen to about twenty. Courts
had replaced duels. By 1700, the murder rate had dropped to five.
Today, that rate is generally well below two, where it has held
steady, with minor fluctuations, for the past century.
"In the United States, the picture could hardly be more different. The
American homicide rate has been higher than Europe's from the start,
and higher at just about every stage since. It has also fluctuated,
sometimes wildly. During the Colonial period, the homicide rate fell,
but in the nineteenth century, while Europe's kept sinking, the U.S.
rate went up and up. In the twentieth century, the rate in the United
States dropped to about five during the years following the Second
World War, but then rose, reaching about eleven in 1991. It has since
fallen once again, to just above five, a rate that is, nevertheless,
twice that of any other affluent democracy. ...

"2.3 million people are currently behind bars in the United States.
That works out to nearly one in every hundred adults, the highest rate
anywhere in the world, and four times the world average. ...
"[Roth theorizes] that four factors correlate with the homicide rate:
faith that government is stable and capable of enforcing just laws;
trust in the integrity of legitimately elected officials; solidarity
among social groups based on race, religion, or political affiliation;
and confidence that the social hierarchy allows for respect to be
earned without recourse to violence. When and where people hold these
sentiments, the homicide rate is low, when and where they don't, it is
high."
Jill Lepore, "Rap Sheet," The New Yorker, November 9, 2009, pp. 79-81.
-------------------------------------------------------------------------------------------------In today's excerpt - U.S. Presidents have had a pronounced tendency to
pressure the Federal Reserve into keeping interest rates low, and
Federal Reserve presidents are often predisposed to please. But
deferring needed rate increases always eventually backfires, and
imposing price controls always eventually backfires:
"Maintaining the gold value of the dollar [by increasing interest
rates] conflicted with the Kennedy growth imperative in 1962, although
it was finessed by the foreign security tax ploy. Starting about 1965,
Lyndon Johnson started running big budget deficits to finance the war
in Vietnam and his domestic program [and] the floods of new money were
already generating inflationary pressures. ...
"And once again, the prescribed medicine - raise interest rates and
reduce borrowing - was not on the table, for it conflicted with
Richard Nixon's desire to win a second presidential term.
"The first two years of the Nixon administration were very difficult
economic sailing, to the point where the administration was seriously
worried about the 1972 election. During the five years of Johnson's
presidency, despite the uptick in inflation, the real, or inflationadjusted, annual rate of growth exceeded 5 percent. But in 1970,
growth plunged to near zero, while inflation was scraping 6 percent dreadful numbers for a campaign launch. There was little room for
maneuver. The 1970 federal deficit was already as big as any Johnson
had run, so fiscal stimulation was likely to spill over into more
inflation. ...
"But few politicians had Nixon's gift for the bold stroke. In August
1971 he helicoptered his entire economics team to Camp David for a
weekend that Herbert Stein, a member of the Council of Economic

Advisers, predicted 'could be the most important meeting in the


history of economics' since the New Deal. After the meeting, ... Nixon
announced that he would cut taxes, impose wage and price controls
throughout the economy, impose a tax surcharge on all imports, and
rescind the commitment to redeem dollars in gold. ...
"After the final decisions had been taken, Volcker was charged with
drafting Nixon's and [Treasury Secretary John ] Connally's speeches
announcing the changes. His draft, he recalled, was 'a typical
devaluation speech' filled with the 'obligatory mea culpas.' None of
it saw the light of day. The Volcker draft was handed over to uber
speechwriter William Safire and emerged as a proclamation of 'a
triumph and a fresh start.'
"Politically, it was a masterstroke. With price controls in place
Nixon and his Federal Reserve chief, Arthur Burns, could gun up the
money supply without worrying about price inflation - both the narrow
and broad measures of money jumped by more than 10 percent in 1971, at
the time the biggest increase ever. Economic growth obediently
revived, and was back up over 5 percent by the 1972 election - just
what the political doctors had ordered.
"Consumers were happy with flat prices, while big business loved the
tax breaks, the import surcharges, and the price controls. All in a
single weekend, Nixon had delivered them from union wage pressures,
supplier price hikes, and foreign competition. ...
"Although Nixon got his landslide, the cracks in the economy were too
big to hide. The 1971 wage-and-price '90-day freeze,' as it was
originally billed, lasted for three years. Controls are always easier
to put on than to take off. The underlying inflation builds to a point
of explosiveness, and the inevitable thicket of rules offers
profitable little crevices for the lucky or the well-connected.
Organized labor stopped cooperating in 1974, but by then Nixon was
deeply ensnared in the coils of Watergate. Congress forced the end of
all controls in the spring, except for price controls on domestic oil.
Removal of controls triggered double-digit inflation, the first since
the 1940s, and the country suffered a nasty recession in 1974 and
1975."
Charles R. Morris, The Sages, Public Affairs, Copyright 2009 by
Charles R. Morris, pp. 124-127.
-------------------------------------------------------------------------------------------------In today's encore excerpt - the American Dust Bowl, which lasted from
1930 to as late as 1940 in some areas. Rated the number one weather
event of the twentieth century, the Dust Bowl covered one hundred
million acres in parts of Nebraska, Kansas, Colorado, New Mexico,

Oklahoma, and Texas, and left thousands dead, diseased and destitute:
"The rains disappeared - not just for a season but for years on end.
With no sod to hold the earth in place, the soil calcified and started
to blow. Dust clouds boiled up, ten thousand feet or more in the sky,
and rolled like moving mountains - a force of their own. When the dust
fell, it penetrated everything: hair, nose, throat, kitchen, bedroom,
water well. A scoop shovel was needed just to clean the house in the
morning. The eeriest thing was the darkness. People tied themselves to
ropes before going to a barn just a few hundred feet away, like a walk
in space, tethered to the life support center. Chickens roosted in
mid-afternoon. ...
"Many in the East did not believe the initial accounts of predatory
dust until a storm in May 1934 carried the windblown shards of the
Great Plains over much of the nation. In Chicago, twelve million tons
of dust fell. New York, Washington - even ships at sea, three hundred
miles off the Atlantic coast - were blanketed in brown. Cattle went
blind and suffocated. When farmers cut them open, they found stomachs
stuffed with fine sand. Horses ran madly against the storms. Children
coughed and gagged, dying of something the doctors called 'dust
pneumonia.' In desperation, some families gave away their children.
The instinctive act of hugging a loved one or shaking someone's hand
could knock two people down, for the static electricity from the
dusters was so strong. ...
"By 1934, the soil was like fine-sifted flour, and the heat made it a
danger to go outside many days. In Vinita, Oklahoma, the temperature
soared above 100 degrees for thirty-five consecutive days. On the
thirty- sixth day, it reached 117. ...
"On the skin, the dust was like a nail file, a grit strong enough to
hurt. People rubbed Vaseline in their nostrils as a filter. The Red
Cross handed out respiratory masks to schools. Families put wet towels
beneath their doors and covered their windows with bed sheets, freshdampened nightly. ...
"Black Sunday, April 14, 1935, [was the] day of the worst duster of
them all. The storm carried twice as much dirt as was dug out of the
earth to create the Panama Canal. The canal took seven years to dig;
the storm lasted a single afternoon. More than 300,000 tons of Great
Plains topsoil was airborne that day. ... As the black wall
approached, car radios clicked off, overwhelmed by the static.
Ignitions shorted out. Waves of sand, like ocean water rising over a
ship's prow, swept over roads. Cars went into ditches. A train
derailed."
Timothy Egan, The Worst Hard Time, Mariner, Copyright 2006 by Timothy
Egan, pp. 5-8.

-------------------------------------------------------------------------------------------------In today's excerpt - historically, 85% of the increase in per capita


GDP (gross domestic product or wealth) in the U.S. economy has come
from innovation - the invention of new products and services or the
invention of better ways to make existing products and services. It
follows that any durable and sustainable program to create jobs in an
economy would focus foremost on innovation:
"Since the 1950s, economists have understood that innovation is
critical to economic growth. Our lives are more comfortable and longer
than those of our great- grandparents on many dimensions. To cite just
three improvements: antibiotics cure once-fatal infections, longdistance communications cost far less, and the burden of household
chores is greatly reduced. At the heart of these changes has been the
progress of technology and business.
"Economists have documented the strong connection between
technological progress and economic prosperity, both across nations
and over time. This insight grew out of studies done by the pioneering
student of technological change, Morris Abramowitz. He realized that
there are ultimately only two ways of increasing the output of the
economy: (1) increasing the number of inputs that go into the
productive process (e.g., by having workers stay employed until the
age of sixty-seven, instead of retiring at sixty-two), or (2)
developing new ways to get more output from the same inputs.
Abramowitz measured the growth in the output of the American economy
between 1870 and 1950 - the amount of material goods and services
produced - and then computed the increase in inputs (especially labor
and financial capital) over the same time period. To be sure, this was
an imprecise exercise: he needed to make assumptions about the growth
in the economic impact of these input measures. After undertaking this
analysis, he discovered that growth of inputs between 1870 and 1950
could account only for about 15 percent of the actual growth in the
output of the economy. The remaining 85 percent could not be explained
through the growth of inputs. Instead, the increased economic activity
stemmed from innovations in getting more stuff from the same inputs.
"Other economists in the late 1950s and 1960s undertook similar
exercises. These studies differed in methodologies, economic sectors,
and time periods, but the results were similar. Most notably, Robert
Solow, who later won a Nobel Prize for this work, identified an almost
identical 'residual' of about 85 percent. The results so striking
because most economists for the previous 200 years had been building
models in which economic growth was treated as if it was primarily a
matter of adding more inputs: if you just had more people and dollars,
more output would invariably result.
"Instead, these studies suggested, the crucial driver of growth was

changes in the ways inputs were used. The magnitude of this


unexplained growth, and the fact that it was exposed by researchers
using widely divergent methodologies, persuaded most economists that
innovation was a major force in the growth of output.
"In the decades since the 1950s, economists and policymakers have
documented the relationship between innovation - whether new
scientific discoveries or incremental changes in the way that
factories and service businesses work - and increases in economic
prosperity. Not just identifying an unexplained 'residual,' studies
have documented the positive effects of technological progress in
areas such as information technology. Thus, an essential question for
the economic future of a country is not only what it produces, but how
it goes about producing it.
"This relationship between innovation and growth has been recognized
by many governments. From the European Union - which has targeted
increasing research spending as a key goal in the next few years - to
emerging economies such as China, leaders have embraced the notion
that innovation is critical to growth."
Josh Lerner, Boulevard of Broken Dreams, Princeton, Copyright 2009 by
Princeton University Press, pp. 43-45.
-------------------------------------------------------------------------------------------------In today's excerpt - small firms have an advantage over larger firms
in innovation, and venture capital plays a disproportionately large
role economic growth:
"Initially, economists generally overlooked the creative power of new
firms: they suspected that the bulk of innovations would stem from
large industrialized concerns. For instance, Joseph Schumpeter
(1883-1950), one of the pioneers of the serious study of
entrepreneurship, posited that large firms had an inherent advantage
in innovation relative to smaller enterprises. ...
"In today's world, Schumpeter's hypothesis of large-firm superiority
does not accord with casual observation. In numerous industries, such
as medical devices, communication technologies, semiconductors, and
software, leadership is in the hands of relatively young firms whose
growth was largely financed by venture capitalists and public equity
markets. ...
"A study by Zoltan Acs and David Audretsch examined which firms
developed some of the most important innovations of the twentieth
century. They documented the central contribution of new and small
firms: these firms contributed almost half the innovations they
examined. ...

"What explains the apparent advantage of smaller firms? Much of it


stems from the difficulty of large firms in fomenting innovation. For
instance, one of Schumpeter's more perceptive contemporaries, John
Jewkes, presciently argued:
" 'It is erroneous to suppose that those techniques of large-scale
operation and administration which have produced such remarkable
results in some branches of industrial manufacture can be applied with
equal success to efforts to foster new ideas. The two kinds of
organization are subject to quite different laws. In the one case the
aim is to achieve smooth, routine, and faultless repetition, in the
other to break through the bonds of routine and of accepted ideas. So
that large research organizations can perhaps more easily become selfstultifying than any other type of large organization, since in a
measure they are trying to organize what is least organizable.'
"But this observation still begs a question: what explains the
difficulties of larger firms in creating true innovations? In
particular, there are at least three reasons why entrepreneurial
ventures are more innovative:
"The first has to do with incentives. Normally, firms provide
incentives to their employees in many roles, from salespeople to
waiters. Yet large firms are notorious for offering employees little
more than a gold watch for major discoveries. ... Whatever the reason,
there is a striking contrast between the very limited incentives at
large corporate labs and the stock-option-heavy compensation packages
at start-ups.
"Second, large firms may simply become ineffective at innovating. A
whole series of authors have argued that incumbent firms frequently
have blind spots, which stem from their single-minded focus on
existing customers. As a result, new entrants can identify and exploit
market opportunities that the established leaders don't see.
"Finally, new firms may choose riskier projects. Economic theorists
suggest that new firms are likely to pursue high-risk strategies,
while established firms rationally choose more traditional approaches.
Hence, while small firms may fail more frequently, they are also
likely to introduce more innovative products. ...
"On average a dollar of venture capital appears to be three to four
times more potent in stimulating patenting than a dollar of
traditional corporate R&D."
Josh Lerner, Boulevard of Broken Dreams, Princeton, Copyright 2009 by
Princeton University Press, pp. 45-49, 62.
----------------------------------------------------------------------

----------------------------In today's excerpt - early in the American Revolution, General George


Washington's blunders and misjudgments led to terrible battlefield
losses in New York and in the Brandywine river valley, and put his job
in jeopardy to General Horatio Gates, whose stirring victory at
Saratoga had been crucial to American prospects:
"Washington's string of blunders [in New York] was certain to lead to
recriminations, perhaps even to calls for his removal. Already aware
of the displeasure among some in Congress, he discovered on the last
day of November that even some in the army had lost confidence in him.
On that day, a letter from General Charles Lee arrived for Joseph
Reed, Washington's former secretary, now the army's adjutant general.
As Reed was away from headquarters on a mission, Washington, who was
desperate for information, tore open Lee's letter. What he read was
lacerating. Washington's 'fatal indecision,' Lee had said in his
customarily caustic manner, would doom the American army.
"From things that Lee said, it was also clear that Reed, heretofore
Washington's closest confidant in the army, shared Lee's views. (Reed,
with pitiless honesty, had told Lee that Washington's 'indecisive
Mind' had been among the army's 'greatest Misfortunes,' and he added
that had it not been for Lee, Washington's army would never have
escaped Manhattan.) One of the few men with the backbone to criticize
Washington to his face, Lee had already told the commander that he was
foolish to act on the advice of his generals, most of whom were 'Men
of inferior judgment.' Though Washington was unaware of it, Lee had
also urged General Horatio Gates to hurry to Washington's side to
'save your army,' as 'a certain great Man is most damnably
deficient.' ...
"[The next year after his loss at the Battle of the Brandywine],
Washington was besieged with rumors that a 'Strong Faction' within
Congress wished to remove him and name Gates as the new commander of
the Continental army. Washington did not know precisely what was
occurring behind the curtain in Congress, but he probably knew that
some congressmen believed - as Pennsylvania's Dr. Benjamin Push, a
signer of the Declaration of Independence, put it - that the army
'under General Gates [was] a well regulated family,' while
'Washington's [was but an] imitation of an Army' that bore the look of
'an unformed mob.' Some proclaimed that Gates had 'executed with vigor
and bravery,' attaining 'the pinnacle of military glory.' Washington's
command, according to the whispers, was characterized by such
'negligence' that it was hardly surprising he had been 'outwitted,'
'outgeneraled and twice beated [sic].' It was unsettling enough to
have congressmen complain about his leadership, but atop that
Washington soon learned that some of his officers had lost confidence
in him.

"Washington did not know the full extent of the disaffection with his
leadership. But he knew enough to become convinced that he was in the
maw of a great crisis."
John Ferling, The Ascent of George Washington, Bloomsbury Press,
Copyright 2009 by John Ferling, pp. 119, 139.
-------------------------------------------------------------------------------------------------In today's excerpt - in 1917, the American public was resistant to
entering World War I, and Woodrow Wilson had just been re-elected on
the promise to keep America out of that war, when Germany made a
colossal diplomatic blunder that drew America in:
"As 1917 began, the war was not going well for Britain. There seemed
to be no end to the slaughter on the Western Front yet there were no
obvious signs of Germany being defeated. Food shortages threatened and
the Asquith government had fallen. Worse, Germany was about to start
unrestricted U-boat warfare in the Atlantic from February 1st with, it
was feared, a substantially larger U-boat fleet. Much depended on
whether America could be brought into the war.
"Unrestricted U-boat warfare meant that every enemy and neutral ship
found near the war zone would be sunk without warning. The Germans
envisaged U- boats sinking 600,000 tons a month, forcing Britain to
capitulate before the next harvest. Admiral von Holtzendorff told the
Kaiser: 'I guarantee that the U-boat will lead to victory ... I
guarantee on my word as a naval officer that no American will set foot
on the Continent.'
"Enter Arthur Zimmermann, the new German Foreign Minister, a blunt
speaker who considered himself an expert on American affairs. He
developed a plan to keep America out of Europe once U-boats started
sinking American ships. He proposed to establish a German-Mexican
alliance, promising the Mexicans that if America entered the war, and
following a German victory, Mexico would have restored to her the
territories of Texas, New Mexico and Arizona. ...
"On January 16th, 1917, [Zimmermann] sent a coded cable via the
American cable channel to his ambassador in Washington, Count
Bernstorff. It contained his overture to Mexico proposing a military
alliance against America. Bernstorff was instructed to pass on the
message to his counterpart Ambassador Eckhardt in Mexico City. ...
"The full text of the Zimmermann telegram read:
"Most Secret: For Your Excellency's personal information and to be
handed on to the Imperial (German) Minister in Mexico.

"We intend to begin un-restricted submarine warfare on the first of


February. We shall endeavour in spite of this to keep the United
States neutral. In the event of this not succeeding, we make Mexico a
proposal of an alliance on the following basis: Make war together,
make peace together, generous financial support, and an understanding
on our part that Mexico is to reconquer the lost territory in Texas,
New Mexico, and Arizona. The settlement detail is left to you.
"You will inform the President [of Mexico] of the above most secretly
as soon as the outbreak of war with the United States is certain and
add the suggestion that he should, on his own initiative, invite Japan
to immediate adherence and at the same time mediate between Japan and
ourselves.
"Please call the President's attention to the fact that the
unrestricted employment of our submarines now offers the prospect of
compelling England to make peace within a few months. Acknowledge
receipt. Zimmermann. ...
"The telegram [intercepted and decoded by the British and] was passed
to Washington with the explanation that the British copy had been
'bought in Mexico'. The contents of the telegram were passed on to the
Associated Press on February 28th. It sparked eight-column headlines
next morning. It caused a sensation in America but at the same time
aroused suspicion among Washington politicians about whether the
telegram was authentic. Some even sniffed a cunning British scheme to
propel America into war.
"Confirmation came from an unexpected source. To Lansing's 'profound
amazement and relief', Zimmermann himself admitted his authorship.
Overnight the mid-Western isolationist press dropped its pacifist
posture. The Chicago Daily Tribune said the United States could no
longer expect to keep out of 'active participation in the present
conflict.'
"On April 6th 1917, America went to war with Germany, as Wilson told a
joint session of Congress: 'The world must be made safe for
democracy.' "
David Nicholas, "Lucky Break," History Today, September 2007, pp.
56-57.
-------------------------------------------------------------------------------------------------In today's excerpt - 39-year-old Teddy Roosevelt leads the charge up
San Juan Hill. Roosevelt, who was Assistant Secretary of the Navy when
the Spanish American War started in 1898, unexpectedly resigned his
position to enlist in the Army, and displayed genuine heroism during a
key battle of the four month long war. Roosevelt was decidedly pro-war

at a moment when President McKinley and much of the country were


greatly concerned that the war was unnecessary and would be the
unfortunate commencement of American imperialism - and in fact, the
war resulted in the acquisition of America's first colony - the
Philippines:
"[After the explosion of the USS Maine], President William McKinley
called for 125,000 volunteers to augment the 28,000-man regular army.
Young men from every section of the country rallied to his call. They
were anxious to prove themselves equal to the task and worthy of their
place as Americans. Among the first to volunteer was the man who had
perhaps been the leading advocate for war - Theodore Roosevelt.
Everyone was astonished by this act.
"President McKinley twice attempted to change Roosevelt's mind, to no
avail. 'One of the commonest taunts directed at men like myself is
that we are armchair and parlor jingoes who wish to see others do what
we only advocate doing,' declared Roosevelt. 'I care very little for
such a taunt, except as it affects my usefulness, but I cannot afford
to disregard the fact that my power for good, whatever it may be,
would be gone if I didn't try to live up to the doctrines I have tried
to preach.' ...
"The press dubbed the [first of the three regiments
engaged]'Roosevelt's Rough Riders' - a name T.R. did not relish
because of its obvious reference to Buffalo Bill's Wild West show and the men were anxious to see their namesake lieutenant colonel.
Many were at first unimpressed with his somewhat comical appearance,
but that quickly changed. Lieutenant Tom Hall sized him up
immediately: 'He is nervous, energetic, virile. He may wear out some
day, but he will never rust out.' ...
"Colonel [Leonard] Wood noted 'that this is the first great expedition
our country has ever sent overseas and marks the commencement of a new
era in our relations with the world.' For the men, however, there was
little thought of world politics, just much card playing and even an
occasional chorus of the Rough Rider's adopted theme song - 'There'll
Be a Hot Time in the Old Town Tonight.' ...
"[In Cuba, during the heat of the battle] an assortment of officers,
foreign observers, and journalists watched [the charge up San Juan
Hill] in amazement. The foreigners were as one in condemning the folly
of the charge. 'It is gallant, but very foolish,' said one officer.
Melancholy New York World reporter Stephen Crane was lost in the glory
of it all. 'Yes, they were going up the hill, up the hill,' Crane
wrote. 'It was the best moment of anybody's life.'
"It was certainly the best moment of Colonel Theodore Roosevelt's
life. He was the only man on horseback, but his life seemed charmed.
'No one who saw Roosevelt take that ride expected him to finish it

alive,' wrote correspondent Richard Harding Davis. 'He wore on his


sombrero a blue polka-dot handkerchief, la Havelock, which, as he
advanced, floated out straight behind his head, like a guidon.' Like
Crane, Davis was overcome by the sheer emotion of the charge.
'Roosevelt, mounted high on horseback, and charging the rifle-pits at
a gallop and quite alone, made you feel that you would like to cheer,'
he declared."
Paul Andrew Hutton, "Theodore Roosevelt: Leading the Rough Riders
During the Spanish-American War," American History Magazine online.
-------------------------------------------------------------------------------------------------In today's encore excerpt - the calamity of the Great Depression
dwarfs the calamity of 2008, in large part because the Fed turned the
crisis of 1929 into the Great Depression by acting to contract the
money supply, helping cause U.S. output to decline by a third and
unemployment to rise to 33%:
"In perhaps the most important work of American economic history ever
published, Milton Friedman and Anna Schwartz argued that it was the
Federal Reserve System that bore the responsibility for turning the
crisis of 1929 into a Great Depression. They did not blame the Fed for
the bubble itself [and] ... the New York Fed responded effectively to
the October 1929 panic by conducting large-scale (and unauthorized)
market operations (buying bonds from the financial sector) to inject
liquidity into the market. However, after Strong's death from
tuberculosis in October 1928, the Federal Reserve Board in Washington
came to dominate monetary policy, with disastrous results.
"First, too little was done to counteract the credit contraction
caused by banking failures. This problem had already surfaced several
months before the stock market crash, when commercial banks with
deposits of more than $80 million suspended payments. However, it
reached critical mass in November and December 1930, when 608 banks
failed, with deposits totaling $550 million, among them the Bank of
United States, which accounted for more than a third of the total
deposits lost. The failure of merger talks that might have saved the
Bank was a critical moment in the history of the Depression.
"Secondly, ... the Fed made matters worse by reducing the amount of
credit outstanding (December 1930-April 1931). This forced more and
more banks to sell assets in a frantic dash for liquidity, driving
down bond prices and worsening the general position. The next wave of
bank failures, between February and August 1931, saw commercial bank
deposits fall by $2.7 billion, 9 per cent of the total. Thirdly, ...
the Fed raised its discount rate in two steps to 3.5 per cent, ...
driving yet more US banks over the edge: the period August 1931 to
January 1932 saw 1,860 banks fail with deposits of $1.45 billion. ...

"In the United States, output collapsed by a third. Unemployment


reached a quarter of the civilian labour force, closer to a third if a
modern definition is used. World trade shrank by two-thirds as
countries sought vainly to hide behind tariff barriers and import
quotas. ...
"The Fed's inability to avert a total of around 10,000 bank failures
was crucial not just because of the shock to consumers whose deposits
were lost or to shareholders whose equity was lost, but because of the
broader effect on the money supply and the volume of credit [which
saw] a decline in bank deposits of $5.6 billion and a decline in bank
loans of $19.6 billion, equivalent to 19 per cent of 1929 GDP."
Niall Ferguson, The Ascent of Money, Penguin, Copyright 2008 by Niall
Ferguson, pp. 158, 161-163.
-------------------------------------------------------------------------------------------------In today's excerpt - in civil wars and other widespread conflicts
within a country, Dr. Graciana del Castillo, in her landmark work
Rebuilding War-Torn States (which includes case studies of El
Salvador, Kosovo, Afghanistan and Iraq), asserts that no peace process
has ever succeeded without the reintegration of former combatants:
"One of the conditions for successful reconstruction of a country is
the disarmament, demobilization, and reintegration (DDR) of former
combatants, including all militia groups. ...
"No peace process has ever succeeded without the reintegration of
former combatants, as well as other groups affected by the conflict,
taking place in an effective manner. This is because effective
reintegration promotes security by limiting the incentives to these
groups to act as spoilers. Reintegration (such as El Salvador's landfor-arms program), however, is the longest and one of the most
expensive reconstruction activities. [Yet], reintegration is typically
neglected, as major donors shy away from open-ended commitments to the
costly social and economic programs that are often essential for
sustainable peace. Donors should consider that, without effective
reintegration, their military and security expenditure to keep the
peace may be significantly higher.
"This process is critical in supporting national reconciliation and
the promotion of peace. In December 2006, the UN launched new
Integrated Disarmament, Demobilization, and Reintegration Standards,
acknowledging the difficulty of transforming individuals scarred by
conflict into productive members of their societies. In order to
facilitate the transition, the Standards call for measures to provide
psychosocial counseling, job training, educational opportunities, and

mechanisms to promote reconciliation in the communities to which those


individuals return. ...
"Lessons from Mozambique, El Salvador, Guatemala, and many other
countries are conclusive in this respect: short-term reintegration
programs served an important purpose in providing demobilizing
soldiers with a means of survival and an alternative to banditry that
indeed helped maintain the cease-fire. ...
"It is important that the strategy have enough financial and technical
support at each stage, to make reintegration sustainable over time,
since it has proved a sine qua non for peace consolidation. ... There
can be different avenues for reintegration. Reintegration often takes
place through the agricultural sector, micro-enterprises, fellowships
for technical and university training, and even through the
incorporation of former combatants into new police forces, the
national army, or political parties."
Graciana del Castillo, Rebuilding War-Torn States, Oxford, Copyright
2008 by Graciana del Castillo, pp. 256-259.
-------------------------------------------------------------------------------------------------In today's excerpt - prostitutes in the United States a century ago
likely made in excess of $70,000 per year in today's dollars - far in
excess of today's prostitutes, and were a higher percentage of the
population. The reason? Today's prostitutes face more competition from
women willing to have sex with a man for free. This conclusion stems
from the work of economist Sudhir Venkatesh on the subject of the
Chicago prostitute industry:
"It turns out that the typical street prostitute in Chicago works 13
hours a week, performing 10 sex acts during that period, and earns an
hourly wage of approximately $27. So her weekly take-home pay is
roughly $350. This includes an average of $20 that a prostitute steals
from her customers and acknowledges that some prostitutes accept drugs
in lieu of cash - usually crack cocaine or heroin, and usually at a
discount. Of all the women in Venkatesh's study, 83 percent were drug
addicts.
"Many of these women took on other, non-prostitution work, which
Venkatesh also tracked. Prostitution paid about four times more than
those jobs. But as high as that wage premium may be, it looks pretty
meager when you consider the job's downsides. In a given year, a
typical prostitute in Venkatesh's study experienced a dozen incidents
of violence. At least 3 of the 160 prostitutes who participated died
during the course of [his] study. 'Most of the violence by johns is
when, for some reason, they can't consummate or can't get erect,' says
Venkatesh. 'Then he's shamed,- 'I'm too manly for you' or 'You're too

ugly for me!' Then the john wants his money back, and you definitely
don't want to negotiate with a man who just lost his masculinity.'
"Moreover, the women's wage premium pales in comparison to the one
enjoyed by even the low-rent prostitutes from a hundred years ago.
Compared with them, [the typical street prostitutes] are working for
next to nothing.
"Why has the prostitute's wage fallen so far?
"Because demand has fallen dramatically. Not the demand for sex. That
is still robust. But prostitution, like any industry, is vulnerable to
competition.
"Who poses the greatest competition to a prostitute? Simple: any woman
who is willing to have sex with a man for free.
"It is no secret that sexual mores have evolved substantially in
recent decades. The phrase 'casual sex' didn't exist a century ago (to
say nothing of 'friends with benefits'). Sex outside of marriage was
much harder to come by and carried significantly higher penalties than
it does today."
Steven D. Levitt and Stephen J. Dubner, Superfreakonomics, William
Morrow, Copyright 2009 by Steven D. Levitt and Stephen J. Dubner, pp.
29-30.
-------------------------------------------------------------------------------------------------In today's excerpt - by the estimate of journalist Philip Caputo, the
most violent city in the world is not located in Afghanistan, Iraq or
some Sub-Saharan African country, but across a river from the United
States in Juarez, Mexico. And in the almost three years since
President Felipe Caldern launched a war on drug cartels, some 14,000
people have been killed in the country of Mexico, and part of the
country is effectively under martial law:
"The U.S. government estimates that the cultivation and trafficking of
illegal drugs directly employs 450,000 people in Mexico [out of 110
million people]. Unknown numbers of people, possibly in the millions,
are indirectly linked to the drug industry, which has revenues
estimated to be as high as $25 billion a year, exceeded only by
Mexico's annual income from manufacturing and oil exports. Dr. Edgardo
Buscaglia ... concluded in a recent report that 17 of Mexico's 31
states have become virtual narco-republics, where organized crime has
infiltrated government, the courts, and the police so extensively that
there is almost no way they can be cleaned up. The drug gangs have
acquired a 'military capacity' that enables them to confront the army
on an almost equal footing. ...

"Of the many things Mexico lacks these days, clarity is near the top
of the list. It is dangerous to know the truth. Finding it is
frustrating. Statements by U.S. and Mexican government officials,
repeated by a news media that prefers simple story lines, have
fostered the impression in the United States that the conflict in
Mexico is between Caldern's white hats and the crime syndicates'
black hats. The reality is far more complicated, as suggested by this
statistic: out of those 14,000 dead, fewer than 100 have been
soldiers. Presumably, army casualties would be far higher if the war
were as straightforward as it's often made out to be. ...
"The toll includes more than 1,000 police officers, some of whom,
according to Mexican press reports, were executed by soldiers for
suspected links to drug traffickers. Conversely, a number of the
fallen soldiers may have been killed by policemen moonlighting as
cartel hit men, though that cannot be proved. Meanwhile, human-rights
groups have accused the military of unleashing a reign of terror carrying out forced disappearances, illegal detentions, acts of
torture, and assassinations - not only to fight organized crime but
also to suppress dissidents and other political troublemakers. What
began as a war on drug trafficking has evolved into a low-intensity
civil war with more than two sides and no white hats, only shades of
black. The ordinary Mexican citizen - never sure who is on what side,
or who is fighting whom and for what reason - retreats into a private
world where he becomes willfully blind, deaf, and above all, dumb. ...
"[The City of] Jurez's main product now is the corpse. Last year,
drug-related violence there claimed more than 1,600 lives, and the
toll for the first nine months of this year soared beyond 1,800, and
mounts daily. That makes Jurez, population 1.5 million, the most
violent city in the world."
Philip Caputo, "The Border of Madness," The Atlantic, December 2009,
pp. 63-69.
-------------------------------------------------------------------------------------------------In today's excerpt - an economist writes on the demand for kidney
transplants:
"The first successful kidney transplant was performed in 1954. To the
layperson, it looked rather like a miracle: someone who would surely
have died of kidney failure could now live on by having a replacement
organ plunked inside him. Where did this new kidney come from? The
most convenient source was a fresh cadaver, the victim of an
automobile accident perhaps or some other type of death that left
behind healthy organs. The fact that one person's death saved the life
of another only heightened the sense of the miraculous.

"But over time, transplantation became a victim of its own success.


The normal supply of cadavers couldn't keep up with the demand for
organs. In the United States, the rate of traffic fatalities was
declining, which was great news for drivers but bad news for patients
awaiting a lifesaving kidney. ... In Europe, some countries passed
laws of 'presumed consent'; rather than requesting that a person
donate his organs in the event of an accident, the state assumed the
right to harvest his organs unless he or his family specifically opted
out. But even so, there were never enough kidneys to go around.
"Fortunately, cadavers aren't the only source of organs. We are born
with two kidneys but need only one to live. ... Stories abounded of
one spouse giving a kidney to the other, a brother coming through for
his sister, a grown woman for her aging parent, even kidneys donated
between long-ago playground friends. But what if you were dying and
didn't have a friend or relative willing to give you a kidney? One
country, Iran, was so worried about the kidney shortage that it
enacted a program many other nations would consider barbaric. It
sounded like the kind of idea some economist might have dreamed up:
the Iranian government would pay people to give up a kidney, roughly
$1,200, with an additional sum paid by the kidney recipient.
"In the United States, meanwhile, during a 1983 congressional hearing,
an enterprising doctor named Barry Jacobs described his own pay-fororgans plan. His company, International Kidney Exchange, Ltd., would
bring Third World citizens to the United States, remove one of their
kidneys, give them some money, and send them back home. Jacobs was
savaged for even raising the idea. His most vigorous critic was a
young Tennessee congressman named Al Gore, who wondered if these
kidney harvestees 'might be willing to give you a cut-rate price just
for the chance to see the Statue of Liberty or the Capitol or
something.'
"Congress promptly passed the National Organ Transplant Act, which
made it illegal 'for any person to knowingly acquire, receive, or
otherwise transfer any human organ for valuable consideration for use
in human transplantation.' ...
"And what about U.S. organ-donation policy? ... There are currently
80,000 people in the United States on a waiting list for a new kidney,
but only some 16,000 transplants will be performed this year. This gap
grows larger every year. More than 50,000 people on the list have died
over the past twenty years, with at least 13,000 more falling off the
list as they became too ill to have the operation. ... This has led
some people to call for a well-regulated market in human organs ...
but this proposal has so far been greeted with widespread
repugnance. ...
"Recall, meanwhile, that Iran established a similar market nearly

thirty years ago. Although this market has its flaws, anyone in Iran
needing a kidney transplant does not have to go on a waiting list. The
demand for transplantable kidneys is being fully met."
Steven D. Levitt & Stephen J. Dubner, Superfreakonomics, William
Morrow, Copyright 2009 by Steven D. Levitt & Stephen J. Dubner, pp.
124-125.
-------------------------------------------------------------------------------------------------In today's encore excerpt - the football huddle is invented at a
college for the deaf - Gallaudet University in Washington, DC - as a
means of hiding signals from other deaf teams. It is institutionalized
at the University of Chicago as a means of bringing control and
Christian fellowship to the game:
"When Gallaudet played nondeaf clubs or schools, Hubbard merely used
hand signals - American Sign Language - to call a play at the line of
scrimmage, imitating what was done in football from Harvard to
Michigan. Both teams approached the line of scrimmage. The signal
caller - whether it was the left halfback or quarterback - barked out
the plays at the line of scrimmage. Nothing was hidden from the
defense. There was no huddle.
"Hand signals against nondeaf schools gave Gallaudet an advantage. But
other deaf schools could read [quarterback Paul] Hubbard's sign
language. So, beginning in 1894, Hubbard came up with a plan. He
decided to conceal the signals by gathering his offensive players in a
huddle prior to the snap of the ball. ... Hubbard's innovation in 1894
worked brilliantly. 'From that point on, the huddle became a habit
during regular season games,' cites a school history of the football
program. ...
"In 1896, the huddle started showing up on other college campuses,
particularly the University of Georgia and the University of Chicago.
At Chicago, it was Amos Alonzo Stagg, the man credited with nurturing
American football into the modern age and barnstorming across the
country to sell the game, who popularized the use of the huddle and
made the best case for it. ...
"At the time, coaches were not permitted to send in plays from the
sideline. So, while Stagg clearly understood the benefit of concealing
the signals from the opposition, he was more interested in the huddle
as a way of introducing far more reaching reforms to the game. Before
becoming a coach, Stagg wanted to be a minister. At Yale, he was a
divinity student from 1885 to 1889.
"Thoughtful, pious, and righteous, Stagg brought innovations football
as an attempt to bring a Christian fellowship to the game. He wanted

his players to play under control, to control the pace, the course,
and the conduct of what had been a game of mass movement that often
broke out into fisticuffs. Stagg viewed the huddle as a vital aspect
of helping to teach sportsmanship. He viewed the huddle as a kind of
religious congregation on the field, a place where the players could,
if you will, minister to each other, make a plan, and promise to keep
faith in that plan and one another."
Sal Palantonio, How Football Explains America, Triumph, Copyright 2008
by Sal Palantonio, pp. 38-41.
-------------------------------------------------------------------------------------------------In today's excerpt - in the late 1990s, even as the major U.S. oil
companies merged to get larger, their influence waned in the face of
foreign national oil companies. Of the world's twenty largest oil
companies, fifteen are state-owned:
"[The need for significantly larger investments in oil exploration and
development] created the imperative for what became known as
restructuring. The majors combined to become supermajors. BP merged
with Amoco to become BPAmoco, and then merged with ARCO, and emerged
as a much bigger BP. Exxon and Mobil - once Standard Oil of New Jersey
and Standard Oil of New York - became ExxonMobil. Chevron and Texaco
came together as Chevron. Conoco combined with Phillips to be
ConocoPhillips. In Europe, what had once been the two separate French
national champions, Total and Elf Aquitaine, plus the Belgian company
Petrofina, combined to emerge as Total. Only Royal Dutch Shell,
already of supermajor status on its own, remained as it was. ... With
all these mergers, the landscape of the international oil industry
changed. ...
"It turned out that the restructuring of the world oil industry that
had started with the emergence of the supermajors at the end of the
1990s was only the beginning. One more merger - of Norway's Statoil
and Norsk Hydro - created Statoilhydro, a new supermajor, although
partly state-owned. But the balance between companies and governments
has shifted dramatically. Altogether, all the oil that the supermajors
produce for their own account is less than 15 percent of total world
supplies. Over 80 percent of world reserves are controlled by
governments and their national oil companies. Of the world's twenty
largest oil companies, fifteen are state-owned. Thus, much of what
happens to oil is the result of decisions of one kind or another made
by governments. And overall, the government-owned national oil
companies have assumed a preeminent role in the world oil
industry. ...
"Saudi Aramco - the successor to Aramco, now state-owned - remains by
far the largest upstream oil company in the world, single-handedly

producing about 10 percent or more of the world's entire oil with a


massive deployment of technology and coordination. The major Persian
Gulf producers control for the most part their production, as do the
traditional state companies in Venezuela, Mexico, Algeria, and many
other countries. The Chinese companies - partly state-owned, partly
owned by shareholders around the world - continue to produce the
majority of oil in China but have also become increasingly active and
visible in the international arena. So have Indian companies. The
Russian industry is led by state-controlled giants Gazprom and Rosneft
and by privately held companies, such as Lukoil and TNK-BP, that are
majors in their own right.
"Petrobras, the Brazilian national oil company, is 68 percent owned by
investors and 32 percent by the Brazilian government, though the
government retains the majority of the voting shares. Petrobras had
already established itself at the forefront in terms of capabilities
in exploring for and developing oil in the challenging deep waters
offshore. Beginning with the Tupi find in 2006, potentially very large
discoveries are being made in what had heretofore been inaccessible
resources in Brazil's deep waters, below salt deposits. These
discoveries could make Petrobras - and Brazil - into a new powerhouse
of world oil. Malaysia's Petronas had turned itself into a significant
international company, operating in 32 countries outside Malaysia.
State companies in other countries in the former Soviet Union KazMunayGas in Kazakhstan and SOCAR in Azerbaijan - have also emerged
as important players. While Qatar is an oil exporter, its massive
natural gas reserves put it at the forefront of the liquefied natural
gas industry (LNG) and, along with Algeria's Sonatrach and other
exporters, at the center of growing global trade in natural gas."
Daniel Yergin, The Prize, new 2009 epilogue, Free Press, Copyright
1991, 1992, 2009 by Daniel Yergin, pp 765-770.
-------------------------------------------------------------------------------------------------In today's encore excerpt - the neural and chemical basis of love. Why
doesn't passionate love last? - because we develop a chemical
tolerance:
"Anthropologist Helen Fisher ... has devoted much of her career to
studying the biochemical pathways of love in all its manifestations:
lust, romance, attachment, the way they wax and wane ... [In her
studies] when each subject looked at his or her loved one, the parts
of the brain linked to reward and pleasure - the ventral tegmental
area and the caudate nucleus - lit up. ... Love lights up the caudate
nucleus because it is home to a dense spread of receptors for a
neurotransmitter called dopamine ... which creates intense energy,
exhiliration, focused attention ... [thus] love makes you bold, makes
you bright, makes you run real risks, which you sometimes survive, and

sometimes you don't. ...


"Researchers have long hypothesized that people with obsessivecompulsive disorder (OCD) have a serotonin 'imbalance.' Drugs like
Prozac seem to alleviate OCD by increasing the amount of this
neurotransmitter available at the juncture between neurons.
[Researchers] compared the lover's serotonin levels with those from
the OCD group and another group who were free from both passion and
mental illness. Levels of serotonin in both the obsessives' blood and
the lovers' blood were 40 percent lower than those in normal
subjects. ... Translation: Love and mental illness may be difficult to
tell apart. ...
"Why doesn't passionate love last? ... Biologically speaking, the
reasons romantic love fades may be found in the way our brains respond
to the surge and pulse of dopamine ... cocaine users describe the
phenomenon of tolerance: the brain adapts to the excessive input of
the drug ... From a physiological point of view, [couples move] from
the dopamine-drenched state of romantic love to the relative quiet of
the oxytocin-induced attachment. Oxytocin is a hormone that promotes a
feeling of connection, bonding."
Lauren Slater, "Love: The Chemical Reaction," National Geographic,
February 2006, pp. 35-45
-------------------------------------------------------------------------------------------------In today's encore excerpt - three kinds of love; attachment love,
caregiver's love, and sex:
"In the terrain of the human heart, scientists tell us, at least three
independent but interrelated brain systems are at play, all moving us
in their own way. To untangle love's mysteries, neuroscience
distinguishes between neural networks for attachment, for caregiving,
and for sex. Each is fueled by a differing set of brain chemicals and
hormones, and each runs through a disparate neuronal circuit. Each
adds its own chemical spice to the many varieties of love.
"Attachment determines who we turn to for succor; these are the people
we miss the most when they are absent. Caregiving gives us the urge to
nurture the people for whom we feel most concern. When we are
attached, we cling; when we are caregiving we provide. And sex is,
well, sex. ...
"The forces of affection that bind us to each other preceded the rise
of the rational brain. Love's reasons have always been subcortical,
though love's execution may require careful plotting. ... The three
major systems for loving - attachment, caregiving, and sexuality - all
follow their own complex rules. At a given moment any one of these

three can be ascendant - say, as a couple feels a warm togetherness,


or when they cuddle their own baby, or while they make love. When all
three of these love systems are operating, they feed romance at its
richest: a relaxed, affectionate, and sensual connection where rapport
blossoms. ...
"Neuroscientist Jaak Pansepp ... finds a neural corollary between the
dynamics of opiate addiction and the dependence on the people for whom
we feel our strongest attachments. All positive interactions with
people, he proposes, owe [at least] part of their pleasure to the
opioid system, the very circuitry that links with heroin and other
addictive substances. ... Even animals, he finds, prefer to spend time
with those in whose presence they have secreted oxytocin and natural
opioids, which induce a relaxed serenity - suggesting that these brain
chemicals cement our family ties and friendships as well as our love
relationships."
Daniel Goleman, Social Intelligence, Bantam, Copyright 2006 by Daniel
Goleman, pp. 189-193.
-------------------------------------------------------------------------------------------------In today's encore excerpt - in the chivalrous twelfth century,
relationships and sex, viewed as dutiful and dispassionate under the
Church, begin to emerge as rapturous and transcendent. The new age of
courtly love sweeps through the courts of Europe and engenders a new
genre of songs and poems. Aiding in this transformation are Eleanor of
Aquitaine (1122-1204) and the troubadors:
"The [new] game of courtly love is an elaborate blueprint for the
building of desire, as opposed to the quenching of it. The higher it
builds without fulfillment, the more perfect a lover the knight proves
himself to be. ...
"Consummated or not, courtly love is by definition adulterous. The
knight who jousts on horseback, sword in hand, competes against other
knights for a highly desirable lady. But they're not fighting for her
hand in marriage, or even for the privilege of courting her. She
already has a husband. Initially, at least, they're not even fighting
for the privilege of sleeping with her. They're fighting for the
privilege of loving her - synonymous with serving her. ...
"In 1154, Henry, Duke of Normandy, captures the English throne as
Henry I, making his wife Eleanor [of Aquitaine] a queen for the second
time - and [through her] bestowing upon the English court a resident
expert on the rules of the game. From there the ideal of love ... will
be converted into the middle-class ideal of marriage: the melding of
two minds, bodies, and hearts into one. ... Eleanor and her kin would
find it next to unimaginable that the heady quality of adultery would

one day converge with the dutiful, dispassionate quality of marriage


as they experience it.
"Maybe that's what finally enables the convergence: Love enters
marriage through the extramarital back door. As [Christian author]
C.S. Lewis noted in his study of courtly doctrine, Allegory of Love,
'Any idealization of sexual love, in a society where marriage is
purely utilitarian, must begin by an idealization of adultery.' ...
"What troubadors bring about is the reinvention of love. They make its
pursuit desirable, even admirable. Previously, epic tales of sexual
desire ended in mutually assured destruction for all concerned. ...
[Now], to gamble all you have, even your life, on romantic rapture
becomes the route to transcendence. The most memorable romantic lovers
of courtly literature - Tristan and Isolde, Lancelot and Guinevere,
Troilus and Cressida - meet tragic ends, but noble ones. They martyr
themselves for the glory of the faith. The new religion of love is a
wedge to the future."
Susan Squire, I Don't, Bloomsbury, Copyright 2008 by Susan Squire, pp.
151-159.
-------------------------------------------------------------------------------------------------In today's excerpt - the combined populations of Europe, the United
States, and Canada (often referred to simply as "The West") has
declined from 33 percent of the world's population in 1913 to just 17
percent in 2003, as these countries' fertility rates have declined and
many European countries now have shrinking populations:
"At the beginning of the eighteenth century, approximately 20 percent
of the world's inhabitants lived in Europe (including Russia). Then,
with the Industrial Revolution, Europe's population boomed, and
streams of European emigrants set off for the Americas. By the eve of
World War I, Europe's population had more than quadrupled. In 1913,
Europe had more people than China, and the proportion of the world's
population living in Europe and the former European colonies of North
America had risen to over 33 Percent.
"But this trend reversed after World War I, as basic health care and
sanitation began to spread to poorer countries. In Asia, Africa, and
Latin America, people began to live longer, and birthrates remained
high or fell only slowly. By 2003, the combined populations of Europe,
the United States, and Canada accounted for just 17 percent of the
global population. In 2050, this figure is expected to be just 12
percent - far less than it was in 1700. (These projections, moreover,
might even understate the reality because they reflect the 'medium
growth' projection of the UN forecasts, which assumes that the
fertility rates of developing countries will decline while those of

developed countries will increase. In fact, many developed countries


show no evidence of increasing fertility rates.) ...
"According to the economic historian Angus Maddison, Europe, the
United States, and Canada together produced about 32 percent of the
world's GDP at the beginning of the nineteenth century. By 1950, that
proportion had increased to a remarkable 68 percent of the world's
total output (adjusted to reflect purchasing power parity). This
trend, too, is headed for a sharp reversal. The proportion of global
GDP produced by Europe, the United States, and Canada fell from 68
percent in 1950 to 47 percent in 2003 and will decline even more
steeply in the future. ...
"The year 2010 will likely be the first time in history that a
majority of the world's people live in cities rather than in the
countryside. Whereas less than 30 percent of the world's population
was urban in 1950, according to UN projections, more than 70 percent
will be by 2050. Lower-income countries in Asia and Africa are
urbanizing especially rapidly, as agriculture becomes less labor
intensive and as employment opportunities shift to the industrial and
service sectors. Already, most of the world's urban agglomerations Mumbai (population 20.l million), Mexico City (19.5 million), New
Delhi (17 million), Shanghai (15.8 million), Calcutta (15.6 million),
Karachi (13.1 million), Cairo (12.5 million), Manila (11.7 million),
Lagos (10.6 million), Jakarta (9.7 million) - are found in low-income
countries. Many of these countries have multiple cities with over one
million residents each: Pakistan has eight, Mexico 12, and China more
than 100."
Jack A. Goldstone, "The New Population Bomb," Foreign Affairs, Jan/Feb
2010, pp. 32-33, 38.
-------------------------------------------------------------------------------------------------In today's encore excerpt - early British colonizers of America in the
1600s and 1700s needed laborers for their new colonies:
"They needed a compliant, subservient, preferably free labour force
and since the indigenous peoples of America were difficult to enslave
they turned to their own homeland to provide. They imported Britons
deemed to be 'surplus' people - the rootless, the unemployed, the
criminal and the dissident - and held them in the Americas in various
forms of bondage for anything from three years to life. ... In the
early decades, half of them died in bondage.
"Among the first to be sent were children. Some were dispatched by
impoverished parents seeking a better life for them. But others were
forcibly deported. In 1618, the authorities in London began to sweep
up hundreds of troublesome urchins from the slums and, ignoring

protests from the children and their families, shipped them to


Virginia. ... It was presented as an act of charity: the 'starving
children' were to be given a new start as apprentices in America. In
fact, they were sold to planters to work in the fields and half of
them were dead within a year. Shipments of children continued from
England and then from Ireland for decades. Many of these migrants were
little more than toddlers. In 1661, the wife of a man who imported
four 'Irish boys' into Maryland as servants wondered why her husband
had not brought 'some cradles to have rocked them in' as they were 'so
little.'
"A second group of forced migrants from the mother country were those,
such as vagrants and petty criminals, whom England's rulers wished to
be rid of. The legal ground was prepared for their relocation by a
highwayman turned Lord Chief Justice ,who argued for England's jails
to be emptied in America. Thanks to men like him, 50,000 to 70,000
convicts (or maybe more) were transported to Virginia, Maryland,
Barbados and England's other American possessions before 1776. ...
"A third group were the Irish. ... Under Oliver Cromwell's ethniccleansing policy in Ireland, unknown numbers of Catholic men, women
and children were forcibly transported to the colonies. And it did not
end with Cromwell; for at least another hundred years, forced
transportation continued as a fact of life in Ireland. ...
"The other unwilling participants in the colonial labour force were
the kidnapped. Astounding numbers are reported to have been snatched
from the streets and countryside by gangs of kidnappers or 'spirits'
working to satisfy the colonial hunger for labour. Based at every
sizeable port in the British Isles, spirits conned or coerced the
unwary onto ships bound for America. ... According to a contemporary
who campaigned against the black slave trade, kidnappers were
snatching an average of around 10,000 whites a year - doubtless an
exaggeration but one that indicates a problem serious enough to create
its own grip on the popular mind.' "
Don Jordan and Michael Walsh, White Cargo, New York University Press,
Copyright 2007 by Don Jordan and Michael Walsh, pp. 12-14.
-------------------------------------------------------------------------------------------------In today's excerpt - binge drinking, bars, liquor licenses, new forms
of intoxication, and attempts at prohibition go back hundreds (if not
thousands) of years. In London in 1628, there were 130 churches and
over 3000 alehouses:
"In Britain, concerns over drunkenness go back a long way. The first
Licensing Act, passed in 1552, required alehouse-keepers to acquire a
license from local justices on the grounds that 'intolerable hurts and

troubles' arose from drunkenness in 'common alehouses'. The following


year rules were introduced strictly limiting the number of wine
taverns that could open in any one town. This legislative distinction
between common alehouses and more exclusive wine taverns reflected a
long-standing social stratification of drinks in Britain. Lack of
native viticulture made imported wine an elite drink, while ale, and
later beer (made with hops, which were only widely used from the 15th
century), were associated with more popular drinking cultures. ...
"For the poet and courtly aspirant George Gascoigne, drunkenness was
not an indigenous trait but the result of recent contact with heavy
drinking north Europeans. Furthermore, its spread demonstrated how
fashion could become entrenched as tradition if not swiftly
checked. ...
"Not everyone blamed Continental trends. In his pamphlet Anatomy of
Abuses, published in 1583, the Puritan Phillip Stubbes cited the
legacy of pre- Reformation festive culture, especially the 'churchales' which had played an important role in late-medieval local
economies. From this perspective, drunkenness was linked to the
corrupting practices of a Catholic past rather than to the new-fangled
fashions of Britain's Continental neighbors. ...
"In 1628 one Richard Rawlidge complained that whereas in London there
were less than 130 churches, there were 'above thirty hundred
alehouses'. ...
"Until the late 17th century the social stratification of alcohol
tended to be expressed in relation to courtly wine, urban beer and
bucolic ale. The liberalization of the gin trade following the
accession of William III [in the early 18th century] threw an entirely
new substance into this mix. Partly designed to promote a domestic
alternative to French brandy, deregulation contributed to a staggering
rise in consumption and triggered impassioned debates on the role of
the state in regulating this new and dangerous commodity. In 1736 it
also produced a disastrous attempt at gin prohibition, which was
repealed seven years later."
James Nicholls, "Drink: The British Disease?," History Today, January
2010, Volume: 60 Issue: 1, pp. 10-17.
-------------------------------------------------------------------------------------------------In today's excerpt - happy couples. It turns out that how couples
handle good news may matter even more to their relationship than their
ability to support each other under difficult circumstances:
"Numerous studies show that intimate relationships, such as marriages,
are the single most important source of life satisfaction. Although

most couples enter these relationships with the best of intentions,


many break up or stay together but languish. Yet some do stay happily
married and thrive. What is their secret?
"A few clues emerge from the latest research in the nascent field of
positive psychology. Founded in 1998 by psychologist Martin E. P.
Seligman of the University of Pennsylvania, this discipline includes
research into positive emotions, human strengths and what is
meaningful in life. In the past few years positive psychology
researchers have discovered that thriving couples accentuate the
positive in life more than those who stay together unhappily or split
do. They not only cope well during hardship but also celebrate the
happy moments and work to build more bright points into their lives.
"It turns out that how couples handle good news may matter even more
to their relationship than their ability to support each other under
difficult circumstances. Happy pairs also individually experience a
higher ratio of upbeat emotions to negative ones than people in
unsuccessful liasions do. Certain tactics can boost this ratio and
thus help to strengthen connections with others. Another ingredient
for relationship success: cultivating passion. Learning to become
devoted to your significant other in a healthy way can lead to a more
satisfying union.
"Until recently, studies largely centered on how romantic partners
respond to each other's misfortunes and on how couples manage negative
emotions such as jealousy and anger - an approach in line with
psychology's traditional focus on alleviating deficits. One key to
successful bonds, the studies indicated, is believing that your
partner will be there for you when things go wrong. Then, in 2004,
psychologist Shelly L. Gable, currently at the University of
California, Santa Barbara, and her colleagues found that romantic
couples share positive events with each other surprisingly often,
leading the scientists to surmise that a partner's behavior also
matters when things are going well.
"In a study published in 2006 Gable and her coworkers videotaped
dating men and women in the laboratory while the subjects took turns
discussing a positive and negative event. After each conversation,
members of each pair rated how 'responded to' - how understood,
validated and cared for - they felt by their partner. Meanwhile
observers rated the responses on how active-constructive (engaged and
supportive ) they were - as indicated by intense listening, positive
comments and questions, and the like. Low ratings reflected a more
passive, generic response such as 'That's nice, honey.' Separately,
the couples evaluated their commitment to and satisfaction with the
relationship.
"The researchers found that when a partner proffered a supportive
response to cheerful statements, the 'responded to' ratings were

higher than they were after a sympathetic response to negative news,


suggesting that how partners reply to good news may be a stronger
determinant of relationship health than their reaction to unfortunate
incidents. The reason for this finding, Gable surmises, may be that
fixing a problem or dealing with a disappointment - though important
for a relationship - may not make a couple feel joy, the currency of a
happy pairing."
Suzann Pileggi, "The Happy Couple," Scientific American Mind, Jan/Feb
2010, pp. 34-36.
-------------------------------------------------------------------------------------------------In today's excerpt - the vicious wit of Dorothy Parker, who was the
most-quoted of the literary stars who frequented the Algonquin
roundtable:
"Mrs. Parker was born Dorothy Rothschild in West End, New Jersey, on
August 22, 1893. She was educated at Miss Dana's School in Morristown,
New Jersey, and the Sacred Heart Convent in New York, and her first
publishing job came in 1916 when Vogue, to which she had submitted
some poems, offered to hire her as a picture-caption writer at $10 a
week. Two years later, she moved to Vanity Fair, where her co-editors
were the men to whom she occasionally referred as 'the two Bobs,'
Sherwood and Benchley.
"She was fired in 1920 by Vanity Fair when three theatrical producers
protested that her reviews were too tough. ... Sherwood and Benchley
resigned immediately, feeling that the magazine should not have
yielded to the pressure. Sherwood went on to Life, and Mrs. Parker and
Benchley rented and shared an office for a while, trying to survive as
free-lance writers. The office was so tiny that Mrs. Parker said
afterward, 'If we'd had to sit a few inches closer together, we'd have
been guilty of adultery.' (The well-known story that they grew lonely
and tried to lure company by having the word MEN painted on their
office door is, unfortunately, apocryphal.)...
"The dark-haired, pretty writer was well-known for lines like, 'If all
the girls at Smith and Bennington were laid end to end, I wouldn't be
surprised,' and, 'One more drink and I'd have been under the
host.' [Her writing career was aided greatly by columnist Franklin
Pierce Adams, whom she later claimed] 'raised her from a couplet.'...
"Dorothy Parker was a cute girl but hardly lovable; her forte was
criticism which really stung. It was Dorothy Parker who, commenting on
an early and uninspired performance by Katharine Hepburn in a Broadway
play, The Lake, said that the actress 'ran the gamut of emotions from
A to B'; it was also Dorothy Parker who, feeling dislike for Countess
Margot Asquith because the Countess had written a book which seemed

too narcissistic, took care of her by commenting, 'The romance between


Margot Asquith and Margot Asquith will live as one of the great love
affairs of literature,' and adding that the book was 'in four volumes,
suitable for throwing.' She also dealt with a drama called The House
Beautiful by calling it 'the play lousy,' and, during the period in
which she reviewed books under the pseudonym of Constant Reader,
disposed of a book by A. A. Milne, whose cuteness and whimsy she
abhorred, by writing, 'Tonstant Weader fwowwed up.' "
Scott Meredith, George S. Kaufman and His Friends, Doubleday,
Copyright 1974 by Playboy, pp. 139, 152-3, 33.
-------------------------------------------------------------------------------------------------In today's excerpt - the Mongols amassed the largest contiguous landbased empire in history, larger than both the Roman and the Muslim
Empires at their height, and exceeded in size only by the British
Empire, which was however not contiguous:
"The Mongols originally were a tribe from the region of Lake Baikal,
to the north of Mongolia in modern-day Russia, but by the [the 13th
century] they had become a multiethnic federation of nomadic tribes
ruled from the high Mongolian plateau: a region of bitterly cold
winters, searingly hot summers, and vast open expanses of desolate
terrain. The tribes had been united at the end of the twelfth century
by a leader originally known as Temujin, who in 1206 was declared
their undisputed leader. Temujin then proceeded to launch an
astonishing series of military campaigns that, by the time of his
death in 1227, put him in control of the largest contiguous land-based
empire in history, one that extended from China in the east to the
Caspian Sea in the west, and from Siberia in the north to northwest
India in the south, Temujin and the Mongols considered their campaigns
of conquest to have been ordained by the supreme sky god, Tenggeri,
and the whole world, they believed, was a Mongol empire-in-the-making.
'Through the power of God,' Great Khan Guyuk would announce to the
West in 1246, 'all empires from sunrise to sunset have been given to
us, and we own them.' Those who refused to submit to Mongol rule were
rebels against the divine plan, and punishment for this refusal was
often the outright slaughter of whole cities and peoples.
"The conquests led by Temujin were legendary, and to celebrate them
the Mongols posthumously bestowed on him the title Fierce Ruler, or
Chingis Khan. Today, thanks to an imperfect Arabic transliteration of
that name, he is widely known as Genghis Khan.
"Chingis Khan recognized that his people, resilient horsemen
accustomed to lives of hardship, deprivation, and perpetual motion,
were natural warriors. They traveled with huge numbers of spare
horses, and by using them in rotation managed to travel up to a

hundred miles a day - a distance far greater than any other army of
the time could travel. As nomads, they knew how to live off the land
and the peoples they conquered, but during times of privation and hard
travel they could sustain themselves by drinking the blood of their
own horses - and, if necessary, by eating them. Such practices,
coupled with the ferocity the Mongols displayed in battle, fed rumors
in Europe of the Mongols as cannibals and savages. 'The men are
inhuman and of the nature of beasts,' [an English monk] reported,
'rather to be called monsters than men, thirsting after and drinking
blood, and tearing and devouring the flesh of dogs and human beings.'
"The Mongols' savagery was calculated. They wanted their reputation to
precede them. Often, on the eve of an invasion, they would send
advance word of their mission of conquest to their adversaries and
would demand submission without a fight. Inevitably, many opponents
would acquiesce, having already heard terrifying rumors about what
would happen to them if they did not - and as a result, when the
promised invasion actually did take Place, the Mongols' ranks would
already be swollen with captives. Some would be forced to fight as
foot soldiers on the front ranks. Others would be enlisted as guides,
interpreters, engineers, and spies."
Toby Lester, The Fourth Part of the World, Free Press, Copyright 2009
by Toby Lester, pp. 47-48.
-------------------------------------------------------------------------------------------------In today's excerpt - the effect of birthdate on athletic performance:
"If you visit the locker room of a world-class soccer team early in
the calendar year, you are more likely to interrupt a birthday
celebration than if you arrive later in the year. A recent tally of
the British national youth leagues, for instance, shows that fully
half of the players were born between January and March, with the
other half spread out over the nine remaining months. On a similar
German team, 52 elite players were born between January and March,
with just 4 players born between October and December.
"Why such a severe birthdate bulge? Most elite athletes begin playing
their sports when they are quite young. Since youth sports are
organized by age, the leagues naturally impose a cutoff birthdate. The
youth soccer leagues in Europe, like many such leagues, use December
31 as the cutoff date.
"Imagine now that you coach in a league for seven-year-old boys and
are assessing two players. The first one (his name is Jan) was born on
January 1, while the second one (his name is Tomas) was born 364 days
later, on December 31. So even though they are both technically sevenyear-olds, Jan is a year older than Tomas - which, at this tender age,

confers substantial advantages. Jan is likely to be bigger, faster,


and more mature than Tomas.
"So while you may be seeing maturity rather than raw ability, it
doesn't much matter if your goal is to pick the best players for your
team. It probably isn't in a coach's interest to play the scrawny
younger kid who, if he only had another year of development, might be
a star.
"And thus the cycle begins. Year after year, the bigger boys like Jan
are selected, encouraged, and given feedback and playing time, while
boys like Tomas eventually fall away. This 'relative-age effect,' as
it has come to be known, is so strong in many sports that its
advantages last all the way through to the professional ranks.
"K. Anders Ericsson, an enthusiastic, bearded, and burly Swede, is the
ringleader of a merry band of relative-age scholars scattered across
the globe. He is now a professor of psychology at Florida State
University, where he uses empirical research to learn what share of
talent is 'natural' and how the rest of it is acquired. His
conclusion: the trait we commonly call 'raw talent' is vastly
overrated. 'A lot of people believe there are some inherent limits
they were born with,' he says. 'But there is surprisingly little hard
evidence that anyone could attain any kind of exceptional performance
without spending a lot of time perfecting it.' Or, put another way,
expert performers - whether in soccer or piano playing, surgery or
computer programming - are nearly always made, not born."
Steven D. Levitt & Stephen J. Dubner, Superfreakonomics, William
Morrow, Copyright 2009 by Steven D. Levitt & Stephen J. Dubner, pp.
59-61.
-------------------------------------------------------------------------------------------------In today's excerpt - El Salvador achieved peace after a decade-long
civil war by providing the combatants, called the FMLN, with land
through an "arms-for-land" program and integrating them into the
political process as a new political party, rather than by the
essentially impossible task of eradicating the combatants:
"El Salvador [had] a GDP of about $6 billion in 1991, average income
per capita was around $1,100. ... However, the country's income and
wealth were highly concentrated. It was often heard that about 85
percent of the land belonged to 14 families. Despite an agrarian
reform program that started in 1980, in the early 1990s it was
estimated that there were about 300,000 families of campesinos (small
farmers) who still had no land. ...
"The roots of El Salvador's decade-long civil war extended deep into

the nineteenth century. As E. Torres-Rivas has pointed out, Salvadoran


society systematically generated economic marginalization, social
segregation, and political repression. Land tenure was as much a root
cause of the conflict that raged throughout the 1980s as was the
overbearing power of the armed forces. ...
"After a decade-long war, with over 100,000 estimated dead and serious
damage to human capital and physical infrastructure, a Peace Agreement
between the government of El Salvador and the Frente Farabundo Marti
para la Liberacion Nacional (FMLN), signed on January 16, 1992, in the
Chapultepec Castle in Mexico City, created high expectations. ...
Damage to the country's infrastructure as a result of the civil war
was estimated at $1.5 billion to $2.0 billion (more than 30 percent of
1990 GDP). ...
"After the conclusion of the peace agreements, El Salvador embarked on
a complex war-to-peace transition. ... The land situation in the
conflict zones was very complex. Production had been virtually
paralyzed during the war and infrastructure was seriously damaged. As
landowners abandoned or were forced off their land, landless peasants
had moved in. During the peace negotiations, the FMLN had insisted on
the legalization of the landholders' precarious tenure as a reward for
their crucial support to the FMLN's largely rural-based guerrilla
movement. The landholders were also expected to provide electoral
support for the FMLN's post-conflict political ambitions. Moreover,
the problem had to be addressed in any case, regardless of the FMLN
position, lest it remain as a potential source of instability as
landowners tried to recover their land.
"The objective of the 'arms-for-land' program - which was of central
importance to the maintenance of the ceasefire - was to provide
demobilizing combatants with the means for reintegration into the
productive economy by providing credit to potential beneficiaries to
purchase land. The agreement also contemplated supplementary shortterm programs (agricultural training, distribution of agricultural
tools, basic household goods, and academic instruction) and mediumterm programs (credit for production purposes, housing, and technical
assistance). ...
"The country moved a long way in this transition. Although successive
Salvadoran presidents have continued to be elected from the
[incumbent] Allanza Republicana Nacionalista (ARENA), in the March
2000 elections, the FMLN won 31 of 84 seats in the unicameral
legislative assembly. This was a remarkable achievement for the FMLN,
barely eight years after becoming a political party, allowing it to
block bills requiring a two-thirds majority. This moved the country
further ahead in the political transition. In the municipal elections
of 1997, the opposition had won about 80 percent of the largest
cities, including the capital. ...

"In an evaluation of the 1992-2004 period, the Inter-American


Development Bank (IADB 2005) concluded that Salvadoran society had
made a successful transition to peacetime and has gained considerable
ground in terms of stability, economic modernization, and poverty
reduction."
Graciana del Castillo, Rebuilding War-Torn States, Oxford, Copyright
2008 by Graciana del Castillo, pp. 103-119.
-------------------------------------------------------------------------------------------------In today's excerpt - in the Middle Ages, the leaders of Europe popes, bishops, and kings - held a firm belief in the existence of a
mysterious Christian king from the Far East named Prester John. This
belief gave them comfort in the years after the Muslims captured
Jerusalem and gained dominance over the Middle East and the
Mediterranean:
"For decades Christians in the area had been nurturing a religious
fantasy, and ... fantasy was this: A wealthy and powerful Christian
priest-king from the East had for some time been making his way toward
the West. A direct descendant of the Magi, he was known as Prester
John, and he was on a mission: to unite the Christians of East and
West, to defeat Muslims armies everywhere he found them, and to take
back Jerusalem for all time. ...
"The first surviving mention of Prester John dates from 1145. It
derives from a Syrian prelate named Hugh, who had traveled to Europe
that year to make an appeal for a new Crusade. Christian forces had
successfully established themselves in the Holy Land at the turn of
the century, but those forces were now quickly losing ground to
advancing Muslim armies, and desperately needed help. Hugh [portrayed
Prester John as a potential ally.] ...
"The story of Prester John would be told repeatedly in the decades
that followed, but it might well have vanished from memory had not a
curious letter - one of the great literary hoaxes of all time - begun
to circulate in Europe in 1165, purportedly written by Prester John
himself ... The letter may have been composed to generate support for
a new Crusade or it may simply have been a joke. Whatever it was, it
seems to have been accepted as genuine, and in the decades and
centuries that followed, it would circulate widely, translated into
many different languages - and would exert a lasting influence on the
ways in which Europeans imagined the East.
"Prester John introduced himself imperiously in the letter. 'I,
Prester John, who reign supreme;' he announced, 'exceed in riches,
virtue, and power all creatures who dwell under heaven.' He ruled over
seventy-two vassal kings, he continued, and twelve archbishops; he fed

thirty thousand soldiers each day at tables made of gold, amethyst,


and emerald; he maintained a great army dedicated to protecting
Christians everywhere, and he had dedicated himself to waging
perpetual war against the enemies of Christ, ...
"Prester John was both specific and vague when it came to describing
where he was from. 'Our magnificence dominates the Three Indias;'
wrote, 'and extends to Farther India, where the body of St. Thomas the
Apostle rests. It reaches through the desert toward the place of the
rising of the sun.' In the early Middle Ages, Europeans knew India as
little more than a distant and mysterious part of the world somewhere
to the east of the Holy Land, but by the twelfth century they had
begun to divide it into three parts: Nearer India (the northern part
of the Indian subcontinent and points east); Farther India (the
southern part of India, and the spice producing regions of the Far
East); and Middle India (Ethiopia and other African kingdoms). The
division may seem odd to us today, but substitute 'East' for 'India'
and you get something like our own peculiar way of dividing Asia: the
Near East, the Far East, and the Middle East.
"By the early 1200s the myth of Prester John had taken root in Europe
and the Holy Land. Jerusalem by now was once again in Muslim hands,
and the idea of yet another Crusade, the fifth, was taking shape. One
[bishop] wrote in 1217, 'that there are more Christians than Muslims
living in Islamic countries. The Christians of the Orient, as far away
as the land of Prester John, have many kings, who, when they hear that
the Crusade has arrived, will come to its aid.' "
Toby Lester, The Fourth Part of the World, Free Press, Copyright 2009
by Toby Lester, pp. 49-52.
-------------------------------------------------------------------------------------------------In today's excerpt - George Soros, one of the world's wealthiest and
most successful investors, made his early fortune by betting on the
stock market's irrationality in a world that had long believed in a
rational and efficient market. Investors fell in love with
conglomerates such as LTV Corporation in the 1960s, and that love
translated into unwarranted high valuations for these companies and
soaring stock prices. For Soros, that was a golden opportunity:
"Soros's practical experience as a broker and research analyst
convinced him that the normal market state was, in fact,
disequilibrium. As an investor, however, he finds it more useful than
an assumption of market rationality, because it is a better pointer to
profit opportunities. [One] of his early investment successes [was]
crucial to the evolution of his thinking.
"[It] was related to the conglomerate movement in the second half of

the 1960s. The flurry of company takeovers, Soros saw, merely


exploited investors' tendency to rate companies by trends in earnings
per share (EPS). Start with modestly sized Company A, and engineer a
debt-financed acquisition of B, a much larger, stodgy company with
stagnant revenues, a modest EPS, and a low market price. Merge B into
A, and retire B's stock, and the resulting combined A/B will have a
much higher debt load, but a much smaller stock base. So long as B's
earnings more than cover the new debt service, the combined A/B will
show a huge Jump in per-share revenues and earnings. Uncritical
investors then push up A/B's stock price, which helps finance new
acquisitions. Jim Ling was one of the early exploiters of the
strategy, parlaying a modest Dallas electronics company into a
sprawling giant (LTV) with dozens of companies, spanning everything
from steel to avionics, meatpacking, and golf balls.
"Business schools justified the scam by theorizing that conglomerates
deserved higher share prices because their diversified business mix
would deliver smoother and steadier earnings. It was the kind of dumb
idea that underscores the disconnect between business schools and real
business. If shareholders want earnings diversification, of course,
they can quickly and easily diversify stock portfolios in the market.
Big conglomerates, in fact, probably warrant lower stock prices.
They're hard to manage, are often run by financial operators, and
usually carry outsized debt loads.
"Soros understood that the conglomerate game made no sense, but he
also recognized its strong following among market professionals. 'I
respect the herd,' he told me, not because it's right, but because
'it's like the ocean.' Even the stupidest idea may warrant investment,
in other words, if it has a grip on the market's imagination. So Soros
invested heavily in conglomerates, riding up the stock curve until he
sensed it was nearing a top. Then he took his winnings and switched to
the short side, enjoying a second huge payday on the way down."
Charles R. Morris, The Sages, Public Affairs, Copyright 2009 by
Charles R. Morris, pp. 10-11.
-------------------------------------------------------------------------------------------------In today s excerpt - human brains have almost all of their 100 billion
neurons in place at birth, with some 250,000 being born every minute
during gestation, and these brains are almost identical to all other
human brains, since even a slight variation can be lethal:
The trajectory by which a fusion of human sperm and ovum results, over
nine months gestation, in some 3-4 kilos of baby, fully equipped with
internal organs, limbs, and a brain with most of its 100 billion
neurons in place, is relatively easy to describe, even when it is hard
to explain.

All humans are alike in very many respects, all are different in some.
(No two individuals, not even monozygotic twins, are entirely
identical, even at birth.) Yet chemically, anatomically and
physiologically there is astonishingly little obvious variation to be
found between brains, even from people from widely different
populations. Barring gross developmental damage, the same structures
and substances repeat in every human brain, from the chemistry of
their neurotransmitters to the wrinkles on the surface of the cerebral
cortex. Humans differ substantially in size and shape, and so do our
brains, but when a correction is made for body size, then our brains
are closely matched in mass and structure, though men's brains are
slightly heavier on average than are women's. So similar are they
though, that imagers using PET (positron emission tomography) and MRI
(magnetic resonance imaging) have been able to develop algorithms by
which they can transform and project the image derived from any
individual into a 'standard' brain. Brains are so finely tuned to
function, so limited by constraints, that anything more than
relatively minor variation is simply lethal.
Of no body organ is the developmental sequence more simultaneously
dramatic and enigmatic than the brain. How to explain the complexity
and apparent precision with which individual neurons are born, migrate
to their appropriate final sites, and make the connections which
ensure that the newborn on its arrival into the outside world has a
nervous system so fully organized that the baby can already see, hear,
feel, voice her needs, and move her limbs? The fact that this is
possible implies that the baby at birth must have most of her
complement of neurons already in place - if not the entire 100
billion, then getting on for that number. If we assume a steady birth
of cells over the whole nine months - although of course in reality
growth is much more uneven, with periodic growth spurts and lags - it
would mean some 250,000 nerve cells being born every minute of every
day over the period. As if this figure is not extraordinary enough,
such is the density of connections between these neurons that we must
imagine up to 30,000 synapses a second being made over the period for
every square centimeter of newborn cortical surface. And to this rapid
rate of production must be added that of the glia, packing the white
matter below the cortex and surrounding the neurons within it - though
admittedly they do not reach their full complement by birth but
continue to be generated throughout life.
Steven Rose, The Future of the Brain, Oxford, Copyright 2005 by Steven
Rose, pp. 57-63.
-------------------------------------------------------------------------------------------------In today's excerpt - by 1400 A.D., Rome's population had declined from
over a million people at the height of the Empire, to a mere 20,000

people - and Christian tourists shunned Roman artifacts in favor of


such relics as the finger bone of St. Thomas:
"In the early 1400s the Eternal City must have been, in most respects,
a wretchedly uninspiring sight, a parent that the Florentines may well
have wished to disown. A million people had dwelled in Rome during the
height of the Empire, but now the city's population was less than that
of Florence. The Black Death of 1348 had reduced numbers to 20,000,
from which, over the next fifty years, they rose only slightly. Rome
had shrunk into a tiny area inside its ancient walls, retreating from
the seven hills to huddle among a few streets on the bank of the Tiber
across from St. Peter's, whose walls were in danger of collapse. Foxes
and beggars roamed the filthy streets. Livestock grazed in the Forum,
now known as il Campo Vaccino, 'the Field of Cows.'
"Other monuments had suffered even worse fates. The Temple of Jupiter
was a dunghill, and both the Theater of Pompey and the Mausoleum of
Augustus had become quarries from which the ancient masonry was
scavenged, some of it for buildings as far away as Westminster Abbey.
Many ancient statues lay in shards, half buried, while others had been
burned in kilns to make quicklime or else fertilizer for the feeble
crops. Still others were mangers for asses and oxen. The funerary
monument of Agrippina the Elder, the mother of Caligula, had been
turned into a measure for grain and salt.
"Rome was a dangerous and unappealing place. There were earthquakes,
fevers, and endless wars, the latest of which, the War of the Eight
Saints, witnessed English mercenaries laying waste to the city. There
was no trade or industry apart from the pilgrims who arrived from all
over Europe, clutching copies ofMirabilia urbis romae (The wonders of
Rome), which told them which relics to see during their stay. This
guidebook directed them to such holy sights as the finger bone of St.
Thomas in Santa Croce in Gerusalemme, the arm of St. Anne and the head
of the Samaritan woman converted by Christ in San Paolo fuori le Mura,
or the crib of the infant Savior in Santa Maria Maggiore. There was a
hucksterish atmosphere to the city: pardoners sold indulgences from
stalls in the street, and churches advertised confessions that were
supposedly good for a remission of infernal torture for a grand total
of 8,000 years.
"The Mirabilia urbis romae did not direct the attention of the
pilgrims to the Roman remains that surrounded them. To such pious
Christians these ancient ruins were so much heathen idolatry. Worse,
they were stained with the blood of Christian martyrs. The Baths of
Diocletian, for example, were built with the forced labor of early
Christians, many of whom had died during the construction. Antique
images that had survived a millennium of earthquakes, erosion, and
neglect were therefore deliberately trampled underfoot, spat on, or
thrown to the ground and smashed to pieces."

Ross King, Brunelleschi"s Dome, Penguin, Copyright 2000 by Ross King,


pp. 22-23.
-------------------------------------------------------------------------------------------------In today's excerpt - the mind 'at rest' is often more active, and at
the least almost as active, as the mind when it is engaged in a task:
"Many neuroscientists have long assumed that much of the neural
activity inside your head when at rest matches your subdued, somnolent
mood. In this view, the activity in the resting brain represents
nothing more than random noise, akin to the snowy pattern on the
television screen when a station is not broadcasting. But recent
analysis produced by neuroimaging technologies has revealed something
quite remarkable: a great deal of meaningful activity is occurring in
the brain when a person is sitting back and doing nothing at all.
"It turns out that when your mind is at rest - when you are
daydreaming quietly in a chair, say, [or] asleep in a bed or
anesthetized for surgery - dispersed brain areas are chattering away
to one another. And the energy consumed by this ever active messaging,
known as the brain's default mode, is about 20 times that used by the
brain when it responds consciously to an outside stimulus. Indeed,
most things we do consciously, be it sitting down to eat dinner or
making a speech, mark a departure from the baseline activity of the
brain default mode. ...
"Further analyses indicated that performing a particular task
increases the brain's energy consumption by less than 5 percent of the
underlying baseline activity. A large fraction of the overall activity
- from 60 to 80 percent of all energy used by the brain - occurs in
circuits unrelated to any external event. With a nod to our astronomer
colleagues, our group came to call this intrinsic activity the brain's
dark energy, a reference to the unseen energy that also represents the
mass of most of the universe. ...
"In the mid-1990s we noticed quite by accident that, surprisingly,
certain brain regions experienced a decreased level of activity from
the baseline resting state when subjects carried out some task. These
areas - in particular, a section of the medial parietal cortex (a
region near the middle of the brain involved with remembering personal
events in one's life, among other things) - registered this drop when
other areas were engaged in carrying out a defined task such as
reading aloud. Befuddled, we labeled the area showing the most
depression MMPA, for 'medial mystery parietal area.'
"A series of experiments then confirmed that the brain is far from
idling when not engaged in a conscious activity. In fact, the MMPA as
well as most other areas remains constantly active until the brain

focuses on some novel task, at which time some areas of intrinsic


activity decrease. At first, our studies met with some skepticism. In
1998 we even had a paper on such findings rejected because one referee
suggested that the reported decrease in activity was an error in our
data. The circuits, the reviewer asserted, were actually being
switched on at rest and switched off during the task. Other
researchers, however, reproduced our results for both the medial
parietal cortex - and the medial prefrontal cortex (involved with
imagining what other people are thinking as well as aspects of our
emotional state). Both areas are now considered major hubs of the
brain's default mode network."
Marcus E. Raichle, "The Brain's Dark Energy,"Scientific American,
March 2010, pp. 44-47.
-------------------------------------------------------------------------------------------------In today's encore excerpt - as the twentieth century unfolded,
Virginia Woolf, F. Scott Fitzgerald, James Joyce and other artists
reacted to the unprecedented and accelerating pace of change by
repudiating the past and grasping for something new:
"The heroic daring of [the new] century lay in its conviction of
absolute, unprecedented novelty. This is what the exhilarating notion
of modernity meant: canceling all the accumulated wisdom of our
forebears. ... Valiantly eager for the future, the Bauhaus instructor
Oskar Schlemmer decreed in 1929 that 'One should act as if the world
had just been created.'
"A new-born universe called for fresh tenants. Virginia Woolf
accordingly reported, as if she were pinpointing an actual, verifiable
event, that 'on or about December 1910 human character changed.' Rites
of passage made this enigmatic transformation visible. How do human
beings usually announce an altered identity? By changing the way they
wear their hair. Men who wanted to be ruthlessly modern shaved their
skulls, like the Russian revolutionary poet Vladimir Mayakovsky or
Johannes Itten, an instructor at the Bauhaus in Weimar. In the hirsute
nineteenth century, sages - aspiring to the shagginess of Old
Testament prophets - grew beards. For the glowering, bullet-headed
Mayakovsky, the cranium was a projectile, made more aerodynamic by
being rid of hair. For Itten, shaving announced his priestly
dedication to the new world which the designers at the Bauhaus
intended to build. ...
"Women had their own equivalent to those drastic masculine acts of
self-mutilation. In 1920 F. Scott Fitzgerald wrote a story, 'Bernice
Bobs Her Hair,' about a timid provincial girl for whom bobbing is a
transition between two periods of life and two historical epochs. The
new style ejects her from Madonna-like girlhood, when she was

protectively cocooned in tresses, and announces her sexual maturity.


Bernice fearfully acknowledges the revolutionary antecedents of the
process. Driving downtown to the mens' barber-shop where the operation
will be performed, she suffers 'all the sensations of Marie Antoinette
bound for the guillotine in a tumbril;' the barber with his shears is
an executioner. The French revolutionaries sliced off the heads of
bewigged aristocrats in order to destroy an old world. Bernice,
however, has her own hair chopped to fit her for membership of a new
society: bobbing conferred erotic allure on girls who were previously
dismissed as wallflowers. ...
"James Joyce's Ulysses in 1922 testified to the change in human
character announced by Virginia Woolf. Bodies now did things which, at
least according to literature, they had never done before. A man
ponders his own bowel movement, relishing its sweet smell. Later in
the day he surreptitiously masturbates in a public place and takes
part in a pissing contest, proud of the arc his urine describes. A
woman has a noisily affirmative orgasm, or perhaps more than one. The
same people did not think in paragraphs or logical, completed
sentences, like characters in nineteenth-century novels. Their mental
life proceeded in associative jerks and spasms; they mixed up shopping
lists with sexual fantasies, often forgot verbs and (in the woman's
case) scandalously abandoned all punctuation. The modern mind was not
a quiet, tidy cubicle for cogitation. It thronged with as many random
happenings as a city street; it contained scraps and fragments, dots
and dashes, like the incoherent blizzard of marks on a modern canvas
which could only be called an 'impression' because it represented
nothing recognizable."
Peter Conrad, Modern Times, Modern Places, Knopf, Copyright 1998 by
Peter Conrad, pp. 14-15.
-------------------------------------------------------------------------------------------------In today's encore excerpt - as the twentieth century unfolded,
Virginia Woolf, F. Scott Fitzgerald, James Joyce and other artists
reacted to the unprecedented and accelerating pace of change by
repudiating the past and grasping for something new:
"The heroic daring of [the new] century lay in its conviction of
absolute, unprecedented novelty. This is what the exhilarating notion
of modernity meant: canceling all the accumulated wisdom of our
forebears. ... Valiantly eager for the future, the Bauhaus instructor
Oskar Schlemmer decreed in 1929 that 'One should act as if the world
had just been created.'
"A new-born universe called for fresh tenants. Virginia Woolf
accordingly reported, as if she were pinpointing an actual, verifiable
event, that 'on or about December 1910 human character changed.' Rites

of passage made this enigmatic transformation visible. How do human


beings usually announce an altered identity? By changing the way they
wear their hair. Men who wanted to be ruthlessly modern shaved their
skulls, like the Russian revolutionary poet Vladimir Mayakovsky or
Johannes Itten, an instructor at the Bauhaus in Weimar. In the hirsute
nineteenth century, sages - aspiring to the shagginess of Old
Testament prophets - grew beards. For the glowering, bullet-headed
Mayakovsky, the cranium was a projectile, made more aerodynamic by
being rid of hair. For Itten, shaving announced his priestly
dedication to the new world which the designers at the Bauhaus
intended to build. ...
"Women had their own equivalent to those drastic masculine acts of
self-mutilation. In 1920 F. Scott Fitzgerald wrote a story, 'Bernice
Bobs Her Hair,' about a timid provincial girl for whom bobbing is a
transition between two periods of life and two historical epochs. The
new style ejects her from Madonna-like girlhood, when she was
protectively cocooned in tresses, and announces her sexual maturity.
Bernice fearfully acknowledges the revolutionary antecedents of the
process. Driving downtown to the mens' barber-shop where the operation
will be performed, she suffers 'all the sensations of Marie Antoinette
bound for the guillotine in a tumbril;' the barber with his shears is
an executioner. The French revolutionaries sliced off the heads of
bewigged aristocrats in order to destroy an old world. Bernice,
however, has her own hair chopped to fit her for membership of a new
society: bobbing conferred erotic allure on girls who were previously
dismissed as wallflowers. ...
"James Joyce's Ulysses in 1922 testified to the change in human
character announced by Virginia Woolf. Bodies now did things which, at
least according to literature, they had never done before. A man
ponders his own bowel movement, relishing its sweet smell. Later in
the day he surreptitiously masturbates in a public place and takes
part in a pissing contest, proud of the arc his urine describes. A
woman has a noisily affirmative orgasm, or perhaps more than one. The
same people did not think in paragraphs or logical, completed
sentences, like characters in nineteenth-century novels. Their mental
life proceeded in associative jerks and spasms; they mixed up shopping
lists with sexual fantasies, often forgot verbs and (in the woman's
case) scandalously abandoned all punctuation. The modern mind was not
a quiet, tidy cubicle for cogitation. It thronged with as many random
happenings as a city street; it contained scraps and fragments, dots
and dashes, like the incoherent blizzard of marks on a modern canvas
which could only be called an 'impression' because it represented
nothing recognizable."
Peter Conrad, Modern Times, Modern Places, Knopf, Copyright 1998 by
Peter Conrad, pp. 14-15.
----------------------------------------------------------------------

----------------------------In today's excerpt - when Fidel Castro took over Cuba, he found that
he needed allies to counterbalance the threat of the U.S.
Astonishingly, this quest led the tiny and poor country of Cuba to
places as far afield as Ethiopia, Yemen, Angola, Guinea-Bissau and
Algeria, adventures which led them to be perceived as champion of the
Third World, but which each ultimately drained Cuba and ended in
failure:
"Fidel Castro's need for allies accelerated still further after the
crisis of October 1962 when the Soviet premier Khrushchev withdrew the
nuclear missiles he had installed in Cuba without even consulting
Castro, whose faith in the Soviets was badly shaken.
"Latin America seemed to offer hope. ... During 1962 Cuba sent
expeditions to lead or support guerrilla movements in Latin America.
The most important went to Che's native Argentina in June 1963. The
rebels planned to establish a foco which Che himself would join. After
nine harrowing months they were wiped out by the Argentine army. This
was a blow for [Castro's second-in-command Ernesto] 'Che' Guevara.
Discouraged by their Latin experience, the Cubans turned to a
continent ripe for revolution: Africa.
"Cuba's first friend in Africa was Algeria, whose uprising against the
French, which started in 1954, seemed to offer a parallel to Cuba's
own revolution. ...
"Che [also] found an ideal situation in the turbulent ex-Belgian Congo
(later Zaire, now the Democratic Republic of Congo). ... Hundreds of
young Congolese went to Cuba for free schooling and training and, when
the military revolted against [President] Massamba-Debat, the Cubans
saved him. By 1966 the Cuban force had grown to 1,000, serving
primarily as the presidential guard. Nevertheless, Massamba-Debat
succumbed to a coup in April 1968 and Che's dreams [there] collapsed.
"The Cubans had better luck in Guinea-Bissau where revolutionaries
under Amilcar Cabral were the best organized and disciplined in
Portuguese Africa. ...
"After his failure in the Congo, Che had turned his attention back to
Latin America where Bolivia seemed to offer ideal conditions for
revolution: poverty, instability, remote mountain terrain and borders
with the most important countries of Latin America. Che and his small
force set off in October 1966. ... His campaign was a disaster that
culminated in his capture and execution on October 8th, 1967. Efforts
to stir revolution in Guatemala, Venezuela and Colombia collapsed soon
after.
"Cuba's most ambitious involvement in Africa came [in] Angola, the

richest and most strategically important of the colonies, which was to


become independent on November 11th, 1975. ... By the end of 1975 the
Cuban sea and airlift had transported more than 25,000 troops to
Angola and the Soviets had finally joined in, providing heavy weapons
and coordinating closely with the Cubans. They could do so because the
US Congress had prohibited President Ford from any further
intervention in Angola. Castro was deeply and personally involved in
all this. ... The last Cuban troops withdrew in June 1991 after 15
years in Angola. They left behind some 4,000 dead and suffered another
10,000 wounded.
"During these years Castro was involved in another part of Africa. In
1977, on his way to Angola, he had visited Marxist South Yemen where
he tried to mediate the growing tension between Ethiopia and Somalia
over control of the Ogaden region, which belonged to Ethiopia but was
inhabited by Somalis. ... In January 1978 Raul Castro flew secretly to
Ethiopia and Moscow and worked out plans for a coordinated operation:
16,000 Cubans were transported by the Soviets who provided the heavy
weapons. Cuban troops remained in Ethiopia, though in diminishing
numbers, until 1989. ...
"There are no happy endings to this story. Castro's role as champion
of the Third World never recovered from his support of the Soviet
invasion of Afghanistan in 1979. Ten years later, Cuba's General
Ochoa, hero of the Angolan and Ethiopian campaigns, was accused of
embezzlement and executed after a Soviet-style show trial; his real
offence was criticizing Castro's incessant interference in military
operations. Guinea-Bissau, where the Cubans had been successful,
suffered coups, civil wars and assassinations that left it one of the
poorest countries in the world. Che's friend Laurent Kabila took over
Zaire in 1997 only to be assassinated four years later. By then, an
exceptionally violent civil war consumed the country; it still simmers
in the eastern parts that Che had hoped to liberate. Ethiopia's
Mengistu turned out to be a bloody tyrant and was thrown out in 1991;
the succeeding regime fought a war with Eritrea in 1999-2000. Finally,
Angola: after the Cubans withdrew, the regime sloughed off the thin
skin of Marxism and called relatively free elections [but then] became
a one-party state, a kleptocracy ranked as one of the most corrupt
places on earth, where the elite flourished while the mass of the
population remained mired in poverty."
Clive Foss, "Cuba's African Adventure," History Today, Volume: 60
Issue: 3, pp. 10-16
-------------------------------------------------------------------------------------------------In today's excerpt - expressing anger amplifies aggression:
"People often opine that releasing anger is healthier than bottling it

up. In one survey, 66 percent of university undergraduates agreed that


expressing pent up anger is a good way of tamping down aggression.
This belief dates back at least to Aristotle, who observed that
viewing tragic plays affords the opportunity for catharsis, a
cleansing of anger and other negative emotions.
"Popular media also assure us that anger is a monster we must tame by
'letting off steam,' 'blowing our top' and 'getting things off our
chest.' [That] advice echoes the counsel of many self-help authors.
One suggested that rather than 'holding in poisonous anger,' it is
better to 'punch a pillow or a punching bag. And while you do it, yell
and curse and moan and holler.' Some popular therapies encourage
clients to scream, hit pillows or throw balls against walls when they
get angry. Practitioners of Arthur Janov s 'primal therapy,' popularly
called primal scream therapy, believe that psychologically disturbed
adults must bellow at the top of their lungs or somehow otherwise
release the emotional pain stemming either from the trauma of birth or
from childhood neglect or suffering.
"Yet more than 40 years of research reveals that expressing anger
actually amplifies aggression. In one study, people who pounded nails
after someone insulted them became more critical of that person than
did their counterparts who did not pound nails. Other research shows
that playing aggressive sports, such as football, actually boosts
self-reported hostility. And a review of 35 studies by psychologist
Craig Anderson of Iowa State University and psychologist Brad Bushman
of the University of Michigan at Ann Arbor suggests that playing
violent video games such as Manhunt, in which participants rate
assassinations on a five-point scale, heightens aggression in the
laboratory and in everyday social situations.
"Psychologist Jill Littrell of Georgia State University concludes from
a published review of the literature that expressing anger is helpful
only when accompanied by constructive problem solving or communication
designed to reduce frustration or address the immediate source of the
anger. So if we are upset with our partner for repeatedly ignoring our
feelings, shouting at him or her is unlikely to make us feel better,
let alone improve the situation. But calmly and assertively expressing
our resentment ('I realize you probably aren't being insensitive on
purpose, but when you act that way, I don't feel close to you') can
often take the sting out of anger.
"Why is this myth so popular? People probably attribute the fact that
they feel better after expressing anger to catharsis, rather than to
the anger subsiding on its own, which it almost always does. Odds are,
they would have felt better if they had merely waited out their
anger."
Scott O. Lilienfeld, Steven Jay Lynn, John Ruscio and Barry L.
Beyerstein, "Busting Big Myths in Popular Psychology", Scientific

American - Mind, March/April 2010, pp. 44-45.


-------------------------------------------------------------------------------------------------In today s excerpt - the collapse of a long-standing empire has very
often occurred in a very short span of time:
"What is most striking about [Rome's] history is the speed of the
Roman Empire's collapse. In just five decades, the population of Rome
itself fell by three-quarters. Archaeological evidence from the late
fifth century - inferior housing, more primitive pottery, fewer coins,
smaller cattle - hows that the benign influence of Rome diminished
rapidly in the rest of western Europe. What [Oxford historian Brian]
Ward-Perkins calls 'the end of civilization' came within the span of a
single generation.
"Other great empires have suffered comparably swift collapses. The
Ming dynasty in China began in 1368, when the warlord Zhu Yuanzhang
renamed himself Emperor Hongwu, the word hongwumeaning 'vast military
power.' For most of the next three centuries, Ming China was the
world's most sophisticated civilization by almost any measure. Then,
in the mid-seventeenth century, political factionalism, fiscal crisis,
famine, and epidemic disease opened the door to rebellion within and
incursions from without. In 1636, the Manchu leader Huang Taiji
proclaimed the advent of the Qing dynasty. Just eight years later,
Beijing, the magnificent Ming capital, fell to the rebel leader Li
Zicheng, and the last Ming emperor hanged himself out of shame. The
transition from Confucian equipoise to anarchy took little more than a
decade.
"In much the same way, the Bourbon monarchy in France passed from
triumph to terror with astonishing rapidity. French intervention on
the side of the colonial rebels against British rule in North America
in the 1770s seemed like a good idea at the time - a chance for
revenge after Great Britain's victory in the Seven Years' War a decade
earlier - but it served to tip French finances into a critical state.
In May 1789, the summoning of the Estates-General, France's longdormant representative assembly, unleashed a political chain reaction
that led to a swift collapse of royal legitimacy in France. Only four
years later, in January 1793, Louis XVI was decapitated by
guillotine. ...
"The sun set on the British Empire almost as suddenly. In February
1945, Prime Minister Winston Churchill was at Yalta, dividing up the
world with U.S. President Franklin Roosevelt and Soviet Premier Joseph
Stalin. As World War II was ending, he was swept from office in the
July 1945 general election. Within a decade, the United Kingdom had
conceded independence to Bangladesh, Bhutan, Burma, Egypt, Eritrea,
India, Iran, Israel, Jordan, Libya, Madagascar, Pakistan, and Sri

Lanka. The Suez crisis in 1956 proved that the United Kingdom could
not act in defiance of the United States in the Middle East, setting
the seal on the end of empire. Although it took until the 1960s for
independence to reach sub-Saharan Africa and the remnants of colonial
rule east of the Suez, the United Kingdom's [centuries old] age of
hegemony was effectively over less than a dozen years after its
victories over Germany and Japan.
"The most recent and familiar example of precipitous decline is, of
course, the collapse of the Soviet Union. With the benefit of
hindsight, historians have traced all kinds of rot within the Soviet
system back to the Brezhnev era and beyond. Perhaps, as the historian
and political scientist Stephen Kotkin has argued, it was only the
high oil prices of the 1970s that 'averted Armageddon.' But this did
not seem to be the case at the time. In March 1985, when Mikhail
Gorbachev became general secretary of the Soviet Communist Party, the
CIA estimated the Soviet economy to be approximately 60 percent the
size of the U.S. economy. This estimate is now known to have been
wrong, but the Soviet nuclear arsenal was genuinely larger than the
U.S. stockpile. And governments in what was then called the Third
World, from Vietnam to Nicaragua, had been tilting in the Soviets'
favor for most of the previous 20 years. Yet less than five years
after Gorbachev took power, the Soviet imperium in central and Eastern
Europe had fallen apart, followed by the Soviet Union itself in 1991.
If ever an empire fell off a cliff - rather than gently declining - it
was the one founded by Lenin."
Niall Ferguson, Complexity and Collapse, Foreign Affairs, March/April
2010, pp. 28-30.
-------------------------------------------------------------------------------------------------In today's excerpt - American strategy to combat terrorist groups such
as al Qaeda has centered on finding and removing the leaders of these
groups, a strategy known as "decapitation." A rigorous analysis of all
298 such cases of leadership decapitation in terrorist groups from
1945 to 2004 suggests that this may be an unproductive strategy - that
these leadership gaps are quickly filled and that groups become more
virulent as a result compared to similar groups where this strategy is
not employed:
"Immediately following the killing of Abu Musab al-Zarqawi, President
George W. Bush announced that a 'severe blow' had been dealt to al
Qaeda. Leadership decapitation is not limited to U.S. counterterrorism
efforts. The arrests of the Shining Path's Abimael Guzman and the
Kurdistan Workers' Party's (PKK) Abdullah Ocalan are commonly cited as
examples of successful decapitation. Israel has consistently targeted
the leaders of HAMAS. The arrest of Basque Homeland and Freedom's
(ETA) leader Francisco Mugica Garmenia was seen as likely to result in

ETA's collapse, but authorities determined that the organization was


much more complicated than they had assumed. The recent arrests of two
ETA leaders in May and November of 2008 have been characterized by
Spanish Prime Minister Jose Luis Rodriguez Zapatero as a 'definitive
operation in the fight against ETA.'
"Despite a tremendous amount of optimism toward the success of
decapitation, there is very little evidence on whether and when
removing leaders will result in organizational collapse. Moreover,
there are inconsistencies among current studies of decapitation. A
core problem with the current literature and a primary reason for
discrepancy over the effectiveness of decapitation is a lack of solid
empirical foundations. In order to develop an empirically grounded
assessment of leadership targeting, this study examines variation in
the success of leadership decapitation by developing a comprehensive
dataset of 298 cases of leadership decapitation from 1945-2004. The
overarching goal of this article is to explain whether decapitation is
effective. ...
"Optimism toward the success of decapitation is based primarily on
theoriesof charismatic leadership. ... Social network analysis, which
is rooted in sociological studies of organizational dynamics, would
predict more variability in the success of decapitation. ...
"A [terrorist] group's age, size, and type are all important
predictors of when decapitation is likely to be effective. The data
indicate that as an organization becomes larger and older,
decapitation is less likely to result in organizational collapse.
Furthermore, religious groups are highly resistant to attacks on their
leadership, while ideological organizations are much easier to
destabilize through decapitation.
"Second, the data also show that decapitation is not an effective
counterterrorism strategy. Decapitation does not increase the
likelihood of organizational collapse beyond to a baseline rate of
collapse for groups over time. The marginal utility for decapitation
is actually negative. Groups that
have not had their leaders targeted have a higher rate of decline than
groups whose leaders have been removed. Decapitation is actually
counterproductive, particularly for larger, older, religious, or
separatist organizations.
"Finally, in order to determine whether decapitation hindered the
ability of an organization to carry out terrorist attacks, I looked at
three cases in which decapitation did not result in a group's
collapse. The results were mixed over the extent to which decapitation
has resulted in organizational degradation. While in some cases
decapitation resulted in fewer attacks, in others the attacks became
more lethal in the years immediately following incidents of
decapitation. I argue that these results are largely driven by a

group's size and age.


"Ultimately, these findings indicate that our current
counterterrorism strategies need rethinking. The data show that
independent of other measures, going after the leaders of older,
larger, and religious groups is not only ineffective, it is
counterproductive. Moreover, the decentralized nature of many current
terrorist organizations has proven to be highly resistant to
decapitation and to other counterterrorism measures."
Jenna Jordan, "When Heads Roll: Assessing the Effectiveness of
Leadership Decapitation," Security Studies, 18: 719-755, 2009,
Copyright Taylor and Francis Group, LLC, ISSN: 0963-6412 print/
1556-1852 online.
-------------------------------------------------------------------------------------------------In today's excerpt
Afghanistan today,
beyond the capital
territories beyond
European countries

- historians argue that the situation in


with a central government with little real power
city, and relatively independent (war)lords in the
the capital, parallels the situation in most
at around 1600 AD:

"Up until the seventeenth century, the European continent was divided
into many small political units with vague and porous borders. Where
kings reigned, they usually were only titular leaders with little
power outside a capital city. They had little contact with, or even
direct impact on, their supposed subjects. The dominant authority
figures in most people's lives were religious leaders or local
notables, and popular identities were based on religion, locality, or
community rather than anything that could truly be called nationality.
Christian clergy exerted immense social, cultural, and political
influence, and the church carried out many of the functions normally
associated with states today, such as running schools and hospitals or
caring for the poor.
"Responsibility for security, meanwhile, lay chiefly with local or
regional nobility, who maintained private fortresses, arsenals, and
what would now be called militias or paramilitary forces. Political
life in this prestate era was brutal: warfare, banditry, revolts, and
religious and communal conflict were widespread. Even in England,
where authority was centralized earlier and more thoroughly than
elsewhere in Europe, one-fifth of all dukes met unnatural, violent
deaths during the seventeenth century.
"Around 1600, however, many European kings began to centralize
authority. Their efforts were fiercely resisted by those with the most
to lose from the process -- namely, local political and religious
elites. ... Through the sixteenth century, France was essentially a

collection of loosely affiliated communities with independent


institutions, customs, and even languages. It was primarily during the
reigns of Louis XIII (1610-43) and, especially, Louis XIV (1643-1715)
that the monarchy expanded its armed forces, legal authority, and
bureaucracy and took control of the country.
"This process was remarkably conflictual. Its first several decades
were marked by peasant revolts, religious wars, and the obstinate
resistance of provincial authorities, which culminated in the series
of conflicts known as the Fronde (1648-53) and threatened to plunge
the country into complete chaos. Louis XIV eventually defeated the
recalcitrant nobles and local leaders on the battlefield, but the
costs of victory were so high that he decided to complete the process
of centralizing power by co-opting his remaining rivals rather than
crushing them.
"During the second half of the seventeenth century, accordingly, he
and his ministers focused on buying off and winning over key
individuals and social groups that might otherwise obstruct their
state-building efforts. Adapting and expanding a common practice, for
example, they repeatedly sold state offices to the highest bidders; by
the eighteenth century, almost all the posts in the French government
were for sale, including those dealing with the administration of
justice. These offices brought annual incomes, a license to extract
further revenues from the population at large, and exemptions from
various impositions. The system had drawbacks in terms of technocratic
effectiveness, but it also had compensating benefits for the crown:
selling off public posts was an easy way to raise money and helped
turn members of the gentry and the emerging bourgeoisie into
officeholders. Rather than depending on local or personal sources of
revenue, these new officeholders eventually developed new interests
connected to the broader national system. ...
"Cardinals Richelieu and Mazarin, the chief ministers to Louis XIII
and XIV, respectively, studied the relationships that allowed local
elites to control their underlings and reward their supporters and
tried to supplant those with new relationships centered on the king
and his ministers. They selected provincial brokers who had excellent
contacts in far-flung areas but whose loyalties were to Paris and then
gave these brokers money and benefits that could be channeled to
others in turn, thus expanding the reach of the crown throughout the
periphery.
"Another tactic designed to secure the state's authority was the
construction of Louis XIV's glittering palace at Versailles, which was
officially established as the seat of the French court in 1682. The
luxury of the palace was more than merely a celebration of the wealth
and power of the Sun King; it was also a crucial weapon in his battle
to domesticate the obstreperous French nobility. Louis XIV made the
aristocracy's presence at Versailles a key prerequisite for their

obtaining favor, patronage, and power. By assembling many of the most


important local notables at his court, he was able to watch over them
closely while separating them from their local power bases. The
tradeoff was clear: in return for abandoning their local authority and
autonomy, nobles were given handsome material rewards and the
opportunity to participate in the court's luxurious lifestyle."
Sheri Berman, "From the Sun King to Karzai," Foreign Affairs, March/
April 2010.
-------------------------------------------------------------------------------------------------"Take the idea that it is wrong to say If a student comes before I get
there, they can slip their test under my office door, because student
is singular and they 'is plural.' Linguists traditionally observe that
esteemed writers have been using they as a gender-neutral pronoun for
almost a thousand years. As far back as the 1400s, in the Sir Amadace
story, one finds the likes of Iche mon in thayre degree ("Each man in
their degree").
"Maybe when the sentence is as far back as Middle English, there is a
sense that it is a different language on some level than what we speak
- the archaic spelling alone cannot help but look vaguely maladroit.
But Shakespeare is not assumed to have been in his cups when he wrote
in The Comedy of Errors, 'There's not a man I meet but doth salute
me / As I were their well-acquainted friend' (Act IV, Scene 111).
Later, Thackeray in Vanity Fair tosses off 'A person can't help their
birth.' ...
"Or there's the objection to nouns being used as verbs. These days,
impact comes in for especial condemnation: The new rules are impacting
the efficiency of the procedure. People lustily express that they do
not 'like' this, endlessly writing in to language usage columnists
about it. Or one does not 'like' the use of structure as in I
structured the test to be as brief as possible.
"Well, okay--but that means you also don't 'like' the use of view,
silence, worship, copy, outlaw, and countless other words that started
as nouns and are now also verbs. Nor do many people shudder at the use
of fax as a verb....
"Over the years, I have gotten the feeling that there isn't much
linguists can do to cut through this. ... There are always books out
that try to put linguists' point across. Back 1950, Robert Hall's
Leave Your Language Alone! was all over the place, including a late
edition kicking around in the house I grew up in. Steven Pinker's The
Language Instinct, which includes a dazzling chapter on the grammar
myths, has been one of the most popular books on language ever
written. As I write, the flabbergastingly fecund David Crystal has

just published another book in the tradition, The Fight for English:
How Language Pundits Ate, Shot, and Left. But the air of frustration
in Crystal's title points up how persistent the myths are. ...
"English is shot through with things that don't really follow. I'm the
only one, amn't I? Shouldn't it be amn't after all? Aren't, note, is
'wrong' since are is used with you, we, and they, not I. There's no 'I
are.' Aren't I? is thoroughly illogical - and yet if you decided to
start saying amn't all the time, you would lose most of your friends
and never get promotions. Except, actually, in parts of Scotland and
Ireland where people actually do say amn't - in which case the rest of
us think of them as 'quaint' rather than correct!"
John McWhorter, Our Magnificent Bastard Tongue, Gotham, Copyright 2008
by John McWhorter, pp. 65-69, 80
-------------------------------------------------------------------------------------------------In today's excerpt - new hope for the regeneration of tissue in
humans, based on experiments with mice and similar in type to the
regeneration of tissue and limbs in creatures like newts, flatworms,
and sponges:
"A quest that began over a decade ago with a chance observation has
reached a milestone: the identification of a gene that may regulate
regeneration in mammals. The absence of this single gene, called p21,
confers a healing potential in mice long thought to have been lost
through evolution and reserved for creatures like flatworms, sponges,
and some species of salamander.
"In a report published in the Proceedings of the National Academy of
Sciences, researchers from The Wistar Institute demonstrate that mice
that lack the p21 gene gain the ability to regenerate lost or damaged
tissue.
"Unlike typical mammals, which heal wounds by forming a scar, these
mice begin by forming a blastema, a structure associated with rapid
cell growth and de-differentiation as seen in amphibians. According to
the Wistar researchers, the loss of p21 causes the cells of these mice
to behave more like embryonic stem cells than adult mammalian cells,
and their findings provide solid evidence to link tissue regeneration
to the control of cell division.
" 'Much like a newt that has lost a limb, these mice will replace
missing or damaged tissue with healthy tissue that lacks any sign of
scarring,' said the project's lead scientist Ellen Heber-Katz, Ph.D.,
a professor in Wistar's Molecular and Cellular Oncogenesis program.
'While we are just beginning to understand the repercussions of these
findings, perhaps, one day we'll be able to accelerate healing in

humans by temporarily inactivating the p21 gene.'


"Heber-Katz and her colleagues used a p21 knockout mouse to help solve
a mystery first encountered in 1996 regarding another mouse strain in
her laboratory. MRL mice [a strain of mouse that exhibits remarkable
regenerative abilities for a mammal] which were being tested in an
autoimmunity experiment, had holes pierced in their ears to create a
commonly used life-long identification marker. A few weeks later,
investigators discovered that the earholes had closed without a trace.
While the experiment was ruined, it left the researchers with a new
question: Was the MRL mouse a window into mammalian regeneration? ...
"[Researchers] found that p21, a cell cycle regulator, was
consistently inactive in cells from the MRL mouse ear. P21 expression
is tightly controlled by the tumor suppressor p53, another regulator
of cell division and a known factor in many forms of cancer. The
ultimate experiment was to show that a mouse lacking p21 would
demonstrate a regenerative response similar to that seen in the MRL
mouse. And this indeed was the case. As it turned out, p21 knockout
mice had already been created, were readily available, and widely used
in many studies. What had not been noted was that these mice could
heal their ears.
" 'In normal cells, p21 acts like a brake to block cell cycle
progression in the event of DNA damage, preventing the cells from
dividing and potentially becoming cancerous,' Heber-Katz said. 'In
these mice without p21, we do see the expected increase in DNA damage,
but surprisingly no increase in cancer has been reported.' "
Science Daily , March 16, 2010, based on the article "Lack of p21
expression links cell cycle control and appendage regeneration in
mice," by Khamilia Bedelbaeva, Andrew Snyder, Dmitri Gourevitch, Lise
Clark, Xiang-Ming Zhang, John Leferovich, James M. Cheverud, Paul
Lieberman, and EllenHeber-Katz, from Proceedings of the National
Academy of Sciences of the United States of America, doi:10.1073/pnas.
1000830107.
-------------------------------------------------------------------------------------------------In today's excerpt - does the U.S. produce too many scientists?
"For years, Americans have heard blue-ribbon commissions and major
industrialists bemoan a shortage of scientists caused by an inadequate
education system. A lack of high-tech talent, these critics warn, so
threatens the nation's continued competitiveness that the U.S. must
drastically upgrade its K-12 science and math education and import
large numbers of technically trained foreigners by promptly raising
the current limit on the number of skilled foreigners allowed to enter
the country to work in private industry. ...

"But many ... prominent labor economists, disagree. 'There is no


scientist shortage,' says Harvard University economist Richard
Freeman, a leading expert on the academic labor force. The great lack
in the American scientific labor market, he and other observers argue,
is not top-flight technical talent but attractive career opportunities
for the approximately 30,000 scientists and engineers - about 18,000
of them American citizens - who earn PhDs in the U.S. each year. ...
"The competition for science faculty jobs is so intense that every
advertised opening routinely attracts hundreds of qualified
applicants. Most PhDs hired into faculty-level jobs get so-called
'soft-money' posts, dependent on the renewal of year-to-year funding
rather than the traditional tenure-track positions that offer longterm security. ...
"Despite these realities, ... 'almost no one in Washington' recognizes
the 'glut' of scientists, nor the damage that lack or opportunity is
doing to the incentives that formerly attracted many of America's most
gifted young people to seek scientific and engineering careers, he
says. ...
"One thing that's not in short supply are scientifically talented
American students, whose academic achievements have been increasing
rather than declining in recent years. 'Students emerging from the
oft-criticized K-12 system appear to be studying science and math
subjects more and performing better in them, over time,' said Michael
Teitelbaum, labor economist at the Alfred P. Sloan Foundation in
Congressional testimony in November 2007. 'Nor are [they] lagging far
behind comparable students in economically competitive countries, as
is oft asserted.' The number of Americans earning PhDs in science and
technical fields has risen by 18 percent since 1985, according to the
authoritative Scientific and Engineering Indicators 2008, published by
the National Science Board. ...
"Arguments for the shortage based on the inadequacy of American
education generally begin with the results of standardized tests used
in international comparisons. Average scores for K-12 students in the
U.S. never top those lists in either science or math (although they do
in both reading and civics). On one widely cited assessment, Trends in
International Math and Science Study (TIMSS), which tested American
third and eighth graders between 1995 and 2003 and American 12th
graders in 1995 and 1999, U.S. students ranked between fifth and 12th
in math and science - results bemoaned by many as dangerously
deficient.
"But a detailed study of students' performance on TIMSS as well as on
the Programme for International Student Assessment (PISA), another
widely reported international comparison test, by B. Lindsay Lowell of
Georgetown University's Institute for the Study of International

Migration and Hal Salzman of the Urban Institute in Washington, D.C.,


suggests otherwise. 'Their point is that the average performance of
U.S. students on these comparative international tests is not a
meaningful number,' Teitelbaum says. Far from trailing the developed
world in science education, as some claim, 'on PISA, the U.S. has more
high-scoring kids in science than any other country' and nearly as
many in the top math category as top-scoring Japan and Korea, Salzman
says. ...
"Scientists are not generally recruited from the average students,
[and] raising America's average scores on international comparisons
is, therefore, not a matter of repairing a broken educational system
that performs poorly overall, as many critiques suggest, but rather of
improving the performance of the children at the bottom,
overwhelmingly from low-income families ... This discrepancy, of
course, is a vital national need and responsibility, but it does not
reflect an overall insufficient supply of able science students. Nor
do American students lose interest in science once they reach
college. ...
"The root of the problem, many believe, is [that research] has been
done largely at the nation's universities and paid for through
competitive, temporary grants awarded to individual professors by
federal funding agencies such as the National Institutes of Health
[which now dispenses more than $28 billion a year and is the largest
funder of non-military research on the planet] and the National
Science Foundation, ... 'while other countries have permanent ways of
staffing their labs,' often with PhD staff scientists in career
positions, says Georgia State University economist Paula Stephan, an
authority on the academic labor force."
Beryl Lieff Benderly, "Does the U.S. Produce Too Many Scientists?,"
Scientific American, February 22, 2010.
-------------------------------------------------------------------------------------------------In today's excerpt - a fraction of a second:
"What happens in subsections of seconds? In a tenth of a second, we
find the proverbial 'blink of an eye,' for that's how long the act
takes. In a hundredth of a second, a hummingbird can beat its wings
once. ... A millisecond, 10-3 seconds, is the time it takes a typical
camera strobe to flash. Five-thousandths of a second is also the time
it takes a Mexican salamander ... to snag its prey.
"In one microsecond, 10-6 seconds, nerves can send a message from that
pain in your neck to your brain. On the same scale, we can illuminate
the vast difference between the speed of light and that of sound: in
one microsecond, a beam of light can barrel down the length of three

of our metric-resistant football fields, while a sound wave can barely


traverse the width of a human hair.
"Yes, time is fleeting, so make every second and every partitioned
second count, including nanoseconds, or billionths of a second, or
10-9 seconds. Your ordinary computer certainly does. In a nanosecond,
the time it takes you to complete one hundred-millionth of an eye
blink, a standard microprocessor can perform a simple operation:
adding together two numbers. ... The fastest computers perform their
calculations in picoseconds, or trillionths of a second, that is,
10-12 seconds. ...
"Ephemera, however, are all relative. When physicists, with the aid of
giant particle accelerators, manage to generate traces of a subatomic
splinter called a heavy quark, the particle persists for a picosecond
before it decays adieu. Granted, a trillionth of a second may not
immediately conjure Methuselah or Strom Thurmond to mind, but Dr.
[Robert] Jaffe observed that the quark fully deserves its
classification among physicists as a long-lived, 'stable' particle.
During its picosecond on deck, the quark completes a trillion, or
1012, extremely tiny orbits. By contrast our seemingly indomitable
Earth has completed a mere 5 x 109 orbits around the sun in its 5
billion years of existence, and is expected to tally up only maybe
another 10 billion laps before the solar system crumples and dies. ...
In a very real sense, then, our solar system is far less 'stable' than
particles like the heavy quark. ...
"Scaling down to an even less momentous moment, we greet the
attosecond, a billionth of a billionth of a second, or 10-18 seconds.
The briefest events that scientists can clock, as opposed to
calculate, are measured in attoseconds. It takes an electron twentyfour attoseconds to complete a single orbit around a hydrogen atom - a
voyage that the electron makes about 40,000 trillion times per second.
There are more attoseconds in a single minute than there have been
minutes since the birth of the universe.
"Still, physicists keep coming back to the nicking of time. In the
1990s, they inducted two new temporal units into the official lexicon,
which are worth knowing for their appellations alone: the zeptosecond,
or 10-21 seconds, and the yoctosecond, or 10-24 seconds. The briskest
time span recognized to date is the chronon, or Planck time, and it
lasts about 5 x 10-44 seconds. This is the time it takes light to
travel what could be the shortest possible slice of space, the Planck
length, the size of one of the hypothetical 'strings' that some
physicists say lie at the base of all matter and force in the
universe."
Natalie Angier, The Canon, Houghton Mifflin, Copyright 2007 by Natalie
Angier, pp. 77-78.

-------------------------------------------------------------------------------------------------In today's excerpt - truffles, the revered culinary delicacies, are


symbiotic with trees, and evolved their rich aromas as an enticement
to foraging animals to aid in their spore disbursement:
"Throughout history, truffles have appeared on the menu and in
folklore. The Pharaoh Khufu served them at his royal table. Bedouins,
Kalahari Bushmen and Australian Aborigines have hunted them for
countless generations in deserts. The Romans savored them and thought
they were produced by thunder. Modern epicures prize truffles for
their earthy aroma and flavor and are willing to pay steep prices at
the market - recently more than $3,000 per kilogram for the Italian
white variety. Yet despite humanity's abiding interest in the fungi,
much about their biology has remained veiled in mystery. ...
"Truffles, like mushrooms, are the fruit of fungi. These fleshy organs
are temporary reproductive structures that produce spores, which
eventually germinate and give rise to new offspring. What sets
truffles apart from mushrooms is that their spore-laden fruit forms
below ground rather than above. ...
"All truffles and mush rooms produce networks of filaments, or hyphae,
that grow between plant rootlets to form a shared absorptive organ
known as a mycorrhiza. Thus joined, the fungus provides the plant with
precious nutrients and water, its tiny hyphae able to reach into
pockets of soil inaccessible to the plant's much larger roots. The
plant, in turn, furnishes its consort with sugars and other nutrients
that it generates through photosynthesis - products that the fungus
needs but cannot produce on its own because it does not
photosynthesize. So beneficial is this partnership that nearly all
trees and other woody plants re quire it for survival, as do the
associated fungi. Most herbaceous plants (those that do not have a
permanent woody stem aboveground) form mycorrhizae too, albeit with
different fungi. ...
"Given that truffles require aboveground dispersal of their spores to
propagate, why would natural selection favor the evolution of species
that hide underground? Consider the reproductive tactic of mushrooms.
Mushrooms ... all have fruiting bodies that can discharge spores
directly into the air. ... It is a highly effective approach.
"The mushroom strategy is not foolproof, how ever. Most mushrooms have
little defense against environmental hazards such as heat, drying
winds, frost and grazing animals. Every day a few spores mature and
are discharged. But if in clement weather dries or freezes a mushroom,
spore production usually grinds to a halt.
"Where such hazards are commonplace, new evolutionary adaptations have

arisen. The most successful alternative has been for the fungus to
fruit underground. Once the soil is wet enough for the subterranean
fruiting body to form, it is insulated from vagaries of weather. The
truffle develops with relative impunity, continuing to produce and
nurture its spores even when aboveground conditions become intolerable
to mush rooms. At first glance, the truffle's solution might seem
facile. The form of a truffle is visibly less complex than that of a
mushroom. No longer does the fungus need to expend the energy required
to push its spore-bearing tissues aboveground. The truffle is but a
lump of spore-bearing tissue, usu ally enclosed by a protective skin.
"The problem is that the truffles cannot them selves liberate their
spores, trapped as they are in their underground realm. That feat
demands an alternative dispersal system. And therein lies the
complexity of the truffle's scheme. Over mil lions of years, as
truffles retreated underground, mutations eventually led to the
formation of aromatic compounds attractive to animals. Each truffle
species has its own array of aromatics that are largely absent in
immature specimens but intensify and emerge as the spores mature. ...
When an animal [is attracted by the aroma and] eats a truffle, most of
the flesh is digested, but the spores pass through unharmed and are
defecated on the ground, where they can germinate if the conditions
are right."
James M. Trappe and Andrew W. Claridge, "The Hidden Life of Truffles,"
Scientific American, April 2010, pp. 78-81.
-------------------------------------------------------------------------------------------------In today's excerpt - in the United States, the President is elected by
an Electoral College, which was the bizarre and contorted invention of
the framers of the Constitution intended as a compromise between those
who objected to the legislature electing the President and those who
objected to the people electing the President. This unsatisfying
arrangement was partially overcome later as a party system emerged and as states began to mandate that their Electors cast their ballots
solely for the candidates who won the most votes in that state:
"The delegates [to the Constitutional Convention in Philadelphia]
spent much of the next week and a half in a puzzled discussion of how
to elect or appoint the single person who would wield the 'executive
power.' There were two obvious possibilities - election by the
legislature or by the people - and one contrived alternative that grew
attractive whenever the defects of the other methods became apparent,
which they quickly did. Election by the legislature had the advantage
of leaving the choice up to the nation's best-informed leaders. But
because the framers were intent on making the president politically
independent of the legislature, the victorious candidate could serve
only a single term, lest he become a toady to a dominant faction -

which would seem to deny the republic the potential benefit of


experience gained in office.
"Popular election posed two major problems. First, it would clearly
favor candidates from northern states, because with a single national
constituency, the enslavement of much of the southern population would
always make the free citizens of the North a majority of the
electorate. More important, the framers worried that voters would
naturally prefer candidates from their own states and ignore
contenders from others, making it difficult for anyone to gain a
majority without some costly cycle of repeated elections.
"In response to these doubts, the framers hit upon the idea of
appointing a select corps of electors, well-informed citizens who
might make a knowledgeable choice without gaining any lasting
political influence of their own. For a moment this notion became
almost a panacea - until the framers
started doubting that these electors 'would be men of the first or
even the second rank.' The delegates finished this round of debate
where they began it, with an executive appointed by the legislature
for a single term of seven years.
"[Weeks later] the curiously named Committee on Postponed Parts ...
created the electoral scheme that came to be known as the Electoral
College - a college that could never meet as one deliberative body,
but could gather only as separate faculties in the individual states,
vote on the same day, and then disband. The electoral scheme combined
the two major decisions on representation, which the framers, their
tempers having cooled, were now more disposed to treat as compromises
than they had been in July. Each state would get a number of electors
equal to its total membership in Congress. The most populous states
would have the advantage in promoting the candidates they favored, or
at least in making front-runners. In the event that no candidate
received a majority of electoral votes -a situation which many
delegates thought would be the rule rather than the exception - the
choice would devolve on the Senate, where the states would vote
equally. Incumbents would also be free to seek reelection.
"But here lay a problem: the Senate and president were now going to
share the treaty-making and appointment powers. How could the
president exercise independent judgment when decisions in these areas
needed the approval of the body that had already elected and would
possibly reelect him? It took three days of debate for Roger Sherman
to hit upon an ingenious scheme: let the House select a president from
the five leading candidates to emerge from the first round of
electoral voting, but require its members to vote as delegations
rather than individuals, so that each state would have one vote. This
allowed the Electoral College to replicate the earlier compromises
over representation while allowing the president to remain politically
independent of the Senate."

Jack N. Rakove, The Annotated U.S. Constitution and Declaration of


Independence, Belknap Harvard, Copyright 2009 by the President and
Fellows of Harvard College, pp. 40-46.
-------------------------------------------------------------------------------------------------In today's excerpt - drug gangs in Mexico, whose drug wars have made
parts of it among the most dangerous places on Earth today. Here we
find the Zetas, who have made their base of operations the Barrancas the forbidding terrain in and around Mexico's Copper Canyon:
"Because the Barrancas are impossible to police, they've become a base
for two rival drug cartels, Los Zetas and the New Bloods. Both were
manned by ex-Army Special Forces and were absolutely ruthless; the
Zetas were notorious for plunging uncooperative cops into burning
barrels of diesel fuel and feeding captured rivals to the gang's
mascot - a Bengal tiger. After the victims stopped screaming, their
scorched and tiger-gnawed heads were carefully harvested as marketing
tools; the cartels liked to mark their territory by, in one case,
impaling the heads of two police officers outside a government
building with a sign in Spanish reading LEARN SOME RESPECT. Later that
same month, five heads were rolled onto the dance floor of a crowded
nightclub. Even way out here on the fringes of the Barrancas, some six
bodies were turning up a week....
"If Mexico's drug gangs hated anything as much as cops, it was singers
and reporters. Not singers in any slang sense of snitches or stool
pigeons; they hated real, guitar-strumming, love-song-singing
crooners. Fifteen singers were executed by drug gangs in just eighteen
months, including the beautiful Zayda Pefia, the twenty-eight-year-old
lead singer of Zayda y Los Culpables, who was gunned down after a
concert; she survived, but the hit team tracked her to the hospital
and blasted her to death [in 2007] while she was recovering from
surgery. The young heartthrob Valentin Elizalde was killed [in 2006]
by a barrage of bullets from an AK-47 just across the border from
McAllen, Texas, and Sergio Gomez was killed [in 2007] shortly after he
was nominated for a Grammy; his genitals were torched, then he was
strangled to death and dumped in the street. What doomed them, as far
as anyone could tell, was their fame, good looks, and talent; the
singers challenged the drug lords' sense of their own importance, and
so were marked for death.
"The bizarre fatwa on balladeers was emotional and unpredictable, but
the contract on reporters was all business. News articles about the
cartels got picked up by American papers, which embarrassed American
politicians, which put pressure on the Drug Enforcement Administration
to crack down. Infuriated, the Zetas threw hand grenades into
newsrooms, and even sent killers across the U.S. border to hunt down

meddlesome journalists. After thirty reporters were killed in six


years, the editor of the Villahermosa newspaper found the severed head
of a low-level drug soldier outside his office [in 2008] with a note
reading, 'You're next.' The death toll had gotten so bad, Mexico would
eventually rank second only to Iraq in the number of killed or
kidnapped reporters."
Christopher McDougall, Born to Run, Knopf, Copyright 2009 by
Christopher McDougall, pp. 21-22
-------------------------------------------------------------------------------------------------In today's excerpt - some members of an emerging class of very long
distance runners known as ultrarunners have begun to advocate running
barefoot or in thin-soled shoes:
"Running shoes may be the most destructive force to ever hit the human
foot. ... Consider these words by Dr. Daniel Lieberman, a professor of
biological anthropology at Harvard University: 'A lot of foot and knee
injuries that are currently plaguing us are actually caused by people
running with shoes that actually make our feet weak, cause us to overpronate, give us knee problems. Until 1972, when the modem athletic
shoe was invented by Nike, people ran in very thin-soled shoes, had
strong feet, and had much lower incidence of knee injuries.' ...
"We've shielded our feet from their natural position by providing more
and more support," [Stanford track head coach Vin] Lananna insisted.
That's why he made sure his runners always did part of their workouts
in bare feet on the track's infield. ... 'I think you try to do all
these corrective things with shoes and you overcompensate. You fix
things that don't need fixing. If you strengthen the foot by going
barefoot, I think you reduce the risk of Achilles and knee and plantar
fascia problems.'
" 'Risk' isn't quite the right term; it's more like 'dead certainty.'
Every year, anywhere from 65 to 80 percent of all runners suffer an
injury. That's nearly every runner, every single year. No matter who
you are, no matter how much you run, your odds of getting hurt are the
same. It doesn't matter if you're male or female, fast or slow, pudgy
or ripped as a racehorse, your feet are still in the danger zone.
Maybe you'll beat the odds if you stretch like a swami? Nope. In a
1993 study of Dutch athletes published in The American Journal of
Sports Medicine, one group of runners was taught how to warm up and
stretch while a second group received no 'injury prevention' coaching.
Their injury rates? Identical. Stretching came out even worse in a
follow-up study performed the following year at the University of
Hawaii; it found that runners who stretched were 33 percent more
likely to get hurt. ...

"In fact, there's no evidence that running shoes are any help at all
in injury prevention. ... Runners wearing top-of-the-line shoes are
123 percent more likely to get injured than runners in cheap shoes,
according to a study led by Bernard Marti, M.D., a preventativemedicine specialist at Switzerland's University of Bern. ...
" 'The deconditioned musculature of the foot is the greatest issue
leading to injury, and we've allowed our feet to become badly
deconditioned over the past twenty-five years,' [the Irish physical
therapist] Dr. Gerard Hartmann said. ... 'Putting your feet in shoes
is similar to putting them in a plaster cast,' Dr. Hartmann said. 'If
I put your leg in plaster, we'll find forty to sixty percent atrophy
of the musculature within six weeks. Something similar happens to your
feet when they're encased in shoes.' When shoes are doing the work,
tendons stiffen and muscles shrivel. Feet live for a fight and thrive
under pressure; let them laze around, as [miler] Alan Webb discovered,
and they'll collapse. Work them out, and they'll arc up like a
rainbow. ...
"[The change began in 1962 when Nike co-founder Bill Bowerman created]
the most cushioned running shoe ever created - the Cortez. ...
Bowerman's deftest move was advocating a new style of running that was
only possible in his new style of shoe. The Cortez allowed people to
run in a way no human safely could before: by landing on their bony
heels. Before the invention of a cushioned shoe, runners through the
ages had identical form: Jesse Owens, Roger Bannister, Frank Shorter,
and even Emil Zatopek all ran with backs straight, knees bent, feet
scratching back under their hips. They had no
choice: the only shock absorption came from the compression of their
legs and their thick pad of midfoot fat. ...
"But Bowerman had an idea: maybe you could grab a little extra
distance if you stepped ahead of your center of gravity. Stick a chunk
of rubber under the heel, he mused, and you could straighten your leg,
land on your heel, and lengthen your stride. ... He believed a 'heelto-toe' stride would be 'the least tiring over long distances.'If
you've got the shoe for it."
Christopher McDougall, Born to Run, Knopf, Copyright 2009 by
Christopher McDougall, pp. 169-181.
-------------------------------------------------------------------------------------------------In today's excerpt - massive volcano eruptions have caused the
temperature of the earth to cool significantly by blocking light from
the sun:
"The connection between volcanoes and climate is hardly a new
idea. ... Benjamin Franklin, wrote what seems to be the first

scientific paper on the topic. In 'Meteorological Imaginations and


Conjectures,' published in 1784, Franklin posited that recent volcanic
eruptions in Iceland had caused a particularly harsh winter and a cool
summer with 'constant fog over all Europe, and [a] great part of North
America.' In 1815, the gargantuan eruption of Mount Tambora in
Indonesia produced 'The Year Without a Summer,' a worldwide disaster
that killed crops, prompted widespread starvation and food
riots, and brought snow to New England as late as June.
"As Nathan Myhrvold [of Intellectual Ventures and formerly of
Microsoft] puts it: 'All really big-ass volcanoes have some climate
effects.'
"Volcanoes erupt all the time, all over the world, but truly 'big-ass'
ones are rare. If they weren't - well, we probably wouldn't be around
to worry about global warming. The anthropologist Stanley Ambrose has
argued that a supervolcanic explosion at Lake Toba on Sumatra, roughly
seventy thousand years ago, blocked the sun so badly that it triggered
an ice age that nearly wiped out Homo sapiens. What distinguishes a
big-ass volcano isn't just how much stuff it ejaculates, but where the
ejaculate goes. The typical volcano sends sulfur dioxide into the
troposphere, the atmospheric layer closest to the
earth's surface. This is similar to what a coal-burning power plant
does with its sulfur emissions. In both cases, the gas stays in the
sky only a week or so before falling back to the ground as acid rain,
generally within a few hundred miles of its origin.
"But a big volcano shoots sulfur dioxide far higher, into the
stratosphere. That's the layer that begins at about seven miles above
the earth's surface, or six miles at the poles. Above that threshold
altitude, there is a drastic change in a variety of atmospheric
phenomena. The sulfur dioxide, rather than quickly returning to the
earth's surface, absorbs stratospheric water vapor and forms an
aerosol cloud that circulates rapidly, blanketing most of the globe.
In the stratosphere, sulfur dioxide can linger for a year or more, and
will thereby affect the global climate.
"That's what happened in 1991 when Mount Pinatubo erupted in the
Philippines. Pinatubo made Mount St. Helens look like a hiccup; it put
more sulfur dioxide into the stratosphere than any volcano since
Krakatoa, more than a century earlier. In the period between those two
eruptions, the state of science had progressed considerably. A
worldwide cadre of scientists was on watch at Pinatubo, equipped with
modern technology to capture every measurable piece of data. The
atmospheric aftereffects of Pinatubo were undeniable: a decrease in
ozone, more diffuse sunlight, and, yes, a sustained drop in global
temperature."
Author: Steven D. Levitt and Stephen J. Dubner
Title: Superfreakonomics

Publisher: HarperCollins
Date: Copyright 2009 by Steven D. Levitt and Stephen J. Dubner
Pages: 189-190
-------------------------------------------------------------------------------------------------In today's excerpt - in 1507, inspired by the still-fresh discovery of
the New World, a small band of German humanist scholars in Saint-Die,
Alsace, decided to make a new world map with accompanying commentary
to be sold and studied throughout the cities and universities of
Europe. They considered the New World to be the "fourth part" of the
world, after Europe, Asia, and Africa, and from their distant outpost
believed that Columbus had merely discovered islands west of the
Canaries, and the true discoverer of this massive new continent was
Amerigo Vespucci. So they coined a name for this fourth part of the
world and printed it on their map - America:
"[Matthias Ringmann and Martin Waldseemuller and their colleagues]
decided to produce a geographical package consisting of three parts: a
huge new map of the whole world, dedicated to Maximilian I (the Holy
Roman Emperor and thus the symbolic head of the Germanic people), that
would sum up ancient and modern geographical learning; a tiny version
of that map, printed as a series of globe gores that could be pasted
onto a small ball, creating the world's first mass-produced globe; and
a sort of users' guide to those two maps, titled Introduction to
Cosmography. ... It was a profound moment in the history of
cartography - and in the larger history of ideas. ...
"The bulk of the work - the design of the map and the globe, and the
writing of the Introduction to Cosmography - fell to Waldseemuller and
Ringmann. Ringmann took the lead in writing the book. Libraries today
credit Waldseemuller as the author, but the book actually names no
author, and Ringmann's fingerprints appear all over it. ... Ringmann
the writer, Waldseemuller the mapmaker. ...
"Why dwell on this question of authorship? Because whoever wrote the
Introduction to Cosmography almost certainly coined the name America
(which would have been pronounced 'Amer-eeka'). Here, too, the balance
tilts in Ringmann's favor. Consider the famous passage in which the
author steps forward to explain and justify the use of the name.
" 'These parts have in fact now been more widely explored, and a
fourth part has been discovered by Amerigo Vespucci (as will be heard
in what follows). Since both Asia and Africa received their names from
women, I do not see why anyone should rightly prevent this [new part]
from being called Amerigen - the land of Amerigo, as it were - or
America, after its discoverer, Americus, a man of perceptive
character.'

"This sounds a lot like Ringmann, who is known to have spent time
mulling over the reasons that concepts and places so often had the
names of women. 'Why are all the virtues, the intellectual qualities,
and the sciences always symbolized as if they belonged to the feminine
sex?' he would write in a 1511 essay on the Muses. 'Where does this
custom spring from - a usage common not only to the pagan writers but
also to the scholars of the church? It originated from the belief that
knowledge is destined to be fertile of good works. ... Even the three
parts of the old world received the name of women:' The naming-ofAmerica passage reveals Ringmann's hand in other ways, too. In his
poetry and prose Ringmann regularly amused himself by making up words,
by punning in different languages, and by investing his writing with
hidden meanings for his literary friends to find and savor. The
passage is rich in just this sort of wordplay, much of which requires
a familiarity with Greek, a language Waldseemuller didn't know.
"The key to the passage, almost always ignored or overlooked, is the
curious name Amerigen - a coinage that involves just the kind of
multifaceted, multilingual punning that Ringmann frequently indulged
in. The word combines Amerigo with gen, a form of the Greek word for
'earth,' creating the meaning that the author goes on to propose 'the land of Amerigo.' But the word yields other meanings, too. Gen
can also mean 'born' in Greek, and the word ameros can mean "new,"
making it possible to read Amerigen as not only 'land of Amerigo' but
also 'born new' - a double entendre that would have delighted
Ringmann, and one that very nicely complements the idea of fertility
that he associated with female names. The name may also contain a play
on meros, a Greek word that can sometimes be translated as 'place.'
Here Amerigen becomes A-meri-gen, or 'No-place-land': not a bad way to
describe a previously unnamed continent whose geography is still
uncertain."
Author: Toby Lester
Title: The Fourth Part of the World
Publisher: Free Press
Date: Copyright 2009 by Toby Lester
Pages: 355-357
-------------------------------------------------------------------------------------------------In today's excerpt - though most historians have praised Justinian I,
emperor of the Eastern Roman Empire (483 - 565 AD), because of the
great monuments he built, martial victories he claimed, and legal code
he sponsored, and have blamed his failures on the plague, in reality
he ruined the finances of his Empire through his profligate spending
on these wars and monuments:
"Humankind does not live by edifices alone. The constant temptation of
ancient monarchs was to seize grandeur rather than earn it, by

coercing resources from the margins to the center, to invest in


ostentation and display. The Justinian who is remembered for what he
built is not the Justinian of history - or, rather, is an embodiment
of the weakness of that Justinian. To see Hagia Sophia and the great
church Justinian built in Jerusalem as testimonies to his weakness and
shortsightedness is to see them as they really are. The outsize scale
of his buildings shouts aloud the ego and insecurity of their creator.
Justinian and his great empire proved vulnerable to the tiniest of
enemies, the plague bacillus.
"The years in which his military campaigns in the west went bad and he
found himself in his Italian quagmire were dismal ones spent close to
home. So much Justinian scholarship has concentrated on the selfglorifying legal, military, and architectural self-assertion of the
early years that an important recent scholarly work was impishly
called 'The Other Age of Justinian' - precisely to signal the long
years of frustration and decline that formed part of the career of
this grandiose monarch. ...
"Ancient empires kept abundant financial records, but hardly any of
those documents survive. (Palaces and their archives are designed to
be plundered, sooner or later.) A recent scholar has made some sober
estimates of the profligacy of Justinian's expenditures. A summary of
the bad news runs something like this:
Justinian is reported to have begun his reign with 28 million solidi
'in the bank,' reserves that [his predecessors] Anastasius built up
and Justin preserved.
Justinian's wars cost him about 36 million solidi, with some
interesting proportions:
- About 5 million on the eastern front
- About 8 million in Africa, half of it after 'victory' was
achieved in Belisarius's short campaign
- About 21.5 million in Italy, fully half of it in the last
two ruinous years 552-554
By comparison, his annual revenues for a good year of his reign
amounted to about 5 million solidi; when Africa and Italy were added
to his domains, they brought about another ten percent each, or
500,000 solidi each. Most of that revenue was expended locally on
governing those restive provinces
"When he began to feel the financial pressures of such extravagant
wars, Justinian took the natural action of a martial but improvident
ruler: he plundered his own subjects and attacked his own currency,
progressively thinning out the amount of bronze in the coinage and
profiting handsomely at the treasury as a result. The effects of such
a devaluation were slow but inevitable.
"Justinian's successor inherited (with Italy and Africa) greater
responsibilities than Justinian began with, and had far more

restricted financial capacity to address them. No emperor at


Constantinople after Justinian had the opportunity for both lavish
construction and warfare that Justinian had squandered so unwisely."
Author: James J. O'Donnell
Title: The Ruin of the Roman Empire
Publisher: HarperCollins
Date: Copyright 2008 by James J. O'Donnell
Pages: 285-289
-------------------------------------------------------------------------------------------------In today's excerpt - IQ test results:
"Children develop only as the environment demands development. In
1981, New Zealand-based psychologist James Flynn discovered just how
profoundly true that statement is. Comparing raw IQ scores over nearly
a century, Flynn saw that they kept going up: every few years, the new
batch of IQ test takers seemed to be smarter than the old batch.
Twelve-year-olds in the 1980s performed better than twelve-year-olds
in the 1970s, who performed better than twelve-year-olds in the 1960s,
and so on. This trend wasn't limited to a certain region or culture,
and the differences were not trivial. On average, IQ test takers
improved over their predecessors by three points every ten years - a
staggering difference of eighteen points over two generations.
"The differences were so extreme, they were hard to wrap one's head
around. Using a late-twentieth-century average score of 100, the
comparative score for the year 1900 was calculated to be about 60 leading to the truly absurd conclusion, acknowledged Flynn, 'that a
majority of our ancestors were mentally retarded.' The so-called Flynn
effect raised eyebrows throughout the world of cognitive research.
Obviously, the human race had not evolved into a markedly smarter
species in less than one hundred years. Something else was going on.
"For Flynn, the pivotal clue came in his discovery that the increases
were not uniform across all areas but were concentrated in certain
subtests. Contemporary kids did not do any better than their ancestors
when it came to general knowledge or mathematics. But in the area of
abstract reasoning, reported Flynn, there were 'huge and embarrassing'
improvements. The further back in time he looked, the less test takers
seemed comfortable with hypotheticals and intuitive problem solving.
Why? Because a century ago, in a less complicated world, there was
very little familiarity with what we now consider basic abstract
concepts. '[The intelligence of] our ancestors in 1900 was anchored in
everyday reality,' explains Flynn. 'We differ from them in that we can
use abstractions and logic and the hypothetical ... Since 1950, we
have become more ingenious in going beyond previously learned rules to
solve problems on the spot.'

"Examples of abstract notions that simply didn't exist in the minds of


our nineteenth-century ancestors include the theory of natural
selection (formulated in 1864), and the concepts of control group
(1875) and random sample (1877). A century ago, the scientific method
itself was foreign to most Americans. The general public had simply
not yet been conditioned to think abstractly.
"The catalyst for the dramatic IQ improvements, in other words, was
not some mysterious genetic mutation or magical nutritional supplement
but what Flynn described as 'the [cultural] transition from prescientific to post-scientific operational thinking.' Over the course
of the twentieth century, basic principles of science slowly filtered
into public consciousness, transforming the world we live in. That
transition, says Flynn, 'represents nothing less than a liberation of
the human mind.'
"The scientific world-view, with its vocabulary, taxonomies, and
detachment of logic and the hypothetical from concrete referents, has
begun to permeate the minds of post-industrial people. This has paved
the way for mass education on the university level and the emergence
of an intellectual cadre without whom our present civilization would
be inconceivable.
"Perhaps the most striking of Flynn's observations is this: 98 percent
of IQ test takers today score better than the average test taker in
1900. The implications of this realization are extraordinary. It means
that in just one century, improvements in our social discourse and our
schools have dramatically raised the measurable intelligence of almost
everyone.
"So much for the idea of fixed intelligence."
Author: David Shenk
Title: The Genius in All of Us
Publisher: Doubleday
Date: Copyright 2010 by David Shenk
Pages: 35-37
-------------------------------------------------------------------------------------------------In today's excerpt - the Erie Canal. In 1825, Philadelphia was still
the largest city in America, with New York City and Boston close
behind. But then New York opened the Erie Canal, a massive government
project that connected its ports to the Midwest via the Great Lakes.
Scorned derisively "Clinton's Folly," or "Clinton's Ditch" after New
York Governor and canal proponent DeWitt Clinton, when it opened New
York City almost instantly became the greatest boomtown the world had
ever seen:

"In the early nineteenth century, New York was a large town, but it
had a number of peers, including Philadelphia. The key decision that
vaulted New York to prominence was the decision to build the Erie
Canal. In John Steele Gordon's account of America's rise to an 'empire
of wealth,' he noted the importance of that canal.
"The Erie Canal ... turned New York into the greatest boomtown the
world has ever known. Manhattan's population grew to 202,000 in 1830,
313,000 in 1840, 516,000 in 1850, and 814,000 in 1860. ... In 1800
about 9 percent of the country's exports passed through the port of
New York. By 1860 it was 62 percent, as the city became what the
Boston poet and physician Oliver Wendell Holmes (the father of the
Supreme Court justice) rather grumpily described as 'that tongue that
is licking up the cream of commerce and finance of a continent.'
"These figures are for Manhattan - the surrounding parts of what is
now New York City were growing as well. This explosion was all due to
the Erie Canal. Before the canal, it had taken three weeks at a cost
of $120 to move a ton of flour from Buffalo to New York City. After
the canal's construction, it took eight days and cost $6. Gordon
remarked that, before the canal was even completed, 'the Times of
London saw it coming, writing that year [1822] that the canal would
make New York City the 'London of the New World.' The Times was right.
It was the Erie Canal that gave the Empire State its commercial empire
and made New York the nation's imperial city. That was when the
position of New York as an economic powerhouse was first firmly
established, and the title has yet to be relinquished."
Author: Douglas Wilson
Title: Five Cities That Ruled The World
Publisher: Thomas Nelson
Date: Copyright 2009 by Douglas Wilson
Pages: 164-165
-------------------------------------------------------------------------------------------------In today's excerpt - the toilet. Thomas Crapper became very wealthy by
inventing the Marlboro Silent Water Waste Preventer:
"Perhaps no word in English has undergone more transformations in its
lifetime than 'toilet'. Originally, in about 1540, it was a kind of
cloth, a diminutive form of "toile", a word still used to describe a
type of linen.
"Then it became a cloth for use on dressing tables. Then it became the
items on the dressing table (whence 'toiletries'). Then it became the
dressing table itself, then the act of dressing, then the act of
receiving visitors while dressing, then the dressing room itself, then

any kind of private room near a bedroom, then a room used


lavatorially, and finally the lavatory itself. Which explains why
'toilet water' in English can describe something you would gladly daub
on your face or, simultaneously, 'water in a toilet.' ...
"Most sewage still went into cesspits, but these were commonly
neglected and the contents often seeped into neighbouring water
supplies. In the worst cases they overflowed. The people who cleaned
cesspits were known as nightsoil men, and if there has ever been a
less enviable way to make a living I believe it has yet to be
described. They worked in teams of three or four. One man - the most
junior, we may assume - was lowered into the pit itself to scoop waste
into buckets. A second stood by the pit to raise and lower the
buckets, and the third and fourth carried the buckets to a waiting
cart. Workers ran the risk of asphyxiation and even of explosions
since they worked by the light of a lantern in powerfully gaseous
environments.
"In St Giles, the worst of London"s rookeries - scene of Hogarth's Gin
Lane - 54,000 people were crowded into just a few streets. Such masses
of humanity naturally produced enormous volumes of waste - far more
than any system of cesspits could cope with. In one report, an
inspector recorded visiting two houses in St Giles where the cellars
were filled with human waste to a depth of three feet. The river was a
perpetual 'flood of liquid manure,' as one observer put it. The
streams that fed into the Thames were often even worse than the Thames
itself. The river Fleet was in 1831 'almost motionless with
solidifying filth.'
"Into this morass came something that proved, unexpectedly, to be a
disaster: the flush toilet.Flush toilets of a type had been around for
some time. The very first was built by John Harington, godson to Queen
Elizabeth I. When Harington demonstrated his invention to her in 1597,
she expressed great delight and had it immediately installed in
Richmond Palace. But it was a novelty well ahead of its time and
almost 200 years passed before Joseph Bramah, a cabinet maker and
locksmith, patented the first modern flush toilet in 1778. It caught
on in a modest way. Many others followed. But early toilets often
didn't work well. Sometimes they backfired, filling the room with even
more of what the horrified owner had very much hoped to be rid of.
Until the development of the U-bend and water trap - that little
reservoir of water that returns to the bottom of the bowl after each
flush - every toilet bowl acted as a conduit to the smells of cesspit
and sewer. The backwaft of odors, particularly in hot weather, could
be unbearable.
"This problem was resolved by one of the great and surely most
extraordinarily appropriate names in history, that of Thomas Crapper
(1837-1910), who was born into a poor family in Yorkshire and
reputedly walked to London at the age of 11. There he became an

apprentice plumber in Chelsea. Crapper invented the classic and still


familiar toilet with an elevated cistern activated by the pull of a
chain. Called the Marlboro Silent Water Waste Preventer, it was clean,
leak-proof, odor-free and wonderfully reliable, and their manufacture
made Crapper very rich and so famous that it is often assumed that he
gave his name to the slang term "crap" and its many derivatives.
"In fact, 'crap' in the lavatorial sense is very ancient and 'crapper'
for a toilet is an Americanism not recorded by the Oxford English
Dictionary before 1922. Crapper's name, it seems, was just a happy
accident."
Author: Bill Bryson
Title: "The history of the toilet," (from the upcoming book At Home: A
Short History of Private Life, Doubleday)
Publisher: guardian.co.uk
Date: May 17, 2010
-------------------------------------------------------------------------------------------------In today's excerpt - the Dutch invent crop rotation in the late 1500s.
For thousands of years, all societies had been subsistence societies,
barely able to feed their inhabitants since low agricultural
productivity meant a permanent scarcity of labor and land. This left
precious few resources available for invention and innovation, but
then came the breakthrough - because of their extreme scarcity of
land, the Dutch were driven to find a better way to use land, freeing
resources and setting the stage for the Industrial Revolution:
"Agriculture throughout the world was woefully unproductive because
cropping drained the land of its fertility. The traditional remedy for
soil exhaustion was allowing land to become fallow to recapture its
fertility, but this took a third or a quarter of acres under tillage
out of production. Farmers could also restore fertility by adding
nitrogen to the soil. Their principal source of this came from animals
that unfortunately had to be to stay alive and defecate, taking even
more land away from producing food for the people. Breaking through
this bind of declining soil fertility took a bundle of mutually
enhancing practices. Fortunately Dutch farmers had been experimenting
with possible improvements for many decades.
"Some farmers in the Netherlands realized that they could abandon the
old medieval practice of leaving a third of the land to lie fallow
each year. This move increased the number of tilled acres by a third.
Instead of the fallow rotation, they divided land into four parts,
rotating fields of grain, turnips, hay, and clover each season. Not
only did this increase the number of tilled acres by a third, but the
clover fed livestock after it had enriched the soil with its nitrogen
deposits. The virtuous circle of growth replaced the vicious circle of

decline. When some landlords and farmers responded to the possibility


of becoming more productive, they were taking the first permanent
steps away from the age-old economy of scarcity.
"English farmers copied the Dutch and succeeded in making their
agricultural base feed more and more people with fewer laborers and
less investment. Unlike the Dutch, the English had enough arable land
to grow the grains that fed the people as well as their livestock. The
Dutch could not produce what was needed to get their people through a
year. With their profits from trade, they could store grain, but this
lifesaving program got more and more expensive.
"While some English farmers copied the Dutch four-field rotation,
others adopted up-and-down husbandry. In this routine, a farmer would
crop his best land for three or four years and then put it in pasture
for another five, during which time the animal manure and nitrogenfixing crops would rebuild the fertility necessary for growing grains
again. As in the Dutch system, land was no longer left fallow but
always growing some crop, whether for animals or humans. Every element
on the farm was put to some use; every hand, given new tasks. These
innovations made urgent a farmer's attentiveness because of their
interlocking qualities. Both the Dutch and English began to flood
meadows to warm the soil in winter and extend the growing season. Over
the course of the century all these improvements raised the seed to
yield ratio, the labor to yield ratio, and the land to yield ratio. Or
more simply, they led to bigger harvests from fewer acres, less labor,
and fewer seeds."
Author: Joyce Appleby
Title: The Relentless Revolution
Publisher: Norton
Date: Copyright 2010 by Joyce Appleby
Pages: 73-74
-------------------------------------------------------------------------------------------------In today's excerpt - very early childhood learning:
"In the mid-1980s, Kansas psychologists Betty Hart and Todd Risley
realized that something was very wrong with Head Start, America's
program for children of the working poor. It manages to keep some lowincome kids out of poverty and ultimately away from crime. But for a
program that intervenes at a very young age and is reasonably well run
and generously funded - $7 billion annually - it doesn't do much to
raise kids' academic success. Studies show only 'small to moderate'
positive impacts on three- and four-year-old children in the areas of
literacy and vocabulary, and no impact at all on math skills.
"The problem, Hart and Risley realized, wasn't so much with the

mechanics of the program; it was the timing. Head Start wasn't getting
hold of kids early enough. Somehow, poor kids were getting stuck in an
intellectual rut long before they got to the program - before they
turned three and four years old. Hart and Risley set out to learn why
and how. They wanted to know what was tripping up kids' development at
such an early age. Were they stuck with inferior genes, lousy
environments, or something else?
"They devised a novel (and exhaustive) methodology: for more than
three years, they sampled the actual number of words spoken to young
children from forty-two families at three different socioeconomic
levels: (1) welfare homes, (2) working-class homes, and (3)
professionals' homes. Then they tallied them up.
"The differences were astounding. Children in professionals' homes
were exposed to an average of more than fifteen hundred more spoken
words per hour than children in welfare homes. Over one year, that
amounted to a difference of nearly 8 million words, which, by age
four, amounted to a total
gap of 32 million words. They also found a substantial gap in tone and
in the complexity of words being used.
"As they crunched the numbers, they discovered a direct correlation
between the intensity of these early verbal experiences and later
achievement. 'We were astonished at the differences the data
revealed,' Hart and Risley wrote in their book Meaningful Differences.
'The most impressive aspects [are] how different individual families
and children are and how much and how important is children's
cumulative experience before age 3.'
"Not surprisingly, the psychological community responded with a
mixture of interest and deep caution. In 1995, an American
Psychological Association task force wrote that 'such correlations may
be mediated by genetic as well as (or instead of) environmental
factors.' Note 'instead of.' In 1995, it was still possible for
leading research psychologists to imagine that better-off kids could
be simply inheriting smarter genes from smarter parents, that spoken
words could be merely a genetic effect and not a cause of anything.
"Now we know better. We know that genetic factors do not operate
'instead of' environmental factors, they interact with them."
Author: David Shenk
Title: The Genius in All of Us
Publisher: Doubleday
Date: Copyright 2010 by David Shenk
Pages: 37-39
--------------------------------------------------------------------------------------------------

In today's excerpt - most of America's oysters come from the Gulf


coast, in spite of what your menu might say. (A potentially troubling
fact for oyster lovers given the Deepwater Horizon disaster):
"The murkiness of the Galveston Bay, or 'turbidity,' as scientists
call it, came from suspended sediments and plankton. 'The Adriatic is
beautiful blue,' Croatian-born Galveston oysterman Misho Ivic told me,
'but there's nothing living in it. It's sterile. Galveston Bay looks
muddy because the water is
full of food. Good for the oysters, good for the crabs.'
"I didn't trust him, of course. East Coast and West Coast oystermen
say that the waters of the Gulf of Mexico are filthy. And maybe they
are. But oysters live in brackish water in freshwater estuaries, not
in the Gulf of Mexico. And the scientists I interviewed said that
Galveston Bay was in pretty good shape.
" 'We always fight the perception that the bay is polluted, but the
reality is that the water quality overall is good,' Scott Jones, water
and sediment quality coordinator for the Galveston Bay Estuary
Program, told me. He said dissolved oxygen levels have gone up
markedly in the last thirty years thanks to a cleanup of wastewater
treatment plants mandated by the Clean Water Act of 1972.
"Misho Ivic and a marine biologist named Dr. Sammy Ray put pollution
into a historical perspective for me by comparing Galveston Bay to
Chesapeake Bay. Chesapeake Bay produced millions of bushels of oysters
in the 1800s, before it was polluted. It now produces about 1 percent
of its historic peak. Conservationists in Maryland and Virginia are
making progress and the oyster harvests are increasing, but since the
surrounding wetlands were long ago destroyed, the long-term prospects
are limited.
"In 1900, Galveston Bay and a couple of other small bays in Texas
produced a record 3.5 million pounds of oyster meat. But modern
harvests regularly exceed that. In 2003, the largest harvest of
oysters ever recorded was taken - 6. 8 million pounds, nearly double
what was produced at the turn of the century.
"New York Harbor and Chesapeake Bay lost their oysters due to
industrial pollution. Galveston Bay is at its historic peak
production. In 2003, Misho Ivic's oyster company alone outproduced the
entire Chesapeake Bay. ... Places that were famous for their oysters
one hundred years ago, like Chincoteague Bay, Maryland, and Blue
Point, Long Island, aren't the centers of oyster production anymore.
But people still clamor to buy oysters with famous names, so oystermen
engage in a 'shell game,' if you'll pardon the pun. Texas oysters make
great stunt doubles. They're sold as 'Blue Points' in many oyster bars
across the country. They're also served in Washington, D.C., and

Maryland oyster bars, where people assume they're eating Chesapeake


Bay oysters.
"I once asked the waiter at a Houston chain restaurant called Willie
G's where the oysters came from. Hilariously, he told me the oysters I
was eating were Blue Points from Long Island. I asked him to bring me
the bag tag. By federal law, oysters must be packaged with a tag
stating their place of origin and date of harvest. This allows health
authorities to trace the origin of the oysters in case they cause any
illnesses. Oyster bars aren't required to show the tag to customers,
but if they refuse, it's usually because they're trying to put one
over on you. I bet my tablemate five bucks the oysters came from
Galveston Bay. That was some easy money.
"Check out the statistics on commercial oyster landings and you can
probably win a few wagers yourself: Annual totals for 2003 (pounds of
oyster meat) - Gulf Coast, 27 million; Pacific Coast, 11.5 million;
East Coast, 2.8 million."
Author: Robb Walsh
Title: Sex, Death & Oysters
Publisher: Counterpoint
Date: Copyright 2009 by Robb Walsh
Pages: 5-6, 9-10
-------------------------------------------------------------------------------------------------In today's excerpt - in Virginia, in the period immediately before the
American Revolution, dissenters from Anglican worship could still be
fined or even imprisoned. John Clay, a Baptist preacher who was the
father of legendary American statesman Henry Clay, was among those
jailed. Although "taxation without representation" was the ostensible
cause of the Revolution, it was deeply felt resentment from
Presbyterians, Baptists and others against this heavy-handed
Anglicanism that provided the Revolution with much of its urgency and
emotional weight, and preachers throughout America railed against King
George, the head of the Anglican Church, as the "Great Satan":
"Around the time of his marriage, [Virginian John Clay] received 'the
call.' Eventually he became the Baptists' chief apostle in Hanover
County, working to change attitudes that were not necessarily
irreligious but did find the Church of England emotionally
unsatisfying and spiritually moribund. After the Great Awakening
[which began in the 1730s] swept its revivalist fervor across the
country, Virginians found the mandatory nature of Anglican worship dissenters could be fined and even imprisoned - infuriating, and a
simmering discontent over the lack of religious freedom helped stoke
dissatisfaction with other aspects of British rule.

"Presbyterians became the dominant denomination in literate areas as


converts in the Tidewater and Piedmont were matched by Scots-Irish
migrations from Pennsylvania into the Shenandoah Valley. In the region
between - Henrico, Chesterfield, and Hanover counties - the less
literate gravitated to the Baptists, whose services were long on
emotion and short on complicated liturgical teachings.
"Because of this, the number of Baptists markedly increased in the
1760s and 1770s, particularly among lower-class whites and slaves.
Preachers could be unschooled and were always uncompensated, at least
by any hierarchical authority. They came to their pulpits after an
extraordinary religious experience referred to as 'the call.'
"After John Clay received the call, he organized churches in Henrico
and Hanover counties, including a large congregation at Winn's Church
in 1776. Most of his flock comprised a sect known as New Light
Baptists, not exactly economic levelers but noted for simple attire
and the practice of calling each other 'sister' and 'brother'
regardless of social rank or economic status. They were clearly more
democratic than class-conscious Anglicans, and congregations even
allowed slaves to participate in worship services. That eccentric
practice alone caused Anglican planter elites anxiety over the
influence of Baptists, a troubling, troublesome lot who made even
Presbyterians look respectable.
"Baptists took such contempt as a badge of honor. They and the
Presbyterians grew increasingly angry about the power of establishment
Anglicans, in particular evidenced by onerous taxes and reflexive
persecution. At least once John Clay himself felt the weight of
Anglican anger when he was jailed for his dissent. Such experiences,
though, fueled rather than suppressed enthusiasm for religious
liberty. As protests over British taxes became more strident, calls
for spiritual freedom matched them. The drive for independence gained
momentum, and the calls for disestablishing the Church of England
became more vocal."
Author: David S. Heidler and Jeanne T. Heidler
Title: Henry Clay
Publisher: Random House
Date: Copyright 2010 by David Heidler and Jeanne Heidler
Pages: 6-7
-------------------------------------------------------------------------------------------------In today's excerpt - the "beat cop." Community members regularly
lament the demise of the beat cop - the officer who knew everyone in
the neighborhood and who chastised wayward children and settled
disputes between neighbors and family members without ever having to
resort to making an arrest. But the idealized beat cop is a myth that

has never existed in the real world of constrained budget and backlogs
of unsolved crimes, according to John Timoney, four-star police chief
in the New York City Police Department, later Police Commissioner of
Philadelphia, and then Police Chief of Miami:
"Over the course of my career, [the lament I heard repeatedly from
community members was] 'the only thing I really want is a cop on the
beat, like the guy who patrolled the streets when I was growing up.'
"The first time I heard the lament regarding officers who knew their
community was when I was a young police officer walking a foot beat in
the South Bronx in the early 1970s. The sentiment seemed to make
sense, but as I thought back to when I was a young teenager growing up
in Washington Heights, I didn't remember a police officer walking the
beat. I do remember police officers in police cars who broke our chops
on a daily basis for playing stickball in the street or curveball
underneath Mrs. Lemondrop's window. I concluded that the reason I
didn't remember a specific police officer in my community on his foot
beat was because foot beats must have stopped in the late 1950s and
thus were a thing of the past. Fast-forward twenty years: as a captain
and later as a deputy chief, I continued to hear the same lament from
people who were aged forty or fifty - my age!
"In Philadelphia and then again in Miami, the longing for the days of
the foot beat officer who knew everyone in the neighborhood and who
chastised wayward children and settled disputes between neighbors and
family members without ever having to resort to making an arrest
continued to be voiced at community meetings. I vowed to myself that I
would find this ubiquitous foot beat officer. After much research,
what I did find was that this lament was not of recent vintage. The
case of Police Commissioner Louis Valentine is illustrative.
"Valentine entered the NYPD as a rookie in 1902. His rise through the
ranks was periodically stalled as he ran afoul of different police
administrations due to his desire to see a corruption-free NYPD.
Eventually, Valentine became the police commissioner under New York's
reform mayor, Fiorello LaGuardia. In his autobiography, Nightstick,
Commissioner Valentine lays out what his priorities were when he
became police commissioner in 1934. First and foremost among his goals
was to return to the days before he first came on 'the job,' about
1903, when the police officer on the beat knew everyone in the
neighborhood, and everybody in the neighborhood knew him. ... You get
my point.
"My research took me to Hollywood, where I think I found our missing
beat officer. His name was Officer McShane. He walked a foot beat in
the 1945 movie A Tree Grows in Brooklyn. Officer McShane knew the
problems of the people on his beat intimately. He was around day and
night, and he looked after the neighbors on his beat, including the
family with the alcoholic father and exasperated wife and two adorable

little girls. Eventually and predictably, the father dies from his
affliction and Officer McShane is there to ease the widow's pain. ...
"Yes, I found the beat officer, or should I say, I found the myth.
There is nothing wrong with this myth. It is really an ideal that most
people have regarding police officers in their communities. Most
people like police officers or want to like police officers. It is the
job of every police officer and every police chief to help make the
myth a reality, or at least make the ideal a goal."
Author: John F. Timoney
Title: Beat Cop to Top Cop
Publisher: University of Pennsylvania Press
Date: Copyright 2010 by University of Pennsylvania Press
Pages: 322-324.
-------------------------------------------------------------------------------------------------In today's excerpt - wealth without resources. In the 1500s and 1600s,
the experience of two countries seemed to defy all that had gone
before. Spain had amassed the largest supply of gold in history thanks
to its New World conquests, but saw inflation and near bankruptcy as a
result. And in Holland, the Dutch were gaining greater wealth than
most any country on earth by trading in fish and other mundane items in the beginnings of a strange new way that came to be known as a
market economy:
During the seventeenth century, the Dutch extracted tons of herring
from waters that washed on English shores, had the largest merchant
fleet in Europe, drew into their banks Spanish gold, borrowed at the
lowest interest rates, and bested all comers, in the commerce of the
Baltic, the Mediterranean, and the West Indies. Dutch prosperity, like
Dutch land, seemed to have been created out of nothing. The inevitable
contrast with Spain, the possessor of gold and silver mines now
teetering on the verge of bankruptcy, only underscored the conundrum
of Dutch success.
The Netherlands represented a kind of anti-fairy tale. The rags-toriches heroes of medieval folklore invariably found pots of gold or
earned fortunes through acts of valor. Elfin magicians, fairy
godmothers, and subdued giants were the bestowers of great wealth.
Spanish exploits in the New World had been entirely in keeping with
this legendary tradition. The conquistadors had won the fabled mines
of the Incas and Aztecs wih their military prowess. Even the less
glamorous triumphs of the Portuguese conformed to the 'treasure' image
of getting wealthy. Venturing into uncharted oceans, they had bravely
blazed a water trail to the riches of the Orient.
The Dutch, on the other hand, had made their money in a most mundane

fashion. No aura of gold and silver, perfumed woods, rare stones


aromatic spices, or luxurious fabrics attended their initial
successes. Instead their broad-bottomed flyboats plied the waters of
the North Sea in an endless circulation of European staples. From this
inglorious foundation the industrious people of the Low Countries had
turned their cities into the emporiums of the world. The Dutch were
the ones to emulate, but to emulate was not easy, for the market
economy was not a single thing but a complicated mix of human
activities that seemed to sustain itself. ...
In the Dutch example there were challenging contradictions between
appearances and reality with puzzling divergences between expectations
based upon established truths and what actually happened. Without
mines, how did the Dutch come to have plenty of coin? With few natural
resources for export, how could the Dutch engross the production of
other countries? How did the Dutch have low interest rates and high
land values? How were high wages maintained with a burgeoning
population? How could high prices and widespread prosperity exist
simultaneously in the Low Countries (Holland)?
Author: Joyce Appleby
Title: The Relentless Revolution
Publisher: Norton
Date: Copyright 2010 by Joyce Appleby
Pages: 99-100
-------------------------------------------------------------------------------------------------In today's excerpt - the elements. At one point, aluminum was
considered more rare and precious than silver:
"There are ninety-two naturally occurring elements on Earth, plus a
further twenty or so that have been created in labs, but some of these
we can immediately put to one side - as, in fact, chemists themselves
tend to do. Not a few of our earthly chemicals are surprisingly little
known. Astatine, for instance, is practically unstudied. It has a name
and a place on the periodic table (next door to Marie Curie's
polonium), but almost nothing else. The problem isn't scientific
indifference, but rarity. There just isn't much astatine out there.
The most elusive element of all, however, appears to be francium,
which is so rare that it is thought that our entire planet may
contain, at any given moment, fewer than twenty francium atoms.
Altogether only about thirty of the naturally occurring elements are
widespread on Earth, and barely half a dozen are of central importance
to life.
"As you might expect, oxygen is our most abundant element, accounting
for just under 50 percent of the Earth's crust, but after that the
relative abundances are often surprising. Who would guess, for

instance, that silicon is the second most common element on Earth or


that titanium is tenth? Abundance has little to do with their
familiarity or utility to us. Many of the more obscure elements are
actually more common than the better-known ones. There is more cerium
on Earth than copper, more neodymium and lanthanum than cobalt or
nitrogen. Tin barely makes it into the top fifty, eclipsed by such
relative obscurities as praseodymium, samarium, gadolinium, and
dysprosium.
"Abundance also has little to do with ease of detection. Aluminum is
the fourth most common element on Earth, accounting for nearly a tenth
of everything that's underneath your feet, but its existence wasn't
even suspected until it was discovered in the nineteenth century by
Humphry Davy, and for a long time after that it was treated as rare
and precious. Congress nearly put a shiny lining of aluminum foil atop
the Washington Monument to show what a classy and prosperous nation we
had become, and the French imperial family in the same period
discarded the state silver dinner service and replaced it with an
aluminum one. The fashion was cutting edge even if the knives weren't.
"Nor does abundance necessarily relate to importance. Carbon is only
the fifteenth most common element, accounting for a very modest 0.048
percent of Earth's crust, but we would be lost without it. What sets
the carbon atom apart is that it is shamelessly promiscuous. It is the
party animal of the atomic world, latching on to many other atoms
(including itself) and holding tight, forming molecular conga lines of
hearty robustness - the very trick of nature necessary to build
proteins and DNA. As Paul Davies has written: 'If it wasn't for
carbon, life as we know it would be impossible. Probably any sort of
life would be impossible.' Yet carbon is not all that plentiful even
in humans, who so totally depend on it. Of every 200 atoms in your
body, 126 are hydrogen, 51 are oxygen, and just 19 are carbon."
Author: Bill Bryson
Title: A Short History of Nearly Everything
Publisher: Broadway Books
Date: Copyright 2003 by Bill Bryson
Pages: 250-251
-------------------------------------------------------------------------------------------------In today's excerpt - Ptolemy XII, the father of Cleopatra VII, who was
the famed lover of both Caesar and Mark Antony. How do you lose a
kingdom? Through profligate spending. Ptolemy's reckless spending and
negligent administration led to unrest throughout Egypt, and meant
that he had to turn to Roman bankers to rescue him, making the country
of Egypt itself the collateral for the Roman loan. And when Ptolemy
left the country in disgrace, the Romans had to restore him to power
so he could try and make good on his loan. He never fully did, and as

a result, Rome's takeover of Egypt from his daughter Cleopatra, though


caught in the intrigue of a power struggle between Caesar and Mark
Antony, was less a conventional military conquest and more the
collection of an overdue loan:
"Ptolemy's lavishness cost him dearly, with both internal instability
and Roman concern increasing. There are scattered notices of
disturbances in Egypt all through the 60s B.C. The historian Diodoros,
who visited Egypt about this time, witnessed a riot and lynching that
occurred when someone accidentally committed the sacrilege of killing
a cat, an incident that was notable for the failure of government
officials sent to the scene to intervene. Taxes were increased,
resulting in strikes by farmers in the villages: as was usual in times
of financial excess and overseas adventures, the poor suffered the
most. It was said that money to pay the king's debts was exacted by
force. Even the gold sarcophagus of Alexander the Great was melted
down. Civil disturbances reached such a point that in 63 B.C. Ptolemy
had to issue an order that unauthorized persons could not enter temple
treasuries. His expenditures soon reached a point that he went into
debt, borrowing from the famous Roman banker C. Rabirius Postumus. ...
"Discontent and opposition to his rule, especially his failure to hold
traditional Ptolemaic territory and to keep it from the Romans, as
well as his financial policies, resulted either in his expulsion or,
more likely, voluntary departure from Egypt in the summer of 58 B.C.,
[leaving his wife Cleopatra VI in charge.] ...
"The indebtedness of Ptolemy XII to Roman bankers meant that his
political survival was more than an idle question in Rome, since the
best way to ensure that the debts would be paid would be to implement
his restoration and thus give him renewed access to the Egyptian
treasury. ...
"In Alexandria the death of Cleopatra VI during her husband's exile
created an awkward situation, as the surviving queen Berenike IV had
no husband [so] one had to be found. ... A certain Archelaos was
finally successful. His background is contradictorily described in the
sources, but he claimed to have been a descendant of Mithradates the
Great and happened to be a protege of [triumvirate member] Pompeius.
"The Romans were not finished with Ptolemy XII, however. Although
Archelaos and Berenike seemed firmly in control in Egypt with
Archelaos now accepted as king, the Roman bankers, led by Rabirius
Postumus, knew that restoration of Ptolemy was their only means of
salvation. Yet discussions about whether to restore the king led to
rioting in Rome, less out of concern about Ptolemy's future than the
machinations of those in power, whose interests included the future of
Egypt. Pompeius persuaded Gabinius, the governor of Syria, to bring
about the restoration, and his willingness to comply was eased by
10,000 talents provided by Ptolemy."

Author: Duane W. Roller


Title: Cleopatra
Publisher: Oxford
Date: Copyright 2010 by Oxford University Press, Inc.
Pages: 21-24
-------------------------------------------------------------------------------------------------In today's excerpt - teaching. Through many years of systematic
observation of some of the very best teachers, teacher Doug Lemov has
identified forty-nine key techniques that separate the very best
teachers from merely good ones. One of these forty-nine techniques he
has labeled "Right is Right":
" 'Right Is Right' is about the difference between partially right and
all-the-way right - between pretty good and 100 percent. The job of
the teacher is to set a high standard for correctness: 100 percent.
The likelihood is strong that students will stop striving when they
hear the word right (or yes or some other proxy), so there's a real
risk to naming as right that which is not truly and completely right.
When you sign off and tell a student she is right, she must not be
betrayed into thinking she can do something that she cannot.
"Many teachers respond to almost-correct answers their students give
in class by rounding up. That is they'll affirm the student's answer
and repeat it, adding some detail of their own to make it fully
correct even though the student didn't provide (and may not recognize)
the differentiating factor. Imagine a student who's asked at the
beginning of Romeo and Juliet how the Capulets and Montagues get
along. 'They don't like each other,' the student might say, in an
answer that most teachers would, I hope, want some elaboration on
before they called it fully correct. 'Right,' the teacher might reply.
'They don't like each other, and they have been feuding for
generations.' But of course the student hadn't included the additional
detail. That's the 'rounding up.' Sometimes the teacher will even give
the student credit for the rounding up as if the student said what he
did not and what she merely wished he'd said, as in, 'Right, what
Kiley said was that they don't like each other and have been feuding.
Good work, Kiley.' Either way, the teacher has set a low standard for
correctness and explicitly told the class that they can be right even
when they are not. Just as important, she has crowded out students'
own thinking, doing cognitive work that students could do themselves
(e.g., 'So, is this a recent thing? A temporary thing? Who can build
on Kiley's answer?').
"When answers are almost correct, it's important to tell students that
they're almost there, that you like what they've done so far, that
they're closing in on the right answer, that they've done some good

work or made a great start. You can repeat a student's answer back to
him so he can listen for what's missing and further correct - for
example, 'You said the Capulets and the Montagues didn't get along.'
Or you can wait or prod or encourage or cajole in other ways to tell
students what still needs doing, ask who can help get the class all
the way there until you get students all the way to a version of right
that's rigorous enough to be college prep: 'Kiley, you said the
Capulets and the Montagues didn't get along. Does that really capture
their relationship? Does that sound like what they'd say about each
other?'
"In holding out for right, you set the expectation that the questions
you ask and their answers truly matter. You show that you believe your
students are capable of getting answers as right as students anywhere
else. You show the difference between the facile and the scholarly.
This faith in the quality of a right answersends a powerful message to
your students that will guide them long after they have left your
classroom.
"Over the years I've witnessed teachers struggle to defend right
answers. In one visit to a fifth-grade classroom, a teacher asked her
students to define peninsula. One student raised his hand and offered
this definition: 'It's like, where the water indents into the land.'
'Right,' his teacher replied, trying to reinforce participation since
so few hands had gone up. Then she added, 'Well, except that a
peninsula is where land indents into water, which is a little
different.' Her reward to the student for his effort was to provide
him with misinformation. A peninsula, he heard, is pretty much 'where
the water indents into the land' but different on some arcane point he
need not really recall. Meanwhile, it's a safe bet that the students
with whom he will compete for a seat in college are not learning to
conflate bays and peninsulas."
Author: Doug Lemov
Title: Teach Like a Champion
Publisher: Jossey-Bass, A Wiley Imprint
Date: Copyright 2010 by John Wiley & Sons
Page: 35-37
-------------------------------------------------------------------------------------------------In today's excerpt - slave labor on sugar plantations had to be
replaced every ten to thirteen years:
"Because of its centrality to the sugar trade, the slave trade was the
most hotly contested European venture on the face of the globe. The
numbers themselves shock one into an awareness of its significance.
Between 1501 and 1820 slavers took 8.7 million Africans in chains to
the Western Hemisphere; between 1820 and the final abolition of

slavery in Brazil in 1888, 2.3 million more were sent. A total of 11


million men and women can from Africa to the New World colonies in
comparison with the 2.6 million Europeans who crossed the Atlantic in
the same period. Over one hundred thousand separate voyages brought
this human cargo, 70 percent them owned by either British or
Portuguese traders.
"Sugar was one of capitalism's first great bonanzas; its successes
also revealed the power of the profit motive to override any cultural
inhibitions to gross exploitation. Slavery was old. Egyptian slaves
had built pyramids; Roman ones, bridges and aqueducts. What capitalism
introduced was sustained and systematic brutality in the making of
goods on a scale never seen before. It's not size alone that
distinguishes modern slavery from its ancient lineage in Greece and
biblical times; it's also race. Slavery then often had an ethnic
component because slaves were taken as the captives of war, but never
a consistently racial one. When the Portuguese brought back captured
Africans to work in depopulated Lisbon starting in the fifteenth
century, the trade didn't differ much from the commerce in slaves that
the Arabs had been conducting for several centuries throughout central
and eastern Africa. A hundred years later, something new had been
added to this commerce in human beings: They were integrated into an
expanding production system. Those sent to the Caribbean were put to
work in gangs planting sugarcanes, chopping weeds, cutting the
harvest, crushing the canes in the mills that turned out molasses and
sugar. The very size of the trade promoted warfare in Africa in order
to meet the new demand for slaves. ...
"Sometimes bands of freelancing armed Africans raided villages and
sold their captives to all comers. Along thirty-five hundred miles of
coastline from Senegambia to Angola, traders gathered slave cargoes
that they sold for European goods. Slave sellers particularly favored
guns with which to capture more men and women. Separated by sex for
the voyage across the Atlantic, the captives were packed into ships,
each person confined to a space of four square feet for a period of
eight to twelve weeks. A typical voyage would carry 150 to 400
persons, 12 to 15 percent of whom usually died en route. Revolts broke
out in about 10 percent of all voyages, almost always in the first
weeks....
"The sugar planters, who invested their capital in plantations, worked
their slaves and land as hard as possible. They accepted the
inevitable decline of soil in the pursuit of quick returns. So
profitable was the crop and so cruel the plantation owners that they
literally worked their slaves to death. The labor force in the
Caribbean had to be replaced about ever ten to thirteen years."
Author: Joyce Appleby
Title: The Relentless Revolution
Publisher: Norton

Date: Copyright 2010 by Joyce Appleby


Pages: 124-125, 129-130.
-------------------------------------------------------------------------------------------------In today's excerpt - many contend that the current Deepwater Horizon
oil spill is the worst ecological disaster in U.S. history. Others
disagree. Some of the other contenders are the Dust Bowl, the
Johnstown Flood, and the Lakeview Gusher:
"For sheer disruption to human lives, several [environmental
historians and other experts] could think of no environmental problem
in American history quite equaling the calamity known as the Dust
Bowl. 'The Dust Bowl is arguably one of the worst ecological blunders
in world history,' said Ted Steinberg, a historian at Case Western
Reserve University. Across the High Plains, stretching from the Texas
Panhandle to the Dakotas, poor farming practices in the early part of
the 20th century stripped away the native grasses that held moisture
and soil in place. A drought that began in 1930 exposed the folly.
"Boiling clouds of dust whipped up by harsh winds buried homes and
cars, destroyed crops, choked farm animals to death and sent children
to the hospital with pneumonia. At first the crisis was ignored in
Washington, but then the apocalyptic clouds began to blow all the way
to New York, Buffalo and Chicago. A hearing in Congress on the
disaster was interrupted by the arrival of a dust storm. By the
mid-1930s, people started to give up on the region in droves. The Dust
Bowl refugees joined a larger stream of migrants displaced by
agricultural mechanization, and by 1940 more than two million people
had left the Great Plains States.
"However, the Dust Bowl lasted a decade, and that raises an issue.
What exactly should be defined as an environmental disaster? How long
should an event take to play out, and how many people have to be
harmed before it deserves that epithet? Among sudden events, the
Johnstown Flood might be a candidate for worst environmental disaster.
On May 31, 1889, heavy rains caused a poorly maintained dam to burst
in southwestern Pennsylvania, sending a wall of water 14 miles
downriver to the town of Johnstown. About 2,200 people were killed in
one of the worst tolls in the nation's history. At the time it
happened, that event was understood as a failure of engineering and
maintenance, and that is how it has come down in history. Perhaps a
one-day flood is simply too short-term to count as an environmental
disaster.
"On the other hand, if events that played out over many decades are
included, the field of candidates expands sharply. Perhaps the
destruction of the native forests of North America, which took
hundreds of years, should be counted as the nation's largest

environmental calamity. The slaughtering of millions of bison on the


Great Plains might qualify. Craig E. Colten, a geographer at Louisiana
State University, nominates 'the human overhaul of the Mississippi
River Valley,' which destroyed many thousands of acres of wetlands and
made the region more vulnerable to later events like Hurricane
Katrina. However, those activities were not seen as disasters at the
time, at least by the people who carried them out. They were viewed as
desirable alterations of the landscape. It is only in retrospect that
people have come to understand what was lost, so maybe those do not
belong on a disaster list.
"Oil spills, too, seem to be judged more by their effect on people
than on the environment. Consider the Lakeview Gusher, which was
almost certainly a worse oil spill, by volume, than the one continuing
in the gulf. In the southern end of California's San Joaquin Valley,
an oil rush was on in the early decades of the 20th century. On March
14, 1910, a well halfway between the towns of Taft and Maricopa, in
Kern County, blew out with a mighty roar. It continued spewing huge
quantities of oil for 18 months. The version of events accepted by the
State of California puts the flow rate near 100,000 barrels a day at
times. 'It's the granddaddy of all gushers,' said Pete Gianopulos, an
amateur historian in the area. The ultimate volume spilled was
calculated at 9 million barrels, or 378 million gallons. According to
the highest government estimates, the Deepwater Horizon spill is not
yet half that size.
"The Lakeview oil was penned in immense pools by sandbags and earthen
berms, and nearly half was recovered and refined by the Union Oil
Company. The rest soaked into the ground or evaporated. Today, little
evidence of the spill remains, and outside Kern County, it has been
largely forgotten. That is surely because the area is desert
scrubland, and few people were inconvenienced by the spill."
Author: Justin Gillis
Title: "Where Gulf Spill Might Place on the Roll of Disasters"
Publisher: The New York Times
Date: June 18, 2010
-------------------------------------------------------------------------------------------------In today's excerpt - in the thirteen years after the Declaration of
Independence but before the U.S. Constitution was written and in
place, the state governments of the thirteen states reigned supreme.
Many of these states had constitutions that were bold experiments in
democracy, and some state legislatures had more common people in
office than gentry. The result was often a chaos - inflation, liberal
debtor relief, and even rebellion - that created great discomfort for
the founders, who were almost all landed gentry. And so the U.S.
Constitution was designed in no small part to curb this democracy and

the excesses of these state governments:


"The Federal Constitution of 1787 was designed in part to solve the
problems created by the presence in the state legislatures of [common
people]. In addition to correcting the deficiencies of the Articles of
Confederation, the Constitution was intended to restrain the excesses
of democracy and protect minority rights from overbearing majorities
in the state legislatures. But could that be done within a republican
framework? Some thought not. 'You may think it very Extraordinary,'
Joseph Savage of New Jersey told his son in July 1787, 'but the better
sort of people are very desirous of a Monarchical government and are
in daily Expectation of having it recommended by those Gentlemen in
Philadelphia.' Of course, the middling sorts in the states did not
think there was too much democracy in the various legislatures. ...
"Certainly no one described the crisis of American politics in 1787
more acutely than did the thirty-six-year-old Virginian James Madison.
Madison had become a member of the Continental Congress at age twentyeight and was thoroughly familiar with the Confederation's weaknesses.
Indeed, throughout the middle 1780s he, along with other national
leaders, had wrestled with various schemes for overhauling the
Articles of Confederation. But it was his experience serving in the
Virginia assembly in 1784-1787 that convinced him that the real
problem of American politics lay in the state legislatures. During the
1780s he saw many of his and Jefferson's plans for reform mangled by
factional fighting and majoritarian confusion in the Virginia
assembly. More than any other Founder, Madison questioned the
conventional wisdom of the age concerning majority rule, the proper
size for a republic, and the role of factions in society. His thinking
about the problems of creating republican governments and his writing
of the Virginia Plan in 1787, which became the working model for the
Constitution, constituted one of the most creative moments in the
history of American politics. ...
"The Constitution corrected the deficiencies of the Confederation by
granting the new national government some extraordinary powers, powers
that ambitious state-builders could exploit. The Convention, however,
rejected Madison's impractical plan for a national congressional veto
over all state laws, a rejection that Madison feared would doom the
Constitution to failure. Instead, the Convention in Article 1, Section
10, prohibited the states from exercising a remarkable number of
powers, including levying import or export duties, printing paper
money (which they had done in abundance and had created inflation and
fiscal chaos), and enacting various debtor relief laws and laws
impairing contracts. But if these prohibitions were not enough to
prevent the excesses of localist and interest-ridden democracy in the
states, then the expanded and elevated structure of the federal
government itself was designed to help."
Author: Gordon S. Wood

Title: Empire of Liberty


Publisher: Oxford
Date: Copyright 2009 by the Oxford University Press, Inc.
Pages: 31-33
-------------------------------------------------------------------------------------------------In today's excerpt - the prefrontal cortex, which is the part of the
brain associated with emotional maturity, does not fully develop in
humans until they are in their mid-twenties. This may be because the
prefrontal cortex, though it brings emotional balance, focus, planning
and efficient action, restricts a person from the most creative
aspects of learning:
"From an evolutionary perspective, one of the most striking things
about human beings is our long period of immaturity. We have a much
lon ger childhood than any other species. Why make babies so helpless
for so long and thus require adults to put so much work and care into
keep ing their babies alive?
"Across the animal kingdom, the intelligence and flexibility of adults
are correlated with the immaturity of babies. 'Precocial' species such
as chickens rely on highly specific innate capaci ties adapted to one
particular environmental niche, and so they mature quickly.
'Altricial' species (those whose offspring need [long] care and
feeding by parents) rely on learning instead. Crows, for instance, can
take a new object, such as a piece of wire, and work out how to turn
it into a tool, but young crows depend on their parents for much
longer than chickens.
"A learning strategy has many advantages, but until learning takes
place, you are helpless. Evo lution solves this problem with a
division of la bor between babies and adults. Babies get a protected
time to learn about their environment, without having to actually do
anything. When they grow up, they can use what they have learned to be
better at surviving and reproduc ing-and taking care of the next
generation. Fundamentally, babies are designed to learn.
"Neuroscientists have started to understand some of the brain
mechanisms that allow all this learning to occur. Baby brains are more
flexible than adult brains. They have far more connec tions between
neurons, none of them particular ly efficient, but over time they
prune out unused connections and strengthen useful ones. Baby brains
also have a high level of the chemicals that make brains change
connections easily.
"The brain region called the prefrontal cortex is distinctive to
humans and takes an especially long time to mature. The adult
capacities for fo cus, planning and efficient action that are gov

erned by this brain area depend on the long learning that occurs in
childhood. This area's wiring may not be complete until the mid-20s.
"The lack of prefrontal control in young chil dren naturally seems
like a huge handicap, but it may actually be tremendously helpful for
learn ing. The prefrontal area inhibits irrelevant thoughts or
actions. But being uninhibited may help babies and young children to
explore freely. There is a trade-off between the ability to ex plore
creatively and learn flexibly, like a child, and the ability to plan
and act effectively, like an adult. The very qualities needed to act
efficient ly-such as swift automatic processing and a highly pruned
brain network-may be intrinsi cally antithetical to the qualities."
Author: Alison Gopnick
Title: "How Babies Think"
Publisher: Scientific American
Date: July 2010
Page: 81
-------------------------------------------------------------------------------------------------In today's excerpt - China. Most commentators view China as an
inevitable world superpowers over the course of the 21st century.
Noted futurist George Friedman disagrees:
"My discussion of the future has to begin with a discussion of China.
One-quarter of the world lives in China, and there has been a great
deal of discussion of China as a future global power. Its economy has
been surging dramatically in the past thirty years, and it is
certainly a significant power. But thirty years of growth does not
mean unending growth. It means that the probability of China
continuing to grow at this rate is diminishing. And in the case of
China, slower growth means substantial social and political problems.
I don't share the view that China is going to be a major world power.
I don't even believe it will hold together as a unified country. But I
do agree that we can't discuss the future without first discussing
China. ...
"The vast majority of China's population lives within one thousand
miles of the coast, populating the eastern third of the country, with
the other two-thirds being quite underpopulated. ... [After Mao], the
coastal regions again became prosperous and closely tied to outside
powers. Inexpensive products and trade produced wealth for the great
coastal cities like Shanghai, but the interior remained impoverished.
Tensions between the coast and the interior increased, but the Chinese
government maintained its balance and Beijing continued to rule,
without losing control of any of the regions and without having to
risk generating revolt by being excessively repressive. ...

"Underlying this is another serious, and more threatening,


problem. ... Between Asian systems of family and social ties and the
communist systems of political relationships, [bank] loans have been
given out for a host of reasons, none of them having much to do with
the merits of the business. As a result, not surprisingly, a
remarkably large number of these loans have gone bad 'nonperforming,' in the jargon of banking. The amount is estimated at
somewhere between $600 billion and $900 billion, or between a quarter
and a third of China's GDP, a staggering amount....
"Japan's bad debt rate around 1990 was, by my estimate, about 20
percent of GDP. China's, under the most conservative estimate, is
about 25 percent - and I would argue the number is closer to 40
percent. But even 25 percent is staggeringly high. China's economy
appears healthy and vibrant, and if you look only at how fast the
economy is growing, it is breathtaking. Growth is only one factor to
examine, however. The more important question is whether such growth
is profitable. Much of China's growth is very real, and it generates
the money necessary to keep the banks satisfied. But this growth
really does not strengthen the economy. And if and when it slacks off,
for example because of a recession in the United States, the entire
structure could crumble very fast.
"This is not a new story in Asia. Japan was a growth engine in the
1980s. Conventional wisdom said it was going to bury the United
States. But in reality, while Japan's economy was growing fast, its
growth rates were unsustainable. When growth slumped, Japan had a
massive banking crisis from which it has not really fully recovered
almost twenty years later. Similarly, when East Asia's economy
imploded in 1997, it came as a surprise to many, since the economies
had been growing so fast. ...
"A Chinese businessman in Shanghai ... makes far more money from
relationships with Los Angeles, New York, and London than he does from
Beijing. As Beijing tries to clamp down on him, not only will he want
to break free of its control, but he will try to draw in foreign
powers to protect his and their interests. In the meantime, the much
poorer people in the interior of the country will be either trying to
move to the coastal cities or pressuring Beijing to tax the coast and
give them money. Beijing, caught in the middle, either weakens and
loses control or clamps down so hard that it moves back to a Maoisttype closure of the country. ...
"A [very real] possibility is that under the stress of an economic
downturn, China fragments along traditional regional lines, while the
central government weakens and becomes less powerful. ... A very real
future for China in 2020 is its old nightmare - a country divided
among competing regional leaders, foreign powers taking advantage of
the situation to create regions where they can define economic rules
to their advantage, and a central government trying to hold it all

together but failing. A second possibility is a neo-Maoist China,


centralized at the cost of economic progress. As always, the least
likely scenario is the continuation of the current situation
indefinitely."
Author: George Friedman
Title: The Next 100 Years
Publisher: Anchor Books
Date: Copyright 2009 by George Friedman
Pages: 88-100
-------------------------------------------------------------------------------------------------In today's excerpt - how memory works:
"Many people wish their memory worked like a video recording. How
handy would that be? Finding your car keys would simply be a matter of
zipping back to the last time you had them and hitting 'play.' You
would never miss an appointment or forget to pay a bill. You would
remember everyone's birthday. You would ace every exam. Or so you
might think. In fact, a memory like that would snare mostly useless
data and mix them willy-nilly with the information you really needed.
It would not let you prioritize or create the links between events
that give them meaning. For the very few people who have true
photographic recall - eidetic memory, in the parlance of the field it is more burden than blessing.
"For most of us, memory is not like a video recording - or a notebook,
a photograph, a hard drive or any of the other common storage devices
to which it has been compared. It is much more like a web of
connections between people and things. Indeed, recent research has
shown that some people who lose their memory also lose the ability to
connect things to each other in their mind. And it is the connections
that let us understand cause and effect, learn from our mistakes and
anticipate the future. ...
"Learning and memory are not sequestered in their own storage banks
but are distributed across the entire cerebral cortex. ... The
significance of these findings is profound. It means that memory is
dispersed, forming in the regions of the brain responsible for
language, vision, hearing, emotion and other functions. It means that
learning and memory arise from changes in neurons as they connect to
and communicate with other neurons. And it means that a small reminder
can reactivate a network of neurons wired together in the course of
registering an event, allowing you to experience the event anew.
Remembering is reliving. ...
"The hippocampus [is] an essential mediator in [connecting neurons].
In a very small brain, every neuron might be connected to every other

neuron. But a human brain that worked on this model would require that
each of hundreds of billions of neurons be linked to every other
neuron, an impossibly unwieldy configuration. The hippocampus solves
this problem by serving as a kind of neural switchboard, connecting
the distant cortical regions for language, vision and other abilities
as synaptic networks take shape and create memories
"[People with hippocampus damage] appear to have impairments that go
well beyond the loss of memory creation. They also have severe
difficulty imagining future events, living instead in a fragmented,
disconnected reality. Recent studies show that imagining the future
involves brain processes similar to, but distinct from, those involved
in conjuring the past. We also tend to remember the people and events
that resonate emotionally, which is why forgetting an anniversary is
such an offense: it is fair evidence that the date is not as important
as the ones we do remember. The discovery that memory is all about
connections has revolutionary implications for education. It means
that memory is integral to thought and that nothing we learn can stand
in isolation; we sustain new learning only to the degree we can relate
it to what we already know. ...
"The connections across the brain also help us conceive the future, as
recent imaging studies have shown. Functional magnetic resonance
imaging ... shows that a mosaic of brain areas similar to those
involved in memory is active when participants imagine details of
hypothetical or prospective events. ...
"[This] can sometimes cause us problems by altering our memories
instead of augmenting them. ... Psychologist Elizabeth Loftus [has
shown] how easy it is to create false memories of past events. In one
study, participants watched a film of a car accident. Researchers
asked some subjects how fast they thought the cars were going when
they 'smashed into' each other and asked other subjects how fast the
cars were going when they 'hit' each other. The subjects who heard the
word 'smashed' gave significantly higher estimates of the speed. In
other experiments, subjects were fed incorrect information about an
accident after watching the film; they might, for instance, be asked
repeatedly whether a traffic light had turned yellow before the
collision when in fact the light was green. Many then remembered a
yellow light that never existed - which is why eyewitness testimony
after police interrogation can be so unreliable."
Author: Anthony J. Greene
Title: "Making Connections"
Publisher: Scientific American Mind
Date: July/August 2010
Pages: 22-29
--------------------------------------------------------------------------------------------------

In today's encore excerpt - from the annals of science: in 1917, a


cure was found for impotence and related maladies that involved
transplanting the glands and testicles of goats, monkeys and humans
into the patient. Though ultimately found to be fraudulent, it became
a craze that swept across the U.S. and captured as patients a wide
swath of Americans from movie stars to moguls:
"Ever since man began to walk upright, he had been obsessed when his
penis would not behave likewise and searched for ways to fix the
problem. The world's earliest known medical document, the so-called
Edwin Smith Papyrus of Egypt dating from 1600 B.C., presents a
strikingly sophisticated view of trauma surgery - except on the back,
where one finds 'Incantation for Transforming an Old Man into a Youth
of Twenty.' In ancient Greece an herb called satyrion, recommended by
the philosopher Theophrastus in 320 B.C., was swiftly harvested to
extinction. During the ensuing centuries cloves, ginger, and massaging
one's genitals in ass's milk all had their vogue. In England around
the year 1000, men were devouring 'love bread' (naked maidens romped
in wheat, which was then harvested counterclockwise). The Middle Ages
favored lubrication of the afflicted member with melted fat from camel
humps. ...
"[For doctors experimenting in the 1910s], finding a human donor [of
testicles for transplants to test a 'cure' for impotence] was actually
easy, thanks to the help of Dr. Leo Stanley, chief surgeon at San
Quentin prison in California. Three or four hangings a year offered
the perfect chance to relieve relatively young men of their testicles
without an argument. ... Testicles of these deceased felons were
inserted into other prisoners, usually geezers with no chance of
parole. According to Dr. Stanley's reports, most showed improvement.
Seventy-two-year-old Mark Williams, half-senile at the implant, perked
up within five days. ... Scientific journals, including JAMA, gave
this work wide and respectful coverage. Dr. Stanley himself ...
injected or implanted testicular material, both animal and human, into
643 inmates and 13 physicians. ...
"At the Park Avenue Hospital in Chicago in 1920, Dr. John Brinkley
performed thirty-four goat-gland transplants, pausing often to chat
with reporters. ... He had to say that his own technique, in which the
goat gland 'humanized' in the scrotal sac, was 'far in advance of the
Old World experts.' ... Dr. Stanley was now averaging fifty operations
a month at $750 apiece, for a take of almost half a million dollars a
year (in 1920s currency). Most patients walked in and lay down without
even asking how the thing worked. 'I suppose a goat gland is a good
deal like a potato,' said seventy-seven-year-old A.B. Pierce of
Nebraska. 'You can cut a potato all in pieces and plant it and every
eye will grow.' "
Author: Pope Brock

Title: Charlatan
Publisher: Crown
Date: Copyright 2008 by Pope Brock
Pages: 32-53
-------------------------------------------------------------------------------------------------In today's encore excerpt, America formally declares its independence
from England. The long-standing British occupation had turned into
war, and Americans had already fought the British well at Bunker Hill,
Dorchester Heights, and Fort Ticonderoga, and in July of 1776, were
days away from a demoralising loss at the Battle of Brooklyn. But
America had not yet formally declared its independence:
"Congress adopted independence on July 2, 1776. It issued the
Declaration on the fourth...It was only after it was on parchment and
brought back to Congress on August 2 that they formally signed the
document...Congress didn't actually circulate a copy of the document
with signatures until January 1777. Why? Well, this was a confession
of treason. You were putting your head in the noose. And the war was
going very, very poorly in 1776. Only after Trenton and Princeton made
it possible (in December) to believe that Americans might win this war
did they circulate the document with their signatures."
Pauline Maier, from Brian Lamb's Booknotes, Penguin, 2001, p. 13
"In Philadelphia, the same day as the British landing on Staten
Island, July 2, 1776, the Continental Congress, in a momentous
decision, voted to 'dissolve the connection' with Great Britain. The
news reached New York four days later, on July 6th, and at once
spontaneous celebrations broke out...On Tuesday, July 9th, at six in
the evening, on (Washington's) orders, the several brigades in the
city were marched onto the Commons and other parade grounds to hear
the Declaration read aloud."
"The formal readings concluded, a great mob of cheering, shouting
soldiers and townspeople stormed down Broadway to Bowling Green,
where, with ropes and bars, they pulled down the gilded lead statue of
George III on his colossal horse. In their fury the crowd hacked off
the sovereign's head, severed the nose, clipped the laurels that
wreathed his head, and mounted what remained of the head on a spike
outside the tavern."
Author: David McCullough
Title: 1776
Publisher: Simon & Schuster
Date: Copyright 2005 by David McCullough
Pages: 135-137

-------------------------------------------------------------------------------------------------In today's excerpt - the birth of the "greenback," the national paper
money created as a means of helping to pay for the unprecedented cost
- $3.4 billion in 1865 dollars or $50 billion today - of the U.S.
Civil War:
"To support the war effort, Union leaders also resorted to a device
utilized liberally during the Revolution: the printing press. Before
the outbreak of the Civil War, federally issued money consisted of
gold, silver, and copper coins. [Founding Father Alexander] Hamilton
had been clear in warning that 'the stamping of paper is an operation
so much easier than the laying of taxes that a government in the
practice of such paper emissions would rarely fail in any such
emergency to indulge itself too far' His fear, of course, was that the
overzealous printing of money would lead to inflation, whereas notes
issued by an independent, nationally chartered bank, backed by gold,
would be a credible national currency.
"But there was no central bank in 1861. ... To make matters worse, the
banking system was in chaos. In the years immediately before the Civil
War, roughly sixteen hundred state-chartered banks dotted the American
landscape, each issuing its own notes. Roughly seven thousand
varieties of banknotes were in circulation. Some were issued by
legitimate state-chartered banks, but many were of dubious quality or
simply counterfeit. ...
"Because the notes of state-chartered banks were generally accepted
only in the state of the issuing bank, the government had difficulty
in procuring goods and services for the military, just as it did
during the War of 1812. ... The Union government 'needed to establish
a currency [that would be] uniformly acceptable.'
"In December 1861, the Ways and Means Subcommittee ... drafted a bill
to create a new currency that ... would not be redeemable on demand
for specie [gold]. [Treasury Secretary Salmon] Chase, who believed the
financial system should be rooted in gold, registered his profound
objections. Like Hamilton, he favored a currency consisting of notes
issued by nationally chartered banks that were backed by government
bonds, which were in turn backed by gold.
"The idea immediately drew fire on the House floor. The horrible
precedent of the 'Continentals' [which almost became worthless during
the Revolutionary War] was frequently cited. Congressman George
Pendleton of Ohio ... warned, 'If this bill is passed, prices will be
inflated ... incomes will depreciate; the savings of the poor will
vanish; the hoardings of the widow will melt away; bonds, mortgages,
and notes -everything of fixed value - will lose their value.' Chase
threatened to resign if the legislation was enacted. ... Opponents

also argued that the Constitution gave Congress the power only to
'coin money' and 'regulate the value thereof' - not to print money.
The collapse in value of the Continentals had been much on the minds
of the framers when this provision was written.
"On February 3, 1862, desperate for cash, Chase changed his tune.
'Immediate action is of great importance,' the secretary informed
Congress; 'The Treasury is nearly empty.' ...
"[Many legislators] considered legal tender "of doubtful
constitutionality' and admitted that it 'shocks all my notions of
political, moral and national honor,' but reconciled themselves to it
because 'to leave the government without resources in such a crisis is
not to be thought of.'
"Late in February, Congress passed the Legal Tender Act, authorizing
an initial issue of the new federal currency. ... The new 'legal
tender' was printed with green ink on one side, and the notes were
quickly nicknamed "greenbacks." (Confederate currency, printed with
blue-gray ink, was known as "blue backs. ") Chase, who was planning to
challenge Lincoln for the Republican presidential nomination in 1864,
had his portrait featured on the widely circulated one-dollar bill.
"Despite supporting the printing of greenbacks and having his picture
placed on them, Chase later had second thoughts. After failing to
dislodge Lincoln as the Republican nominee, and then being fired by
him, Chase was appointed by the president to be chief justice of the
United States. In 1870, Chase wrote the Court's majority opinion
striking down the Legal Tender Act, holding that it was a violation of
the Fifth Amendment's prohibition against taking property without due
process because it forced Americans to accept greenbacks in repayment
of private debts that originally had been contracted to be settled in
gold. This decision was reversed in 1871, after President Ulysses S.
Grant deliberately appointed two justices who disagreed with the 1870
decision.
"The $450 million worth of greenbacks that were issued covered nearly
15 percent of the cost of the war. ... As Hamilton had predicted,
overly enthusiastic use of the federal printing press proved to be a
significant contributor to inflation ... and prices rose by nearly 25
percent annually. This hit workers and soldiers on fixed salaries
especially hard, contributing to social unrest.
"The Confederacy fared much worse. Its economy was devastated by a
9,000 percent inflation rate, caused primarily by far greater resort
to the printing press due to a weak tax base and an inability to raise
outside funds."
Author: Robert D. Hormats
Title: The Price of Liberty

Publisher: Times Books


Date: Copyright 2007 by Robert D. Hormats
Pages: 74-78
-------------------------------------------------------------------------------------------------In today's excerpt - comedy movies. The comedies of the writerdirectors of the past, like Preston Sturges, Woody Allen, and John
Hughes, have given way to the more improvisational comedies of today's
writer-actor:
"Nowadays in the comedy industry, a 'Bucket Brigade' of actors,
writers, and directors pitches in to punch up one another's films; the
nearly all-male group includes Jay Roach, Ben Stiller, Owen Wilson,
Seth Rogen, Jonah Hill, Nicholas Stoller, Jason Segel, John Hamburg,
Garry Shandling, Sacha Baron Cohen, Robert Smigel, Adam McKay, and
Will Ferrell. Many of the group's members trained at Second City, or
with such newer improv groups as the Groundlings and the Upright
Citizens Brigade. They read one another's drafts, attend one another's
table reads and rough cuts, and give notes. Lots and lots of
notes. ...
"Bucket Brigade movies are usually ensemble affairs in which every
character is funny, as opposed to an Eddie Murphy movie, in which
Eddie Murphy is sometimes funny. As in a sitcom, the banter tends to
be filmed with three cameras at once, which eliminates the technical
problem of 'matching' the action if an actor does a great improv that
you've filmed only in closeup - that is, of having to reshoot the
improv from the other actors' perspectives to maintain the continuity
of the scene. (Shooting with three cameras can compromise the
lighting; comedies, like documentaries and porn, aren't expected to
have great production values.)
"Most directors of unimprovised comedies shoot around five hundred
thousand feet of film and edit it down to the eight thousand feet that
constitutes a ninety-minute film. Roach shot more than nine hundred
thousand feet for [the upcoming Steve Carell movie] 'Dinner for
Schmucks,' Adam McKay shot more than a million for 'Step-brothers,'
and Judd Apatow always shoots at least a million. Apatow often runs
off an entire eleven-minute magazine of film - a thousand feet - on a
take, hollering alts or letting the actors riff. ['Dinner' director
Jay] Roach said, 'It's a sloppy approach. One out of ten moments is
great, and you watch the nine others go by and hope.'
"It's all a painstaking set of procedures aimed at maximum creativity,
a huge planning effort to encourage accidents. ... But, even as
members of the Bucket Brigade troll for every last chuckle, they
remain mindful that it's not the comedy in comedies that keeps people
interested; it's the structure. 'Revenge of the Nerds' and 'My Big Fat

Greek Wedding' were nearly devoid of laughs, but they were big hits
simply because of their clock-work plots. The screenwriter Dennis
Klein observed, 'In standup, improv is the ability to be funny at
will, but in movies even Jim Carrey bending over and talking out of
his ass will get cut if the improv doesn't connect to the ongoing
story.' [In 'Austin Powers'], Dr. Evil's 'Sh!' comedy run works so
well because his refusal to listen to [his son] Scott is what will
allow Austin Powers to escape - and because he and Scott hate each
other.
"The rise of improv expands screenwriting into the realm of acting.
The best contemporary improvisers - including Ferrell, Myers, and
Carell - can riff in keeping with the underlying story because they
often wrote the underlying story. Comedies, once the province of
writer-directors like Preston Sturges, Woody Allen, and John Hughes,
now belong to the writer-actor."
Author: Tad Friend
Title: "First Banana"
Publisher: The New Yorker
Date: July 5, 2010
Pages: 54-55
-------------------------------------------------------------------------------------------------In today's excerpt - Sir Arthur Conan Doyle, the creator of Sherlock
Holmes. In later life, Conan Doyle converted completely to
'spiritualism,' rejecting the rational and consulting mediums to visit
with the dead. The backdrop for his spiritualism was his father's own
desperate alcoholism, and the death of his brother and son in the
horror of World War I:
"Arthur Conan Doyle's books strike the reader as part of a (possibly
unconscious) project - a series of attempts to articulate systems of
thought which might make sense of the chaos of life and the human
condition ('this circle of misery and violence and fear', as Holmes
puts it in 'The Cardboard Box'). First comes the ratiocination of
Baker Street, inspired by the techniques of Conan Doyle's old
university teacher Dr Joseph Bell (who, in 1892, reviewed the original
Holmes adventures, calling his former clerk 'a born story-teller'),
then extreme patriotism (in 1899, [George] Bernard Shaw boasted that
he had converted Conan Doyle from 'Christmas-card pacifism to rampant
jingoism') and, finally, the magical world-view of spiritualism, a
philosophy which could render even the slaughter of the Great War
explicable. In 1914, Conan Doyle was praising the 'glorious spectacle'
of mass enlistment and imagining that 'our grandchildren will thrill
as they read of the days that we endure'. Twelve years later,
following the deaths of his brother and his eldest son, he had come to
see the trenches as 'God's first warning to mankind' ('ten million

young men were laid dead upon the ground . . . twice as many were
mutilated'), even claiming to be glad that his son was killed ('am I
not far nearer to my son than if he were alive . . . ?'). Spurning
'Victorian science' for having 'left the world hard and clean and
bare, like a landscape in the moon', the doctor was reduced to arguing
that "'have always held that people insist too much upon direct
proof'. ...
"The figure behind much of this is surely Arthur's father, the artist
Charles Altamont Doyle. A chronic alcoholic who, according to Andrew
Lycett's biography Conan Doyle: The Man who Created Sherlock Holmes
(2007), was sometimes to be found dragging 'himself around the
floor . . . unable to remember his own name', and who, 'when nothing
else was available . . . drank furniture varnish'. He spent the last
twelve years of his life in an asylum, or 'Convalescent Home', as
Arthur disguised it in his autobiography, Memories and Adventures,
where he wrote that the old man's 'thoughts were always in the
clouds . . . he had no appreciation of the realities of life'. Russell
Miller, another recent biographer, adduces Charles's confession to his
doctor that he was 'getting messages from the unseen world' and also,
significantly, his belief in fairies. ...
"When one thinks of Charles Altamont Doyle (who on occasion stripped
off his clothes in the street with the intention of selling them to
buy drink), ... It is as though Conan Doyle began his writing life by
assuming a position which repudiated all of Charles's weaknesses,
associating himself instead with the substitute father of Bell before
gradually - painfully - giving himself over to a worldview that
vindicated his parent's supposed insanity and which reduced Bell's
rationalism to blinkered, pharisaical refusal to accept the truth."
Author: Jonathan Barnes
Title: "Mediumistics"
Publisher: The Times Literary Supplement
Date: June 25, 2010
Page: 4
-------------------------------------------------------------------------------------------------In today's excerpt - George Washington's plans for Washington, DC:
"Because of his [interest in symbols and events that would give cause
to citizens of the new United states for national pride above state
pride], Washington was especially interested in the size and character
of the White House and of the capital city that was to be named after
him. The huge scale and imperial grandeur of the Federal City, as
Washington modestly called it, owe much to his and his backing of the
French-born engineer Pierre Charles L'Enfant as architect.

"L'Enfant had migrated from France in 1777 as one of the many foreign
recruits to the Continental Army. In 1779 he became a captain of
engineers and attracted the attention of Washington for his ability to
stage festivals and design medals, including that of the Society of
the Cincinnati. In 1782 he organized the elaborate celebration in
Philadelphia marking the birth of the French dauphin, and in 1788 he
designed the conversion of New York's City Hall into Federal Hall.
Thus it was natural for L'Enfant to write Washington in 1789 outlining
his plans for 'the Capital of this vast Empire.' L'Enfant proposed a
capital that would 'give an idea of the greatness of the empire as
well as ... engrave in every mind that sense of respect that is due a
place which is the seat of a supreme sovereign.' His plan for the
Federal City, he said, 'should be drawn on such a Scale as to leave
room for that aggrandizement & embellishment which the increase of the
wealth of the Nation will permit it to pursue at any period however
remote.'
"Washington knew the site of the national capital had to be larger
than that of any state capital. 'Philadelphia,' the president pointed
out, 'stood upon an area of three by two miles. ... If the metropolis
of one State occupied so much ground, what ought that of the United
States to occupy?' He wanted the Federal City to become a great
commercial metropolis in the life of the nation and a place that would
eventually rival any city in Europe. The new national capital, he
hoped, would become the energizing and centralizing force that would
dominate local and sectional interests and unify the disparate states.
"L'Enfant designed the capital, as he said, in order to fulfill 'the
President's intentions.' The Frenchman conceived of a system of grand
radial avenues imposed on a grid of streets with great public squares
and circles and with the public buildings - the 'grand edifices' of
the 'Congress House' and the 'President's Palace' -placed so as to
take best advantage of the vistas across the Potomac. Some of the
early plans for the rotunda of the Capitol even included a monumental
tomb that was designed eventually to hold the first president's body a proposal that made Secretary of State Thomas Jefferson very uneasy.
"Although the final plans for the capital were less impressive than
what Washington originally envisioned, they were still grander than
those others had in mind. If Jefferson had had his way, L'Enfant would
never have kept his job as long as he did, and the capital would have
been smaller and less magnificent - perhaps something on the order of
a college campus, like Jefferson's later University of Virginia.
Opposed as he was to anything that smacked of monarchical Europe,
Jefferson thought that fifteen hundred acres would be enough for the
Federal City.' "
Author: Gordon S. Wood
Title: Empire of Liberty
Publisher: Oxford

Date: Copyright 2009 by Oxford University Press, Inc.


Pages: 79-80
-------------------------------------------------------------------------------------------------In today's excerpt - in the seemingly inexorable subjugation by
Americans of Native Americans and their march westward from the
original thirteen colonies to California, they were turned back
dramatically only once - in a war with the most fierce tribe of them
all, the Comanches of Texas and the Llano Estacado:
"Six years after the end of the Civil War, the western frontier was an
open and bleeding wound, a smoking ruin littered with corpses and
charred chimneys, a place where anarchy and torture killings had
replaced the rule of law, where Indians and especially Comanches
raided at will. Victorious in war, unchallenged by foreign foes in
North America for the first time in its history, the Union now found
itself unable to deal with the handful of remaining Indian tribes that
had not been destroyed, assimilated, or forced to retreat meekly onto
reservations where they quickly learned the meaning of abject
subjugation and starvation. ... No tribe in the history of the
Spanish, French, Mexican, Texan, and American occupations of this land
had ever caused so much havoc and death [as the Comanches]. None was
even a close second.
"Just how bad things were in 1871 along this razor edge of
civilization could be seen in the numbers of settlers who had
abandoned their lands. The frontier, carried westward with so much
sweat and blood and toil, was now rolling backward, retreating.
Colonel Randolph Marcy, who accompanied [Civil War hero and general in
chief of the army William Tecumseh] Sherman on a western tour in the
spring, and who had known the country intimately for decades, had been
shocked to find that in many places there were fewer people than
eighteen years before. 'If the Indian marauders are not punished,' he
wrote, 'the whole country seems in a fair way of becoming totally
depopulated.' This phenomenon was not entirely unknown in the history
of the New World. The Comanches had also stopped cold the northward
advance of the Spanish empire in the eighteenth century - an empire
that had, up to that point, easily subdued and killed millions of
Indians in Mexico and moved at will through the continent.
"Now, after more than a century of relentless westward movement, they
were rolling back [European] civilization's advance again, only on a
much larger scale. Whole areas of the borderlands were simply emptying
out, melting back eastward toward the safety of the forests. One
[Texas] county - Wise - had seen its population drop from 3,160 in the
year 1860 to 1,450 in 1870. In some places the line of settlements had
been driven back a hundred miles. If General Sherman wondered about
the cause - as he once did - his tour with Marcy relieved him of his

doubts. That spring they had narrowly missed being killed themselves
by a party of raiding Indians. The Indians, mostly Kiowas, passed them
over because of a shaman's superstitions and had instead attacked a
nearby wagon train. What happened was typical of the savage, revengedriven attacks by Comanches and Kiowas in Texas in the postwar years.
What was not typical was Sherman's proximity and his own very personal
and mortal sense that he might have been a victim, too. Because of
that the raid became famous, known to history as the Salt Creek
Massacre.
"Seven men were killed in the raid, though that does not begin to
describe the horror of what [was] found at the scene. According to
Captain Robert G. Carter, who witnessed its aftermath, the victims
were stripped, scalped, and mutilated. Some had been beheaded and
others had their brains scooped out. 'Their fingers, toes and private
parts had been cut off and stuck in their mouths,' wrote Carter, 'and
their bodies, now lying in several inches of water and swollen or
bloated beyond all chance of recognition, were filled full of arrows,
which made them resemble porcupines.' They had clearly been tortured,
too. 'Upon each exposed abdomen had been placed a mass of live
coals. ... One wretched man, Samuel Elliott, who, fighting hard to the
last, had evidently been wounded, was found chained between two wagon
wheels and, a fire having been made from the wagon pole, he had been
slowly roasted to death - 'burnt to a crisp.' "
Author: S.C. Gwynne
Title: Empire of the Summer Moon
Publisher: Scribner
Date: Copyright 2010 by S.C. Gwynne
Pages: 3-5
-------------------------------------------------------------------------------------------------In today's excerpt - practice. Rather than being the result of
genetics or inherent genius, truly outstanding skill in any domain is
rarely achieved with less than ten thousand hours of practice over ten
years' time
"For those on their way to greatness [in intellectual or physical
endeavors], several themes regarding practice consistently come to
light:
1. Practice changes your body. Researchers have recorded a
constellation of physical changes (occurring in direct response to
practice) in the muscles, nerves, hearts, lungs, and brains of those
showing profound increases in skill level in any domain.
2. Skills are specific. Individuals becoming great at one particular
skill do not serendipitously become great at other skills. Chess
champions can remember hundreds of intricate chess positions in

sequence but can have a perfectly ordinary memory for everything else.
Physical and intellectual changes are ultraspecific responses to
particular skill requirements.
3. The brain drives the brawn. Even among athletes, changes in the
brain are arguably the most profound, with a vast increase in precise
task knowledge, a shift from conscious analysis to intuitive thinking
(saving time and energy), and elaborate self-monitoring mechanisms
that allow for constant adjustments in real time.
4. Practice style is crucial. Ordinary practice, where your current
skill level is simply being reinforced, is not enough to get better.
It takes a special kind of practice to force your mind and body into
the kind of change necessary to improve.
5. Short-term intensity cannot replace long-term commitment. Many
crucial changes take place over long periods of time. Physiologically,
it's impossible to become great overnight.
"Across the board, these last two variables - practice style and
practice time - emerged as universal and critical. From Scrabble
players to dart players to soccer players to violin players, it was
observed that the uppermost achievers not only spent significantly
more time in solitary study and drills, but also exhibited a
consistent (and persistent) style of preparation that K. Anders
Ericsson came to call 'deliberate practice.' First introduced in a
1993 Psychological Review article, the notion of deliberate practice
went far beyond the simple idea of hard work. It conveyed a method of
continual skill improvement. 'Deliberate practice is a very special
form of activity that differs from mere experience and mindless
drill,' explains Ericsson. 'Unlike playful engagement with peers,
deliberate practice is not inherently enjoyable. It ... does not
involve a mere execution or repetition of already attained skills but
repeated attempts to reach beyond one's current level which is
associated with frequent failures.' ...
"In other words, it is practice that doesn't take no for an answer;
practice that perseveres; the type of practice where the individual
keeps raising the bar of what he or she considers success. ...
"[Take] Eleanor Maguire's 1999 brain scans of London cabbies, which
revealed greatly enlarged representation in the brain region that
controls spatial awareness. The same holds for any specific task being
honed; the relevant brain regions adapt accordingly. ...
"[This type of practice] requires a constant self-critique, a
pathological restlessness, a passion to aim consistently just beyond
one's capability so that daily disappointment and failure is actually
desired, and a never-ending resolve to dust oneself off and try again
and again and again. ...
"The physiology of this process also requires extraordinary amounts of
elapsed time - not just hours and hours of deliberate practice each

day, Ericsson found, but also thousands of hours over the course of
many years. Interestingly, a number of separate studies have turned up
the same common number, concluding that truly outstanding skill in any
domain is rarely achieved in less than ten thousand hours of practice
over ten years' time (which comes to an average of three hours per
day). From sublime pianists to unusually profound physicists,
researchers have been very hard-pressed to find any examples of truly
extraordinary performers in any field who reached the top of their
game before that ten-thousand-hour mark."
Author: David Shenk
Title: The Genius in All of Us
Publisher: Doubleday
Date: Copyright 2010 by David Shenk
Pages: 53-57
-------------------------------------------------------------------------------------------------In today's excerpt - the curse of abundant oil resources in developing
countries - in this example, Venezuela. Developing countries with oil
grow only one-fourth as fast as those without, and are far more likely
to be militarized and devolve into civil war. In fact, oil and
mineral-exporting countries have a 23 percent likelihood of civil war
within five years, compared to less than 1 percent for nondependent
countries.:
"[With its oil wealth], Venezuela began to import more and more and
produce less, a typical symptom of Dutch disease, where resource-rich
countries see other parts of their economics wither. (Venezuela
actually had Dutch disease before the Dutch, but that term wouldn't be
invented until the natural gas boom in the Netherlands in the 1960s
torpedoed the country's economy. The condition should be called the
Caracas cramp.)
"[After the discovery of oil in Venezuela in 1921], nobody paid taxes.
If you're an oil state, it's far more efficient to ask oil buyers for
more money than to collect taxes from your population, which requires
a vast network of tax collectors, a bureaucracy, laws that are fair,
and a justice system to administer them. Collecting oil money, by
contrast, requires a small cadre of intellectuals to set policy and
diplomats to make it happen. ... The political, economic, and
psychological ramifications of this ... are profound.
" 'Systematically the government went after oil money rather than
raising taxes,' says economist Francisco Monaldi. 'There is no
taxation and therefore no representation here. The state here is
extremely autonomous.' Whether it's a dictatorship, a democracy, or
something in between, the state's only patron is the oil industry, and
all of its attention is focused outward. What's more, the state owes

nothing more than promises to the people of Venezuela, because they


have so little leverage on the state's income.
"When a state develops the ability to collect taxes, the bureaucracy
and mechanisms it creates are expensive. They perpetuate their
existence by diligently collecting as much money as possible and
encouraging the growth of a private economy to collect taxes from. A
strong private economy, so the thinking goes, creates a strong civil
society, fostering other centers of power that keep the state in
check. Like other intellectuals I talk with in other oil states,
Monaldi finds taxes more interesting and more useful than abstract
ideas about democracy and ballot boxes. Taxes aren't democracy, but
they seem to connect taxpayers and government in a way that has
democratizing effects. Studies by Michael L. Ross at UCLA found that
taxes alone don't foster accountability, but the relationship of taxes
to government services creates a struggle for value between the state
and citizens, which is some kind of accountability. ...
"Abdoulaye Djonouma, president of Chad's Chamber of Commerce, says oil
brought about economic and agricultural collapse in Nigeria and Gabon.
For Chad, which has fewer resources, he fears worse: militarization.
He ticks off all the former French colonies that have become
militarized. Virtually all. (One study found that oil-exporting
countries spend between two and ten times more on their militaries
than other developing countries.) ...
"At Stanford, Terry Lynn Karl's analysis of Venezuela's economy during
the 1970s and '80s shows that countries whose economy is dominated by
oil exports tend to experience shrinking standards of living something that Chad can hardly afford. Oil has opportunity costs: A
study by Jeffrey Sachs and Andres Warner showed that of ninety-seven
developing countries, those without oil grew four times as much as
those with oil. At UCLA, Michael L. Ross did regression studies
showing that governments that export oil tend to become less
democratic over time. At Oxford, Paul Collier's regression studies
show that oil, and mineral-exporting countries have a 23 percent
likelihood of civil war within five years, compared to less than 1
percent for nondependent countries."
Author: Lisa Margonelli
Title: Oil on the Brain
Publisher: Nan A. Talese/Doubleday
Date: Copyright 2007 by Lisa Margonelli
Pages: 146-147, 174-176
-------------------------------------------------------------------------------------------------In today's excerpt - bacteria:

"It's probably not a good idea to take too personal an interest in


your microbes. Louis Pasteur, the great French chemist and
bacteriologist, became so preoccupied with them that he took to
peering critically at every dish placed before him with a magnifying
glass, a habit that presumaby did not win him many repeat invitations
to dinner.
"In fact, there is no point in trying to hide from your bacteria, for
they are on and around you always, in numbers you can't conceive. If
you are in good health and averagely diligent about hygiene, you will
have a herd about one trillion bacteria grazing on your fleshy plains
- about a hundred thousand of them on every square centimeter of skin.
They are there to dine off the ten billion or so flakes of skin you
shed every day, plus all the tasty oils and fortifying minerals that
seep out from every pore and fissure. You are for them the ultimate
food court, with the convenience of warmth and constant mobility
thrown in. By way of thanks, they give you B.O.
"And those are just the bacteria that inhabit your skin. There are
trillions more tucked away in your gut and nasal passages, clinging to
you hair and eyelashes, swimming over the surface of your eyes,
drilling through the enamel of your teeth. Your digestive system alone
is host to more than a hundred trillion microbes, of at least four
hundred types. Some deal with sugars, some with starches, some attack
other bacteria. A surprising number, like the ubiquitous intestinal
spirochetes, have no detectable function at all. They just seem to
like to be with you. Every human body consists of about 10 quadrillion
cells, but about 100 quadrillion bacterial cells. They are, in short,
a big part of us. From the bacteria's point of view, of course, we are
a rather small part of them.
"Because we humans are big and clever enough to produce and utilize
antibiotics and disinfectants, it is easy to convince ourselves that
we have banished bacteria to the fringes of existence. Don't you
believe it. Bacteria may not build cities or have interesting social
lives, but they will be here when the Sun explodes. This is their
planet, and we are on it only because they allow us to be.
"Bacteria, never forget, got along for billions of years without us.
We couldn't survive a day without them. ... And they are amazingly
prolific. The more frantic among them can yield a new generation in
less than ten minutes; Clostridium perfringens, the disagreeable
little organism that causes gangrene, can reproduce in nine minutes.
At such a rate, a single bacterium could theoretically produce more
offspring in two days than there are protons in the universe. 'Given
an adequate supply of nutrients, a single bacterial cell can generate
280,000 billion individuals in a single day,' according to the Belgian
biochemist and Nobel laureate Christian de Duve. In the same period, a
human cell can just about manage a single division."

Author: Bill Bryson


Title: A Short History of Nearly Everything
Publisher: Broadway
Date: Copyright 2003 by Bill Bryson
Pages: 302-304
-------------------------------------------------------------------------------------------------In today's excerpt - Anglo-centric American historians have typically
featured Jamestown (founded in 1607) and Plymouth (founded in 1620) to
tell the founding story of America, at the expense of the Dutch colony
centered in Manhattan (founded in 1614 as Fort Amsterdam and the
designated New Amsterdam), which was more economically and culturally
dominant in the earliest years of American history, and which is
arguably more representative of America today:
"We are used to thinking of American beginnings as involving thirteen
English colonies - to thinking of American history as an English root
onto which, over time, the cultures of many other nations were grafted
to create a new species of society that has become a multiethnic model
for progressive societies around the world. But that isn't true. To
talk of the thirteen original English colonies is to ignore another
European colony, the one centered on Manhattan, which predated New
York and whose history was all but erased when the English took it
over (in 1664).
"The settlement in question occupied the area between the newly
forming English territories of Virginia and New England. It extended
roughly from present-day Albany, New York, in the north to Delaware
Bay in the south, comprising all or parts of what became New York, New
Jersey, Connecticut, Pennsylvania, and Delaware. It was founded by the
Dutch, who called it New Netherland, but half of its residents were
from elsewhere. Its capital was a tiny collection of rough buildings
perched on the edge of a limitless wilderness, but its muddy lanes and
waterfront were prowled by a Babel of peoples - Norwegians, Germans,
Italians, Jews, Africans (slaves and free), Walloons, Bohemians,
Munsees, Montauks, Mohawks, and many others - all living on the rim of
empire, struggling to find a way of being together, searching for a
balance between chaos and order, liberty and oppression. Pirates,
prostitutes, smugglers, and business sharks held sway in it. It was
Manhattan, in other words, right from the start: a place unlike any
other, either in the North American colonies or anywhere else.
"Because of its geography, its population, and the fact that it was
under the control of the Dutch (even then its parent city, Amsterdam,
was the most liberal in Europe), this island city would become the
first multiethnic, upwardly mobile society on America's shores, a
prototype of the kind of society that would be duplicated throughout
the country and around the world. ... If what made America great was

its ingenious openness to different cultures, then the small triangle


of land at the southern tip of Manhattan Island is the New World
birthplace of that idea, the spot where it first took shape. Many
people - whether they live in the heartland or on Fifth Avenue - like
to think of New York City as so wild and extreme in its cultural
fusion that it's an anomaly in the United States, almost a foreign
entity. This book offers an alternative view: that beneath the level
of myth and politics and high ideals, down where real people live and
interact, Manhattan is where America began.
"The original European colony centered on Manhattan came to an end
when England took it over in 1664, renaming it New York after James,
the Duke of York, brother of King Charles II, and folding it into its
other American colonies. As far as the earliest American historians
were concerned, that date marked the true beginning of the history of
the region. The Dutch-led colony was almost immediately considered
inconsequential. When the time came to memorialize national origins,
the English Pilgrims and Puritans of New England provided a better
model. The Pilgrims' story was simpler, less messy, and had fewer
pirates and prostitutes to explain away. It was easy enough to
overlook the fact that the Puritans' flight to American shores to
escape religious persecution led them, once established, to institute
a brutally intolerant regime, a grim theocratic monoculture about as
far removed as one can imagine from what the country was to become."
Author: Russell Shorto
Title: The Island at the Center of the World
Publisher: Doubleday
Date: Copyright 2004 by Russell Shorto
Pages: 2-3
-------------------------------------------------------------------------------------------------In today's excerpt - the 1922 destruction of Smyrna, a beautiful city
located on the Aegean coast of what is now Turkey with twice the Greek
population of Athens itself. In a century of global ethnic cleansing,
the razing of Smyrna was on a scale that the world had never before
seen - and was a harbinger of much that came after. Perhaps the most
cosmopolitan and ethnically tolerant city in the world in the early
twentieth century, it fell victim to the nascent Turkish nationalist
movement after misguided foreign policy moves - some say the blunders
of British Prime Minister David Lloyd George - inflamed the centuriesold enmity between Turkey and Greece. Essentially all of its 700,000
inhabitants were killed, captured or fled as refugees before the
Turkish National Army:
"The city [of Smyrna] was one in which fig-laden camels nudged their
way past the latest Newton Bennett motor car; in which the strange new
vogue of the cinema was embraced as early as 1908. There were

seventeen companies dealing exclusively in imported Parisian luxuries.


And if [a person] cared to read a daily newspaper, he had quite a
choice: eleven Greek, seven Turkish, five Armenian, four French and
five Hebrew, not to mention the ones shipped in from every capital
city in Europe. ...
"Amidst the grandeur there was intense human activity. Hawkers and
street traders peddled their wares along the mile-long quayside. Water
sellers jangled their brass bowls; hodjas - Muslim holy men - mumbled
prayers in the hope of earning a copper or two. And impecunious legal
clerks. often Italian, would proffer language lessons at knock-down
prices. 'You saw all sorts . . .' recalled the French journalist,
Gaston Deschamps. 'Swiss hoteliers, German traders, Austrian tailors,
English mill owners, Dutch fig merchants, Italian brokers, Hungarian
bureaucrats, Armenian agents and Greek bankers.'
"The waterfront was lined with lively bars, brasseries and shaded cafe
gardens, each of which tempted the palate with a series of enticing
scents. The odour of roasted cinnamon would herald an Armenian
patisserie; apple smoke spilled forth from hookahs in the Turkish
cafes. Coffee and olives, crushed mint and armagnac: each smell was
distinctive and revealed the presence of more than three dozen
culinary traditions. Caucasian pastries, boeuf a la mode, Greek game
pies and Yorkshire pudding could all be found in the quayside
restaurants of Smyrna. ...
"What happened over the two weeks [following September 9, 1922] must
surely rank as one of the most compelling human dramas of the
twentieth century. Innocent civilians - men, women and children from
scores of different nationalities - were caught in a humanitarian
disaster on a scale that the world had never before seen. The entire
population of the city became the victim of a reckless foreign policy
that had gone hopelessly, disastrously wrong. ...
"The total death toll is hard to compute with any certainty. According
to Edward Hale Bierstadt - executive of the United States Emergency
Committee - approximately 100,000 people were killed and another
160,000 deported into the interior. 'It is a picture too large and too
fearful to be painted,' he wrote in his 1924 study of the disaster,
The Great Betrayal, although he did his best, interviewing numerous
eyewitnesses and collecting their testimonies. Other estimates were
more conservative, claiming that 190,000 souls were unaccounted for by
the end of September. It is unclear how many of these had been killed
and how many deported, although Greek sources suggest that at least
100,000 Christians were marched into the interior of the country. Most
of these were never seen again. ...
"The exodus from Asia Minor was on a [massive] scale and it was to
continue for many months. To [rescue worker] Esther Lovejoy's eyes, it
was 'the greatest migration in the history of mankind.' The migration

was eventually enshrined in law in 1923, when [Turkish leader]


Mustafa Kemal put his signature to the Treaty of Lausanne. All of
Turkey's remaining 1.2 million Orthodox Christians were to be uprooted
from their ancestral homes and moved to Greece. And the 400,000
Muslims living in Greece were to be removed from their houses and
transported to Turkey. It was ethnic cleansing without parallel."
Author: Giles Milton
Title: Paradise Lost
Publisher: Sceptre
Date: Copyright 2008 by Giles Milton
Pages: 6-8, 372, 382
-------------------------------------------------------------------------------------------------In today's encore excerpt - Detroit and Motown:
"Motown would become the first successful black-owned record company
and eventually the nation's largest black-owned enterprise of any
sort.
"Without Detroit, there could have been no Motown. The company was an
outgrowth of the car industry, specifically of the black immigration
spurred by the industry's swift rise and seemingly endless demand for
labor. Industrial migration swelled the black population of most
northern cities, but none more quickly than Detroit. Between 1910 and
1930, the number of African Americans nearly tripled in Philadelphia
and New York, and quintupled in Chicago. But in Detroit it went up by
more than twenty times, from just under 6,000 to over 120,000.
"The main draw was Henry Ford's factory, which, in 1914, put out the
word that it was paying assembly-line workers five dollars a day. In
response, blacks moved from the South to Detroit at the rate of 1,000
per month; by 1922, the figure rose to 3,500 per month. If Ford's
lines were full, a strong worker could find a job at some other
factory. By 1925, there were three thousand major manufacturing plants
in Detroit; thirty-seven of them were building cars. At the start of
World War II, Detroit became the arsenal of America's military
machine, and the demand for workers soared further. A half million
more migrated to Detroit in the wars first two years.
"There was little assimilation of the black families that poured into
Detroit in the early waves of this migration; most of them were
crammed into dilapidated tenements on the city's south side. During
the even larger influx brought on by World War II, things got ugly.
Most of these new migrants were black, as before, but there were also
many Polish immigrants and white Appalachians, all competing for the
same jobs. A quarter of the city's 185 war plants refused to hire
blacks. Many car factories, even after all these years, would not mix

black and white workers on the same assembly line. In 1943, the NAACP
and the United Auto Workers staged an 'equal opportunity' rally, with
over ten thousand black men attending. When, as a result, three black
workers were promoted to skilled slots at a Packard plant, twenty-six
thousand white workers walked out.
"That summer, race riots erupted. President Franklin D. Roosevelt had
to send in six thousand federal troops to quell the violence, which
left thirty-four people dead (twenty-five of them black), hundreds
injured, and $2 million worth of property damage. The city planners
responded with the Detroit Plan, which demolished hundreds of
buildings and displaced thousands of black families, inspiring the
observation that 'urban renewal' was a euphemism for 'Negro removal.'
"Amid this de facto segregation, the African Americans in Detroit
created their own culture and institutions - and, because of the
decent-paying jobs at Ford and other factories, they had enough money
to sustain the effort."
Author: Fred Kaplan
Title: 1959
Publisher: Wiley
Date: Copyright 2009 by Fred Kaplan
Pages: 213-215
-------------------------------------------------------------------------------------------------In today's excerpt - a controversial suggestion regarding children and
lying:
"Researchers have found that the ability to tell fibs at the age of
two is a sign of a fast developing brain and means they are more
likely to have successful lives. They found that the more plausible
the lie, the more quick witted they will be in later years and the
better their ability to think on their feet. It also means that they
have developed 'executive function' - the ability to invent a
convincing lie by keeping the truth at the back of their mind.
" 'Parents should not be alarmed if their child tells a fib,' said Dr
Kang Lee, director of the Institute of Child Study at Toronto
University who carried out the research. 'Almost all children lie.
Those who have better cognitive development lie better because they
can cover up their tracks. They may make bankers in later life.' Lying
involves multiple brain processes, such as integrating sources of
information and manipulating the data to their advantage. It is linked
to the development of brain regions that allow 'executive functioning'
and use higher order thinking and reasoning.
"Dr Lee and his team tested 1,200 children aged two to 16 years old. A

majority of the volunteers told lies but it is the children with


better cognitive abilities who can tell the best lies. At the age of
two, 20 per cent of children will lie. This rises to 50 per cent by
three and almost 90 per cent at four. The most deceitful age, they
discovered, was 12, when almost every child tells lies. The tendency
starts to fall away by the age of 16, when it is 70 per cent. As
adulthood approaches, young people learn instead to use the less
harmful 'white lies' that everyone tells to avoid hurting people's
feelings.
"Researchers say there is no link between telling fibs in childhood
and any tendency to cheat in exams or to become a fraudster later in
life. Nor does strict parenting or a religious upbringing have any
impact. Dr Lee said that catching your children lying was not a bad
thing but should be exploited as a 'teachable moment'. 'You shouldn't
smack or scream at your child but you should talk about the importance
of honesty and the negativity of lying,' he told the Sunday Times.
'After the age of eight the opportunities are going to be very rare.'
"The research team invited younger children - one at a time - to sit
in a room with hidden cameras. A soft toy was placed behind them. When
the researcher briefly left the room, the children were told not to
look. In nine out of ten cases cameras caught them peeking. But when
asked if they had looked, they almost always said no. They tripped
themselves up when asked what they thought the toy might be. One
little girl asked to place her hand underneath a blanket that was over
the toy before she answered the question. After feeling the toy but
not seeing it, she said: 'It feels purple so it must be Barney.' Dr
Lee, who caught his son Nathan, three, looking at the toy, said: 'We
even had cameras trained on their knees because we thought their legs
would fidget if they were telling a lie, but it isn't true.'
"Older children were set a test paper but were told they must not look
at the answers printed on the back. Some of the questions were easy,
such as who lives in the White House. But the children who looked at
the back gave the printed answer 'Presidius Akeman' to the bogus
question 'Who discovered Tunisia?' When asked how they knew this, some
said they learned it in a history class."
Author: Richard Alleyne
Title: "Lying children will grow up to be successful citizens"
Publication: Telegraph.co.uk
Date: August 3, 2010
Pages: Home Section, Science
-------------------------------------------------------------------------------------------------In today's excerpt - is sex really necessary? After all, if the

evolutionary goal of each individual is to get as many genes into the


next generation as possible, wouldn't it have been simpler and easier
for the early organisms to have just replicated or made clones? The
likeliest explanation is that the genetic shuffling and resulting
variation that occurs from sex-based reproduction keeps enough
diversity in the species to minimize risk from bacteria, viruses and
other parasites. And since we're on the subject, after sex, why does
the male stay involved? From the stand point of biology, males have
more or less nothing to do after copulation:
"Approximately two billion years ago a pair of single-celled organisms
made a terrible mistake-they had sex. We're still living with the
consequences. Sexual reproduction is the preferred method for an
overwhelming portion of the planet's species, and yet from the
standpoint of evolution it leaves much to be desired. Finding and
wooing a prospective mate takes time and energy that could be better
spent directly on one's offspring. And having sex is not necessarily
the best way for a species to attain Darwinian fitness. If the
evolutionary goal of each individual is to get as many genes into the
next generation as possible, it would be simpler and easier to just
make a clone.
"The truth is, nobody really knows why people - and other animals,
plants and fungi - prefer sex to, say, budding. Stephen C. Stearns, an
evolutionary biologist at Yale University, says scientists now
actively discuss more than 40 different theories on why sex is so
popular. Each has its shortcomings, but the current front-runner seems
to be the Red Queen hypothesis. It gets its name from a race in Lewis
Carroll's Through the Looking Glass. Just as Alice has to keep running
to stay in the same place, organisms have to keep changing their
genetic makeup to stay one step ahead of parasites. Sexual
reproduction allows them to shuffle their genetic deck with each
generation. That's not to say that sex is forever. When it comes to
reproduction, evolution is a two-way street. When resources and mates
are scarce, almost all types of animals have been known to revert to
reproducing asexually. ...
"What persuaded the male hominid to stick around after mating? From
the standpoint of biology, males have nothing to do after copulation.
'It's literally wham-bam thank-you-ma'am,' says Kermyt G. Anderson, an
anthropologist at the University of Oklahoma-Norman and co-author of
Fatherhood: Evolution and Human Paternal Behavior. What made the first
father stick around afterward? He was needed. At some point in the six
million years since the human lineage split from chimpanzees, babies
got to be too expensive, in terms of care, for a single mother to
raise. A chimp can feed itself at age four, but humans come out of the
womb essentially premature and remain dependent on their parents for
many years longer. Hunters in Amazonian tribes cannot survive on their
own until age 18, according to anthropologist Hillard Kaplan of the
University of New Mexico-Albuquerque. Their skills peak in their 30s -

not unlike income profiles of modern men and women.


"Oddly enough, bird families also tend to have stay-at-home dads. In
more than 90 percent of bird species, both parents share the care of
their young. This arrangement probably began, at least for most birds,
when males started staying around the nests to protect helpless babies
from predators. 'A flightless bird sitting on a nest is a very
vulnerable creature,' explains evolutionary biologist Richard O. Prum
of Yale University. Some birds, though, might have inherited their
particular form of fatherhood from dinosaurs. Male theropods, a close
relative of birds, seem to have done all the nest building, just as
male ostriches do today. That doesn't mean everything was on the up
and up. A female ostrich will lay an egg in the nest of her mate, but
usually a different male fertilizes it. 'There's a loose
relationship,' Prum says, 'between paternal care and paternity.' "
Author: Brendan Borrell
Title: "Origins"
Publisher: Scientific American
Date: August 2010
Pages: 47-49
-------------------------------------------------------------------------------------------------In today's excerpt - in the early to mid-1800s, it was not clear
whether the United States or Mexico would emerge as the dominant power
in North America. The U.S. had a sliver of land on the east coast and
a vulnerable, export-dependent economy. Control of New Orleans was key
to a new economic foundation:
"Until the Mexican-American War, it was not clear whether the dominant
power in North America would have its capital in Washington or Mexico
City. Mexico was the older society with a substantially larger
military. The United States, having been founded east of the
Appalachian Mountains, had been a weak and vulnerable country. At its
founding, it lacked strategic depth and adequate north-south
transportation routes. The ability of one colony to support another in
the event of war was limited. More important, the United States had
the most vulnerable of economies: It was heavily dependent on maritime
exports and lacked a navy able to protect its sea-lanes against more
powerful European powers like England and Spain. The War of 1812
showed the deep weakness of the United States. By contrast, Mexico had
greater strategic depth and less dependence on exports.
"The American solution to this strategic weakness was to expand the
United States west of the Appalachians, first into the Northwest
Territory ceded to the United States by the United Kingdom and then
into the Louisiana Purchase, which Thomas Jefferson ordered bought
from France. These two territories gave the United States both

strategic depth and a new economic foundation. The regions could


support agriculture that produced more than the farmers could consume.
Using the Ohio-Missouri-Mississippi river system, products could be
shipped south to New Orleans. New Orleans was the farthest point south
to which flat-bottomed barges from the north could go, and the
farthest inland that oceangoing ships could travel. New Orleans became
the single most strategic point in North America. Whoever controlled
it controlled the agricultural system developing between the
Appalachians and the Rockies. During the War of 1812, the British
tried to seize New Orleans, but forces led by Andrew Jackson defeated
them in a battle fought after the war itself was completed.
"Jackson understood the importance of New Orleans to the United
States. He also understood that the main threat to New Orleans came
from Mexico. The U.S.-Mexican border then stood on the Sabine River,
which divides today's Texas from Louisiana. It was about 200 miles
from that border to New Orleans and, at its narrowest point, a little
more than 100 miles from the Sabine to the Mississippi.
"Mexico therefore represented a fundamental threat to the United
States. In response, Jackson authorized a covert operation under Sam
Houston to foment an uprising among American settlers in the Mexican
department of Texas with the aim of pushing Mexico farther west. With
its larger army, a Mexican thrust to the Mississippi was not
impossible - nor something the Mexicans would necessarily avoid, as
the rising United States threatened Mexican national security.
"Mexico's strategic problem was the geography south of the Rio Grande
(known in Mexico as the Rio Bravo). This territory consisted of desert
and mountains. Settling this area with large populations was
impossible. Moving through it was difficult. As a result, Texas was
very lightly settled with Mexicans, prompting Mexico initially to
encourage Americans to settle there (in part, as a buffer against the
Comanches, ed.) Once a rising was fomented among the Americans, it
took time and enormous effort to send a Mexican army into Texas. When
it arrived, it was weary from the journey and short of supplies. The
insurgents were defeated at the Alamo and Goliad, but as the Mexicans
pushed their line east toward the Mississippi, they were defeated at
San Jacinto, near present-day Houston.
"The creation of an independent Texas served American interests,
relieving the threat to New Orleans and weakening Mexico. The final
blow was delivered under President James K. Polk during the MexicanAmerican War, which (after the Gadsden Purchase) resulted in the
modern U.S.-Mexican border. That war severely weakened both the
Mexican army and Mexico City, which spent roughly the rest of the
century stabilizing Mexico's original political order."
Author: George Friedman
Title: "Arizona, Borderlands and U.S.-Mexican Relations"

Publisher: Stratfor Global Intelligence


Date: August 3, 2010
-------------------------------------------------------------------------------------------------In today's encore excerpt - dopamine, pleasure, and too much pleasure:
"The importance of dopamine was discovered by accident. In 1954, James
Olds and Peter Milner, two neuroscientists at McGill University,
decided to implant an electrode deep into the center of a rat's brain.
The precise placement of the electrode was largely happenstance; at
the time, the geography of the mind remained a mystery. But Olds and
Milner got lucky. They inserted the needle right next to the nucleus
accumbens (NAcc), a part of the brain that generates pleasurable
feelings. Whenever you eat a piece of chocolate cake, or listen to a
favorite pop song, or watch your favorite team win the World Series,
it is your NAcc that helps you feel so happy.
"But Olds and Milner quickly discovered that too much pleasure can be
fatal. They placed the electrodes in several rodents' brains and then
ran a small current into each wire, making the NAccs continually
excited. The scientists noticed that the rodents lost interest in
everything. They stopped eating and drinking. All courtship behavior
ceased. The rats would just huddle in the corners of their cages,
transfixed by their bliss. Within days, all of the animals had
perished. They died of thirst.
"It took several decades of painstaking research, but neuroscientists
eventually discovered that the rats had been suffering from an excess
of dopamine. The stimulation of the NAcc triggered a massive release
of the neurotransmitter, which overwhelmed the rodents with ecstasy.
In humans, addictive drugs work the same way: a crack addict who has
just gotten a fix is no different than a rat in an electrical rapture.
The brains of both creatures have been blinded by pleasure. This,
then, became the dopaminergic cliche; it was the chemical explanation
for sex, drugs, and rock and roll.
"But happiness isn't the only feeling that dopamine produces.
Scientists now know that this neurotransmitter helps to regulate all
of our emotions, from the first stirrings of love to the most visceral
forms of disgust. It is the common neural currency of the mind, the
molecule that helps us decide among alternatives. By looking at how
dopamine works inside the brain, we can see why feelings are capable
of providing deep insights. While Plato disparaged emotions as
irrational and untrustworthy - the wild horses of the soul - they
actually reflect an enormous amount of invisible analysis."
Author: Jonah Lehrer
Title: How We Decide

Publisher: Houghton Mifflin Harcourt


Date: Copyright 2009 by Jonah Lehrer
Pages: Kindle Loc. 463-538.
-------------------------------------------------------------------------------------------------In today's excerpt - the work of morticians and the indignity of
death. We are biology. We are reminded of this at the beginning and
the end, at birth and at death. In between we do what we can to
forget:
"An eye cap is a simple ten-cent piece of plastic. It is slightly
larger than a contact lens, less flexible, and considerably less
comfortable. The plastic is repeatedly lanced through, so that small,
sharp spurs stick up from its surface. The eyelid will come down over
an eye cap, but, once closed, will not easily open back up. Eye caps
were invented by a mortician to help dead people keep their eyes
shut. ...
"Presiding at the embalming table today are Theo Martinez and Nicole
D'Ambrogio. ... Before the embalming begins, the exterior of the
corpse is cleaned and groomed. Nicole swabs the mouth and eyes with
disinfectant, then rinses both with a jet of water. Though I know the
man to be dead, I expect to see him flinch when the cotton swab hits
his eye, to cough and sputter when the water hits the back of his
throat. His stillness, his deadness, is surreal. The students move
purposefully. Nicole is looking in the man's mouth. Her hand rests
sweetly on his chest. Concerned, she calls Theo over to look. They
talk quietly and then he turns to me. 'There's material sitting in the
mouth,' he says. ... 'What happened is that whatever was in the
stomach found its way into the mouth.' Gases created by bacterial
decay build up and put pressure on the stomach, squeezing its contents
back up the esophagus and into the mouth. The situation appears not to
bother Theo and Nicole, though purge is a relatively infrequent
visitor to the embalming room. Theo explains that he is going to use
an aspirator [to remove it]. ...
"Next Theo coats the face with what I assume to be some sort of
disinfecting lotion, which looks a lot like shaving cream. The reason
that it looks a lot like shaving cream, it turns out, is that it is.
Theo slides a new blade into a razor. 'When you shave a decedent, it's
really different. ... The skin isn't able to heal, so you have to be
really careful about nicks. One shave per razor, and then you throw it
away.' ...
" 'Now we're going to set the features,' says Theo. He lifts one of
the man's eyelids and packs tufts of cotton underneath to fill out the
lid the way the man's eyeballs once did. ... On top of the cotton go a
pair of eye caps. 'People would find it disturbing to find the eyes

open,' explains Theo, and then he slides down the lids. ...
" 'Did you already go in the nose?' Nicole is holding aloft tiny
chrome scissors. Theo says no. She goes in, first to trim the hair,
then with the disinfectant. 'It gives the decedent some dignity,' she
says, plunging wadded cotton into and out of his left nostril.
"The last feature to be posed is the mouth, which will hang open if
not held shut. Theo is narrating for Nicole, who is using a curved
needle and heavy-duty string to suture the jaws together. 'The goal is
to reenter through the same hole and come in behind the teeth,' says
Theo. 'Now she's coming out one of the nostrils, across the septum,
and then she's going to reenter the mouth. There are a variety of ways
of closing the mouth' he adds, and then he begins talking about
something called a needle injector. ...
"Drops of sweat bead the inside surface of Nicole's splash shield.
We've been here more than an hour. It's almost over. Theo asks, 'Will
we be suturing the anus?' He turns to me. 'Otherwise leakage can wick
into the funeral clothing and it's an awful mess.' I don't mind Theo's
matter-of-factness. Life contains these things: leakage and wickage
and discharge, pus and snot and slime and gleet. We are biology. We
are reminded of this at the beginning and the end, at birth and at
death. In between we do what we can to forget."
Author: Mary Roach
Title: Stiff
Publisher: Norton
Date: Copyright 2003 by Mary Roach
Pages: 72-84
-------------------------------------------------------------------------------------------------In today's excerpt - ancient floods:
"Most cultures ... have stories about a 'great flood' sent by angry
gods to destroy mankind in the distant past. In Western civilization
the most well known example is the story of Noah in the Bible. When
God got fed up with mankind's disobedience and wickedness, he chose
Noah and his family to perform a special mission: to build a huge boat
(an ark) to hold breeding pairs of every animal to repopulate the
world after the deluge.
"In the Sumerian version, the god Enki warns the king of Shuruppak,
Ziusudra, that the gods have decided to destroy the world with a
flood. Enki tells Ziusudra to build a large boat, where the king rides
out the week-long flood. He prays to the gods, makes sacrifices, and
is finally given immortality. According to Sumerian histories, the
first Sumerian dynasty was founded by King Etana of Kish after this

flood.
"Aboard ship take thou the seed of all living things.
That ship thou shall build;
Her dimensions shall be to measure.
-Sumerian flood myth
"According to the ancient Greeks, the mythical demigod Prometheus
warned his son, Deucalion, that a great flood was coming, and
instructed him to build a giant waterproof chest to hold himself and
his wife, Pyrrha. The rest of humanity was drowned, but Deucalion and
Pyrrha rode out the nine days of rain and flooding in their chest. As
the flood subsided, they washed up on Mount Othrys, in northern
Greece. Zeus told Deucalion and his wife to throw stones over their
shoulders, which became men and women to repopulate the world.
"Finally, Hindu mythology tells of a priest named Manu, who served one
of India's first kings. Washing his hands in a river one day, Manu
saved a tiny fish, who begged him for help. The grateful fish warned
Manu that a giant flood was coming, so Manu built a ship on which he
brought the 'seeds of life' to plant again after the flood. The fish actually a disguise for the chief god, Vishnu - then towed the vessel
to a mountaintop sticking up above the water. Sound familiar?
"Though it's impossible to know if these stories refer to the same
actual event, a couple of historical events are plausible candidates.
[One potential] explanation is the huge rise in sea levels that
occurred at the end of the last Ice Age, beginning about twelve
thousand years ago (10,000 BCE). The melting of the polar ice caps
raised sea levels almost four hundred feet around the world - which
must have made quite an impression."
Author: Erik Sass and Steve Wiegand with Will Pearson and Mangesh
Hattikudur
Title: The Mental Floss History of the World
Publisher: Harper
Date: Copyright 2008 by Mental Floss LLC
Pages: 21-22
-------------------------------------------------------------------------------------------------In today's excerpt - only the tiniest fraction of all the earth's
water is available to us as fresh liquid water, and control of rivers,
more than oceans or lakes, has been the key to the advance of
civilization:
"Despite Earth's superabundance of total water, nature endowed to
mankind a surprisingly minuscule amount of accessible fresh liquid
water that is indispensable to planetary life and human civilization.

Only 2.5 percent of Earth's water is fresh. But two-thirds of that is


locked away from man's use in ice caps and glaciers. All but a few
drops of the remaining one-third is also inaccessible, or
prohibitively expensive to extract, because it lies in rocky,
underground aquifers - in effect, isolated underground lakes - many a
half mile or more deep inside Earth's bowels. Such aquifers hold up to
an estimated 100 times more liquid freshwater than exists on the
surface. In all, less than three-tenths of 1 percent of total
freshwater is in liquid form on the surface. The remainder is in
permafrost and soil moisture, in the body of plants and animals, and
in the air as vapor.
"One of the most striking facts about the world's freshwater is that
the most widely accessed source by societies throughout history-rivers
and streams-hold just six-thousandths of 1 percent of the total. Some
societies have been built around the edges of lakes, which
cumulatively hold some 40 times more than rivers. Yet lake water has
been a far less useful direct resource to large civilizations because
its accessible perimeters are so much smaller than riversides.
Moreover, many are located in inhospitable frozen regions or mountain
highlands, and three-fourths are concentrated in just three lake
systems: Siberia's remote, deep Lake Baikal, North America's Great
Lakes, and East Africa's mountainous rift lakes, chiefly Tanganyika
and Nyasa. ...
"The minuscule, less than 1 percent total stock of accessible
freshwater, however, is not the actual amount available to mankind
since rivers, lakes, and shallow groundwater are constantly being
replenished through Earth's desalinating water cycle of evaporation
and precipitation - at any given moment in time, four-hundredths of 1
percent of Earth's water is in the process of being recycled through
the atmosphere. Most of the evaporated water comes from the oceans and
falls back into them as rain or snow. But a small, net positive amount
of desalted, cleansed ocean water precipitates over land to renew its
freshwater ecosystems before running off to the sea. Of that amount,
civilizations since the dawn of history have had practical access only
to a fraction, since two-thirds was rapidly lost in floods,
evaporation, and directly in soil absorption, while a lot of the rest
ran off in regions like the tropics or frozen lands too remote from
large populations to be captured and utilized. Indeed, the dispersion
of available freshwater on Earth is strikingly uneven. Globally, onethird of all streamflow occurs in Brazil, Russia, Canada, and the
United States, with a combined one-tenth of the world's population.
Semiarid lands with one-third of world population, by contrast, get
just 8 percent of renewable supply. Due to the extreme difficulty of
managing such a heavy liquid -weighing 8.34 pounds per gallon, or over
20 percent more than oil - societies' fates throughout history have
rested heavily on their capacity to increase supply and command over
their local water resources. ...

"Almost everywhere civilization has taken root, man-made


deforestation, water diversion, and irrigation schemes have produced
greater desiccation, soil erosion, and the ruination of Earth's
natural fertility to sustain plant life.
"How societies respond to the challenges presented by the changing
hydraulic conditions of its environment using the technological and
organizational tools of its times is, quite simply, one of the central
motive forces of history. ... Throughout history, wherever water
resources have been increased and made most manageable, navigable, and
potable, societies have generally been robust and long enduring. ...
In every age, whoever gained control of the world's main sea-lanes or
the watersheds of great rivers commanded the gateways of imperial
power."
Author: Steven Solomon
Title: Water
Publisher: Harper
Date: Copyright 2010 by Steven Solomon
Pages: 12-16
-------------------------------------------------------------------------------------------------In today's excerpt - the bizarre world of psychopaths, and the equally
bizarre world of psychopathy treatment. Some researchers have
estimated that as many as 500,000 psychopaths inhabit the U.S. prison
system, and there may be another 250,000 more living freely - perhaps
not committing serious crimes, but still taking advantage of those
around them. Psychopathy is caused in large part by differences in
biology. Images of psychopaths' brains made by Kent A. Kiehl show a
pronounced thinning and underdevelopment of the paralimbic tissue, and
area which includes the orbitofrontal cortex, the amygdala, the
anterior cingulate cortex and the insula:
"Between the two of us [authors], we have interviewed hundreds of
prison inmates to assess their mental health. We are trained in
spotting psychopaths, but even so, coming face to face with the real
article can be electrifying, if also unsettling. One of the most
striking peculiarities of psychopaths is that they lack empathy; they
are able to shake off as mere tinsel the most universal social
obligations. They lie and manipulate yet feel no compunction or
regrets - in fact, they don't feel particularly deeply about anything
at all. ...
"Psychopaths are curiously oblivious to emotional cues. In 2002 James
Blair of the NIMH showed that they are not good at detecting emotions,
especially fear, in another person's voice. They also have trouble
identifying fearful facial expressions. ...

"Psychopaths often cover up their deficiencies with a ready and


engaging charm, so it can take time to realize what you are dealing
with. Kent A. Kiehl used to ask inexperienced graduate students to
interview a particularly appealing inmate before acquainting
themselves with his criminal history. These budding psychologists
would emerge quite certain that such a well-spoken, trustworthy person
must have been wrongly imprisoned. Until, that is, they read his file
- pimping, drug dealing, fraud, robbery, and on and on - and went back
to reinterview him, at which point he would say offhandedly, 'Oh,
yeah, I didn't want to tell you about all that stuff. That's the old
me.' ...
"A man we will call Brad was in prison for a particularly heinous
crime. In an interview he described how he had kidnapped a young
woman, tied her to a tree, [abused] her for two days, then slit her
throat and left her for dead. He told the story, then concluded with
an unforgettable non sequitur. 'Do you have a girl?' he asked.
'Because I think it's really important to practice the three C'scaring, communication and compassion. That's the secret to a good
relationship. I try to practice the three C's in all my
relationships.' He spoke without hesitation, clearly unaware how
bizarre this selfhelp platitude sounded after his awful confession....
"Thanks to technology that captures brain activity in real time,
experts are no longer limited to examining psychopaths' aberrant
behavior. We can investigate what is happening inside them as they
think, make decisions and react to the world around them. And what we
find is that far from being merely selfish, psychopaths suffer from a
serious biological defect. Their brains process information
differently from those of other people. It's as if they have a
learning disability that impairs emotional development. ...
"Kiehl has launched an ambitious multimillion-dollar project to gather
genetic information, brain images and case histories from 1,000
psychopaths and compile it all into a searchable database. ... Between
15 and 35 percent of U.S. prisoners are psychopaths. Psychopaths
offend earlier, more frequently and more violently than others, and
they are four to eight times more likely to commit new crimes on
release. In fact, there is a direct correlation between how high
people score on the 40-point screening test for psychopathy and how
likely they are to violate parole. Kiehl recently estimated that the
expense of prosecuting and incarcerating psychopaths, combined with
the costs of the havoc they wreak in others' lives, totals $250
billion to $400 billion a year. No other mental health problem of this
size is being so willfully ignored.
"Billions of research dollars have been spent on depression; probably
less than a million has been spent to find treatments for
psychopathy. ... There is room for optimism: a new treatment for
intractable juvenile offenders with psychopathic tendencies has had

tremendous success. Michael Caldwell, a psychologist at the Mendota


Juvenile Treatment Center in Madison, Wis., uses intensive one-on-one
therapy known as decompression aimed at ending the vicious cycle in
which punishment for bad behavior inspires more bad behavior, which is
in turn punished. Over time, the incarcerated youths in Caldwell's
program act out less frequently and become able to participate in
standard rehabilitation services. A group of more than 150 youths
treated by Caldwell were 50 percent less likely to engage in violent
crime afterward than a comparable group who were treated at regular
juvenile corrections facilities. The young people in the regular
system killed 16 people in the first four years after their release;
those in Caldwell's program killed no one."
Authors: Kent A. Kiehl and Joshua W. Buckholtz
Title: "Inside the Mind of a Psychopath"
Publisher: Scientific American Mind
Date: September/October 2010
Pages: 22-29
-------------------------------------------------------------------------------------------------In today's encore excerpt - the beloved children's book, French author
Jean de Brunhoff's "The Story of Babar," published in 1931 in the days
of French colonies and the global French Empire:
"[In the book], an elephant, lost in the city, does not trumpet with
rage but rides a department-store elevator up and down, until gently
discouraged by the elevator boy. A Haussmann-style city rises in the
middle of the barbarian jungle. Once seen, Babar the Frenchified
elephant is not forgotten. ...
"Every children's story that works at all begins with a simple
opposition of good and evil, of straightforward innocence and envious
corruption. ...[In this story], Babar's mother, with her little
elephant on her back, is murdered, with casual brutality, by a squat
white hunter. ... (Maurice Sendak, in a lovely appraisal of Babar,
recalls thinking that the act of violence that sets Babar off is not
sufficiently analyzed - that the trauma is left unhealed and even
untreated.) ...
"Babar, [some] interpreters have insisted, is an allegory of French
colonization, as seen by the complacent colonizers: the naked African
natives, represented by the 'good' elephants, are brought to the
imperial capital, acculturated, and then sent back to their homeland
on a civilizing mission. The elephants that have assimilated to the
ways of the metropolis dominate those which have not. The true
condition of the animals--to be naked, on all fours, in the jungle-is
made shameful to them, while to become an imitation human, dressed and
upright, is to be given the right to rule. The animals that resist -

the rhinoceroses - are defeated. The Europeanized elephants are, as in


the colonial mechanism of indirect rule, then made trustees of the
system, consuls for the colonial power. To be made French is to be
made human and to be made superior. ...
"Yet those who would [so interpret] 'Babar' miss the true subject of
the books. The de Brunhoffs' saga is not an unconscious expression of
the French colonial imagination; it is a self-conscious comedy about
the French colonial imagination. ... The gist of the classic early
books of the nineteen-thirties - 'The Story of Babar' and 'Babar the
King,' particularly - is explicit and intelligent: the lure of the
city, of civilization, of style and order and bourgeois living is
real, for elephants as for humans. The costs of those things are real,
too, in the perpetual care, the sobriety of effort, they demand. The
happy effect that Babar has on us, and our imaginations, comes from
this knowledge - from the child's strong sense that, while it is a
very good thing to be an elephant, still, the life of an elephant is
dangerous, wild, and painful. It is therefore a safer thing to be an
elephant in a house near a park. ...
"All children's books take as their subject disorder and order and
their proper relation, beginning in order and ending there, but with
disorder given its due. ... Disorder is the normal mess of life, what
rhinos like. Order is what elephants (that is, Frenchmen) achieve at a
cost and with effort. To stray from built order is to confront the man
with a gun. ... Fables for children work not by pointing to a moral
but by complicating the moral of a point. The child does not dutifully
take in the lesson that salvation lies in civilization, but, in good
Freudian fashion, takes in the lesson that the pleasures of
civilization come with discontent at its constraints: you ride the
elevator, dress up in the green suit, and go to live in Celesteville,
but an animal you remain - the dangerous humans and rhinoceroses are
there to remind you of that - and you delight in being so. There is
allure in escaping from the constraints that button you up and hold
you; there is also allure in the constraints and the buttons. We would
all love to be free, untrammeled elephants, but we long, too, for a
green suit."
Author: Adam Gopnik
Title: "Freeing the Elephants"
Publisher: The New Yorker
Date: September 22, 2008
Pges: 46-50
-------------------------------------------------------------------------------------------------In today's excerpt - Moscow, circa 1685, at the time of the completion
of the palace at Versailles in France and the inception of the Salem
witch trials in America:

"From a distance, Moscow struck one Western traveler as the most


'beautiful city in the world,' an urban feast topped by hundreds of
gold-crusted domes and a sea of glistening crosses that surmounted the
treetops. Unlike the stone and marble of its European counterparts,
Moscow was a city hewn from wood; even the streets themselves were
planked with timber, not trampled down or paved with stone. Also
unlike anything in the Western world was the somber medieval citadel
of Russian power, the Kremlin, which imbued the city with an exotic
mystery. ...
"With its massive red walls jutting from the bank of the Moscow River,
the Kremlin was not a single building but an entire walled city Kreml literally means 'fortress' in Russian - ringed by two rivers and
a deep moat. Inside this mighty citadel rose gorgeous cathedrals
(three), an astonishing number of chapels (sixteen hundred) and
hundreds of houses, as well as government offices, law courts,
barracks, bakeries, laundries, stables, and a mighty whitewashed-brick
bell tower ... And Moscow had a spiritual dimension rivaled only by
Jerusalem and the Vatican: It was the 'Third Rome,' the center of
Orthodox faith. ...
"The bazaars of Moscow were frequented by Persians, Afghans, Kirghiz,
Indians, and Chinese, while traders and artisans peddled an eclectic
slice of the Asiatic world: silks, brass and copper goods, tooled
leather and bronze, and innumerable objects of hand-carved wood. The
city itself was peopled with tattered, itinerant holy men and bearded
priests, as well as ruddy peasants in cloth leggings and soldiers in
voluminous caftans. ... Russian customs were uncommonly coarse - basic
things like cutlery and toothpicks were unheard of; and drunkenness
was so rampant that on feast days, travelers were stunned to see naked
men, passed out, who had sold their clothing for drink. Dwarfs and
fools, increasingly out of fashion in the West, still amused the tsar
and his retainers. ...
"Muscovites were an intensely religious people, and most of the city,
rich and poor alike, fell under the church's spell. Few had a hold on
the Russian mind or imagination as did the starets - the man of God.
but the true master who loomed over this ancient land was ultimately
the tsar, the very portrait of absolute monarchy. ... From infancy,
Russians were taught to regard him as a godlike creature ('Only God
and the tsar know,' went one ancient proverb). ... Russian noblemen
did not simply bow, they flattened themselves before the tsar,
touching the ground with their foreheads ('we humbly beseech you, we
your slaves ...')."
Author: Jay Winik
Title: The Great Upheaval
Publisher: Harper Collins
Date: Copyright 2007 by Jay Winik

Pages: 12-14
-------------------------------------------------------------------------------------------------In today's excerpt - Early Americans quickly began to view themselves
as exceptional, different and better than Europeans. Part of that
exceptionalism was equality, which brought with it openness to
strangers. Despite its later reputation for exclusivity, Freemasonry which was born in the widespread emergence of a middle class, and grew
rapidly to fill the human need to belong in a country where high
mobility was breaking apart traditional social bonds - was an agent
for breaking down class barriers and aiding this new openness.
Freemasonry repudiated the monarchical hierarchy of family and
favoritism and helped create a new republican order that rested on
'real Worth and personal Merit':
"Intense local attachments were common to peasants and backward
peoples, but educated gentlemen were supposed to be at home anywhere
in the world. Indeed, to be free of local prejudices and parochial
ties was what defined a liberally educated person. One's humanity was
measured by one's ability to relate to strangers, and Americans prided
themselves on their hospitality and their treatment of strangers, thus
further contributing to the developing myth of their exceptionalism.
Indeed, as Crevecoeur pointed out, in America the concept of
'stranger' scarcely seemed to exist: ' A traveler in Europe becomes a
stranger as soon as he quits his own kingdom; but it is otherwise
here. We know, properly speaking, no strangers; this is every person's
country; the variety of our soils, situations, climates, governments,
and produce hath something which must please everyone.' 'In what part
of the globe,' asked Benjamin Rush, 'was the 'great family of mankind'
given as a toast before it was given in the republican states of
America?'
"The institution that many Americans believed best embodied these
cosmopolitan ideals of fraternity was Freemasonry. Not only did
Masonry create enduring national icons (like the pyramid and the allseeing eye of Providence on the Great Seal of the United States), but
it brought people together in new ways and helped fulfill the
republican dream of reorganizing social relationships. It was a major
means by which thousands of Americans could think of themselves as
especially enlightened. ... Many of the Revolutionary leaders,
including Washington, Franklin, Samuel Adams, James Otis, Richard
Henry Lee, and Hamilton, were members of the fraternity.
"Freemasonry was a surrogate religion for enlightened men suspicious
of traditional Christianity. It offered ritual, mystery, and
communality without the enthusiasm and sectarian bigotry of organized
religion. But Masonry was not only an enlightened institution; with
the Revolution, it became a republican one as well. As George

Washington said, it was 'a lodge for the virtues.' The Masonic lodges
had always been places where men who differed in everyday affairs politically, socially, even religiously - could 'all meet amicably,
and converse sociably together.' There in the lodges, the Masons told
themselves, 'we discover no estrangement of behavior, nor alienation
of affection.' Masonry had always sought and harmony in a society
increasingly diverse and fragmented. ...
"In the decades following the Revolution Masonry exploded in numbers,
fed by hosts of new recruits from middling levels of the society.
There were twenty-one lodges in Massachusetts by 1779; in the next
twenty years fifty new ones were created, reaching out to embrace even
small isolated communities on the frontiers of the state. Everywhere
the same expansion took place. Masonry transformed the social
landscape of the early Republic.
"Masonry began emphasizing its role in spreading republican virtue and
civilization. It was, declared some New York Masons in 1795, designed
to wipe 'away those narrow and contracted Prejudices which are born in
Darkness, and fostered in the Lap of ignorance.' Freemasonry
repudiated the monarchical hierarchy of family and favoritism and
created a new republican order that rested on 'real Worth and personal
Merit' and 'brotherly affection and sincerity.' At the same time,
Masonry offered some measure of familiarity and personal relationships
to a society that was experiencing greater mobility and increasing
numbers of immigrants. It created an 'artificial consanguinity,'
declared DeWitt Clinton of New York in 1793, that operated 'with as
much force and effect, as the natural relationship of blood.'
"Despite its later reputation for exclusivity, Freemasonry became a
way for American males of diverse origins and ranks to be brought
together in republican fraternity, including, at least in Boston, free
blacks."
Author: Gordon S. Wood
Title: Empire of Liberty
Publisher: Oxford
Date: Copyright 2009 by Oxford University Press, Inc.
Pages: 50-52
-------------------------------------------------------------------------------------------------In today's encore excerpt - Columbus's discovery of America brought
new foods, including tomatoes, potatoes, corn, squash, and sugar, that
transformed the European diet:
"Before Columbus, the diet of Europeans had remained basically
unchanged for tens of thousands of years, based mainly on oats,
barley, and wheat. Within a quarter century of his first voyage, the

European diet became richer, more varied, and more nutritious. As


Roger Schlesinger wrote in his book, In the Wake of Columbus: 'As far
as dietary habits are concerned, no other series of events in all
world history brought as much significant change as did [the discovery
of the Americas].' The list of foods that made their way into Europe
is extensive and includes maize, squash, pumpkin, avocado, papaya,
cassava, vanilla, tomatoes, potatoes, sweet potatoes (yams),
strawberries, and beans of almost every variety.
"The potato was one of the first American foods to be transported to
Europe. Valued by the conquistadores, they made it a key item in the
diet of their sailors. The potato then spread to England and Scotland,
and to Ireland where it became the staple of the Irish diet.
"It was also the Spanish who discovered the tomato, first distributing
it throughout their Caribbean possessions and then bringing it to
Europe. In both Italy and Great Britain, the tomato was first thought
to be poisonous, and it was not until the 1700s that the fruit became
widely eaten. As was the case with sweet potatoes, which were regarded
by some Europeans as having aphrodisiac-like qualities, the tomato was
also viewed in some circles as having medicinal value. ... Actually,
some of these claims may not have been as farfetched as they seem,
since many Old World ailments were caused by the lack of fresh fruits
and vegetables. ...
"Tapioca, made from cassava root, eventually became
delicacy, as did a drink made from the cocoa plant.
Hernan Cortes and his men witnessed Aztecs drinking
and Central American natives had been consuming the
hundreds of years. ...

a European
By the time that
chocolatl, South
beverage for

"As diet transforming as all these newly introduced foods became,


sugar, perhaps, had the greatest impact of all. As ever-increasing
amounts of sugar were transported from New World plantations to
Europe, the types of foods that were eaten, and just as significantly,
the ways in which they were cooked, were changed forever. Before the
early 1500s, sugar was sold in European apothecary shops where,
because of its scarcity, only the rich could afford it. But as sugarladen ships arrived in Old World ports, prices tumbled and sugar
became an important foodstuff for the masses. At the time, honey was
both expensive and in short supply, but even if that had not been the
case, most people found sugar to be a much more desirable sweetener.
As a result, tea and coffee drinking gained a popularity that would
never diminish.
"Even more important, the availability of sugar led to the
proliferation of confections and jams that soon graced tables
throughout Europe. ...
"Sugar's impact on the European diet went way beyond jams and

confections and the sweetening of tea, coffee, and other beverages.


Such leftover foods as rice and bread could now be given new life and
a whole new taste when sprinkled with sugar and reheated. Fruits and
vegetables could be inexpensively preserved when immersed in a sugary
syrup. Sugar's popularity also led to the introduction of a host of
new cooking utensils and accoutrements, including new types of
saucepans, pie plates, cookie molds, sugar pots, sugar spoons, and
tongs."
Author: Martin W. Sandler
Title: Atlantic Ocean
Publisher: Sterling
Date: Copyright 2008 by Martin W. Sandler
Pages: 92-100
-------------------------------------------------------------------------------------------------In today's excerpt - in Confucian China, women were little more than
slaves, a status that remained true through the turn of the twentieth
century:
"Nowhere were women treated with greater contempt than in a Confucian
state. Chinese ideographs that include the character for 'woman' mean:
evil, slave, anger, jealousy, avarice, hatred, suspicion, obstruction,
demon, witch, bewitching, fornication, and seduction. Confucius warned
gentlemen against being 'too familiar with the lower orders or with
women.' As the poet Fu Xuan wrote in the third century:
How sad it is to be a woman!
Nothing on earth is held so cheap.
Boys stand leaning at the door
Like gods fallen out of heaven.
"Marriage in China was less of a union between man and woman than a
contract of indentured servitude between a girl and her mother-in-law.
A wedding was arranged by parents in an effort to advance themselves
socially, politically, or financially. In traditional Chinese society
a girl married into her husband's family and gave up all contact with
her own parents. A bride was subservient to everyone in the new
household but especially to her husband's mother, for whom she toiled
without rest. Wife and mother-in-law were jealous rivals for the
affection of the husband/son. Publicly a husband and wife were
indifferent toward each other, never openly acknowledging the
existence of the other. In private the wife would have to struggle to
win her husband's respect, and only through her grown sons did she
have any real hope of security. No wonder she then exhibited little
affection toward her son's bride, and the cycle repeated itself.
"A concubine was a serious and usually permanent member of a

household. She was brought in to bear a son, after the first wife had
failed, then remained as an assistant wife, with all the
responsibilities and few of the privileges. Once the man lost
interest, she was just another servant. In most cases she was
purchased from her parents, so in fact she was a slave, though she
could not be discarded without arriving at a settlement with her
family."
Author: Sterling Seagrave
Title: Dragon Lady
Publisher: Vintage
Date: Copyright 1992 by Scribbler's Ltd
Pages: 29-30
-------------------------------------------------------------------------------------------------In today's encore excerpt - beginning in 1840, the largest human
migration in history brought over 30 million immigrants to America.
This period of massive immigration brought with it unprecedented
prosperity, and by the time it was interrupted in 1914 by World War I,
America stood as the most prosperous nation on earth:
"The reasons for the largest human migration in history had been long
in coming.
"One of the main factors was the enormous increase in the European
population that took place in less than a century - from 140 million
people in 1750 to 250 million in the 1840s. As the numbers increased,
peasant families were constricted into increasingly smaller plots of
land by powerful landlords who were anxious to reap profits by
creating larger farms to feed the growing cities. Soon alarming
numbers of peasants found themselves unable to subsist. They were
joined in their plight by legions of artisans whose special skills passed on from father to son and mother to daughter for generations had earned them both a livelihood and a respected place in society.
Now, however, scores of the goods they had so expertly handcrafted
were being produced by the machinery of the Industrial Revolution.
Thousands of these artisans found themselves out of work, forced to
move to the cities and work in factories, where low wages, drudgery,
and the loss of their personal independence resulted in a sadly
diminished quality of life.
"Devastating as they were, none of these problems compared to the
series of famines that, beginning in the 1840s, descended upon various
European nations. Nowhere was the situation more desperate than in
Ireland where, in 1845, a fungus destroyed the potato crop, the single
food staple upon which the poorer classes of the country depended for
survival. By the time the disease began to abate in 1849, more than a
million Irish men, women, and children had starved to death. ...

"It was not only in Ireland that famine struck. ... A quote from the
archives of the Iowa State Historical Society by a Polish youngster
put it more personally, 'We lived through a famine,' he explained,
'[so] we came to America. Mother said she wanted to see a loaf of
bread on the table and then she was ready to die.'
"There were other important reasons for the mass exodus as well.
Despite the notions of liberty and equality that both the American and
French revolutions had spawned, oppressive governments in countries
such as Russia, Germany and Turkey had denied freedom of religion,
freedom of speech, or other rights and had brutally put down
rebellions aimed at bringing about reform. In Russia and Poland,
massacres called pogroms erupted. Designed to eliminate minority
groups who lived within their borders - particularly Jews - some of
these pogroms were carried out by the governments of these two
countries; others were unofficially endorsed by them. ...
"They came in waves; ... more than five million of them arrived
between 1840 and 1880, an influx slightly greater than the entire
population of the United States in 1790. Most emigrated from northern
and western Europe - Scandinavians who settled in the American
Midwest; Germans who established enclaves in New York, Baltimore,
Cincinnati, St. Louis, and Milwaukee; and British and Irish who poured
into Boston, New York, and other northeastern communities.
"Beginning in 1880 a great shift occurred when an even larger flood of
newcomers came from eastern, central, and southern Europe - Russians,
Poles, Austro-Hungarians, Greeks, Ukrainians, and Italians. In 1880
less than twenty percent of the 250,000 Jews living in New York had
come from Eastern Europe. In the next forty years the number grew to
1,400,000. That was one-fourth of the city's entire population. In the
first quarter of the 1900s, more than two million Italians arrived. By
the time the human tide was interrupted in 1914 by World War I, some
thirty-three million people had fled their native lands, risking all
to start life anew across the ocean."
Author: Martin W. Sandler,
Title: Atlantic Ocean
Publisher: Sterling
Date: Copyright 2008 by Martin W. Sandler
Pages: 356-364
-------------------------------------------------------------------------------------------------In today's excerpt - researchers have identified better ways for
students to study, yet they often contradict received wisdom and have
been ignored by the education system:

" 'We have known these principles [for improved study] for some time,
and it's intriguing that schools don't pick them up, or that people
don't learn them by trial and error,' said Robert A. Bjork, a
psychologist at the University of California, Los Angeles. 'Instead,
we walk around with all sorts of unexamined beliefs about what works
that are mistaken.'
"Take the notion that children have specific learning styles, that
some are 'visual learners' and others are auditory; some are "leftbrain" students, others "right-brain." In a recent review of the
relevant research, published in the journal Psychological Science in
the Public Interest, a team of psychologists found almost zero support
for such ideas. ...
"Psychologists have discovered that some of the most hallowed advice
on study habits is flat wrong. For instance, many study skills courses
insist that students find a specific place, a study room or a quiet
corner of the library, to take their work. The research finds just the
opposite. In one classic 1978 experiment, psychologists found that
college students who studied a list of 40 vocabulary words in two
different rooms - one windowless and cluttered, the other modern, with
a view on a courtyard - did far better on a test than students who
studied the words twice, in the same room. Later studies have
confirmed the finding, for a variety of topics. ...
"Varying the type of material studied in a single sitting alternating, for example, among vocabulary, reading and speaking in a
new language - seems to leave a deeper impression on the brain than
does concentrating on just one skill at a time. Musicians have known
this for years, and their practice sessions often include a mix of
scales, musical pieces and rhythmic work. Many athletes, too,
routinely mix their workouts with strength, speed and skill
drills. ...
"In a study recently posted online by the journal Applied Cognitive
Psychology, Doug Rohrer and Kelli Taylor of the University of South
Florida taught a group of fourth graders four equations, each to
calculate a different dimension of a prism. Half of the children
learned by studying repeated examples of one equation, say,
calculating the number of prism faces when given the number of sides
at the base, then moving on to the next type of calculation, studying
repeated examples of that. The other half studied mixed problem sets,
which included examples of all four types of calculations grouped
together. Both groups solved sample problems along the way, as they
studied. A day later, the researchers gave all of the students a test
on the material, presenting new problems of the same type. The
children who had studied mixed sets did twice as well as the others,
outscoring them 77 percent to 38 percent. The researchers have found
the same in experiments involving adults and younger children.

"This finding undermines the common assumption that intensive


immersion is the best way to really master a particular genre, or type
of creative work, said Nate Kornell, a psychologist at Williams
College and the lead author of the study. 'What seems to be happening
in this case is that the brain is picking up deeper patterns when
seeing assortments of paintings; it's picking up what's similar and
what's different about them,' often subconsciously.
"Cognitive scientists do not deny that honest-to-goodness cramming can
lead to a better grade on a given exam. But hurriedly jam-packing a
brain is akin to speed-packing a cheap suitcase, as most students
quickly learn - it holds its new load for a while, then most
everything falls out. ... [In contrast] an hour of study tonight, an
hour on the weekend, another session a week from now - so-called
spacing - improves later recall without requiring students to put in
more overall study effort or pay more attention, dozens of studies
have found."
Author: Benedict Carey
Title: "Forget What You Know About Good Study Habits"
Publisher: The New York Times
Date: September 6, 2010
-------------------------------------------------------------------------------------------------In today's excerpt - Cahokia, the largest settlement in North America
until Philadelphia, which stood in Illinois near modern day St. Louis,
and in which stood Monks Mound, a structure larger than the Great
Pyramid of Giza. Part of a four-thousand-year-old Native American
tradition of mound-building, it was built by people who came to be
known as Mississippians:
"There was one remarkable community north of the Rio Grande, a city
that by 1150 CE had become the largest urban center north of Mexico, a
record that would stand until Philadelphia surpassed it in the late
1700s. It is difficult to imagine a city covering more than six square
miles flourishing in the Mississippi Valley some 350 years before
Columbus reached the New World, a city, which at its zenith in about
1150 contained a population estimated by some experts to have been as
high as thirty thousand, more inhabitants than any contemporary
European city, including London. Its people constructed enormous
pyramid-shaped earthen mounds (the largest, Monks Mound, has a base
larger than that of the Great Pyramid of Giza in Egypt), designed and
built solar observatories, and carried out a far-flung trade. Its name
was Cahokia. ...
"The city of Cahokia was physically dominated by the Monks Mound,
named for French Trappist monks who lived in a monastery nearby in the
early 1800s and gardened on the mound. Cahokia was built in a dozen or

more phases beginning in about 900 CE, a time described by


archeologists as the 'Big Bang,' a period in which, for still unknown
reasons, thousands of Native Americans from surrounding regions poured
into Cahokia and the city experienced as much as a tenfold increase in
its population.
"Covering an area of fourteen acres, making it larger than the Great
Pyramid of Giza, the clay stab that serves as the base of Monks Mound
is about 954 feet long and 774 feet wide. The enormous structure
stretches 100 feet from its base to its top. ...
"Most archaeologists who have worked the site are in agreement that
the temple or palace atop Monks Mound was the focal point from which
Cahokia's rulers carried out various political and religious rituals,
including prayers for favorable weather to nurture the acres of maize
that stretched out from the city as far as the eye could see.
Excavations have also revealed that at some point in the mound's
various phases of construction a low platform was extended out from
one of its sides, creating a stage from which priests could perform
ceremonies in full view of the public.
"What is perhaps most intriguing of all is the question of how Monks
Mound was constructed. Archaeologists calculate that the structure
contains twenty-two million cubic feet of earth, which was dug with
stone tools and carried out in baskets on people's backs to the evergrowing mound.
"Sally A. Kitt Chappell provided a graphic calculation of the enormous
effort that went into building Monks Mound: This pharaonic enterprise
required carrying 14,666,666 baskets, each filled with 1.5 cubic feet,
of dirt weighing about fifty-five pounds each, for a total of 22
million cubic feet. For comparison, an average pickup truck holds 96
cubic feet, so it would take 229,166 pickup loads to bring the dirt to
the site. If thirty people each carried eight baskets of earth a day,
the job would take 167 years. ...
"[The complexity implies] the presence of individuals with specialized
knowledge of soils and earthen construction. Despite the instability
of the materials they had on hand and the fact that they built their
enormous structure on a floodplain, these ancient engineers achieved
nothing less than the largest prehistoric construction in the
Americas, and there it has stood for more than one thousand years."
Author: Martin W. Sandler
Title: Lost to Time
Publisher: Sterling
Date: Copyright 2010 by Martin W. Sandler
Pages: 20-27
----------------------------------------------------------------------

----------------------------In today's encore excerpt - William the Conqueror, Duke of Normandy,


crossed the English Channel from France in 1066 and conquered the
English, thus starting what historians consider the true beginnings of
modern Britain. William, who had known blood, fear and treachery since
his earliest childhood, built a fighting force considered the fiercest
in Europe, and his horse-mounted army decisively bested the horseless
soldiers of England's King Harald. After the conquest, he tried to
include English lords as partners in his new regime, but their
treachery led to a need to use savage methods to break the entire
kingdom:
"[The child] William was inured to the spectacle of slaughter: two of
his guardians were hacked down in quick succession; his tutor as well;
and a steward, on one particularly alarming occasion, murdered in the
very room in which the young duke lay asleep. Yet even as blood from
the victim's slit throat spilled across the flagstones, William could
feel relief as well as horror: for he at least had been spared. ...
"[Years later], in 1066, there could be no doubting that William
ranked as a truly deadly foe. His apprenticeship was long since over.
Seasoned in all the arts of war and lordship, and with a reputation
fit to intimidate even the princes of Flanders and Anjou, even the
King of France himself, his prime had turned out a fearsome one. So
too had that of his duchy. Quite as greedy for land and spoils as any
Viking sea king, the great lords of Normandy, men who had grown up by
their duke's side and shared all his ambitions, had emerged as an
elite of warriors superior, in both their discipline and training, to
any in Christendom. ...
"[After defeating the England's King Harald at the Battle of
Hastings], what was left of the English turned at last and fled into
the gathering darkness, to be hunted throughout the night by William's
exultant cavalry, and it was the reek of blood and emptied bowels,
together with the moans and sobs of the wounded, that bore prime
witness to the butchery. Come the morning, however, and daylight
unveiled a spectacle of carnage so appalling that even the victors
were moved to pity. 'Far and wide the earth was covered with the
flower of the English nobility and youth, drenched in gore.' ...
"[After the victory] William's coronation oath, which stated that he
would uphold the laws and customs of his new subjects, had been sworn
with all due solemnity - and sure enough, for the first few years of
his reign, he did indeed attempt to include them as partners within
his new regime. But the English earls could never quite forgo a taste
for revolt - with the result that, soon enough, an infuriated William
was brought to abandon the whole experiment. In its place, he
instituted a far more primal and brutal policy. Just as his ancestors
had cleansed what would become Normandy of its Frankish aristocracy,

so now did William set about the systematic elimination from England
of its entire ruling class. ...
"The task of the Norman lords, set as they were amid a sullen and
fractious people, [became] no different in kind to that of the most
upstart castellan in France. In England, however, it was not just
scattered hamlets and villages that needed to be broken, but a whole
kingdom. In the winter of 1069, when the inveterately rebellious
Northumbrians sought to throw off their new king's rule, William's
response was to harry the entire earldom. Methods of devastation
familiar to the peasantry of France were unleashed across the north of
England: granaries were burned, oxen slaughtered, ploughs destroyed.
Rotting corpses were left to litter the road. The scattered survivors
were reduced to selling themselves into slavery, or else, if reports
are to be believed, to cannibalism."
Author: Tom Holland
Title: The Forge of Christendom: The End of Days and the Epic Rise of
the West
Publisher: Doubleday
Date: Copyright 2008 by Tom Holland
Pages: 289, 316, 325, 327-8.
-------------------------------------------------------------------------------------------------In today's excerpt - error is normal, and making mistakes is a
necessary part of learning. In Teach Like a Champion, Doug Lemov's
brilliant distillation of forty-nine techniques for teachers to use to
improve student performance, he writes that teachers should normalize
error and avoid chastening students for getting it wrong. (Lemov's
book has application far beyond the classroom):
"Error followed by correction and instruction is the fundamental
process of schooling. You get it wrong, and then you get it right. If
getting it wrong and then getting it right is normal, teachers should
Normalize Error and respond to both parts of this sequence as if they
were totally and completely normal. After all, they are.
WRONG ANSWERS: DON'T CHASTEN; DON'T EXCUSE
"Avoid chastening wrong answers, for example, 'No, we already talked
about this. You have to flip the sign, Ruben.' And do not make excuses
for students who get answers wrong: 'Oh, that's okay, Charlise. That
was a really hard one.' In fact, if wrong answers are truly a normal
and healthy part of the learning process, they don't need much
narration at all.
"It's better, in fact, to avoid spending a lot of time talking about
wrongness and get down to the work of fixing it as quickly as

possible. Although many teachers feel obligated to name every answer


as right or wrong, spending time making that judgment is usually a
step you can skip entirely before getting to work. For example, you
could respond to a wrong answer by a student namedNoah by saying,
'Let's try that again, Noah. What's the first thing we have to do?' or
even, 'What's the first thing we have to do in solving this kind of
problem, Noah?' This second situation is particularly interesting
because it remains ambiguous to Noah and his classmates whether the
answer was right or wrong as they start reworking the problem. There's
a bit of suspense, and they will have to figure it out for themselves.
When and if you do name an answer as wrong, do so quickly and simply
('not quite') and keep moving. Again, since getting it wrong is
normal, you don't have to feel badly about it. In fact, if all
students are getting all questions right, the work you're giving them
isn't hard enough.
RIGHT ANSWERS: DON'T FLATTER; DON'T FUSS
"Praising right answers can have one of two perverse effects on
students. If you make too much of fuss, you suggest to students unless it's patently obvious that an answer really is exceptional that you're surprised that they got the answer right. And as a variety
of social science research has recently documented, praising students
for being 'smart' perversely incents them not to take risks
(apparently they worry about no longer looking smart if they get
things wrong), in contrast to praising students for working hard,
which incents them to take risks and take on challenges.
"Thus, in most cases when a student gets an answer correct,
acknowledge that the student has done the work correctly or has worked
hard; then move on: 'That's right, Noah. Nice work.' Champion teachers
show their students they expect both right and wrong to happen by not
making too big a deal of either. Of course, there will be times when
you want to sprinkle in stronger praise ('Such an insightful answer,
Carla. Awesome'). Just do so carefully so that such praise isn't
diluted by overuse."
[Editor's note: We were reminded of this principle recently when
touring the Franklin Institute's nationally recognized Science
Leadership Academy and finding that the powerful learning mantra of
the engineering department was "fail early, fail often."]
Author: Doug Lemov
Title: Teach Like a Champion
Publisher: Jossey-Bass
Date: Copyright 2010 by John Wiley & Sons, Inc.
Pages: 221-223
----------------------------------------------------------------------

----------------------------In today's excerpt - creating and sustaining the world's first city of
one million people took many things; among them control of the entire
Mediterranean, the continual extraction of food from the farthest
regions of its empire, and the invention of concrete used to build the
aqueducts that supplied its water:
"Although not famed for their technological originality, Romans did
use water to make one transformational innovation - concrete - around
200 BC that helped galvanize their rise as a great power. Light,
strong, and waterproof, concrete was derived from a process that
exploited water's catalytic properties at several stages by adding it
to highly heated limestone. When skillfully produced, the end process
yielded a putty adhesive strong enough to bind sand, stone chips,
brick dust, and volcanic ash. Before hardening, inexpensive concrete
could be poured into molds to produce Rome's hallmark giant
construction projects. One peerless application was the extensive
network of aqueducts that enabled Rome to access, convey, and manage
prodigious supplies of wholesome freshwater for drinking, bathing,
cleaning, and sanitation on a scale exceeding anything realized before
in history and without which its giant metropolis would not have been
possible. ...
"Yet nowhere was Rome's public water system more influential than in
Rome itself. Indeed, Rome's rapid growth to a grand, astonishingly
clean imperial metropolis corresponded closely with its building its
11 aqueducts over five centuries to AD 226, extending 306 miles in
total length and delivering a continuous, abundant flow of fresh
countryside water from as far away as 57 miles. The aqueducts funneled
their mostly spring-fed water through purifying settling and
distribution tanks to sustain an urban water network that included
1,352 fountains and basins for drinking, cooking and cleaning, 11 huge
imperial baths, 856 free or inexpensive public baths plus numerous,
variously priced private ones, and ultimately to underground sewers
that constantly flushed the wastewater into the Tiber. ...
"Sustaining and housing a population of 1 million may not seem like
much of an accomplishment from the vantage point of the twenty-first
century with its megacities. Yet for most of human history cities were
unsanitary human death traps of inadequate sewerage and fetid water
that bred germs and disease-carrying insects. Athens at its peak was
about one-fifth the size of Rome, and heaped with filth and refuse at
its perimeter. In 1800, only six cities in the world had more than
half a million people - London, Paris, Beijing, Tokyo, Istanbul,
Canton. Despite Rome's hygienic shortcomings - incomplete urban waste
disposal, overcrowded and unsanitary tenements, malaria-infested,
surrounding lowlands - the city's provision of copious amounts of
fresh, clean public water washed away so much filth and disease as to
constitute an urban sanitary breakthrough unsurpassed until the

nineteenth century's great sanitary awakening in the industrialized


West.
"Although there are no precise figures in ancient records on how much
freshwater was delivered daily, it is widely believed that Roman water
availability was stunning by ancient standards and even compared
favorably with leading urban centers until modern times - perhaps as
much as an average of 150 to 200 gallons per day for each Roman.
Moreover, the high quality of the water - the Roman countryside
offered some of the best water quality in all Europe, and still does
so today - was an easily overlooked historical factor in explaining
Rome's rise and endurance."
Author: Steven Solomon
Title: Water
Publisher: Harper
Date: Copyright 2010 by Steven Solomon
Pages: 85-88
-------------------------------------------------------------------------------------------------In today's excerpt - pirates. Pirates have been present throughout
history, have long been most numerous in the teeming shipping lanes of
the Far East, but are best remembered in Western culture for the
Caribbean pirates of the early 1700s:
"There has been piracy since the earliest times. There were Greek
pirates and Roman pirates, and centuries of piracy when the Vikings
and Danes were ravaging the coasts of Europe. The southern shores of
England were infested with smugglers and pirates during Tudor times. A
group of Dutch pirates called the Sea Beggars, or Watergeuzen, played
a small but critical role in the history of the Netherlands. ...
"In the Mediterranean, pirates took part in the holy war which was
waged between the Christians and the Muslims for several centuries:
Barbary corsairs intercepted ships traveling through the Strait of
Gibraltar or coming from the trading ports of Alexandria and Venice,
swooping down on the heavily laden merchantmen in their swift galleys
powered by oars and sails. They looted their cargoes, captured their
passengers and crews, and held them to ransom or sold them into
slavery.
"The French played a major part in the history of piracy. Many of the
most successful and most fearsome of the buccaneers who prowled the
Spanish Main came from French seaports. Corsairs based at Dunkirk
menaced the shipping in the English Channel in the mid-seventeenth
century. Their most famous leader was Jean Bart, who was responsible
for the capture of some eighty ships. He later joined the French navy
and was ennobled by King Louis XIV in 1694.

"The Red Sea and the Persian Gulf were always notorious for pirates,
and the Malabar coast on the western shores of India was home to the
Maratha pirates, led by the Angria family, who plundered the ships of
the East India Company during the first half of the eighteenth
century.
"In the Far East there was piracy on a massive scale. The Ilanun
pirates of the Philippines roamed the seas around Borneo and New
Guinea with fleets of large galleys manned by crews of forty to sixty
men, launching savage attacks on shipping and coastal villages until
they were stamped out by a naval expedition in 1862. But the most
formidable of all, in terms of numbers and cruelties, were the pirates
of the South China Sea. Their activities reached a peak in the early
years of the nineteenth century, when a community of around forty
thousand pirates with some four hundred junks dominated the coastal
waters and attacked any merchant vessels which strayed into the area.
From 1807 these pirates were led by a remarkable woman called Mrs.
Cheng, a former prostitute from Canton.
"[But] this book concentrates on the pirates of the Western world, and
particularly on the great age of piracy, which began in the 1650s and
was brought to an abrupt end around 1725, when naval patrols drove the
pirates from their lairs and mass hangings eliminated many of their
leaders. It is this period which has inspired most of the books,
plays, and films about piracy, and has been largely responsible for
the popular image of the pirate in the West today."
Author: David Cordingly
Title: Under the Black Flag
Publisher: Random House
Date: Copyright 1996 by David Cordingly
Pages: xv-xvii
-------------------------------------------------------------------------------------------------In today's encore excerpt - Native Americans systematically used
large-scale fires to transform the American landscape in the centuries
before European dominance of the continent:
"Adriaen van der Donck was a Dutch lawyer who in 1641 transplanted
himself to the Hudson River Valley. ... He spent a lot of time with
the Haudenosaunee [tribe], whose insistence on personal liberty
fascinated him. ... Every fall, he remembered, the Haudenosaunee set
fire to 'the woods, plains, and meadows,' to 'thin out and clear the
woods of all dead substances and grass, which grow better the ensuing
spring.' At first the wildfire had scared him, but over time van der
Donck had come to relish the spectacle of the yearly burning. 'Such a
fire is a splendid sight when one sails on the [Hudson and Mohawk]

rivers at night while the forest is ablaze on both banks,' he


recalled. With the forest burning to the right and the left, the
colonists' boats passed through a channel of fire, their passengers as
goggle-eyed at the blaze as children at a video arcade. 'Fire and
flames are seen everywhere and on all sides...a delightful scene to
look on from afar.' ...
"[From] Hudson's Bay to the Ro Grande, the Haudenosaunee and almost
every other Indian group shaped their environment, at least in part,
by fire. ... For more than ten thousand years, most North American
ecosystems have been dominated by fire. ...
"Fire is a dominating factor in many if not most terrestrial
landscapes. It has two main sources: lightning and Homo sapiens. In
North America, lightning fire is most common in the western mountains.
Elsewhere, though, Indians controlled it - at least until contact, and
in many places long after. In the Northeast, Indians always carried a
deerskin pouch full of flints, Thomas Morton reported in 1637, which
they used 'to set fire of the country in all places where they come.'
The flints ignited torches, which were as important to the hunt as
bows and arrows. Deer in the Northeast; alligators in the Everglades;
buffalo in the prairies; grasshoppers in the Great Basin; rabbits in
California; moose in Alaska: all were pursued by fire. Native
Americans made big rings of flame, Thomas Jefferson wrote, 'by firing
the leaves fallen on the ground, which, gradually forcing animals to
the center, they there slaughter them with arrows, darts, and other
missiles.' Not that Indians always used fire for strictly utilitarian
purposes. At nightfall tribes in the Rocky Mountains entertained the
explorers Meriwether Lewis and William Clark by applying torches to
sap-dripping fir trees, which then exploded like Roman candles. ...
"Indian fire had its greatest impact in the middle of the continent,
which Native Americans transformed into a prodigious game farm. Native
Americans burned the Great Plains and Midwest prairies so much and so
often that they increased their extent; in all probability, a
substantial portion of the giant grassland celebrated by cowboys was
established and maintained by the people who arrived there first.
'When Lewis and Clark headed west from [St. Louis],' wrote ethologist
Dale Lott, 'they were exploring not a wilderness but a vast pasture
managed by and for Native Americans.'
"In 1792 the surveyor Peter Fidler examined the plains of southern
Alberta systematically, the first European to do so. Riding with
several groups of Indians in high fire season, he spent days on end in
a scorched land. 'Grass all burnt this day,' he reported on November
12. 'Not a single pine to be seen three days past.' A day later: 'All
burnt ground this Day.' A day later: 'The grass nearly burnt all along
this Day except near the Lake.' A month later: 'The Grass is now
burning [with] very great fury.' ... ' 'These fires burning off the
old grass,' he observed, 'in the ensuing Spring & Summer makes

excellent fine sweet feed for the Horses & Buffalo, &c.' ... Captain
John Palliser, traveling through the same lands as Fidler six decades
later, lamented the Indians' 'disastrous habit of setting the prairie
on fire for the most trivial and worse than useless reasons.' ...
"Carrying their flints and torches, Native Americans were living in
balance with Nature - but they had their thumbs on the scale. Shaped
for their comfort and convenience, the American landscape had come to
fit their lives like comfortable clothing. It was a highly successful
and stable system, if 'stable' is the appropriate word for a regime
that involves routinely enshrouding miles of countryside in smoke and
ash."
Author: Charles C. Mann
Title: 1491: New Revelations of the Americas Before Columbus
Publisher: Vintage
Date: Copyright 2005, 2006 by Charles C. Mann
Pages: Kindle Loc. 4587-4681.
-------------------------------------------------------------------------------------------------In today's excerpt - the dark ages:
"The decline and eventual collapse of the Roman Empire in the West
during the fifth century CE plunged the world into centuries of doom
and gloom, wherein humanity became a collection of dull-witted,
superstition-ridden dolts who accomplished next to nothing and waited
around for the Renaissance to begin.
"Or not.
"Actually, the 'Dark Ages' - the term used to describe the first half
of what is traditionally described as the 'Middle Ages' - is something
of a misnomer. So is the 'Middle Ages' for that matter. The idea that
there was a thousand-year period between the end of the Roman Empire
and the beginnings of the Renaissance where nothing much happened was
fostered mainly by intellectuals starting in the fifteenth century,
especially in Italy. These bright lights wanted to believe - and
wanted others to believe - that they had much more in common with the
Classical Age than they did with the centuries that had just preceded
them. By creating, and then denigrating, the Dark, or Middle, Ages,
the 'humanists' also sought to separate themselves from the very real
decline in the quality of life in most of the European continent after
the Roman system fell apart.
"It was a pretty Eurocentric view of things. In reality, there were a
lot of places in the world where mankind was making strides. Centered
on what is now Turkey, the Byzantine Empire was a direct link to the
culture and learning of ancient Greece and Rome. In the deserts of

what is now Saudi Arabia, an empire centered on the new religion of


Islam was spreading with lightning speed, and carrying with it not
only new beliefs but also new ways of looking at medicine, math, and
the stars. In the North Atlantic, Scandinavian ships were exploring
the fringes of a New World, while in the Pacific, the Polynesians were
pushing across even more vast aquatic distances to settle in virtually
every inhabitable island they could find.
"In the jungles of Central America, the Maya were reaching the peak of
a fairly impressive civilization. In the jungles of Southeast Asia,
the Khmer were setting up an equally impressive cultural and trade
center. Even in Europe, which admittedly was pretty much a mess,
devoted monks were doing their best to keep the flame of learning
burning. ... [and] there was progress. Stone and wooden tools were
replaced with metal implements. The water-powered mill became
commonplace. Farmers learned to rotate their crops in order to
rejuvenate soil. And the harness was redesigned so that it fell across
a horse's shoulders rather than its throat, thus increasing its
proficiency in pulling a plow.
"There were astounding feats of human endeavor, such as the
construction of the Grand Canal in China, which stretched more than
1,200 miles and connected the farmlands of the Yangtze Valley with the
markets of Luoyang and Chang-an. ... And there were equally astounding
feats of individual endeavor, such as the founding of a major world
religion by a comfortably fixed middle-aged Arab trader who became
known as the Prophet Muhammad...."
Author: Erik Sass and Steve Wiegand
Title: The Mental Floss History of the World
Publisher: Harper
Date: Copyright 2008 by Mental Floss LLC
Pages: 127-128, 137
-------------------------------------------------------------------------------------------------In today's excerpt - soldiers, demobilizing after the end of a war,
are vulnerable to the lures of crime. They are largely young, have few
marketable skills, are often unemployed, and are trained to fight. Two
of the greatest periods of pirate activity in the Mediterranean and
Caribbean followed the ends of two of England's long wars, and those
periods of piracy died out soon after those generations passed:
"Two of the most dramatic increases in pirate activity took place when
peace was declared after long periods of naval warfare and large
numbers of seamen were out of work.
"[The first such surge began] when fifty years of hostilities between
England and Spain were finally ended in 1603, [and] hundreds of seamen

from the Royal Navy and from privateers were thrown on the streets.
Their only skill was in handling a ship, and many turned to piracy.
For the next thirty years, shipping in the English Channel, the Thames
estuary, and the Mediterranean was ravaged by pirates
"The second surge in piracy took place in the years following the
Treaty of Utrecht of 1713, which brought peace among England, France,
and Spain. The size of the Royal Navy slumped from 53,785 in 1703 to
13,430 in 1715, putting 40,000 seamen out of work. There is no proof
that these men joined the ranks of the pirates, and Marcus Rediker has
pointed out that most pirates were drawn from the merchant navy, not
the Royal Navy; but many contemporary observers believed that the rise
in pirate attacks in the years after the Peace of Utrecht was due to
the large numbers of unemployed seamen. They particularly blamed the
Spanish for driving the logwood cutters out of the bays of Campeche
and Honduras after the Treaty of Utrecht, and they also blamed the
privateers. Many privateering commissions had been issued in the later
years of the seventeenth century, particularly in the West Indies.
Peace put an end to this, and the Governor of Jamaica warned London of
the likely outcome: 'Since the calling in of our privateers, I find
already a considerable number of seafaring men at the towns of Port
Royal and Kingston that can't find employment, who I am very
apprehensive, for want of occupation in their way, may in a short time
desert us and turn pirates.' "
Author: David Cordingly
Title: Under the Black Flag
Publisher: Random House
Date: Copyright 1996 by David Cordingly
Pages: 192-193
-------------------------------------------------------------------------------------------------In today's excerpt - clouds and the names of clouds:
"The person most frequently identified as the father of modern
meteorology was an English pharmacist named Luke Howard, who came to
prominence at the beginning of the nineteenth century. Howard is
chiefly remembered now for giving cloud types their names in 1803. ...
"Howard divided clouds into three groups: stratus for the layered
clouds, cumulus for the fluffy ones (the word means 'heaped' in
Latin), and cirrus (meaning 'curled') for the high, thin feathery
formations that generally presage colder weather. To these he
subsequently added a fourth term, nimbus (from the Latin for 'cloud'),
for a rain cloud. The beauty of Howard's system was that the basic
components could be freely recombined to describe every shape and size
of passing cloud - stratocumulus, cirrostratus, cumulocongestus, and
so on. It was an immediate hit, and not just in England. The poet

Johann von Goethe in Germany was so taken with the system that he
dedicated four poems to Howard.
"Howard's system has been much added to over the years, so much so
that the encyclopedic if little read International Cloud Atlas runs to
two volumes, but interestingly virtually all the post-Howard cloud
types - mammatus, pileus, nebulosis, spissatus, floccus, and mediocris
are a sampling - have never caught on with anyone outside meteorology
and not terribly much there, I'm told. Incidentally, the first, much
thinner edition of that atlas, produced in 1896, divided clouds into
ten basic types, of which the plumpest and most cushiony-looking was
number nine, cumulonimbus.* That seems to have been the source of the
expression 'to be on cloud nine.'
"For all the heft and fury of the occasional anvil-headed storm cloud,
the average cloud is actually a benign and surprisingly insubstantial
thing. A fluffy summer cumulus several hundred yards to a side may
contain no more than twenty-five or thirty gallons of water - 'about
enough to fill a bathtub,' as James Trefil has noted. You can get some
sense of the immaterial quality of clouds by strolling through fog which is, after all, nothing more than a cloud that lacks the will to
fly. To quote Trefil again: 'If you walk 100 yards through a typical
fog, you will come into contact with only about half a cubic inch of
water - not enough to give you a decent drink.' In consequence, clouds
are not great reservoirs of water. Only about 0.035 percent of the
Earth's fresh water is floating around above us at any moment.
" *If you have ever been struck by how beautifully crisp and well
defined the edges of cumulus clouds tend to be, while other clouds are
more blurry, the explanation is that in a cumulus cloud there is a
pronounced boundary between the moist interior of the cloud and the
dry air beyond it. Any water molecule that strays beyond the edge of
the cloud is immediately zapped by the dry air beyond, allowing the
cloud to keep its fine edge. Much higher cirrus clouds are composed of
ice, and the zone between the edge of the cloud and the air beyond is
not so clearly delineated, which is why they tend to be blurry at the
edges."
Author: Bill Bryson
Title: A Short History of Nearly Everything
Publisher: Broadway
Date: Copyright 2003 by Bill Bryson
Pages: 263-265
-------------------------------------------------------------------------------------------------In today's excerpt - violence is a recurring problem in America. Five
years after their initial injury, twenty percent of those who have had
a gunshot or stab wound will be dead:

"According to statistics from the Centers for Disease Control and


Prevention (CDC), young black men have a higher rate of both fatal and
nonfatal violence than any other group. National statistics show that
homicide is the leading cause of death for African American men
between the ages Of 15 and 34. In 2006, 2,946 black males between the
ages Of 15 and 24 were victims of homicide. This means that the
homicide rate for black males aged 15 to 24 was 92 in 100,000. For
white males in the same age range, the homicide rate was 4.7 in
100,000. In other words, the homicide death rate was more than 19
times higher for young black men than young white men.
"Homicide numbers across the nation have decreased over the past
decade, but a closer look at these homicide statistics shows
disturbing trends. Daniel Webster and his colleagues at the Johns
Hopkins School of Public Health have found that although overall
homicide rates have appeared stable since 1999, the homicide rate
among African American men between the ages of 25 and 44 has increased
substantially. It is no wonder, then, that as these homicides are
reported in the news, flashed across television screens, and
recapitulated in films, we would come to associate young black men
with homicide.
"But homicide represents only the tip of the iceberg with regard to
violence. Nonfatal injuries are far more common than fatal injuries.
The CDC estimates that for every homicide, there are more than 94
nonfatal violent incidents. Even with the increasing lethality of the
guns available, the ratio of firearm-related injuries from nonfatal
physical assaults to firearm-related homicides was four to one. In
other words, for every person who gets shot and dies, another four get
shot and survive.
"While it is true that a person is more likely to die of a gunshot
wound than from injuries delivered by other kinds of weapons, many
young people are stabbed or assaulted. The ratios of nonfatal to
fatal injuries for other types of violence show the same pattern. For
those who are stabbed or cut, 64 people survive for each person who
dies. For physical assaults, 3,243 people survive for each person who
dies. In nonfatal injury, just as in homicide, black males are
disproportionately affected. In data from the year 2000, the overall
violent assault rate for black males was 4.6 times higher than the
rate for non-Hispanic white males. Countless others suffer trauma or
near-trauma that never comes to the attention of the health care
system, like being shot at or being grazed by a bullet or beaten up
but not badly enough to seek medical care.
"Studies also show that violence is a recurrent problem. Up to 45
percent of people who have had a penetrating injury - a gunshot or
stab wound - will have another similar injury within five years. More
disturbing is the finding that five years after their initial injury,

20 percent of these individuals are dead."


Author: John A. Rich, M.D., M.P.H.
Title: Wrong Place, Wrong Time
Publisher: Johns Hopkins
Date: Copyright 2009 by Johns Hopkins University Press
Pages: ix-xi
-------------------------------------------------------------------------------------------------In today's excerpt - Joseph Pulitzer (1847-1911), the son of Hungarian
Jews, and his New York World newspaper brought in the age of mass
communications. His circulation battle with his upstart competitor
William Randolph Hearst (1863-1951) is often credited with
precipitating the Spanish-America War:
"[Even in his twilight years, as he traveled the globe, Joseph]
Pulitzer never relaxed his grip on the World, his influential New York
newspaper that had ushered in the modern era of mass communications.
An almost unbroken stream of telegrams, all written in code, flowed
from ports and distant destinations to New York, directing every part
of the paper's operation. The messages even included such details as
the typeface used in an advertisement and the vacation schedule of
editors. Managers shipped back reams of financial data, editorial
reports, and espionage-style accounts of one another's work. Although
he had set foot in his skyscraper headquarters on Park Row only three
times, whenever anyone talked about the newspaper it was always
'Pulitzer's World.'
"And it was talked about. Since Pulitzer took over the moribund
newspaper in 1883 and introduced his brand of journalism to New York,
the World had grown at meteoric speed, becoming, at one point, the
largest circulating newspaper on the globe. Six acres of spruce trees
were felled a day to keep up with its demand for paper, and almost
every day enough lead was melted into type to set an entire Bible into
print.
"Variously credited with having elected presidents, governors, and
mayors; sending politicians to jail; and dictating the public agenda,
the World was a potent instrument of change. As a young man in a hurry
Pulitzer had unabashedly used the paper as a handmaiden of reform, to
raise social consciousness and promote a progressive - almost radical
- political agenda. The changes he had called for, like the outlandish
ideas of taxing inheritances, income, and corporations, had become
widely accepted.
"'The World should be more powerful than the President,' Pulitzer once
said. ...

"The [explosion of the USS Maine], coming at a time of rising tension


between Spain and America, became incendiary kindling in the hands of
battling newspaper editors in New York. William Randolph Hearst, a
young upstart imitator from California armed with an immense family
fortune, had done the unthinkable. In 1898 his paper, the New York
Journal, was closing in on the World's dominance of Park Row. Fighting
down to the last possible reader, each seeking to outdo the other in
its eagerness to lead the nation into war, the two journalistic
behemoths fueled an outburst of jingoistic fever. And when the war
came, they continued their cutthroat competition by marshaling armies
of reporters, illustrators, and photographers to cover every detail of
its promised glory.
"The no-holds-barred attitude of the World and Journalput the
newspapers into a spiraling descent of sensationalism, outright
fabrications, and profligate spending. If left unchecked, it
threatened to bankrupt both their credibility and their
businesses. ... In the end, the two survived this short but intense
circulation war. But their rivalry became almost as famous as the
Spanish-American War itself. Pulitzer was indissolubly linked with
Hearst as a purveyor of vile Yellow Journalism. In fact, some critics
suspected that Pulitzer's current plans to endow a journalism school
at Columbia University and create a national prize for journalists
were thinly veiled attempts to cleanse his legacy before his
approaching death."
Author: James McGrath Morris
Title: Pulitzer
Publisher: Harper
Date: Copyright 2009 by James McGrath Morris
Pages: 2-4
-------------------------------------------------------------------------------------------------In today's encore excerpt - the Renaissance in Europe owed a
tremendous debt to the inventions that Marco Polo (1254-1324), his
father Niccol and his uncle Maffeo brought back to Venice from their
twenty-four years of travel in China:
"[Upon their return from China], the three Polos received respect from
their fellow citizens, with Marco singled out for special attention.
'All the young men went every day continuously to visit and converse
with Messer Marco,' Giambattista Ramusio claimed. 'who was most
charming and gracious, and to ask of him matters concerning Cathay
(China) and the Great Khan, and he responded with so much kindness
that all felt themselves to be in a certain manner indebted to him.'
"It is easy to understand why Marco attracted notice. The significance
of the inventions that he brought back from China, or which he later

described in his Travels, cannot be overstated. At first, Europeans


regarded these technological marvels with disbelief, but eventually
they adopted them.
"Paper money, virtually unknown in the West until Marco's return,
revolutionized finance and commerce throughout the West.
"Coal, another item that had caught Marco's attention in China,
provided a new and relatively efficient source of heat to an energystarved Europe.
"Eyeglasses (in the form of ground lenses), which some accounts say he
brought back with him, became accepted as a remedy for failing
eyesight. In addition, lenses gave rise to the telescope - which in
turn revolutionized naval battles, since it allowed combatants to view
ships at a great distance - and the microscope. Two hundred years
later, Galileo used the telescope - based on the same technology - to
revolutionize science and cosmology by supporting and disseminating
the Copernican theory that Earth and other planets revolved around the
Sun.
"Gunpowder, which the Chinese had employed for at least three
centuries, revolutionized European warfare as armies exchanged their
lances, swords, and crossbows for cannon, portable harquebuses, and
pistols.
"Marco brought back gifts of a more personal nature as well. The
golden paiza, or passport, given to him by Kublai Khan had seen him
through years of travel, war, and hardship. Marco kept it still, and
would to the end of his days. He also brought back a Mongol servant,
whom he named Peter, a living reminder of the status he had once
enjoyed in a far-off land.
"In all, it is difficult to imagine the Renaissance - or, for that
matter, the modern world - without the benefit of Marco Polo's example
of cultural transmission between East and West."
Author: Laurence Bergreen
Title: Marco Polo
Publisher: Knopf
Date: Copyright 2007 by Laurence Bergreen
Pages: 320-321
-------------------------------------------------------------------------------------------------In today's excerpt - the teachings of Siddhatta Gotama (Siddhartha
Gautama), the Buddha (circa 500 BCE), did not include such items as an
explanation of the origin of the universe, because he was only
concerned with those teachings that helped relieve suffering:

"The Buddha had no time for doctrines or creeds; he had no theology to


impart, no theory about the root cause of dukkha (suffering), no tales
of an Original Sin, and no definition of the Ultimate Reality. He saw
no point in such speculations. Buddhism is disconcerting to those who
equate faith with belief in certain inspired religious opinions. A
person's theology was a matter of total indifference to the Buddha. To
accept a doctrine on somebody else's authority was, in his eyes, an
'unskillful' state, which could not lead to enlightenment, because it
was an abdication of personal responsibility. He saw no virtue in
submitting to an official creed. 'Faith' meant trust that Nibbana
(nirvana) existed and a determination to prove it to oneself. The
Buddha always insisted that his disciples test everything he taught
them against their own experience and take nothing on hearsay. A
religious idea could all too easily become a mental idol, one more
thing to cling to, when the purpose of the dhamma (dharma, religious
teachings or truths) was to help people to let go.
" 'Letting go' is one of the keynotes of the Buddha's teaching. The
enlightened person did not grab or hold on to even the most
authoritative instructions. Everything was transient and nothing
lasted. Until his disciples recognized this in every fiber of their
being, they would never reach Nibbana. Even his own teachings must be
jettisoned, once they had done their job. He once compared them to a
raft, telling the story of a traveler who had come to a great expanse
of water and desperately needed to get across. There was no bridge, no
ferry, so he built a raft and rowed himself across the river. But
then, the Buddha would ask his audience, what should the traveler do
with the raft? Should he decide that because it had been so helpful to
him, he should load it onto his back and lug it around with him
wherever he went? Or should he simply moor it and continue his
journey? The answer was obvious. 'In just the same way, bhikkhus
(monks), my teachings are like a raft, to be used to cross the river
and not to be held on to,' the Buddha concluded. 'If you understand
their raft-like nature correctly, you will even give up good
teachings, not to mention bad ones! '
"His Dhamma was wholly pragmatic. Its task was not to issue infallible
definitions or to satisfy a disciple's intellectual curiosity about
metaphysical questions. Its sole purpose was to enable people to get
across the river of pain to the 'further shore.' His job was to
relieve suffering and help his disciples attain the peace of Nibbana.
Anything that did not serve that end was of no importance whatsoever.
"Hence there were no abstruse theories about the creation of the
universe or the existence of a Supreme Being. These matters might be
interesting but they would not give a disciple enlightenment or
release from dukkha. One day, while living in a grove of simsapa trees
in Kosambi, the Buddha plucked a few leaves and pointed out to his
disciples that there were many more still growing in the wood. So too

he had only given them a few teachings and withheld many others. Why?
'Because, my disciples, they will not help you, they are not useful in
the quest for holiness, they do not lead to peace and to the direct
knowledge of Nibbana.' He told one monk, who kept pestering him about
philosophy, that he was like a wounded man who refused to have
treatment until he learned the name of the person who had shot him and
what village he came from: he would die before he got this useless
information. In just the same way, those who refused to live according
to the Buddhist method until they knew about the creation of the world
or the nature of the Absolute would die in misery before they got an
answer to these unknowable questions. What difference did it make if
the world was eternal or created in time? Grief, suffering and misery
would still exist. The Buddha was concerned simply with the cessation
of pain. 'I am preaching a cure for these unhappy conditions here and
now,' the Buddha told the philosophically inclined bhikkhu, 'so always
remember what I have not explained to you and the reason why I have
refused to explain it.' "
Author: Karen Armstrong
Title: Buddha
Publisher: Penguin
Date: Copyright 2001 by Karen Armstrong
Pages: 100-103
-------------------------------------------------------------------------------------------------In today's excerpt - our American narrative says that in 1865 the
slaves of the south were free. But in 1877, and in the decades that
followed, the lot of blacks had scarcely improved. Enforced servitude,
intimidation and murder were routinely carried out and condoned, and
federal troops were scantily available to travel south and enforce the
new laws of the land. A former slave named Henry Adams kept a list:
"Henry Adams kept a list. It was a long list, and one that kept
growing. Every time whites committed a violent act against blacks in
his northern Louisiana parish of Clairborne, Adams would add a new
entry. There was number 323, Manuel Gregory, who was hanged for
'talking to a white girl,' and number 333, Abe Young, who boasted that
he was going to vote Republican, for which crime he was 'shot by white
men.' Number 453, Jack Shanbress, was whipped and then shot 'because
he was president of a Republican club.' Ben Gardner, number 454, was
'badly beaten by white men' for refusing to work another year on 'Mr.
Gamble's plantation.' Eliza Smith, number 486, was 'badly whipped by
Frank Hall' for 'not being able to work while sick,' while a black man
known only as Jack, number 599, was 'hung dead, by white men,' for
having 'sauced a white man' - talking back after having received
instructions. By the time Henry Adams presented his list to a
committee of the United States Senate in 1878, there were 683 violent
incidents.

"When the committee asked Adams what they could do to help, he


responded that only federal troops proved effective in curbing
violence. The white terrorists, called 'bulldozers' in Louisiana, had
no reason to fear local law enforcement, which they dominated. When
federal troops came into his parish during the 1876 election, the
bulldozers 'stopped killing our people as much as they had been; the
White Leagues stopped raging about with their guns so much.' But only
the governor could request federal assistance, and that office had
fallen into the hands of the terrorists themselves.
"Henry Adams had been a slave for twenty-two years and knew well the
anger of Louisiana's whites. But when he joined the United States
Army, he met whites worthy of his respect, and in 1869 at Fort Jackson
he began attending a school for black soldiers run by a white woman
named Mrs. Bentine. Adams learned to read and write, and felt a new
world opening before him, one that promised greater equality and
opportunity. The following year he voted for the first time and
perceived the potential of democracy, becoming a leader in his
community and a successful businessman. As an organizer for the
Republican Party, he found a number of whites with whom he could work
and whom he esteemed for their honesty and courage, but the majority
of whites belonged to the Democratic Party and sought to silence those
with whom they disagreed.
Reconstruction gave the former slaves hope for the future, but it
aroused the rage of the defeated Confederates who despised the new
order. ... Those who did not share a conviction in the inherent right
of white men to rule totally were to be silenced and the South would
speak with a single voice. These whites, so often the same people who
had supported slavery and secession in the past, did not hesitate to
use violence to attain their ends - thus Adams's list.
"Joe Johnson was another name on that list, but also Adams's friend.
He had been elected constable in East Feliciana Parish on the
Republican ticket in November 1876. When Adams went to visit his
friend in the first days of 1877, he found a grieving widow standing
by the smoldering ruin of Johnson's house. She told him how more than
fifty white men had come to their house and killed her husband
'because he refused to resign his office as constable.' They set fire
to the house with Johnson inside, leaving him for dead. But Johnson
crawled from the house into a pool of water, even though 'all the skin
was burnt off of him.' The terrorists saw Johnson and shot him several
times, though he lingered on for several days before dying. Adams had
to admit to Johnson's widow that there was little chance for justice,
'as I knew that white men had been killing our race so long, and they
had not been stopped yet.' Standing with her children, Mrs. Johnson
wept, 'O, Lord God of Hosts, help us to get out of this country and
get somewhere where we can live.' "

Author: Michael A. Bellesiles


Title: 1877
Publisher: The New Press
Date: Copyright 2010 by Michael A. Bellesiles
Pages: 21-22
-------------------------------------------------------------------------------------------------In today's excerpt - the cause of suicide terrorism:
"Suicide terrorism is rising around the world, but there is great
confusion as to why. Since many such attacks - including, of course,
those of September 11, 2001 - have been perpetrated by Muslim
terrorists professing religious motives, it might seem obvious that
Islamic fundamentalism is the central cause. This presumption has
fueled the belief that future 9/11's can be avoided only by a
wholesale transformation of Muslim societies, a core reason for broad
public support in the United States for the recent conquest of Iraq.
"However, the presumed connection between suicide terrorism and
Islamic fundamentalism is misleading and may be encouraging domestic
and foreign policies likely to worsen America's situation and to harm
many Muslims needlessly.
"I have compiled a database of every suicide bombing and attack around
the globe from 1980 through 2003 - 315 attacks in all. It includes
every attack in which at least one terrorist killed himself or herself
while attempting to kill others; it excludes attacks authorized by a
national government, for example by North Korea against the South.
This database is the first complete universe of suicide terrorist
attacks worldwide. I have amassed and independently verified all the
relevant information that could be found in English and other
languages (for example, Arabic, Hebrew, Russian, and Tamil) in print
and on-line. The information is drawn from suicide terrorist groups
themselves, from the main organizations that collect such data in
target countries, and from news media around the world. More than a
'list of lists,' this database probably represents the most
comprehensive and reliable survey of suicide terrorist attacks that is
now available.
"The data show that there is little connection between suicide
terrorism and Islamic fundamentalism, or any one of the world's
religions. In fact, the leading instigators of suicide attacks are the
Tamil Tigers in Sri Lanka, a Marxist-Leninist group whose members are
from Hindu families but who are adamantly opposed to religion. This
group committed 76 of the 315 incidents, more suicide attacks than
Hamas.
"Rather, what nearly all suicide terrorist attacks have in common is a

specific secular and strategic goal: to compel modern democracies to


withdraw military forces from territory that the terrorists consider
to be their homeland. Religion is rarely the root cause, although it
is often used as a tool by terrorist organizations in recruiting and
in other efforts in service of the broader strategic objective.
"Three general patterns in the data support my conclusions. First,
nearly all suicide terrorist attacks occur as part of organized
campaigns, not as isolated or random incidents. Of the 315 separate
attacks in the period I studied, 301 could have their roots traced to
large, coherent political or military campaigns.
"Second, democratic states are uniquely vulnerable to suicide
terrorists. The United States, France, India, Israel, Russia, Sri
Lanka, and Turkey have been the targets of almost every suicide attack
of the past two decades, and each country has been a democracy at the
time of the incidents.
"Third, suicide terrorist campaigns are directed toward a strategic
objective. From Lebanon to Israel to Sri Lanka to Kashmir to Chechnya,
the sponsors of every campaign have been terrorist groups trying to
establish or maintain political self-determination by compelling a
democratic power to withdraw from the territories they claim. Even alQaeda fits this pattern: although Saudi Arabia is not under American
military occupation per se, a principal objective of Osama bin Laden
is the expulsion of American troops from the Persian Gulf and the
reduction of Washington's power and influence in the region."
Author: Robert A. Pape
Title: Dying to Win
Publisher: Random House
Date: Copyright 2005 by Robert A. Pape
Pages: 3-4
-------------------------------------------------------------------------------------------------In today's excerpt - the extraordinary entrepreneurial culture of
Israel:
"[Israel boasts] the highest density of start-ups in the world (a
total of 3,850 start-ups, one for every 1,844 Israelis), [and] more
Israeli companies are listed on the NASDAQ exchange than all companies
from the entire European continent. ...
"In 2008, per capita venture capital investments in Israel were 2.5
times greater than in the United States, more than 30 times greater
than in Europe, 80 times greater than in China, and 350 times greater
than in India. Comparing absolute numbers, Israel - a country of just
7.1 million people - attracted close to $2 billion in venture capital,

as much as flowed to the United Kingdom's 61 million citizens or to


the 145 million people living in Germany and France combined. And
Israel is the only country to experience a meaningful increase in
venture capital from 2007 to 2008 [in the face of a global financial
crisis.]
"After the United States, Israel has more companies listed on the
NASDAQ than any other country in the world, including India, China,
Korea, Singapore, and Ireland. And Israel is the world leader in the
percentage of the economy that is spent on research and development.
Israel's economy has also grown faster than the average for the
developed economies of the world in most years since 1995.
"Even the wars Israel has repeatedly fought have not slowed the
country down. During the six years following 2000, Israel was hit not
just by the bursting of the global tech bubble but by the most intense
period of terrorist attacks in its history and by the second Lebanon
war. Yet Israel's share of the global venture capital market did not
drop - it doubled, from 15 percent to 31 percent. And the Tel Aviv
stock exchange was higher on the last day of the Lebanon war than on
the first, as it was after the three-week military operation in the
Gaza Strip in 2009.
"The Israeli economic story becomes even more curious when one
considers the nation's dire state just a little over a half century
ago.
"[The importance of start-ups and venture capital in any country is
hard to overstate.] According to the pioneering work of Nobel Prize
winner Robert Solow, technological innovation is the ultimate source
of productivity and growth. It' s the only proven way for economies to
consistently get ahead - especially innovation born by start-up
companies. Recent Census Bureau data show that most of the net
employment gains in the United States between 1980 and 2005 came from
firms younger than five years old. Without start-ups, the average
annual net employment growth rate would actually have been negative."
Author: Dan Senor and Saul Singer
Title: Start-Up Nation
Publisher: Twelve/Hachette
Date: Copyright 2009 by Dan Senor and Saul Singer
Pages: 11-19
-------------------------------------------------------------------------------------------------In today's excerpt - the deadlocked presidential election of 1876,
during the nation's centennial, pitted New York Democrat Samuel Tilden
against Ohio Republican Rutherford B. Hayes. At stake was enough
autonomy for Southern states to disenfranchise blacks - and massive

voting fraud in states like South Carolina, Florida and Louisiana gave
Tilden the electoral edge. President Grant armed Washington against
rumored attacks, and the crisis was not resolved until March of 1877
in a deal that gave Hayes the presidency in trade for the tacit
authority these Southern states sought:
"As the new year of 1877 dawned, the nation appeared hopelessly
deadlocked. Officially Tilden had 184 electoral votes and Hayes 165,
leaving 20 votes up for grab. Hayes needed them all; Tilden required
only a single vote to be president. The framers of the Constitution
had not considered such a situation, simply stating that the electoral
votes should be 'directed to the President of the Senate,' typically
the vice president of the United States, who 'shall, in the presence
of the Senate and House of Representatives, open all the Certificates
and the votes shall then be counted.' But who decided which votes to
open and read if there were two [different sets of votes] - or, as
with Florida, three sets? ...
"Congress struggled to find a solution, remaining in continuous
session into March. In January, each house appointed a committee to
investigate the election. The House committee, dominated by Democrats,
discovered that corruption in the three questionable states meant that
all three should go to Tilden; the Senate committee, dominated by
Republicans, concluded that fraud and voter suppression in the three
states meant that all should go to Hayes. This was not helpful. The
House judiciary Committee then suggested the appointment of a joint
special commission, which, after some very careful negotiation, led to
a commission of five House members, five senators, and five Supreme
Court justices. Originally the five justices were to be drawn from a
hat, but Tilden killed that plan with the bon mot, 'I may lose the
Presidency, but I will not raffle for it.' While Tilden and many other
political leaders doubted the constitutionality of the commission, a
consensus emerged that there were so many recipes for disaster that
some resolution was required as quickly as possible, no matter how
tenuous the legality of the process. Hayes and Tilden reluctantly
accepted the commission in order to avoid a civil war. When one of
Tilden's advisers suggested publicly opposing the commission, Tilden
shot back, 'What is left but war?'
"Tilden's fears found validation in the increasing calls for violence
circulating through the country. It was a time of rumors, disturbing
and bizarre - and occasionally true - as well as loud demands for
violence. Reportedly, President Grant was planning a coup, while
Confederate general Joseph Shelby supposedly announced in St. Louis
that he would lead an army on Washington to put Tilden in the White
House. Hearing this latter story, Confederate hero Colonel John S.
Mosby, the 'Gray Ghost,' went to the White House and offered Grant his
services to help ensure Hayes's inauguration. ...
"Troubled by the professed willingness of his fellow Americans to take

up arms so soon after their devastating Civil War, President Grant


prepared to defend the capital. Grant could call on only 25,000 unpaid
troops, most of them in the West, and had to tread lightly. He could
not afford to alienate the Democrats, but they gave every indication
of deliberately weakening the ability of the federal government to
protect its democratic institutions. Grant adroitly maneuvered his
available units to send a message of resolve while not appearing
aggressive, ordering artillery companies placed on all the entrances
to Washington, D.C., the streets of which, as the New York Herald
reported, 'presented a martial appearance.' Grant ordered the man-ofwar Wyoming to anchor in the Potomac River by the Navy Yard, where its
guns could cover both the Anacostia Bridge from Maryland and the Long
Bridge from Virginia. Meanwhile, a company of Marines took up position
on the Chain Bridge. General Sherman told the press, 'We must protect
the public property, . . . particularly the arsenals.' There was no
way Sherman was going to let white Southerners get their hands on
federal arms without a fight, and his clever placement of a few units
helped to forestall possible coups in Columbia and New Orleans." ...
"Members of Congress began bringing pistols to the Capitol, and in
Columbus, Ohio, a bullet was shot through a window of the Hayes home
while the family was at dinner."
Author: Michael A. Bellesiles
Title: 1877
Publisher: The New Press
Date: Copyright 2010 by Michael A. Bellesiles
Pages: 38-41
-------------------------------------------------------------------------------------------------In today's excerpt - flies, elephants, cities, and ideas:
"Scientists and animal lovers had long observed that as life gets
bigger, it slows down. Flies live for hours or days; elephants live
for half-centuries. The hearts of birds and small mammals pump blood
much faster than those of giraffes and blue whales. But the
relationship between size and speed didn't seem to be a linear one. A
horse might be five hundred times heavier than a rabbit, yet its pulse
certainly wasn't five hundred times slower than the rabbit's. After a
formidable series of measurements in his Davis lab, [Swiss scientist
Max] Kleiber discovered that this scaling phenomenon stuck to an
unvarying mathematical script called 'negative quarter-power scaling.'
If you plotted mass versus metabolism on a logarithmic grid, the
result was a perfectly straight line that led from rats and pigeons
all the way up to bulls and hippopotami. ...
"The more species Kleiber and his peers analyzed, the clearer the
equation became: metabolism scales to mass to the negative quarter

power. The math is simple enough: you take the square root of 1,000,
which is (approximately) 31, and then take the square root of 31,
which is (again, approximately) 5.5. This means that a cow, which is
roughly a thousand times heavier than a woodchuck, will, on average,
live 5.5 times longer, and have a heart rate that is 5.5 times slower
than the woodchuck's. As the science writer George Johnson once
observed, one lovely consequence of Kleiber's law is that the number
of heartbeats per lifetime tends to be stable from species to species.
Bigger animals just take longer to use up their quota. ...
"Several years ago, the theoretical physicist Geoffrey West decided to
investigate whether Kleiber's law applied to one of life's largest
creations: the superorganisms of human-built cities. Did the
'metabolism' of urban life slow down as cities grew in size? Was there
an underlying pattern to the growth and pace of life of metropolitan
systems? Working out of the legendary Santa Fe Institute, where he
served as president until 2009, West assembled an international team
of researchers and advisers to collect data on dozens of cities around
the world, measuring everything from crime to household electrical
consumption, from new patents to gasoline sales.
"When they finally crunched the numbers, West and his team were
delighted to discover that Kleiber's negative quarter-power scaling
governed the energy and transportation growth of city living. The
number of gasoline stations, gasoline sales, road surface area, the
length of electrical cables: all these factors follow the exact same
power law that governs the speed with which energy is expended in
biological organisms. If an elephant was just a scaled-up mouse, then,
from an energy perspective, a city was just a scaled-up elephant.
"But the most fascinating discovery in West's research came from the
data that didn't turn out to obey Kleiber's law. West and his team
discovered another power law lurking in their immense database of
urban statistics. Every datapoint that involved creativity and
innovation - patents, R&D budgets, 'supercreative' professions,
inventors - also followed a quarter-power law, in a way that was every
bit as predictable as Kleiber's law. But there was one fundamental
difference: the quarter-power law governing innovation was positive,
not negative. A city that was ten times larger than its neighbor
wasn't ten times more innovative; it was seventeen times more
innovative. A metropolis fifty times bigger than a town was 130 times
more innovative.
"Kleiber's law proved that as life gets bigger, it slows down. But
West's model demonstrated one crucial way in which human-built cities
broke from the patterns of biological life: as cities get bigger, they
generate ideas at a faster clip. This is what we call 'superlinear
scaling': if creativity scaled with size in a straight, linear
fashion, you would of course find more patents and inventions in a
larger city, but the number of patents and inventions per capita would

be stable. West's power laws suggested something far more provocative:


that despite all the noise and crowding and distraction, the average
resident of a metropolis with a population of five million people was
almost three times more creative than the average resident of a town
of a hundred thousand."
Author: Steve Johnson
Title: Where Good Ideas Come From
Publisher: Riverhead
Date: Copyright 2010 by Steven Johnson
Pages: 8-11
-------------------------------------------------------------------------------------------------In today's encore excerpt - if you happen to work for a bureaucracy,
you'll need to know the subtleties of "officespeak":
"This section deals with the technical aspects of officespeak, such as
passive voice, circular reasoning, and rhetorical questions. These are
the nuts and bolts of the Rube Goldberg contraption that is the
language of the office. Obscurity, vagueness, and a noncommittal
stance on everything define the essence of officespeak. No one wants
to come out and say what they really think. It is much safer for the
company and those up top to constantly cloak their language in order
to hide how much they do know or, just as often, how much they don't
know. ...
Passive voice: The bread and butter of press releases and official
statements. For those who have forgotten their basic grammar, a
sentence in the passive voice does not have an active verb. Thus, no
one can take the blame for 'doing' something, since nothing,
grammatically speaking, has been done by anybody. Using the passive
voice takes the emphasis off yourself (or the company). Here [is an]
few example of how the passive voice can render any situation
guiltless:
'Five hundred employees were laid off.' (Not 'The company laid off
five hundred employees,' or even worse, 'I laid off five hundred
employees.' These layoffs occurred in a netherworld of displaced
blame, in which the company and the individual are miraculously absent
from the picture.) ...
Circular reasoning: Another favorite when it comes time to deliver bad
news. In circular reasoning, a problem is posited and a reason is
given. Except that the reason is basically just a rewording of the
problem. Pretty nifty. Here are some examples to better explain the
examples:
'Our profits are down because of [a decrease in revenues].'

'People were laid off because there was a surplus of workers.' ...
Rhetorical questions: The questions that ask for no answers. So why
even ask the question? Because it makes it seem as though the listener
is participating in a true dialogue. When your boss asks, 'Who's
staying late tonight?' you know he really means, 'Anyone who wants to
keep their job will work late.' Still, there's that split second when
you think you have a say in the matter, when you believe your opinion
counts. Only to be reminded, yet again, that no one cares what you
think. ...
Hollow statements: The second cousin of circular reasoning. Hollow
statements make it seem as though something positive is happening
(such as better profits or increased market share), but they lack any
proof to support the claim.
'Our company is performing better than it looks.'
'Once productivity increases, so will profits.' ...
They and them: Pronouns used to refer to the high-level management
that no one has ever met, only heard whispers about. 'They' are
faceless and often nameless. And their decisions render those beneath
them impotent to change anything. 'They' fire people, 'they' freeze
wages, 'they' make your life a living hell. It's not your boss who is
responsible - he would love to reverse all these directives if he
could. But you see, his hands are tied.
'I'd love to give you that raise, you know I would. But they're the
ones in charge.'
'Okay, gang, bad news, no more cargo shorts allowed. Hey, I love the
casual look, but they hate it.' ...
Obfuscation: A tendency to obscure, darken, or stupefy. The primary
goal of the above techniques is, in the end, obfuscation. Whether it's
by means of the methods outlined above or by injecting jargon-heavy
phrases into sentences, corporations want to make their motives and
actions as difficult to comprehend as possible."
Author: D.W. Martin
Title: Officespeak
Publisher: Simon Spotlight
Date: Copyright 2005 by David Martin
Pages: 11-20
-------------------------------------------------------------------------------------------------In today's excerpt - twenty-eight year old Abraham Lincoln's speech
to the Young Men's Lyceum of Springfield, Illinois. Titled "The
Perpetuation of our Political Institutions," Lincoln's 1838 comments

addressed the rampant lynchings that followed the Emancipation Act of


1833, and his belief that America's greatest dangers came not from
abroad but from within:
"We find ourselves in the peaceful possession of the fairest portion
of the earth, as regards extent of territory, fertility of soil, and
salubrity of climate. ... At what point shall we expect the approach
of danger? By what means shall we fortify against it? Shall we
expect some transatlantic military giant, to step the Ocean, and crush
us at a blow? Never! All the armies of Europe, Asia and Africa
combined, with all the treasure of the earth (our own excepted) in
their military chest; with a Buonaparte for a commander, could not by
force, take a drink from the Ohio, or make a track on the Blue Ridge,
in a trial of a thousand years.
"At what point then is the approach of danger to be expected? I
answer, if it ever reach us, it must spring up amongst us. It cannot
come from abroad. If destruction be our lot, we must ourselves be its
author and finisher. ...
"I hope I am over wary; but if I am not, there is, even now, something
of ill-omen amongst us. I mean the increasing disregard for law which
pervades the country; the growing disposition to substitute the wild
and furious passions, in lieu of the sober judgment of Courts; and the
worse than savage mobs, for the executive minister of justice. ...
Accounts of outrages committed by mobs form the everyday news of the
times. ...
"When men take it in their heads to day, to hang gamblers, or burn
murderers, they should recollect, that, in the confusion usually
attending such transactions, they will be as likely to hang or burn
someone, who is neither a gambler nor a murderer as one who is; and
that, acting upon the example they set, the mob of to-morrow, may, and
probably will, hang or burn some of them, by the very same mistake.
And not only so; the innocent, those who have ever set their faces
against violation of law in every shape, alike with the guilty, fall
victims to the ravages of mob law; and thus it goes on, step by step,
till all the walls erected for the defense of the persons and property
of individuals, are trodden down, and disregarded. But all this even,
is not the full extent of the evil. By such examples, by instances of
the perpetrators of such acts going unpunished, the lawless in spirit,
are encouraged to become lawless in practice; and having been used to
restraint, but dread of punishment, they thus become, absolute
unrestrained. ... Thus, then, by the operation of this mobocratic
spirit, which all must admit is now abroad in the land, the strongest
bulwark of any Government, and particularly of those constituted like
ours, may effectually be broken down and destroyed ... [and] this
Government cannot last. ...
"The question recurs, 'how shall we fortify against it?' The answer is

simple. Let every American, every lover of liberty, every well wisher
to his posterity, swear by the blood of the Revolution, never to
violate in the least particular, the laws of the country; and never to
tolerate their violation by others. As the patriots of seventy-six
did to the support of the Declaration of Independence, so to the
support of the Constitution and Laws, let every man remember that to
violate the law, is to trample on the blood of his father, and to tear
the character of his own, and his children's liberty. ... In short,
let it become the political religion of the nation; and let the old
and the young, the rich and the poor, the brave and the gay, of all
sexes and tongues, and colors and conditions, sacrifice unceasingly
upon its altars.
"The scenes of the revolution are not now or ever will be entirely
forgotten; but that like everything else, they must fade upon the
memory of the world, and grow more and more dim by the lapse of
time. ... They were the pillars of the temple of liberty; and now,
that they have crumbled away, that temple must fall, unless we, their
descendants, supply their places with other pillars, hewn from the
solid quarry of sober reason. Passion has helped us; but can do so no
more. It will in future be our enemy. Reason, cold, calculating,
unimpassioned reason must furnish all the materials for our future
support and defense. Let those materials be molded into general
intelligence, sound morality and, in particular, a reverence for the
constitution and laws. ...
"Upon these let the proud fabric of freedom rest, as the rock of its
basis; and as truly as has been said of the only greater institution,
'the gates of hell shall not prevail against it.' "
Author: Abraham Lincoln
Title: "The Perpetuation of our Political Institutions"
Date: January 1838
-------------------------------------------------------------------------------------------------In today's excerpt - we all know the causes of the American
Revolutionary War and the American Civil War, they are enshrined in
our national consciousness: the Revolutionary War was fought to end
taxation without representation, and the Civil War was fought to end
slavery (or, if you have Southerner sympathies, it was fought over the
issue of states' rights). These reasons, though true in part, may be
insufficient to fully capture the causes of these wars. The colonists
had lower tax rates than their English brethren, more independence on
many matters, and an equally high standard of living. Parliament
removed taxes from the colonists quickly after it imposed them, and
left a token tax in place as a face-saving maneuver. And in the 1850s,
the abolitionist movement was a tiny fraction of the U.S. population.

Historians Fred Anderson and Andrew Cayton point out that those two
wars both started roughly twelve years after the acquisition of
control over vast news areas of land, and thus put in play huge
potential shifts in the balance of power within the newly controlling
governments. The Revolutionary War started shortly after the British
and their colonists wrested control the remainder of the continent
east of the Mississippi away from the French in the French and Indian
War. The Civil War started shortly after the U.S. wrested half of
Mexico's territory away from it in the Mexican American War, and thus
gained effective control of the continent west of the Mississippi:
"Unlike the three previous wars between Britain and France, the vast
conflict known in Europe as the Seven Years' War (1756-63; its North
American phase, 1754-60, is sometimes called the French and Indian
War) ended in a decisive victory, as a result of which the North
American empire of France ceased to exist and Spain (France's ally in
the final year of the war) was compelled to surrender its imperial
claims east of the Mississippi River. This left Britain (in theory at
least) the proprietor of the eastern half of North America. ...
"The victorious British ... so alienated their colonists by attempted
reforms that just a dozen years after the Peace of Paris that ended
the Seven Years War, the thirteen North American colonies took up arms
against the empire. In their efforts to mount resistance to a
sovereign king in Parliament in the decade before war broke out,
colonial leaders used arguments that stressed what had usually been
called the rights of Englishmen, stressing the centrality of political
freedom and the protection of property and other rights. Because the
colonists were a chronically divided lot, however, the leaders of the
resistance movement took care to couch their explanations and appeals
in universalistic language: as defenses of natural rights, not merely
the liberties of Englishmen.
"The War for American Independence (1775-83) shattered the British
empire and made those universalized ideas the foundation of American
political identity. It took another dozen years after the end of the
war in 1783, however, to produce the complex of agreements and
understandings we call the Revolutionary Settlement. ...
Great Britain and the United States ceased to compete militarily after
1815, leaving Mexico, which declared its independence from Spain in
1821, as the last remaining obstacle to the dominion of the United
States in North America. ... The Mexican leaders' fears of revolution
and racial war, along with the rich geographic diversity of their
nation, inhibited the emergence of an American-style revolutionary
settlement and created a fertile field for caudillos, violence, and
local rebellions. One of the latter, on the remote northeastern fringe
of Mexico, created the Republic of Texas in 1836. A decade later, the
United States annexed Texas, provoking a war with Mexico in 1846.
Within two years American soldiers overwhelmed Mexican resistance,

seized the national capital, and forced a peace, the Treaty of


Guadalupe Hidalgo (1848), that deprived Mexico of fully half its
territory.
"As in the aftermath of the Seven Years' War, the accession of vast
amounts of territory created a furious debate that shredded the
political fabric of the victorious empire. Then it had taken twelve
years for the imperial community to collapse in civil war; it now took
thirteen. Adding the lands from the Rockies to the Pacific coast to
what Americans thought of as the empire of liberty made the question
of slavery's expansion into the conquests inescapable. The
Revolutionary Settlement broke down as Northern and Southern Americans
came to see each other as potential tyrants intent on subjugation.
Thus in April 1861, Southerners and Northerners went to war to make
the American empire safe for their own, mutually exclusive, notions of
liberty, convinced that no alternative remained but an appeal to the
god of battles."
Author: Fred Anderson and Andrew Cayton
Title: The Dominion of War
Publisher: Penguin
Date: Copyright 2005 by Fred Anderson and Andrew Cayton
Pages: xvi-xix
-------------------------------------------------------------------------------------------------In today's excerpt - social Darwinism. The greatest U.S. economic
crisis prior to the Great Depression itself was "the Panic of 1873," a
depression that lasted from 1873-1879. It brought unprecedented
unemployment to the country, and unloosed a nationwide hysteria known
as "the Tramp Scare." Thousands unemployed "tramps" crossed the
country looking for work that wasn't to be found. Instead of reacting
with aid and compassion, cities and states passed harsh "anti-Tramp"
laws and labeled the unemployed as morally inferior. Conveniently,
Charles Darwin's brand new theory of evolution was available could be
freely adapted to the social world to provide justification for this
scorn:
"Impressed with what they took to be the hard, scientific fact of
natural selection, many prominent American intellectuals, politicians,
and businessmen followed [leading intellectual] Herbert Spencer in
wanting to extend Charles Darwin's insights on nature to society. To
those who enjoyed the benefits of American prosperity, unrestrained
capitalism appeared a law of nature, and one that should be obeyed by
all and not altered, as to do so would undermine social progress.
Daniel S. Gregory's popular Christian Ethics argued that 'The Moral
Governor has placed the power of acquisitiveness in man for a good and
noble purpose,' so that interfering with greed was actually a sin. Not
surprisingly, the rich and their acolytes crafted an ideology from

this perception that equated wealth with morality and poverty with a
defective character. No one gave voice to this belief system better
than the Reverend Henry Ward Beecher, the most famous - and highly
paid - minister in America: 'The general truth will stand, that no man
in this land suffers from poverty unless it be more than his fault unless it be his sin.'
"Though it had its origins in England with Spencer's writings, social
Darwinism became an obsession among educated Americans in the late
1870s. While scholars place the first use of the phrase 'social
Darwinism' in Europe in 1879, it is telling that the phrase actually
first appears in the public press in the United States in 1877 - and
then in the context of the tramp menace. The Nation converted to
social Darwinism in 1877, its editor, E.L. Godkin, declaring that
nothing of value 'is not the result of successful strife.' Those who
are successful in life deserve their wealth, while trying to lift up
the weak undermines this natural struggle and thus social
progress. ...
"So popular had evolutionary theory become in 1877 that The
Congregationalist complained that too many 'preachers seem to think it
their duty to give their congregations dilutions of John Tyndall and
Thomas Huxley and Herbert Spencer,' the leading promoters of Darwin's
work. They noted with concern that Harvard students are now expected
to read Spencer. Later in the year, Harvard's professor John McCrary,
who held the chair in geology, resigned in opposition to this cult of
Herbert Spencer. But his was a lonely voice, as readings of Spencer
became common at high school exercises throughout the country, even in
Milwaukee. Despite their rejection of evolution, most Protestant
ministers and intellectuals were entranced by social Darwinism. The
Reverend William A. Halliday used Darwin to point out that progress is
certain, but that not everyone advances together; 'the survival of the
fittest is nothing the unfit can cheer about.' ...
"[Yale professor] William Graham Sumner found in Spencer scientific
justification for his extreme version of laissez-faire. Sumner could
thus claim it was a fact, 'fixed in the order of the universe,' that
government intervention threatened to disrupt the workings of natural
selection - from the eight-hour day to public education, protective
tariffs to the post office, they all thwarted progress. Appearing
before the House of Representatives, Sumner was asked, 'Professor,
don't you believe in any government aid to industries?' To which he
emphatically replied, 'No! It's root, hog, or die.' ... Meanwhile,
Henry Ward Beecher turned to Spencer to argue that economic success is
evidence of the working of both God's will and natural selection.
Given that double authority, no one should attempt to ameliorate
economic inequality. Science proved God's will in making certain that
'the poor will be with you always.' "
Author: Michael A. Bellesiles

Title: 1877
Publisher: The New Press
Date: Copyright 2010 by Michael A. Bellesiles
Pages: 127-129
-------------------------------------------------------------------------------------------------In today's excerpt - American culture in the late 1700s and early
1800s was more homogeneous than in European countries. In England,
France and Germany, villagers on one part of the country could not
understand the dialect of those in another part (a condition that was
still true in China in the early 1900s), and the upper class spoke
differently from them all. Americans, however, all spoke a mutually
understandable dialect, and also had fewer religious customs, which
created a more uniform culture, with all the attendant advantages in
commerce and governance:
"Americans thought that they were less superstitious and more rational
than the peoples of Europe. They had actually carried out religious
reforms that European liberals could only dream about. Early Americans
were convinced that their Revolution, in the words of the New York
constitution of 1777, had been designed to end the 'spiritual
oppression and intolerance wherewith the bigotry and ambition of weak
and wicked priests' had 'scourged mankind.' Not only had Americans
achieved true religious liberty, not just the toleration that the
English made so much of, but their blending of the various European
religions and nationalities had made their society much more
homogeneous than those of the Old World.
"The European migrants had been unable to bring all of their various
regional and local cultures with them, and re-creating and sustaining
many of the peculiar customs, craft holidays, and primitive practices
of the Old World proved difficult. Consequently, morris dances,
charivaries, skimmingtons, and other folk practices were much less
common in America than in Britain or Europe. The New England Puritans,
moreover, had banned many of these popular festivals and customs,
including Christmas, and elsewhere the mixing and settling of
different peoples had worn most of them away. ... Since enlightened
elites everywhere in the Western world regarded these plebeian customs
and holidays as remnants of superstition and barbarism, their relative
absence in America was seen as an additional sign of the New World's
precocious enlightenment.
"America had a common language, unlike the European nations, none of
which was linguistically homogeneous. In 1789 the majority of
Frenchmen did not speak French but were divided by a variety of
provincial patois. Englishmen from Yorkshire were incomprehensible to
those from Cornwall and vice versa. By contrast, Americans could
understand one another from Maine to Georgia. It was very obvious why

this should be so, said John Witherspoon, president of Princeton.


Since Americans were 'much more unsettled, and move frequently from
place to place, they are not as liable to local peculiarities, either
in accent or phraseology.' With the Revolution some Americans wished
to carry this uniformity further. They wanted their language 'purged
of its barbaric dross' and made 'as pure, simple, and systematic as
our politics.' It was bound to happen in any case. Republics, said
John Adams, had always attained a greater 'purity, copiousness, and
perfection of language than other forms of government.'
"Americans expected the development of an American English that would
be different from the English of the former mother country, a language
that would reflect the peculiar character of the American people. Noah
Webster, who would eventually become famous for his American
dictionary, thought that language had divided the English people from
one another. The court and the upper ranks of the aristocracy set the
standards of usage and thus put themselves at odds with the language
spoken by the rest of the country. By contrast, America's standard was
fixed by the general practice of the nation, and therefore Americans
had 'the fairest opportunity of establishing a national language, and
of giving it uniformity and perspicuity, in North America, that ever
presented itself to mankind.' Indeed, Webster was convinced that
Americans already 'speak the most pure English now known in the
world.' Within a century and a half, he predicted, North America would
be peopled with a hundred millions of people, 'all speaking the same
language.' Nowhere else in the world would such large numbers of
people 'be able to associate and converse together like children of
the same family.'
"Others had even more grandiose visions for the spread of America's
language. John Adams was among those who suggested that American
English would eventually become 'the next universal language.' In 1789
even a French official agreed; in a moment of giddiness he actually
predicted that American English was destined to replace diplomatic
French as the language of the world. Americans, he said, 'tempered by
misfortune,' were 'more human, more generous, more tolerant, all
qualities that make one want to share the opinions, adopt the customs,
and speak language of such a people.' "
Author: Gordon S. Wood
Title: Empire of Liberty
Publisher: Oxford
Date: Copyright 2009 by Oxford University Press, Inc.
Pages: 47-49
-------------------------------------------------------------------------------------------------In today's excerpt - the invention, and reinvention, of incubators for
newborns:

"Sometime in the late 1870s, a Parisian obstetrician named Stephane


Tarnier took a day off from his work at Maternite de Paris, the lyingin hospital for the city's poor women, and paid a visit to the nearby
Paris Zoo. Wandering past the elephants and reptiles and classical
gardens of the zoo's home inside the Jardin des Plantes, Tarnier
stumbled across an exhibit of chicken incubators. Seeing the
hatchlings totter about in the incubator's warm enclosure triggered an
association in his head, and before long he had hired Odile Martin,
the zoo's poultry raiser, to construct a device that would perform a
similar function for human newborns. By modern standards, infant
mortality was staggeringly high in the late nineteenth century, even
in a city as sophisticated as Paris. One in five babies died before
learning to crawl, and the odds were far worse for premature babies
born with low birth weights. Tarnier knew that temperature regulation
was critical for keeping these infants alive, and he knew that the
French medical establishment had a deep-seated obsession with
statistics. And so as soon as his newborn incubator had been installed
at Maternite, the fragile infants warmed by hot water bottles below
the wooden boxes, Tarnier embarked on a quick study of five hundred
babies. The results shocked the Parisian medical establishment: while
66 percent of low-weight babies died within weeks of birth, only 38
percent died if they were housed in Tarnier's incubating box. You
could effectively halve the mortality rate for premature babies simply
by treating them like hatchlings in a zoo. ...
"Modern incubators, supplemented with high-oxygen therapy and other
advances, became standard equipment in all American hospitals after
the end of World War II, triggering a spectacular 75 percent decline
in infant mortality rates between 1950 and 1998. ...
"In the developing world, however, the infant mortality story remains
bleak. Whereas infant deaths are below ten per thousand births
throughout Europe and the United States, over a hundred infants die
per thousand in countries like Liberia and Ethiopia, many of them
premature babies that would have survived with access to incubators.
But modern incubators are complex, expensive things. A standard
incubator in an American hospital might cost more than $40,000. But
the expense is arguably the smaller hurdle to overcome. Complex
equipment breaks, and when it breaks you need the technical expertise
to fix it, and you need replacement parts. In the year that followed
the 2004 Indian Ocean tsunami, the Indonesian city of Meulaboh
received eight incubators from a range of international relief
organizations. By late 2008, when an MIT professor named Timothy
Prestero visited the hospital, all eight were out of order, the
victims of power surges and tropical humidity, along with the hospital
staff's inability to read the English repair manual. ...
"Prestero and his team decided to build an incubator out of parts that
were already abundant in the developing world. The idea had originated

with a Boston doctor named Jonathan Rosen, who had observed that even
the smaller towns of the developing world seemed to be able to keep
automobiles in working order. The towns might have lacked air
conditioning and laptops and cable television, but they managed to
keep their Toyota 4Runners on the road. So Rosen approached Prestero
with an idea: What if you made an incubator out of automobile parts?
"Three years after Rosen suggested the idea, the team introduced a
prototype device called the NeoNurture. From the outside, it looked
like a streamlined modern incubator, but its guts were automotive.
Sealed-beam headlights supplied the crucial warmth; dashboard fans
provided filtered air circulation; door chimes sounded alarms. You
could power the device via an adapted cigarette lighter, or a
standard-issue motorcycle battery. Building the NeoNurture out of car
parts was doubly efficient, because it tapped both the local supply of
parts themselves and the local knowledge of automobile repair. These
were both abundant resources in the developing world context, as Rosen
liked to say. You didn't have to be a trained medical technician to
fix the NeoNurture; you didn't even have to read the manual. You just
needed to know how to replace a broken headlight."
Author: Steve Johnson
Title: Where Good Ideas Come From
Publisher: Riverhead
Date: Copyright 2010 by Steven Johnson
Pages: 25-28
-------------------------------------------------------------------------------------------------In today's excerpt - when declining temperatures created a climate
crisis in Mongolia, it started Chinggis (Ghengis) Khan and his heirs
on campaigns of conquests that ultimately grew to include most of
China, Central Asia, and Eastern Europe - the largest contiguous landbased empire in history:
"In the late twelfth century [the Mongolian steppe] region was facing
a subsistence crisis because a drop in the mean annual temperature had
reduced the supply of grass for grazing animals. The man who saved the
situation by gaining access to the bounty of the agricultural world
for them was Chinggis (Ghengis, c.1162-1227).
"A brilliant and utterly ruthless military genius, Chinggis proudly
asserted that there was no greater joy than massacring one's enemies,
seizing their horses and cattle, and ravishing their women. His career
as a military leader began when he avenged the death of his father, a
tribal chieftain who had been murdered when Chinggis was still a boy.
As he subdued the Tartars, Kereyid, Naiman, Merkid, and other Mongol
and Turkic tribes, Chinggis built up an army of loyal followers. In
1206 the most prominent Mongol nobles gathered at an assembly to name

him their overlord, or great khan.


"He then fully militarized Mongol society ignoring traditional tribal
affiliations to form an army based on a decimal hierarchy, 1,000
horsemen in the basic unit. A new military nobility was thus created
of commanders loyal to Chinggis. They could pass their posts to their
sons, but the great khan could remove any commander at will. Chinggis
also created an elite bodyguard of 10,000 sons and brothers of
commanders, which served directly under him. To reduce internal
disorder, he issued simple but draconian laws; the penalty for robbery
and adultery, for instance, was death. He ordered the Uighur script to
be adopted for writing Mongol, seeing the utility of written records
even though he was illiterate himself.
"His organization in place, Chinggis initiated one of world history's
most astonishing campaigns of conquest. He began by subjugating nearby
states. First he would send envoys to demand submission and threaten
destruction. Those who submitted without fighting were treated as
allies and left in power, but those who put up a fight faced the
prospect of total destruction. City-dwellers in particular evoked his
wrath and were often slaughtered en masse or used as human shields in
the next battle. In the Mongol armies' first sweep across the north
China plain in 1212-13, they left ninety-odd cities in rubble. When
they sacked the Jurchen's northern capital at Beijing in 1215, it
burned for more than a month.
"Chinggis's battle-hardened troops were capable of enduring great
privation and crossing vast distances at amazing speed. In 1219 he led
200,000 troops into Central Asia, where the following year they sacked
Bukara and Samarkand. Before his death in 1227, Chinggis ... ruled
from the Pacific Ocean on the east to the Caspian Sea on the west.
"Chinggis's death created a crisis due to the Mongol tradition of
succession by election rather than descent. In the end the empire was
divided into four sections, each to be governed by one of the lines of
his descendants. Ogodei, Chinggis's third son, got control of
Mongolia. In 1234 he crushed the Jin and became ruler of north China.
By 1236 he had taken all but four of the fifty-eight districts in
Sichuan, previously held by the Song, and had ordered the total
slaughter of the one million plus residents of the city of Chengdu, a
city the Mongols had taken easily with little fighting. Even where
people were not slaughtered, they were frequently seized as booty
along with their grain stores and livestock. Ogodei's troops also
participated in the western campaigns begun in 1237. Representatives
of all four lines ... campaigned into Europe in 1237, taking Moscow
and Kiev in 1238 and striking into Poland and Hungary in 1241 and
1242. Although they looted cities in central Europe on these
campaigns, the Mongols soon retreated to Russia, which they dominated
for over a century."

Author: Patricia Buckley Ebrey


Title: China
Publisher: Cambridge
Date: Copyright 1996 by Cambridge University Press
Pages: 169-171
-------------------------------------------------------------------------------------------------In today's excerpt - the myth of the mass panic. In disasters, rather
than descending into disorder and a helpless state, people come
together and give one another strength:
"The image of the panicked deeply ingrained in the popular
imagination. Hardly any self-respecting Hollywood disaster movie would
be complete without one scene of people running wildly in all
directions and screaming hysterically. Television newscasters
perpetuate this stereotype with reports that show shoppers competing
for items in what is described as 'panic buying' and traders
gesticulating frantically as 'panic' sweeps through the stock market.
"The idea of mass panic shapes how we plan for, and respond to,
emergency events. In Pennsylvania, for example, the very term is
inscribed in safety regulations known as the state's Fire and Panic
Code. Many public officials assume that ordinary people will become
highly emotional in an emergency, especially in a crowded situation
and that providing information about the true nature of the danger is
likely to make individuals panic even more. Emergency management plans
and policies often intentionally conceal information: for ex- ample,
event marshals may be instructed to inform one another of a fire using
code words, to prevent people from overhearing the news - and
overreacting.
"Mathematicians and engineers who model 'crowd dynamics' often rely on
similar assumptions describing behaviors such as 'herding,' 'flocking'
and, of course, 'panic.' As the late Jonathan Sime (an environmental
psychologist formerly at the University of Surrey in England) pointed
out, efforts to 'design out disaster' have typically treated people as
unthinking or instinctive rather than as rational, social beings.
Therefore, more emphasis is placed on the width of doorways than on
communication technologies that might help people make informed
decisions about their own safety.
"These ideas about crowd behavior permeate the academic world, too.
For many years influential psychology textbooks have illustrated mass
panic by citing supposed examples such as the Iroquois Theater fire of
1903 in Chicago in which some 600 people perished and the Cocoanut
Grove Theater fire of 1942 in Boston in which 492 people died. In the
textbook explanations, theatergoers burned to death as a result of
their foolish overreaction to danger. But Jerome M. Chertkoff and

Russell H. Kushigian of Indiana University, the first social


psychologists to analyze the Cocoanut Grove fire in depth, found that
the nightclub managers had jeopardized public safety in ways that are
shocking today. In a 1999 book on the psychology of emergency egress
and ingress, Chertkoff and Kushigian concluded that physical
obstructions, not mass panic, were responsible for the loss of life in
the infamous fire.
"A more recent example tells a similar story. Kathleen Tierney and her
co-workers at the University of Colorado at Boulder investigated
accusations of panicking, criminality, brutality and mayhem in the
aftermath of Hurricane Ka- trina. They concluded that these tales were
'disaster myths.' What was branded as 'looting' was actually
collective survival behavior: people took food for their families and
neighbors when store payment systems were not working and rescue
services were nowhere in sight. In fact, the population showed a
surprising ability to self-organize in the absence of authorities,
according to Tierney and her colleagues.
"Such work builds on earlier research by two innovative sociologists
in the 1950s. Enrico Quarantelli - who founded the Disaster Research
Center at Ohio State University in 1985 and later moved with it to the
University of Delaware - examined many instances of emergency
evacuations and concluded that people often flee from dangerous events
such as fires and bombings, because usually that is the sensible thing
to do. A fleeing crowd is not necessarily a panicked, irrational
crowd.
"The second pioneering sociologist, Charles Fritz, was influenced by
his ex- periences as a soldier in the U.K. during the World War II
bombings known as the Blitz. 'The Blitz spirit' has become a clich
for communities pulling together in times of adversity. In the 1950s,
as a researcher at the University of Chicago, Fritz made a
comprehensive inventory of 144 peacetime disaster studies that
confirmed the truth of the clich. He concluded that rather than
descending into disorder and a helpless state, human beings in
disasters come together and give one another strength. Our research
suggests that if there is such a thing as panic, it probably better
describes the fear and helplessness of lone individuals than the
responses of a crowd in the midst of an emergency."
Author: John Drury and Stephen D. Reicher
Title: "Crowd Control"
Publisher: Scientific America Mind
Date: November/December 2010
Pages: 60-61
--------------------------------------------------------------------------------------------------

In today's excerpt - until 1871, the land that is now Germany had been
a loose affiliation of small, sovereign states that had emerged from
the Holy Roman Empire, and only became a single, unified country after
the Franco-Prussian War of 1870. In the nationalistic fervor that
followed, it became the second-leading industrial power in the world,
and sought to build up its capital, Berlin, to world-class status.
Young Max Planck entered Berlin in 1874 to pursue the re-emerging
discipline of physics, and soon became one of history's scientific
giants, founding quantum theory and paving the way for Bohr and
Einstein:
"In October 1874, aged sixteen, Max Planck enrolled at Munich
University and opted to study physics because of a burgeoning desire
to understand the workings of nature. In contrast to the nearmilitaristic regime of the Gymnasiums (high schools), German
universities allowed their students almost total freedom. With hardly
any academic supervision and no fixed requirements, it was a system
that enabled students to move from one university to another, taking
courses as they pleased. Sooner or later those wishing to pursue an
academic career took the courses by the pre-eminent professors at the
most prestigious universities. After three years at Munich, where he
was told 'it is hardly worth entering physics anymore' because there
was nothing important left to discover, Planck moved to the leading
university in the German-speaking world, Berlin.
"With the creation of a unified Germany in the wake of the Prussianled victory over France in the war of 1870-71, Berlin became the
capital of a mighty new European nation. Situated at the confluence of
the Havel and the Spree rivers, French war reparations allowed its
rapid redevelopment as it sought to make itself the equal of London
and Paris. A population of 865,000 in 1871 swelled to nearly 2 million
by 1900, making Berlin the third-largest city in Europe. Among the new
arrivals were Jews fleeing persecution in Eastern Europe, especially
the pogroms in Tsarist Russia. Inevitably the cost of housing and
living soared, leaving many homeless and destitute. Manufacturers of
cardboard boxes advertised 'good and cheap boxes for habitation' as
shanty towns sprung up in parts of the city.
"Despite the bleak reality that many found on arriving in Berlin,
Germany was entering a period of unprecedented industrial growth,
technological progress, and economic prosperity. Driven largely by the
abolition of internal tariffs after unification and French war
compensation, by the outbreak of the First World War Germany's
industrial output and economic power would be second only to the
United States. By then it was producing over two-thirds of continental
Europe's steel, half its coal, and was generating more electricity
than Britain, France and Italy combined. Even the recession and
anxiety that affected Europe after the stock market crash of 1873 only
slowed the pace of German development for a few years.

"With unification came the desire to ensure that Berlin, the epitome
of the new Reich, had a university second to none. Germany's most
renowned physicist, Herman von Helmholtz, was enticed from Heidelberg.
A trained surgeon, Helmholtz was also a celebrated physiologist who
had made fundamental contributions to understanding the workings of
the human eye after his invention of the ophthalmoscope. The 50-yearold polymath knew his worth. Apart from a salary several times the
norm, Helmholtz demanded a magnificent new physics institute. It was
still being built in 1877 when Planck arrived in Berlin and began
attending lectures in the university's main building, a former palace
on Unter den Linden opposite the Opera House."
Author: Manjit Kumar
Title: Quantum
Publisher: Norton
Date: Copyright 2008 by Manjit Kumar
Pages: 8-9
-------------------------------------------------------------------------------------------------In today's excerpt - taxes reached all-time highs during World War II,
and the spirit of sacrifice for a great cause made that politically
possible. But after the war, some politicians were eager to rescind
those taxes - and the ageless tax war between Republicans and
Democrats was rejoined with the usual insults and acrimony. Then came
the Korean War:
"After World War II, a bipartisan consensus had quickly emerged that
the government no longer needed the high level of revenues it had
required to sustain the war effort. As after World War I, Congress
repealed the excess profits tax as well as a number of excise taxes
and lowered the levy on personal income.
"Republicans, who had won control of Congress in 1946 on a platform
promising tax cuts, proceeded with fervor to deliver on their
promises, in part to stimulate the economy, in part to reward their
wealthy supporters, and in part to deprive Truman of the funds to
carry out his ambitious Fair Deal social reforms. Robert Lee
Doughton's Republican successor as chairman of the Ways and Means
Committee, Harold Knutson of Minnesota, explained their perspective:
'For years we Republicans have been warning that short-haired women
and long-haired men of alien minds in the administrative branch of
government were trying to wreck the American way of life and install a
hybrid oligarchy at Washington through confiscatory taxation.' Now,
Knutson believed, it was the Republicans' chance to end the
'oligarchy' and proposed legislation to do so.
"However, the Republican bill ran into stiff resistance from the White
House. Truman vetoed it on grounds that the tax cuts would increase

the deficit, add to inflationary pressures, and provide


disproportionate benefits to the wealthy. 'The time for tax
reduction,' the president asserted, 'will come when inflationary
pressures have ceased.' The motion to override narrowly failed in the
House.
"New tax legislation, drafted in early 1948, found broader support,
including among Democrats. It provided greater benefits to lowerincome Americans than did Knutson's 1947 proposal, reducing individual
taxes, increasing the personal exemption from $500 to $600, permitting
married couples to split their income for tax purposes, and providing
benefits for individuals over sixty-five years of age. Truman vetoed
that bill, too, but was overridden - the second time in U.S. history
this had occurred on a revenue bill."
"Truman's surprise victory over New York's governor, Thomas Dewey, in
the 1948 election was accompanied by a return of Democratic majorities
to the House and Senate. Yet this did not guarantee smooth sailing for
the president with Capitol Hill on tax issues. No major tax
legislation was passed in 1949, and in 1950 the president was
concerned about insufficient revenues. In his January 1950 State of
the Union address, he asked for higher taxes, reporting that 'more
than 70 percent of the government's expenditures are required to meet
the costs of past wars and to work for peace.'
"At the same time, he noted, the government had to 'make substantial
expenditures which are necessary to the growth and expansion of the
domestic economy.' He lamented that 'largely because of the illconsidered tax reduction of the 80th Congress [the Republican Congress
that had overridden his veto] the Government is not receiving enough
revenue to meet its necessary expenditures.' To address the shortfall,
he announced that the administration intended to hold down spending
and submit legislation that would 'yield a moderate amount of
additional revenue.' During the summer of 1950, Truman pressed
Congress to raise the income tax, but it was an election year and
there was little support. On the contrary, leading legislators were
hoping to lower excise taxes and enlarge tax breaks to enhance their
chances of victory.
"Congress was in the midst of considering tax legislation when the
Korean War broke out."
Author: Robert D. Hormats
Title: The Price of Liberty
Publisher: Times Books
Date: Copyright 2007 by Robert D. Hormats
Pages: 187-188
--------------------------------------------------------------------------------------------------

In today's excerpt - England begins subjugating Ireland in the early


1600s. England, having recently broken away from the Catholic Church,
feared that Catholic Spain would use still-Catholic Ireland as a
stronghold for invading England - and therefore had incentive to
subjugate and "colonize" Ireland. And England could look to the new
European experiences in the New World for examples of how to colonize
and subjugate:
"Ironically and perhaps fatefully early English, conceptions of Indian
life and character became intertwined with the justification of
another colonizing venture. Ireland was nominally under English rule,
but effective control did not extend beyond the small district known
as 'the Pale' centered on Dublin. The rest of the island was home to
'the wild Irish' who were divided into loose collections of warlike
people with a common interest in defying the English. With the Spanish
seemingly set on ruling the world, England awakened to the danger that
Catholic Spain might take over Catholic Ireland as a stronghold for
invading England. Subjugating the Irish became a way of forestalling
Spain. Elizabeth began by parceling out the country to her favorites,
[Sir Walter] Ralegh among them. These English overlords could either
tame their wild Irish tenants or supplant them with a more productive
and tractable population. It was the same problem that Ralegh faced at
Roanoke and the Virginia Company would face at Jamestown, not to say
the problem the United States would face in its long march across
North America.
"[To the English,] the Irish shared with American Indians a profound
deficiency that required correction if they were to make proper
subjects: they were not civil. That word carried hidden meanings and
connotations that would reverberate throughout American history.
Civility was a way of life not easily defined but its results were
visible: substantial housing and ample clothing. Uncivil peoples were
naked and nomadic. Civility required of those who deserved the name a
sustained effort [both] physical and intellectual. It did not require
belief in Christianity - for the ancient Greeks and Romans had it; but
Christianity or at least Protestant Christianity was impossible
without it. The Irish Catholics and those Indians converted by Spanish
or French missionaries were not in the English view either civil or
Christian. The objective of colonization was to bring civility and
Christianity to the uncivil in that order.
"The objective was threatened, indeed civility itself was threatened,
if lazy colonists coveting the unfettered life of the uncivil went
native - or it might be said went naked. 'Clothes were of tremendous
importance ... because one's whole identity was bound up in the selfpresentation of dress. The Scots and Irish - and soon the American
Indians - could not be civil unless they dressed in English clothes
like civilized people and cut their long hair,' signs of a capacity to
submit to the enlightened government of their superiors.

"England's preferred way of civilizing the Irish was through force of


arms, but after ruthless military expeditions failed to bring
widespread peace and with it civility, the new solution was to plant
the country with people who already rejoiced in that condition.
Refractory natives would learn by example or simply give way left to a
wretched existence on the margins of a profoundly transformed Ireland.
Not long before the Virginia Company began supplying people to
Jamestown, for much the same purpose the English authorities began
settling far larger numbers across the Irish Sea - an estimated
100,000 by 1641."
Author: Edmund S. Morgan and Marie Morgan
Title: "Our Shaky Beginnings"
Publisher: The New York Review of Books
Date: April 26, 2007
Pages: 21-22
-------------------------------------------------------------------------------------------------In today's excerpt - unintended consequences and forest fires:
"The U.S. Forest Service in the first decade of the 1900s adopted a
policy of fire suppression (attempting to put out forest fires) for
the obvious reasons that it didn't want valuable timber to go up in
smoke, nor people's homes and lives to be threatened. The Forest
Service's announced goal became, 'Put out every forest fire by 10:00
A.M. on the morning after the day when it is first reported.'
Firefighters became much more successful at achieving that goal after
World War II, thanks to the availability of firefighting planes, an
expanded road system for sending in fire trucks, and improved
firefighting technology. For a few decades after World War II, the
annual acreage burnt decreased by 80 percent.
"That happy situation began to change in the 1980s, due to the
increasing frequency of large forest fires that were essentially
impossible to extinguish unless rain and low winds combined to help.
People began to realize that the U.S. federal overnment's fire
suppression policy was contributing to those big fires, and that
natural fires caused by lightning had previously played an important
role in maintaining forest structure. That natural role of fire varies
with altitude, tree species, and forest type. To take the [Montana's]
Bitterroot low-altitude Ponderosa Pine forest as an example,
historical records, plus counts of annual tree rings and datable fire
scars on tree stumps, demonstrated that a Ponderosa Pine forest
experiences a lightning-lit fire about once a decade under natural
conditions (i.e., before fire suppression began around 1910 and became
effective after 1945). The mature Ponderosa trees have bark two inches
thick and are relatively resistant to fire, which instead burns out

the understory of fire-sensitive Douglas Fir seedlings that have grown


up since the last fire. But after only a decade's growth until the
next fire, those seedlings are still too low for fire to spread from
them into the crowns. Hence the fire remains confined to the ground
and understory. As a result, many natural Ponderosa Pine forests have
a park-like appearance, with low fuel loads, big trees well spaced
apart, and a relatively clear understory.
"Of course, though, loggers concentrated on removing those big, old,
valuable, fire-resistant Ponderosa Pines, while fire suppression for
decades let the understory fill up with Douglas Fir saplings that
would in turn become valuable when full-grown. Tree densities
increased from 30 to 200 trees per acre, the forest's fuel load
increased by a factor of 6, and Congress repeatedly failed to
appropriate money to thin out the saplings. Another human-related
factor, sheep grazing in national forests, may also have played a
major role by reducing understory grasses that would otherwise have
fueled frequent low-intensity fires. VVhen a fire finally does start
in a sapling-choked forest, whether due to lightning or human
carelessness or (regrettably often) intentional arson, the dense tall
saplings may become a ladder that allows the fire to jump into the
crowns. The outcome is sometimes an unstoppable inferno in which
flames shoot 400 feet into the air, leap from crown to crown across
wide gaps, reach temperatures of 2,000 degrees Fahrenheit, kill the
tree seed bank in the soil, and may be followed by mudslides and mass
erosion.
"Foresters now identify the biggest problem in managing western
forests is what to do with those increased fuel loads that built up
during the previous half-century of effective fire suppression. In the
wetter eastern U.S., dead trees rot away more quickly than in the
drier West, where more dead trees persist like giant matchsticks. In
an ideal world, the Forest Service would manage and restore the
forests, thin them out, and remove the dense understory by cutting or
by controlled small fires. But that would cost over a thousand dollars
per acre for the one hundred million acres of western U.S. forests, or
a total of about $ 100 billion. No politician or voter wants to spend
that kind of money. Even if the cost were lower, much of the public
would be suspicious of such a proposal as just an excuse for resuming
logging of their beautiful forest. Instead of a regular program of
expenditures for maintaining our western forests in a less firesusceptible condition, the federal government tolerates flammable
forests and is forced to spend money unpredictably whenever a
firefighting emergency arises: e.g., about $1.6 billion to fight the
summer 2000 forest fires that burned 10,000 square miles."
Author: Jared Diamond
Title: Collapse
Publisher: Penguin
Date: Copyright 2005 by Jared Diamond

Pages: 44-46
-------------------------------------------------------------------------------------------------In today's excerpt - the many inventors of the telephone and the
importance of inventors:
"[In 1876], Alexander Bell was in his laboratory in the attic of a
machine shop in Boston, trying once more to coax a voice out of a
wire. His efforts had proved mostly futile, and the Bell Company was
little more than a typical hopeless start-up. Bell was a professor and
an amateur inventor, with little taste for business: his expertise and
his day job was teaching the deaf. His main investor and the president
of the Bell Company was Gardiner Green Hubbard, a patent attorney and
prominent critic of the telegraph monopoly Western Union. It is
Hubbard who was responsible for Bell's most valuable asset: its
telephone patent, filed even before Bell had a working prototype.
Besides Hubbard, the company had one employee, Bell's assistant,
Thomas Watson. That was it. ...
"On the very day that Alexander Bell was registering his invention,
another man, Elisha Gray, was also at the patent office filing for the
very same breakthrough. The coincidence takes some of the luster off
Bell's 'eureka.' And the more you examine the history, the worse it
looks. In 1861, sixteen years before Bell, a German man named Johann
Philip Reis presented a primitive telephone to the Physical Society of
Frankfurt. ... Germany has long considered Reis the telephone's
inventor. Another man, a small-town Pennsylvania electrician named
Daniel Drawbaugh, later claimed that by 1869 he had a working
telephone in his house. He produced prototypes and seventy witnesses
who testified that they had seen or heard his invention at that time.
In litigation before the Supreme Court in 1888, three justices
concluded that 'overwhelming evidence' proved that 'Drawbaugh produced
and exhibited in his shop, as early as 1869, an electrical instrument
by which he transmitted speech.' ...
"There was, it is fair to say, no single inventor of the telephone.
And this reality suggests that what we call invention, while not easy,
is simply what happens once a technology's development reaches the
point where the next step becomes available to many people. By Bell's
time, others had invented wires and the telegraph, had discovered
electricity and the basic principles of acoustics. It lay to Bell to
assemble the pieces: no mean feat, but not a superhuman one. In this
sense, inventors are often more like craftsmen than miracle workers.
"Indeed, the history of science is full of examples of what the writer
Malcolm Gladwell terms 'simultaneous discovery' - so full that the
phenomenon represents the norm rather than the exception. Few today
know the name Alfred Russel Wallace, yet he wrote an article proposing

the theory of natural selection in 1858, a year before Charles Darwin


published The Origin of Species. Leibnitz and Newton developed
calculus simultaneously. And in 1610 four others made the same lunar
observations as Galileo. ...
"Is the loner and outsider inventor, then, merely a figment of so much
hype, with no particular significance? No, I would argue his
significance is enormous; but not for the reasons usually imagined.
The inventors we remember are significant not so much as inventors,
but as founders of 'disruptive' industries, ones that shake up the
technological status quo. Through circumstance or luck, they are
exactly at the right distance both to imagine the future and to create
an independent industry to exploit. ...
"The importance of the outsider here owes to his being at the right
remove from the prevailing currents of thought about the problem at
hand. That distance affords a perspective close enough to understand
the problem, yet far enough for greater freedom of thought, freedom
from, as it were, the cognitive distortion of what is as opposed to
what could be. This innovative distance explains why so many of those
who turn an industry upside down are outsiders, even outcasts. ...
"Another advantage of the outside inventor is less a matter of the
imagination than of his being a disinterested party. Distance creates
a freedom to develop inventions that might challenge or even destroy
the business model of the dominant industry. The outsider is often the
only one who can afford to scuttle a perfectly sound ship, to propose
an industry that might challenge the business establishment or suggest
a whole new business model. Those closer to - often at the trough of existing industries face a remarkably constant pressure not to invent
things that will ruin their employer. The outsider has nothing to
lose.
"But to be clear, it is not mere distance, but the right distance that
matters; there is such a thing as being too far away. It may be that
Daniel Drawbaugh actually did invent the telephone seven years before
Bell. We may never know; but even if he did, it doesn't really matter,
because he didn't do anything with it. He was doomed to remain an
inventor, not a founder, for he was just too far away from the action
to found a disruptive industry. In this sense, Bell's alliance with
Hubbard, a sworn enemy of Western Union, the dominant monopolist, was
all-important. For it was Hubbard who made Bell's invention into an
effort to unseat Western Union."
Author: Tim Wu
Title: Master Switch
Publisher: Knopf
Date: Copyright 2010 by Tim Wu
Pages: 17-20

-------------------------------------------------------------------------------------------------In today's excerpt - in the spring of 1917, an astonishing event took


place in the middle of the carnage of World War I - half of the French
Army went on strike. If the German Army had known, they could have
taken Paris and history would have been dramatically rewritten. The
soldiers went on strike because they understood what the generals did
not, that not only was this new type of warfare grossly ineffective,
it made for a slaughterhouse. In all, an unprecedented ten million
people died in the war, including one out of every twenty men in
France, most of them in the period leading up to the strike:
"Almost immediately after the failure of the offensive of 16 April,
there began what its commanders would admit to be 'acts of collective
indiscipline' and what historians have called 'the mutinies of 1917.'
Neither form of words exactly defines the nature of the breakdown,
which is better identified as a sort of military strike.
'Indiscipline' implies a collapse of order. 'Mutiny' usually entails
violence against superiors. Yet order, in the larger sense, remained
intact and there was no violence by the 'mutineers' against their
officers. ...
"The general mood of those involved - and they comprised soldiers in
fifty-four divisions, almost half the army - was one of reluctance, if
not refusal, to take part in fresh attacks but also of patriotic
willingness to hold the line against attacks by the enemy. There were
also specific demands: more leave, better food, better treatment for
soldiers' families, an end to 'injustice' and 'butchery.' ... The
demands were often linked to those of participants in civilian
strikes, [where French citizens] complained that 'While the people
have to work themselves to death to scrape a living, the bosses and
the big industrialists are growing fat.'
"As the crisis deepened - and five phases have been identified, from
scattered outbreaks in April to mass meetings in May, and hostile
encounters June, followed by an attenuation of dissent during the rest
of the year - [General Philippe Petain] set in train a series of
measures designed to contain it and return the army to moral wellbeing. He promised ampler and more regular leave. He also implicitly
promised an end, for a time at least, to attacks, not in so many
words, for that would have spelled an end to the status of France as a
war-waging power, but by emphasising that the troops would be rested
and retrained. ...
"While the front was being reorganised for these new tactics, the
army's officers, with Petain's approval, were attempting to win back
the men's obedience by argument and encouragement. 'No rigorous
measures must be taken,' wrote the commander of the 5th Division's
infantry. 'We must do our best to dilute the movement by persuasion,

by calm and by the authority of the officers known by the men, and
acting above all on the good ones to bring the strikers to the best
sentiments.' His divisional commander agreed: 'we cannot think of
reducing the movement by rigour, which would certainly bring about the
irreparable.'
"Nevertheless, the 'movement' - indiscipline, strike or mutiny - was
not put down without resort to force. Both high command and
government, obsessed by a belief that there had been 'subversion' of
the army by civilian anti-war agitators, devoted a great deal of
effort to identifying ringleaders, to bringing them to trial and to
punishing them. There were 3,427 courts-martial, by which 554 soldiers
were condemned to death and forty-nine actually shot. Hundreds of
others, though reprieved, were sentenced to life imprisonment. A
particular feature of the legal process was that those sent for trial
were selected by their own officers and NCOs, with the implicit
consent of the rank and file.
"Superficially, order was restored within the French army with
relative speed. ... In general, however, the objects of the mutinies
had been achieved. The French army did not attack anywhere on the
Western Front, of which it held two-thirds, between June 1917 and July
1918, nor did it conduct an 'active' defence of its sectors. The
Germans, who had inexplicably failed to detect the crisis of
discipline on the other side of no man's land, were content to accept
their enemy's passivity, having business of their own elsewhere, in
Russia, in Italy and against the British."
Author: John Keegan
Title: The First World War
Publisher: Knopf
Date: Copyright 1998 by John Keegan
Pages: 329-331
-------------------------------------------------------------------------------------------------In today's encore excerpt - with the opening of the Erie Canal in
1825, New York vaulted past Philadelphia as the largest city and
busiest port in America. The economic importance of canals had been
amply demonstrated in England in such projects as the Bridgewater
Canal in 1761, and numerous major canals had been proposed in the U.S.
However the scale of a canal from the Hudson to Lake Erie was
unprecedented. President Thomas Jefferson, calling it "a little short
of madness," thought the proposal for such a canal was ridiculous and
rejected it. It was the entrepreneur Jesse Hawley who managed to
interest the governor DeWitt Clinton and the plan went ahead. Due to
the overwhelming perception that the plan was absurd, the project
became known as "Clinton's Folly" or "Clinton's Ditch." In 1817
Clinton was successful in convincing the New York State legislature to

authorize the funds for building the canal:


"The first section of the [Erie Canal] opened in 1819. And the entire
project (including eighty-three locks enabling the rise of some 568
feet from the Hudson to Lake Erie) was destined to sit finished by the
end of October 1825.
"Once the Erie Canal opened, the entire logic of trade into, out of,
and through the port of New York would be changed forever. As Roy
Finch of the New York State Engineer and Surveyor Bureau observed on
the occasion of the hundredth anniversary of the canal's opening:
'After the building of the original canal the city of New York grew by
leaps and bounds. Before the canal was built, Philadelphia had been
the nation's chief seaport, but New York soon took the lead and too
late Philadelphia made heroic but futile efforts to regain supremacy.'
Finch added that Massachusetts 'had been another rival having been
about on par with New York State in exports.' Nevertheless, a mere
sixteen years after the opening of the canal, Boston's exports were
only one-third those moving through New York. 'In that period, too,
the value of real estate in New York increased more rapidly than the
population while personal property was nearly four times its former
value and manufacturing three times as great. There were five times as
many people following commercial pursuits in New York as there were
before the completion of the Erie Canal.'
"Men of vision saw this boom coming. Indeed the boom was being counted
upon to help pay off the massive $7 million investment it had taken to
accomplish the terrific feat of engineering. Almost more important
than the straightforward logistical advantage of the Erie Canal,
however, was the sheer heft and grandeur of the project, which
captured the imaginations of average Americans and made them feel
inspired. Great things were possible; terrific accomplishment was
indeed achievable - especially in the United States, a country with a
brief past and a wide-open future. Not until the late 1860s would
another such project, the Transcontinental Railroad, seize the public
mind so totally and offer a similar promise for changing the economic
map."
Author: Edward J. Renehan, Jr.
Title: Commodore
Publisher: Basic
Date: Copyright 2007 by Edward J. Renehan, Jr.
Pages: 96-97

Vous aimerez peut-être aussi