The scene
was the Suntory Museum, Osaka, Japan, in an auditorium so
postmodern it made your teeth vibrate. In the audience were
hundreds of Japanese art students. The occasion was the opening of
a show of the work of four of the greatest American illustrators of
the twentieth century: Seymour Chwast, Paul Davis, Milton Glaser,
and James McMullan, the core of New York’s fabled Pushpin Studio.
The show was titled Pushpin and Beyond: The
Celebrated Studio That Transformed Graphic Design. Up on the
stage, aglow with global fame, the Americans had every reason to
feel terrific about themselves.
Seated facing them was an interpreter.
The Suntory’s director began his introduction in Japanese, then
paused for the interpreter’s English translation:
“Our guests today are a group of
American artists from the Manual Age.”
Now the director was speaking again,
but his American guests were no longer listening. They were too
busy trying to process his opening line. The Manual
Age … the Manual Age … The phrase ricocheted about inside
their skulls … bounced off their pyramids of Betz, whistled through
their corpora callosa, and lodged in the Broca’s and Wernicke’s
areas of their brains.
All at once they got it. The hundreds
of young Japanese staring at them from the auditorium seats saw
them not as visionaries on the cutting edge … but as woolly old
mammoths who had somehow wandered into the Suntory Museum from out
of the mists of a Pliocene past … a lineup of relics unaccountably
still living, still breathing, left over from … the
Manual Age!
Marvelous. I wish I had known Japanese
and could have talked to all those students as they scrutinized the
primeval spectacle before them. They were children of the dawn
of—need one spell it out?—the Digital Age. Manual, “freehand”
illustrations? How brave of those old men to have persevered,
having so little to work with. Here and now in the Digital Age
illustrators used—what else?—the digital computer. Creating images
from scratch? What a quaint old term, “from scratch,” and what a
quaint old notion … In the Digital Age, illustrators “morphed”
existing pictures into altered forms on the digital screen. The
very concept of postmodernity was based on the universal use of the
digital computer … whether one was morphing illustrations or
synthesizing music or sending rocket probes into space or
achieving, on the Internet, instantaneous communication and
information retrieval among people all over the globe. The world
had shrunk, shrinkwrapped in an electronic membrane. No person on
earth was more than six mouse clicks away from any other. The
Digital Age was fast rendering national boundaries and city limits
and other old geographical notions obsolete. Likewise, regional
markets, labor pools, and industries. The world was now unified …
online. There remained only one “region,” and its name was the
Digital Universe.
Out of that fond belief has come the
concept of convergence.
Or perhaps I should say out of that
faith, since the origin of the concept is
religious; Roman Catholic, to be specific. The term itself,
“convergence,” as used here in the Digital Age, was coined by a
Jesuit priest, Pierre Teilhard de Chardin. Another ardent Roman
Catholic, Marshall McLuhan, broadcast the message throughout the
intellectual world and gave the Digital Universe its first and most
memorable name: “the global village.” Thousands of dot-com dreamers
are now busy amplifying the message without the faintest idea where
it came from.
Teilhard de Chardin—usually referred to
by the first part of his last name, Teilhard, pronounced
TAY-yar—was one of those geniuses who, in Nietzsche’s phrase (and
as in Nietzsche’s case), were doomed to be understood only after
their deaths. Teilhard, died in 1955. It has taken the current Web
mania, nearly half a century later, for this romantic figure’s
theories to catch fire. Born in 1881, he was the second son among
eleven children in the family of one of the richest landowners in
France’s Auvergne region. As a young man he experienced three
passionate callings: the priesthood, science, and Paris. He was the
sort of worldly priest European hostesses at the turn of the
century died for: tall, dark, and handsome, and aristocratic on top
of that, with beautifully tailored black clerical suits and
masculinity to burn. His athletic body and ruddy complexion he came
by honestly, from the outdoor life he led as a paleontologist in
archaeological digs all over the world. And the way that hard,
lean, weathered face of his would break into a confidential smile
when he met a pretty woman—by all accounts, every other woman in
le monde swore she would be the one to
separate this glamorous Jesuit from his vows.
For Teilhard also had glamour to burn,
three kinds of it. At the age of thirty-two he had been the French
star of the most sensational archaeological find of all time, the
Piltdown man, the so-called missing link in the evolution of ape to
man, in a dig near Lewes, England, led by the Englishman Charles
Dawson. One year later, when World War I broke out, Teilhard
refused the chance to serve as a chaplain in favor of going to the
front as a stretcher bearer rescuing the wounded in the midst of
combat. He was decorated for bravery in that
worst-of-allinfantry-wars’ bloodiest battles: Ypres, Artois,
Verdun, Villers-Cotterêts, and the Marne. Meantime, in the lulls
between battles he had begun writing the treatise with which he
hoped to unify all of science and all of religion, all of matter
and all of spirit, heralding God’s plan to turn all the world, from
inert rock to humankind, into a single sublime Holy
Spirit.
“With the evolution of Man,” he wrote,
“a new law of Nature has come into force—that of convergence.”
Biological evolution had created step one, “expansive convergence.”
Now, in the twentieth century, by means of technology, God was
creating “compressive convergence.” Thanks to technology, “the
hitherto scattered” species Homo sapiens was
being united by a single “nervous system for humanity,” a “living
membrane,” a single “stupendous thinking machine,” a unified
consciousness that would cover the earth like “a thinking skin,” a
“noösphere,” to use Teilhard’s favorite neologism. And just what
technology was going to bring about this convergence, this
noosphere? On this point, in later years, Teilhard was quite
specific: radio, television, the telephone, and “those astonishing
electronic computers, pulsating with signals at the rate of
hundreds of thousands a second.”
One can think whatever one wants about
Teilhard’s theology, but no one can deny his stunning prescience.
When he died in 1955, television was in its infancy and there was
no such thing as a computer you could buy ready-made. Computers
were huge, hellishly expensive, made-to-order machines as big as a
suburban living room and bristling with vacuum tubes that gave off
an unbearable heat. Since the microchip and the microprocessor had
not yet been invented, no one was even speculating about a personal
computer in every home, much less about combining the personal
computer with the telephone to create an entirely new medium of
communication. Half a century ago, only Teilhard foresaw what is
now known as the Internet.
What Teilhard’s superiors in the
Society of Jesus and the Church hierarchy thought about it all in
the 1920s, however, was not much. The plain fact was that Teilhard
accepted the Darwinian theory of evolution. He argued that
biological evolution had been nothing more than God’s first step in
an infinitely grander design. Nevertheless, he accepted it. When
Teilhard had first felt his call to the priesthood, it had been
during the intellectually liberal papacy of Leo XIII. But by the
1920s the pendulum had swung back within the Church, and
evolutionism was not acceptable in any guise. At this point began
the central dilemma, the great sorrow—the tragedy, I am tempted to
say—of this remarkable man’s life. A priest was not allowed to put
anything into public print without his superiors’ approval.
Teilhard’s dilemma was precisely the fact that science and religion
were not unified. As a scientist, he could not bear to disregard
scientific truth; and in his opinion, as a man who had devoted
decades to paleontology, the theory of evolution was indisputably
correct. At the same time he could not envision a life lived
outside the Church.
God knew there were plenty of women who
were busy envisioning it for him. Teilhard’s longest, closest,
tenderest relationship was with an American sculptress named Lucile
Swan. Lovely little Mrs. Swan was in her late thirties and had
arrived in Peking in 1929 on the China leg of a world tour aimed at
diluting the bitterness of her recent breakup with her husband.
Teilhard was in town officially to engage in some major
archaeological digs in China and had only recently played a part in
discovering the second great “missing link,” the Peking man. In
fact, the Church had exiled him from Europe for fear he would ply
his evolutionism among priests and other intellectuals. Lucile Swan
couldn’t get over him. He was the right age, forty-eight, a
celebrated scientist, a war hero, and the most gorgeous white man
in Peking. The crowning touch of glamour was his brave, doomed
relationship with his own church. She had him over to her house
daily “for tea.” In addition to her charms, which were many, she
seems also to have offered an argument aimed at teasing him out of
the shell of celibacy. In effect, the Church was forsaking him
because he had founded his own new religion. Correct? Since it was
his religion, couldn’t he have his priests do anything he wanted
them to do? When she was away, he wrote her letters of great
tenderness and longing. “For the very reason that you are such a
treasure to me, dear Lucile,” he wrote at one point, “I ask you not
to build too much of your life on me … Remember, whatever sweetness
I force myself not to give you, I do in order to be worthy of
you.”
The final three decades of his life
played out with the same unvarying frustration. He completed half a
dozen books, including his great work, The
Phenomenon of Man. The Church allowed him to publish none of
it and kept him in perpetual exile from Europe and his beloved
Paris. His only pleasure and ease came from the generosity of
women, who remained attracted to him even in his old age. In 1953,
two years before his death, he suffered one especially cruel blow.
It was discovered that the Piltdown man had been, in fact, a
colossal hoax pulled off by Charles Dawson, who had hidden various
doctored ape and human bones like Easter eggs for Teilhard and
others to find. He was in an acute state of depression when he died
of a cerebral hemorrhage at the age of seventy-four, still in
exile. His final abode was a dim little room in the Hotel Fourteen
on East Sixtieth Street in Manhattan, with a single window looking
out on a filthy air shaft composed, in part, of a blank exterior
wall of the Copacabana nightclub.
Not a word of his great masterwork had ever been published, and yet Teilhard had enjoyed a certain shady eminence for years. Some of his manuscripts had circulated among his fellow Jesuits, sub rosa, sotto voce, in a Jesuit samizdat. In Canada he was a frequent topic of conversation at St. Michael’s, the Roman Catholic college of the University of Toronto. Immediately following his death, his Paris secretary, Jeanne Mortier, to whom he had left his papers, began publishing his writings in a steady stream, including The Phenomenon of Man. No one paid closer attention to this gusher of Teilhardiana than a forty-four-year-old St. Michael’s teaching fellow named Marshall McLuhan, who taught English literature. McLuhan was already something of a campus star at the University of Toronto when Teilhard died. He had dreamed up an extracurricular seminar on popular culture and was drawing packed houses as he held forth on topics such as the use of sex in advertising, a discourse that had led to his first book, The Mechanical Bride, in 1951. He was a tall, slender man, handsome in a lairdly Scottish way, who played the droll don to a T, popping off deadpan three-liners-not oneliners but three-liners—peopie couldn’t forget.
One time I asked him how it was that
Pierre Trudeau managed to stay in power as Prime Minister through
all the twists and turns of Canadian politics. Without even the
twitch of a smile McLuhan responded, “It’s simple. He has a French
name, he thinks like an Englishman, and he looks like an Indian. We
all feel very guilty about the Indians here in
Canada.”
Another time I was in San Francisco
doing stories on both McLuhan and topless restaurants, each of
which was a new phenomenon. So I got the bright idea of taking the
great communications theorist to a topless restaurant called the
Off Broadway. Neither of us had ever seen such a thing. Here were
scores of businessmen in drab suits skulking at tables in the dark
as spotlights followed the waitresses, each of whom had astounding
silicone-enlarged breasts and wore nothing but high heels, a
G-string, and the rouge on her nipples. Frankly, I was shocked and
speechless. Not McLuhan.
“Very interesting,” he
said.
“What is, Marshall?”
He nodded at the waitresses. “They’re
wearing … us.”
“What do you mean,
Marshall?”
He said it very slowly, to make sure I
got it:
“They’re … putting … us …
on.”
But the three-liners and the pop
culture seminar were nothing compared to what came next, in the
wake of Teilhard’s death: namely, McLuhanism.
McLuhanism was Marshall’s synthesis of
the ideas of two men. One was his fellow Canadian, the economic
historian Harold Innis, who had written two books arguing that new
technologies were primal, fundamental forces steering human
history. The other was Teilhard. McLuhan was scrupulous about
crediting scholars who had influenced him, so much so that he
described his first book of communications theory, The Gutenberg Galaxy, as “a footnote to the work of
Harold Innis.” In the case of Teilhard, however, he was caught in a
bind. McLuhan’s “global village” was nothing other than Teilhard’s
“noösphere,” but the Church had declared Teilhard’s work heterodox,
and McLuhan was not merely a Roman Catholic, he was a convert. He
had been raised as a Baptist but had converted to Catholicism while
in England studying at Cambridge during the 1930s, the palmy days
of England’s great Catholic literary intellectuals, G. K.
Chesterton and Hilaire Belloc. Like most converts, he was highly
devout. So in his own writings he mentioned neither Teilhard nor
the two-step theory of evolution that was the foundation of
Teilhard’s worldview. Only a single reference, a mere obiter dictum, attached any religious significance
whatsoever to the global village: “The Christian concept of the
mystical body—all men as members of the body of Christ—this becomes
technologically a fact under electronic conditions.”
I don’t have the slightest doubt that
what fascinated him about television was the possibility it might
help make real Teilhard’s dream of the Christian unity of all souls
on earth. At the same time, he was well aware that he was
publishing his major works, The Gutenberg
Galaxy (1962) and Understanding Media
(1964), at a moment when even the slightest whiff of religiosity
was taboo, if he cared to command the stage in the intellectual
community. And that, I assure you, he did care to do. His father
had been an obscure insurance and real estate salesman, but his
mother, Elsie, had been an actress who toured Canada giving
dramatic readings, and he had inherited her love of the limelight.
So he presented his theory in entirely secular terms, arguing that
a new, dominant medium such as television altered human
consciousness by literally changing what he called the central
nervous system’s “sensory balance.” For reasons that were never
clear to me—although I did question him on the subject—McLuhan
regarded television as not a visual but an “aural and tactile”
medium that was thrusting the new television generation back into
what he termed a “tribal” frame of mind. These are matters that
today fall under the purview of neuroscience, the study of the
brain and the central nervous system. Neuroscience has made
spectacular progress over the past twenty-five years and is now the
hottest field in science and, for that matter, in all of academia.
But neuroscientists are not even remotely close to being able to
determine something such as the effect of television upon one
individual, much less an entire generation.
That didn’t hold back McLuhan, or the
spread of McLuhanism, for a second. He successfully established the
concept that new media such as television have the power to alter
the human mind and thereby history itself. He died in 1980 at the
age of sixty-nine after a series of strokes, more than a decade
before the creation of the Internet. Dear God—if only he were alive
today! What heaven the present moment would have been for him! How
he would have loved the Web! What a shimmering Oz he would have
turned his global village into!
But by 1980 he had spawned swarms of
believers who were ready to take over where he left off. It is
they, entirely secular souls, who dream up our fin de siècle
notions of convergence for the Digital Age, never realizing for a
moment that their ideas are founded upon Teilhard’s and McLuhan’s
faith in the power of electronic technology to alter the human mind
and unite all souls in a seamless Christian web, the All-in-One.
Today you can pick up any organ of the digital press, those
magazines for dot-com lizards that have been spawned thick as shad
since 1993, and close your eyes and riffle through the pages and
stab your forefinger and come across evangelical prose that sounds
like a hallelujah! for the ideas of Teilhard or McLuhan or
both.
I did just that, and in Wired magazine my finger landed on the name Danny
Hillis, the man credited with pioneering the concept of massively
parallel computers, who writes: “Telephony, computers, and CD-ROMs
are all specialized mechanisms we’ve built to bind us together. Now
evolution takes place in microseconds … We’re taking off. We’re at
that point analogous to when single-celled organisms were turning
into multicelled organisms. We are amoebas and we can’t figure out
what the hell this thing is that we’re creating … We are not
evolution’s ultimate product. There’s something coming after us,
and I imagine it is something wonderful. But we may never be able
to comprehend it, any more than a caterpillar can comprehend
turning into a butterfly.”
Teilhard seemed to think the phase-two
technological evolution of man might take a century or more. But
you will note that Hillis has it reduced to microseconds. Compared
to Hillis, Bill Gates of Microsoft seems positively tentative and
cautious as he rhapsodizes in The Road
Ahead: “We are watching something historic happen, and it
will affect the world seismically.” He’s “thrilled” by “squinting
into the future and catching that first revealing hint of
revolutionary possibilities.” He feels “incredibly lucky” to be
playing a part “in the beginning of an epochal change
…”
We can only appreciate Gates’s
self-restraint when we take a stab at the pages of the September
1998 issue of Upside magazine and come
across its editor in chief, Richard L. Brandt, revealing just how
epochally revolutionary Gates’s Microsoft really is: “I expect to
see the overthrow of the U.S. government in my lifetime. But it
won’t come from revolutionaries or armed conflict. It won’t be a
quick-and-bloody coup; it will be a gradual takeover … Microsoft is
gradually taking over everything. But I’m not suggesting that
Microsoft will be the upstart that will gradually make the U.S.
government obsolete. The culprit is more obvious. It’s the
Internet, damn it. The Internet is a global phenomenon on a scale
we’ve never witnessed.”
In less able hands such speculations
quickly degenerate into what all who follow the digital press have
become accustomed to: Digibabble. All of our digifuturists, even
the best, suffer from what the philosopher Joseph Levine calls “the
explanatory gap.” There is never an explanation of just why or how
such vast changes, such evolutionary and revolutionary great leaps
forward, are going to take place. McLuhan at least recognized the
problem and went to the trouble of offering a neuroscientific
hypothesis, his theory of how various media alter the human nervous
system by changing the “sensory balance.” Everyone after him has
succumbed to what is known as the “Web-mind fallacy,” the purely
magical assumption that as the Web, the Internet, spreads over the
globe, the human mind expands with it. Magical beliefs are leaps of
logic based on proximity or resemblance. Many primitive tribes have
associated the waving of the crops or tall grass in the wind with
the rain that follows. During a drought the tribesmen get together
and create harmonic waves with their bodies in the belief that it
is the waving that brings on the rain. Anthropologists have posited
these tribal hulas as the origin of dance. Similarly, we have the
current magical Web euphoria. A computer is a computer, and the
human brain is a computer. Therefore, a computer is a brain, too,
and if we get a sufficient number of them, millions, billions,
operating all over the world, in a single seamless Web, we will
have a superbrain that converges on a plane far above such
old-fashioned concerns as nationalism and racial and ethnic
competition.
I hate to be the one who brings this
news to the tribe, to the magic Digikingdom, but the simple truth
is that the Web, the Internet, does one thing. It speeds up the
retrieval and dissemination of information, partially eliminating
such chores as going outdoors to the mailbox or the adult
bookstore, or having to pick up the phone to get hold of your
stockbroker or some buddies to shoot the breeze with. That one
thing the Internet does, and only that. All the rest is
Digibabble.
May I log on to the past for a moment?
Ever since the 1830s, people in the Western Hemisphere have been
told that technology was making the world smaller, the assumption
being that only good could come of the shrinkage. When the railroad
locomotive first came into use, in the 1830s, people marveled and
said it made the world smaller by bringing widely separated
populations closer together. When the telephone was invented, and
the transoceanic cable and the telegraph and the radio and the
automobile and the airplane and the television and the fax, people
marveled and said it all over again, many times. But if these
inventions, remarkable as they surely are, have improved the human
mind or reduced the human beast’s zeal for banding together with
his blood brethren against other human beasts, it has escaped my
notice. One hundred and seventy years after the introduction of the
locomotive, the Balkans today are a cluster of virulent spores more
bloodyminded than ever. The former Soviet Union is now fifteen
nations split up along ethnic bloodlines. The very Zeitgeist of the
twenty-first century is summed up in the cry “Back to blood!” The
thin crust of nationhoods the British established in Asia and
Africa at the zenith of their imperial might has vanished, and it
is the tribes of old that rule. What has made national boundaries
obsolete in so much of Eastern Europe, Africa, and Asia? Not the
Internet but the tribes. What have the breathtaking advances in
communications technology done for the human mind? Beats me. SAT
scores among the top tenth of high-school students in the United
States, that fraction who are prime candidates for higher education
in any period, are lower today than they were in the early 1960s.
Believe, if you wish, that computers and the Internet in the
classroom will change all that, but I assure you, it is sheer
Digibabble.
Since so many theories of convergence were magical assumptions about the human mind in the Digital Age, notions that had no neuroscientific foundation whatsoever, I wondered what was going on in neuroscience that might bear upon the subject. This quickly led me to neuroscience’s most extraordinary figure, Edward O. Wilson.
Wilson’s own life is a good argument
for his thesis, which is that among humans, no less than among
racehorses, inbred traits will trump upbringing and environment
every time. In its bare outlines his childhood biography reads like
a case history for the sort of boy who today winds up as the
subject of a tabloid headline: DISSED DORK SNIPERS JOCKS. He was
born in Alabama to a farmer’s daughter and a railroad engineer’s
son who became an accountant and an alcoholic. His parents
separated when Wilson was seven years old, and he was sent off to
the Gulf Coast Military Academy. A chaotic childhood was to follow.
His father worked for the federal Rural Electrification
Administration, which kept reassigning him to different locations,
from the Deep South to Washington, D.C., and back again, so that in
eleven years Wilson attended fourteen different public schools. He
grew up shy and introverted and liked the company only of other
loners, preferably those who shared his enthusiasm for collecting
insects. For years he was a skinny runt, and then for years after
that he was a beanpole. But no matter what ectomorphic shape he
took and no matter what school he went to, his life had one great
center of gravity: He could be stuck anywhere on God’s green earth
and he would always be the smartest person in his class. That
remained true after he graduated with a bachelor’s degree and a
master’s in biology from the University of Alabama and became a
doctoral candidate and then a teacher of biology at Harvard for the
next half century. He remained the best in his class every inch of
the way. Seething Harvard savant after seething Harvard savant,
including one Nobel laureate, has seen his reputation eclipsed by
this terribly reserved, terribly polite Alabamian, Edward O.
Wilson.
Wilson’s field within the discipline of
biology was zoology; and within zoology, entomology, the study of
insects; and within entomology, myrmecology, the study of ants.
Year after year he studied his ants, from Massachusetts to the
wilds of Suriname. He made major discoveries about ants,
concerning, for example, their system of communicating via the
scent of sticky chemical substances known as pheromones—all this to
great applause in the world of myrmecology, considerable applause
in the world of entomology, fair-to-middling applause in the world
of zoology, and polite applause in the vast world of biology
generally. The consensus was that quiet Ed Wilson was doing
precisely what quiet Ed Wilson had been born to do, namely, study
ants, and God bless him. Apparently none of them realized that
Wilson had experienced that moment of blazing revelation all
scientists dream of having. It is known as the “Aha!”
phenomenon.
In 1971 Wilson began publishing his
now-famous sociobiology trilogy. Volume I, The
Insect Societies, was a grand picture of the complex social
structure of insect colonies in general, starring the ants, of
course. The applause was well nigh universal, even among Harvard
faculty members, who kept their envy and resentment on a hair
trigger. So far Ed Wilson had not tipped his hand.
The Insect
Societies spelled out in great detail just how
extraordinarily diverse and finely calibrated the career paths and
social rankings of insects were. A single ant queen gave birth to a
million offspring in an astonishing variety of sizes, with each ant
fated for a particular career. Forager ants went out to find and
bring back food. Big army ants went forth as marauders, “the Huns
and Tartars of the insect world,” slaughtering other ant colonies,
eating their dead victims, and even bringing back captured ant
larvae to feed the colony. Still other ants went forth as herdsmen,
going up tree trunks and capturing mealybugs and caterpillars,
milking them for the viscous ooze they egested (more food), and
driving them down into the underground colony for the night, i.e.,
to the stables. Livestock!
But what steered the bugs into their
various, highly specialized callings? Nobody trained them, and they
did not learn by observation. They were born, and they went to
work. The answer, as every entomologist knew, was genetics, the
codes imprinted (or hardwired, to use another metaphor) at birth.
So what, if anything, did this have to do with humans, who in
advanced societies typically spent twelve or thirteen years, and
often much longer, going to school, taking aptitude tests, talking
to job counselors, before deciding upon a career?
The answer, Wilson knew, was to be
found in the jungles of a Caribbean island. Fifteen years earlier,
in 1956, he had been a freshly minted Harvard biology instructor
accompanying his first graduate student, Stuart Altmann, to Cayo
Santiago, known among zoologists as “monkey island,” off the coast
of Puerto Rico. Altmann was studying rhesus macaque monkeys in
their own habitat. This was four years before Jane Goodall began
studying chimpanzees in the wild in East Africa. Wilson, as he put
it later in his autobiography, was bowled over by the monkeys’
“sophisticated and often brutal world of dominance orders,
alliances, kinship bonds, territorial disputes, threats and
displays, and unnerving intrigues.” In the evenings, teacher and
student, both in their twenties, talked about the possibility of
finding common characteristics among social animals, even among
those as outwardly different as ants and rhesus macaques. They
decided they would have to ignore glib surface comparisons and find
deep principles, statistically demonstrable principles. Altmann
already had a name for such a discipline, “sociobiology,” which
would cover all animals that lived within social orders, from
insects to primates. Wilson thought about that—
Aha!
—human beings were primates, too. It
took him nineteen years and excursions into such esoteric and
highly statistical disciplines as population biology and allometry
(“relative growth of a part in relation to an entire organism”) to
work it out to the point of a compelling synthesis grounded in
detailed observation, in the wild and in the laboratory, and set
forth in terms of precise measurements. The Insect
Societies had been merely the groundwork. In 1975 he
published the central thesis itself: Sociobiology: The New Synthesis.
Not, as everyone in the world of biology noticed, A new synthesis
but The new synthesis. The with a capital T.
In the book’s final chapter, the now
famous Chapter 27, he announced that man and all of man’s works
were the products of deep patterns running throughout the story of
evolution, from ants one-tenth of an inch long to the species
Homo sapiens. Among Homo
sapiens, the division of roles and work assignments between
men and women, the division of labor between the rulers and the
ruled, between the great pioneers and the lifelong drudges, could
not be explained by such superficial, external approaches as
history, economics, sociology, or anthropology. Only sociobiology,
firmly grounded in genetics and the Darwinian theory of evolution,
could do the job.
During the furor that followed, Wilson
compressed his theory into one sentence during an interview. Every
human brain, he said, is born not as a blank slate waiting to be
filled in by experience but as “an exposed negative waiting to be
slipped into developer fluid.” The negative might be developed well
or it might be developed poorly, but all you were going to get was
what was already on the negative at birth.
In one of the most remarkable displays
of wounded Marxist chauvinism in American academic history (and
there have been many), two of Wilson’s well-known colleagues at
Harvard’s Museum of Comparative Zoology, paleontologist Stephen Jay
Gould and geneticist Richard Lewontin, joined a group of radical
activists called Science for the People to form what can only be
called an “antiseptic squad.” The goal, judging by their public
statements, was to demonize Wilson as a reactionary eugenicist, a
Nazi in embryo, and exterminate sociobiology as an approach to the
study of human behavior. After three months of organizing, the
cadre opened its campaign with a letter, signed by fifteen faculty
members and students in the Boston area, to the leading American
organ of intellectual etiquette and deviation sniffing,
The New York Review of Books. Theories like
Wilson’s, they charged, “tend to provide a genetic justification of
the status quo and of existing privileges for certain groups
according to class, race, or sex.” In the past, vile Wilson-like
intellectual poisons had “provided an important basis for the
enactment of sterilization laws … and also for the eugenics
policies which led to the establishment of gas chambers in Nazi
Germany.” The campaign went on for years. Protesters picketed
Wilson’s sociobiology class at Harvard (and the university and the
faculty kept mum and did nothing). Members of INCAR, the
International Committee Against Racism, a group known for its
violent confrontations, stormed the annual meeting of the American
Association for the Advancement of Science in Washington and
commandeered the podium just before Wilson was supposed to speak.
One goony seized the microphone and delivered a diatribe against
Wilson while the others jeered and held up signs with
swastikas—whereupon a woman positioned behind Wilson poured a
carafe of ice water, cubes and all, over his head, and the entire
antiseptic squad joined in the chorus: “You’re all wet! You’re all
wet! You’re all wet!”
The long smear campaign against Edward
O. Wilson was one of the most sickening episodes in American
academic history—and it could not have backfired more completely.
As Freud once said, “Many enemies, much honor.” Overnight, Ed
Wilson became the most famous biologist in the United States. He
was soon adorned with the usual ribbons of celebrity: appearances
on the Today show, the Dick
Cavett Show, Good Morning America, and the covers of
Time and The New York Times
Magazine … while Gould and Lewontin seethed … and seethed …
and contemplated their likely place in the history of science in
the twentieth century: a footnote or two down in the ibid. thickets of the biographies of Edward Osborne
Wilson.
In 1977 Wilson won the National Medal
for Science. In 1979 he won the Pulitzer Prize for nonfiction for
the third volume of his sociobiology trilogy, On
Human Nature. Eleven years later he and his fellow
myrmecologist, Bert Hölldobler, published a massive (7½ pounds),
highly technical work, The Ants, meant as
the last word on these industrious creatures who had played such a
big part in Wilson’s career. The book won the two men Pulitzer
Prizes. It was Wilson’s second.
His smashing success revived Darwinism
in a big way. Sociobiology had presented evolution as the ultimate
theory, the convergence of all knowledge. Darwinists had been with
us always, of course, ever since the days of the great man himself.
But in the twentieth century the Darwinist story of human
life—natural selection, sexual selection, survival of the fittest,
and the rest of it—had been overshadowed by the Freudian and
Marxist stories. Marx said social class determined a human being’s
destiny; Freud said it was the Oedipal drama within the family.
Both were forces external to the newborn infant. Darwinists, Wilson
foremost among them, turned all that upside down and proclaimed
that the genes the infant was born with determined his
destiny.
A field called evolutionary psychology
became all the rage, attracting many young biologists and
philosophers who enjoyed the naughty and delicious thrill of being
Darwinian fundamentalists. The influence of genes was absolute.
Free will among humans, no less than among ants, was an illusion.
The “soul” and the “mind” were illusions, too, and so was the very
notion of a “self.” The quotation marks began spreading like
dermatitis over all the commonsense beliefs about human nature. The
new breed, the fundamentalists, hesitated to use Wilson’s term,
“sociobiology,” because there was always the danger that the
antiseptic squads, the Goulds and the Lewontins and the INCAR
goonies and goonettes, might come gitchoo. But all the bright new
fundamentalists were Ed Wilson’s offspring,
nevertheless.
They soon ran into a problem that
Wilson had largely finessed by offering only the broadest strokes.
Darwin’s theory provided a wonderfully elegant story of how the
human beast evolved from a single cell in the primordial ooze and
became the fittest beast on earth—but offered precious little to
account for what man had created once he reached the level of the
wheel, the shoe, and the toothbrush. Somehow the story of man’s
evolution from the apes had not set the stage for what came next.
Religions, ideologies, scholarly disciplines, aesthetic experiences
such as art, music, literature, and the movies, technological
wonders such as the Brooklyn Bridge and breaking the bonds of
Earth’s gravity with spaceships, not to mention the ability to
create words and grammars and record such extraordinary
accomplishments—there was nothing even remotely homologous to be
found among gorillas, chimpanzees, or any other beasts. So was it
really just Darwinian evolution? Anthropologists had always chalked
such things up to culture. But it had to be Darwinian evolution!
Genetics had to be the answer! Otherwise, fundamentalism did not
mean much.
In 1976, a year after Wilson had lit up
the sky with Sociobiology: The New Synthesis, a British zoologist and Darwinian
fundamentalist, Richard Dawkins, published a book called
The Selfish Gene in which he announced the
discovery of memes. Memes were viruses in the form of ideas,
slogans, tuners, styles, images, doctrines, anything with
sufficient attractiveness or catchiness to infect the
brain—“infect,” like “virus,” became part of the subject’s earnest,
wannabe-scientific terminology—after which they operated like
genes, passing along what had been naïvely thought of as the
creations of culture.
Dawkins’s memes definitely infected the
fundamentalists, in any event. The literature of Memeland began
pouring out: Daniel C. Dennett’s Darwin’s Dangerous
Idea, William H. Calvin’s How Brains
Think, Steven Pinker’s How the Mind
Works, Robert Wright’s The Moral Animal, The
Meme Machine by Susan Blackmore (with a foreword by Richard
Dawkins), and on and on. Dawkins has many devout followers
precisely because his memes are seen as the missing link in
Darwinism as a theory, a theoretical discovery every bit as
important as the skull of the Peking man. One of Bill Gates’s
epigones at Microsoft, Charles Simonyi, was so impressed with
Dawkins and his memes and their historic place on the scientific
frontier, he endowed a chair at Oxford University titled the
Charles Simonyi Professor of the Public Understanding of Science
and installed Dawkins in it. This makes Dawkins the postmodern
equivalent of the Archbishop of Canterbury. Dawkins is now
Archbishop of Darwinian Fundamentalism and Hierophant of the
Memes.
There turns out to be one serious
problem with memes, however. They don’t exist. A neurophysiologist
can use the most powerful and sophisticated brain imaging now
available—and still not find a meme. The Darwinian fundamentalists,
like fundamentalists in any area, are ready for such an obvious
objection. They will explain that memes operate in a way analogous
to genes, i.e., through natural selection and survival of the
fittest memes. But in science, unfortunately, “analogous to” just
won’t do. The tribal hula is analogous to the waving of a wheat
field in the wind before the rain, too. Here the explanatory gap
becomes enormous. Even though some of the fundamentalists have
scientific credentials, not one even hazards a guess as to how, in
physiological, neural terms, the meme “infection” is supposed to
take place. Although no scientist, McLuhan at least offered a
neuroscientific hypothesis for McLuhanism.
So our fundamentalists find themselves
in the awkward position of being like those Englishmen in the year
1000 who believed quite literally in the little people, the
fairies, trolls, and elves. To them, Jack Frost was not merely a
twee personification of winter weather. Jack Frost was one of the
little people, an elf who made your fingers cold, froze the tip of
your nose like an icicle, and left the ground too hard to plow. You
couldn’t see him, but he was there. Thus also with memes. Memes are
little people who sprinkle fairy dust on genes to enable them to
pass along so-called cultural information to succeeding generations
in a proper Darwinian way.
Wilson, who has a lot to answer for,
transmitted more than fairy dust to his progeny, however. He gave
them the urge to be popular. After all, he was a serious scientist
who had become a celebrity. Not only that, he had made the
bestseller lists. As they say in scholarly circles, much of his
work has been really quite accessible. But there is accessible …
and there is cute. The fundamentalists have developed the habit of
cozying up to the reader or, as they are likely to put it, “cozying
up.” When they are courting the book-buying public, they use
quotation marks as friendly winks. They are quick to use the
second-person singular in order to make you (“you”) feel right at
home (“right at home”) and italicized words to make sure you
get it and lots of conversational
contractions so you won’t feel intimidated
by a lot of big words such as “algorithms,” which you’re not likely to tolerate unless there’s some way to bring you closer to your wise
friend, the author, by a just-between-uspals approach. Simple,
I’d say! One fundamentalist book begins with
the statement that “intelligence is what you use when you don’t
know what to do (an apt description of my present predicament as I
attempt to write about intelligence). If you’re good at finding the
one right answer to life’s multiple-choice questions, you’re
smart. But there’s more to being
intelligent—a creative aspect, whereby you
invent something new ‘on the fly’” (How Brains
Think by William H. Calvin, who also came up with a
marvelously loopy synonym for fairy dust: “Darwinian
soft-wiring”).
Meantime, as far as Darwin II himself is concerned, he has nice things to say about Dawkins and his Neuro Pop brood, and he wishes them well in their study of the little people, the memes, but he is far too savvy to buy the idea himself. He theorizes about something called “culturgens,” which sound suspiciously like memes, but then goes on to speak of the possibility of a “gene-culture coevolution.” I am convinced that in his heart Edward O. Wilson believes just as strongly as Dawkins in Darwinian fundamentalism. I am sure he believes just as absolutely in the idea that human beings, for all their extraordinary works, consist solely of matter and water, of strings of molecules containing DNA that are connected to a chemical analog computer known as the brain, a mechanism that creates such illusions as “free will” and … “me.” But Darwin II is patient, and he is a scientist. He is not going to engage in any such sci-fi as meme theory. To test meme theory it would be necessary first to fill in two vast Saharas in the field of brain research: memory and consciousness itself. Memory has largely defied detailed neural analysis, and consciousness has proven totally baffling. No one can even define it. Anaesthesiologists who administer drugs and gases to turn their patients’ consciousness off before surgery have no idea why they work. Until memory and consciousness are understood, meme theory will remain what it is today: amateur night.
But Wilson is convinced that in time
the entire physics and chemistry, the entire neurobiology of the
brain and the central nervous system will be known, just as the
100,000-or-so genes are now being identified and located one by one
in the Human Genome Project. When the process is completed, he
believes, then all knowledge of living things will converge … under
the umbrella of biology. All mental activity, from using allometry
to enjoying music, will be understood in biological
terms.
He actually said as much a quarter of a
century ago in the opening paragraph of Sociobiology’s incendiary Chapter 27. The humanities and
social sciences, he said, would “shrink to specialized branches of
biology.” Such venerable genres as history, biography, and the
novel would become “the research protocols,” i.e., preliminary
reports of the study of human evolution. Anthropology and sociology
would disappear as separate disciplines and be subsumed by “the
sociobiology of a single primate species,” Homo
sapiens. There was so much else in Chapter 27 to outrage the
conventional wisdom of the Goulds and the Lewontins of the academic
world that they didn’t pay much attention to this convergence of
all human disciplines and literary pursuits.
But in 1998 Wilson spelled it out at
length and so clearly that no one inside or outside of academia
could fail to get the point. He published an entire book on the
subject, Consilience, which immediately
became a bestseller despite the theoretical nature of the material.
The term “consilience” was an obsolete word referring to the coming
together, the confluence, of different branches of
knowledge.
The ruckus Consilience kicked up spread far beyond the fields of
biology and evolutionism. Consilience was a
stick in the eye of every novelist, every historian, every
biographer, every social scientist—every intellectual of any
stripe, come to think of it. They were all about to be downsized,
if not terminated, in a vast intellectual merger. The counterattack
began. Jeremy Bernstein, writing in Commentary, drew first blood with a review titled “E. O.
Wilson’s Theory of Everything.” It began: “It is not uncommon for
people approaching the outer shores of middle age to go slightly
dotty.” Oh Lord, another theory of everything from the dotty
professor. This became an intellectual drumbeat—“just another
theory of everything”—and Wilson saw himself tried and hanged on a
charge of hubris.
As for me, despite the prospect of
becoming a mere research protocol drudge for evolutionism, I am
willing to wait for the evidence. I am skeptical, but like Wilson,
I am willing to wait. If Wilson is right, what interests me is not
so much what happens when all knowledge flows together as what
people will do with it once every nanometer and every action and
reaction of the human brain has been calibrated and made manifest
in predictable statistical formulas. I can’t help thinking of our
children of the dawn, the art students we last saw in the Suntory
Museum, Osaka, Japan. Not only will they be able to morph
illustrations on the digital computer, they will also be able to
predict, with breathtaking accuracy, the effect that certain types
of illustrations will have on certain types of brains. But, of
course, the illustrators’ targets will be able to dial up the same
formulas and information and diagnose the effect that any
illustration, any commercial, any speech, any flirtation, any bill,
any coo has been crafted to produce. Life will become one
incessant, colossal round of the match game or liar’s poker or
one-finger-two-finger or rock-paperscissors.
Something tells me, mere research
protocol drudge though I may be, that I will love it all, cherish
it, press it to my bosom. For I already have a working title,
The Human Comedy, and I promise you, you
will laugh your head off … your head and that damnable,
unfathomable chemical analog computer inside of it,
too.