Saturday, March 29, 2008
Are our brains wired for math?
http://www.newyorker.com/reporting/2008/03/03/080303fa_fact_holt
Are our brains wired for math?
by Jim Holt March 3, 2008
According to Stanislas Dehaene, humans have an inbuilt “number sense” capable of some basic calculations and estimates. The problems start when we learn mathematics and have to perform procedures that are anything but instinctive.
-
One morning in September, 1989, a former sales representative in his mid-forties entered an examination room with Stanislas Dehaene, a young neuroscientist based in Paris. Three years earlier, the man, whom researchers came to refer to as Mr. N, had sustained a brain hemorrhage that left him with an enormous lesion in the rear half of his left hemisphere. He suffered from severe handicaps: his right arm was in a sling; he couldn’t read; and his speech was painfully slow. He had once been married, with two daughters, but was now incapable of leading an independent life and lived with his elderly parents. Dehaene had been invited to see him because his impairments included severe acalculia, a general term for any one of several deficits in number processing. When asked to add 2 and 2, he answered “three.” He could still count and recite a sequence like 2, 4, 6, 8, but he was incapable of counting downward from 9, differentiating odd and even numbers, or recognizing the numeral 5 when it was flashed in front of him.
To Dehaene, these impairments were less interesting than the fragmentary capabilities Mr. N had managed to retain. When he was shown the numeral 5 for a few seconds, he knew it was a numeral rather than a letter and, by counting up from 1 until he got to the right integer, he eventually identified it as a 5. He did the same thing when asked the age of his seven-year-old daughter. In the 1997 book “The Number Sense,” Dehaene wrote, “He appears to know right from the start what quantities he wishes to express, but reciting the number series seems to be his only means of retrieving the corresponding word.”
Dehaene also noticed that although Mr. N could no longer read, he sometimes had an approximate sense of words that were flashed in front of him; when he was shown the word “ham,” he said, “It’s some kind of meat.” Dehaene decided to see if Mr. N still had a similar sense of number. He showed him the numerals 7 and 8. Mr. N was able to answer quickly that 8 was the larger number—far more quickly than if he had had to identify them by counting up to the right quantities. He could also judge whether various numbers were bigger or smaller than 55, slipping up only when they were very close to 55. Dehaene dubbed Mr. N “the Approximate Man.” The Approximate Man lived in a world where a year comprised “about 350 days” and an hour “about fifty minutes,” where there were five seasons, and where a dozen eggs amounted to “six or ten.” Dehaene asked him to add 2 and 2 several times and received answers ranging from three to five. But, he noted, “he never offers a result as absurd as 9.”
In cognitive science, incidents of brain damage are nature’s experiments. If a lesion knocks out one ability but leaves another intact, it is evidence that they are wired into different neural circuits. In this instance, Dehaene theorized that our ability to learn sophisticated mathematical procedures resided in an entirely different part of the brain from a rougher quantitative sense. Over the decades, evidence concerning cognitive deficits in brain-damaged patients has accumulated, and researchers have concluded that we have a sense of number that is independent of language, memory, and reasoning in general. Within neuroscience, numerical cognition has emerged as a vibrant field, and Dehaene, now in his early forties, has become one of its foremost researchers. His work is “completely pioneering,” Susan Carey, a psychology professor at Harvard who has studied numerical cognition, told me. “If you want to make sure the math that children are learning is meaningful, you have to know something about how the brain represents number at the kind of level that Stan is trying to understand.”
Dehaene has spent most of his career plotting the contours of our number sense and puzzling over which aspects of our mathematical ability are innate and which are learned, and how the two systems overlap and affect each other. He has approached the problem from every imaginable angle. Working with colleagues both in France and in the United States, he has carried out experiments that probe the way numbers are coded in our minds. He has studied the numerical abilities of animals, of Amazon tribespeople, of top French mathematics students. He has used brain-scanning technology to investigate precisely where in the folds and crevices of the cerebral cortex our numerical faculties are nestled. And he has weighed the extent to which some languages make numbers more difficult than others. His work raises crucial issues about the way mathematics is taught. In Dehaene’s view, we are all born with an evolutionarily ancient mathematical instinct. To become numerate, children must capitalize on this instinct, but they must also unlearn certain tendencies that were helpful to our primate ancestors but that clash with skills needed today. And some societies are evidently better than others at getting kids to do this. In both France and the United States, mathematics education is often felt to be in a state of crisis. The math skills of American children fare poorly in comparison with those of their peers in countries like Singapore, South Korea, and Japan. Fixing this state of affairs means grappling with the question that has taken up much of Dehaene’s career: What is it about the brain that makes numbers sometimes so easy and sometimes so hard?
Dehaene’s own gifts as a mathematician are considerable. Born in 1965, he grew up in Roubaix, a medium-sized industrial city near France’s border with Belgium. (His surname is Flemish.) His father, a pediatrician, was among the first to study fetal alcohol syndrome. As a teen-ager, Dehaene developed what he calls a “passion” for mathematics, and he attended the École Normale Supérieure in Paris, the training ground for France’s scholarly élite. Dehaene’s own interests tended toward computer modelling and artificial intelligence. He was drawn to brain science after reading, at the age of eighteen, the 1983 book “Neuronal Man,” by Jean-Pierre Changeux, France’s most distinguished neurobiologist. Changeux’s approach to the brain held out the tantalizing possibility of reconciling psychology with neuroscience. Dehaene met Changeux and began to work with him on abstract models of thinking and memory. He also linked up with the cognitive scientist Jacques Mehler. It was in Mehler’s lab that he met his future wife, Ghislaine Lambertz, a researcher in infant cognitive psychology.
By “pure luck,” Dehaene recalls, Mehler happened to be doing research on how numbers are understood. This led to Dehaene’s first encounter with what he came to characterize as “the number sense.” Dehaene’s work centered on an apparently simple question: How do we know whether numbers are bigger or smaller than one another? If you are asked to choose which of a pair of Arabic numerals—4 and 7, say—stands for the bigger number, you respond “seven” in a split second, and one might think that any two digits could be compared in the same very brief period of time. Yet in Dehaene’s experiments, while subjects answered quickly and accurately when the digits were far apart, like 2 and 9, they slowed down when the digits were closer together, like 5 and 6. Performance also got worse as the digits grew larger: 2 and 3 were much easier to compare than 7 and 8. When Dehaene tested some of the best mathematics students at the École Normale, the students were amazed to find themselves slowing down and making errors when asked whether 8 or 9 was the larger number.
Dehaene conjectured that, when we see numerals or hear number words, our brains automatically map them onto a number line that grows increasingly fuzzy above 3 or 4. He found that no amount of training can change this. “It is a basic structural property of how our brains represent number, not just a lack of facility,” he told me.
In 1987, while Dehaene was still a student in Paris, the American cognitive psychologist Michael Posner and colleagues at Washington University in St. Louis published a pioneering paper in the journal Nature. Using a scanning technique that can track the flow of blood in the brain, Posner’s team had detailed how different areas became active in language processing. Their research was a revelation for Dehaene. “I remember very well sitting and reading this paper, and then debating it with Jacques Mehler, my Ph.D. adviser,” he told me. Mehler, whose focus was on determining the abstract organization of cognitive functions, didn’t see the point of trying to locate precisely where in the brain things happened, but Dehaene wanted to “bridge the gap,” as he put it, between psychology and neurobiology, to find out exactly how the functions of the mind—thought, perception, feeling, will—are realized in the gelatinous three-pound lump of matter in our skulls. Now, thanks to new technologies, it was finally possible to create pictures, however crude, of the brain in the act of thinking. So, after receiving his doctorate, he spent two years studying brain scanning with Posner, who was by then at the University of Oregon, in Eugene. “It was very strange to find that some of the most exciting results of the budding cognitive-neuroscience field were coming out of this small place—the only place where I ever saw sixty-year-old hippies sitting around in tie-dyed shirts!” he said.
Dehaene is a compact, attractive, and genial man; he dresses casually, wears fashionable glasses, and has a glabrous dome of a head, which he protects from the elements with a chapeau de cowboy. When I visited him recently, he had just moved into a new laboratory, known as NeuroSpin, on the campus of a national center for nuclear-energy research, a dozen or so miles southwest of Paris. The building, which was completed a year ago, is a modernist composition in glass and metal filled with the ambient hums and whirs and whooshes of brain-scanning equipment, much of which was still being assembled. A series of arches ran along one wall in the form of a giant sine wave; behind each was a concrete vault built to house a liquid-helium-cooled superconducting electromagnet. (In brain imaging, the more powerful the magnetic field, the sharper the picture.) The new brain scanners are expected to show the human cerebral anatomy at a level of detail never before seen, and may reveal subtle anomalies in the brains of people with dyslexia and with dyscalculia, a crippling deficiency in dealing with numbers which, researchers suspect, may be as widespread as dyslexia. One of the scanners was already up and running. “You don’t wear a pacemaker or anything, do you?” Dehaene asked me as we entered a room where two researchers were fiddling with controls. Although the scanner was built to accommodate humans, inside, I could see from the monitor, was a brown rat. Researchers were looking at how its brain reacted to various odors, which were puffed in every so often. Then Dehaene led me upstairs to a spacious gallery where the brain scientists working at NeuroSpin are expected to congregate and share ideas. At the moment, it was empty. “We’re hoping for a coffee machine,” he said.
Dehaene has become a scanning virtuoso. On returning to France after his time with Posner, he pressed on with the use of imaging technologies to study how the mind processes numbers. The existence of an evolved number ability had long been hypothesized, based on research with animals and infants, and evidence from brain-damaged patients gave clues to where in the brain it might be found. Dehaene set about localizing this facility more precisely and describing its architecture. “In one experiment I particularly liked,” he recalled, “we tried to map the whole parietal lobe in a half hour, by having the subject perform functions like moving the eyes and hands, pointing with fingers, grasping an object, engaging in various language tasks, and, of course, making small calculations, like thirteen minus four. We found there was a beautiful geometrical organization to the areas that were activated. The eye movements were at the back, the hand movements were in the middle, grasping was in the front, and so on. And right in the middle, we were able to confirm, was an area that cared about number.”
The number area lies deep within a fold in the parietal lobe called the intraparietal sulcus (just behind the crown of the head). But it isn’t easy to tell what the neurons there are actually doing. Brain imaging, for all the sophistication of its technology, yields a fairly crude picture of what’s going on inside the skull, and the same spot in the brain might light up for two tasks even though different neurons are involved. “Some people believe that psychology is just being replaced by brain imaging, but I don’t think that’s the case at all,” Dehaene said. “We need psychology to refine our idea of what the imagery is going to show us. That’s why we do behavioral experiments, see patients. It’s the confrontation of all these different methods that creates knowledge.”
Dehaene has been able to bring together the experimental and the theoretical sides of his quest, and, on at least one occasion, he has even theorized the existence of a neurological feature whose presence was later confirmed by other researchers. In the early nineteen-nineties, working with Jean-Pierre Changeux, he set out to create a computer model to simulate the way humans and some animals estimate at a glance the number of objects in their environment. In the case of very small numbers, this estimate can be made with almost perfect accuracy, an ability known as “subitizing” (from the Latin word subitus, meaning “sudden”). Some psychologists think that subitizing is merely rapid, unconscious counting, but others, Dehaene included, believe that our minds perceive up to three or four objects all at once, without having to mentally “spotlight” them one by one. Getting the computer model to subitize the way humans and animals did was possible, he found, only if he built in “number neurons” tuned to fire with maximum intensity in response to a specific number of objects. His model had, for example, a special four neuron that got particularly excited when the computer was presented with four objects. The model’s number neurons were pure theory, but almost a decade later two teams of researchers discovered what seemed to be the real item, in the brains of macaque monkeys that had been trained to do number tasks. The number neurons fired precisely the way Dehaene’s model predicted—a vindication of theoretical psychology. “Basically, we can derive the behavioral properties of these neurons from first principles,” he told me. “Psychology has become a little more like physics.”
But the brain is the product of evolution—a messy, random process—and though the number sense may be lodged in a particular bit of the cerebral cortex, its circuitry seems to be intermingled with the wiring for other mental functions. A few years ago, while analyzing an experiment on number comparisons, Dehaene noticed that subjects performed better with large numbers if they held the response key in their right hand but did better with small numbers if they held the response key in their left hand. Strangely, if the subjects were made to cross their hands, the effect was reversed. The actual hand used to make the response was, it seemed, irrelevant; it was space itself that the subjects unconsciously associated with larger or smaller numbers. Dehaene hypothesizes that the neural circuitry for number and the circuitry for location overlap. He even suspects that this may be why travellers get disoriented entering Terminal 2 of Paris’s Charles de Gaulle Airport, where small-numbered gates are on the right and large-numbered gates are on the left. “It’s become a whole industry now to see how we associate number to space and space to number,” Dehaene said. “And we’re finding the association goes very, very deep in the brain.”
Last winter, I saw Dehaene in the ornate setting of the Institut de France, across the Seine from the Louvre. There he accepted a prize of a quarter of a million euros from Liliane Bettencourt, whose father created the cosmetics group L’Oréal. In a salon hung with tapestries, Dehaene described his research to a small audience that included a former Prime Minister of France. New techniques of neuroimaging, he explained, promise to reveal how a thought process like calculation unfolds in the brain. This isn’t just a matter of pure knowledge, he added. Since the brain’s architecture determines the sort of abilities that come naturally to us, a detailed understanding of that architecture should lead to better ways of teaching children mathematics and may help close the educational gap that separates children in the West from those in several Asian countries. The fundamental problem with learning mathematics is that while the number sense may be genetic, exact calculation requires cultural tools—symbols and algorithms—that have been around for only a few thousand years and must therefore be absorbed by areas of the brain that evolved for other purposes. The process is made easier when what we are learning harmonizes with built-in circuitry. If we can’t change the architecture of our brains, we can at least adapt our teaching methods to the constraints it imposes.
For nearly two decades, American educators have pushed “reform math,” in which children are encouraged to explore their own ways of solving problems. Before reform math, there was the “new math,” now widely thought to have been an educational disaster. (In France, it was called les maths modernes, and is similarly despised.) The new math was grounded in the theories of the influential Swiss psychologist Jean Piaget, who believed that children are born without any sense of number and only gradually build up the concept in a series of developmental stages. Piaget thought that children, until the age of four or five, cannot grasp the simple principle that moving objects around does not affect how many of them there are, and that there was therefore no point in trying to teach them arithmetic before the age of six or seven.
Piaget’s view had become standard by the nineteen-fifties, but psychologists have since come to believe that he underrated the arithmetic competence of small children. Six-month-old babies, exposed simultaneously to images of common objects and sequences of drumbeats, consistently gaze longer at the collection of objects that matches the number of drumbeats. By now, it is generally agreed that infants come equipped with a rudimentary ability to perceive and represent number. (The same appears to be true for many kinds of animals, including salamanders, pigeons, raccoons, dolphins, parrots, and monkeys.) And if evolution has equipped us with one way of representing number, embodied in the primitive number sense, culture furnishes two more: numerals and number words. These three modes of thinking about number, Dehaene believes, correspond to distinct areas of the brain. The number sense is lodged in the parietal lobe, the part of the brain that relates to space and location; numerals are dealt with by the visual areas; and number words are processed by the language areas.
Nowhere in all this elaborate brain circuitry, alas, is there the equivalent of the chip found in a five-dollar calculator. This deficiency can make learning that terrible quartet—“Ambition, Distraction, Uglification, and Derision,” as Lewis Carroll burlesqued them—a chore. It’s not so bad at first. Our number sense endows us with a crude feel for addition, so that, even before schooling, children can find simple recipes for adding numbers. If asked to compute 2 + 4, for example, a child might start with the first number and then count upward by the second number: “two, three is one, four is two, five is three, six is four, six.” But multiplication is another matter. It is an “unnatural practice,” Dehaene is fond of saying, and the reason is that our brains are wired the wrong way. Neither intuition nor counting is of much use, and multiplication facts must be stored in the brain verbally, as strings of words. The list of arithmetical facts to be memorized may be short, but it is fiendishly tricky: the same numbers occur over and over, in different orders, with partial overlaps and irrelevant rhymes. (Bilinguals, it has been found, revert to the language they used in school when doing multiplication.) The human memory, unlike that of a computer, has evolved to be associative, which makes it ill-suited to arithmetic, where bits of knowledge must be kept from interfering with one another: if you’re trying to retrieve the result of multiplying 7 X 6, the reflex activation of 7 + 6 and 7 X 5 can be disastrous. So multiplication is a double terror: not only is it remote from our intuitive sense of number; it has to be internalized in a form that clashes with the evolved organization of our memory. The result is that when adults multiply single-digit numbers they make mistakes ten to fifteen per cent of the time. For the hardest problems, like 7 X 8, the error rate can exceed twenty-five per cent.
Our inbuilt ineptness when it comes to more complex mathematical processes has led Dehaene to question why we insist on drilling procedures like long division into our children at all. There is, after all, an alternative: the electronic calculator. “Give a calculator to a five-year-old, and you will teach him how to make friends with numbers instead of despising them,” he has written. By removing the need to spend hundreds of hours memorizing boring procedures, he says, calculators can free children to concentrate on the meaning of these procedures, which is neglected under the educational status quo. This attitude might make Dehaene sound like a natural ally of educators who advocate reform math, and a natural foe of parents who want their children’s math teachers to go “back to basics.” But when I asked him about reform math he wasn’t especially sympathetic. “The idea that all children are different, and that they need to discover things their own way—I don’t buy it at all,” he said. “I believe there is one brain organization. We see it in babies, we see it in adults. Basically, with a few variations, we’re all travelling on the same road.” He admires the mathematics curricula of Asian countries like China and Japan, which provide children with a highly structured experience, anticipating the kind of responses they make at each stage and presenting them with challenges designed to minimize the number of errors. “That’s what we’re trying to get back to in France,” he said. Working with his colleague Anna Wilson, Dehaene has developed a computer game called “The Number Race” to help dyscalculic children. The software is adaptive, detecting the number tasks where the child is shaky and adjusting the level of difficulty to maintain an encouraging success rate of seventy-five per cent.
Despite our shared brain organization, cultural differences in how we handle numbers persist, and they are not confined to the classroom. Evolution may have endowed us with an approximate number line, but it takes a system of symbols to make numbers precise—to “crystallize” them, in Dehaene’s metaphor. The Mundurukú, an Amazon tribe that Dehaene and colleagues, notably the linguist Pierre Pica, have studied recently, have words for numbers only up to five. (Their word for five literally means “one hand.”) Even these words seem to be merely approximate labels for them: a Mundurukú who is shown three objects will sometimes say there are three, sometimes four. Nevertheless, the Mundurukú have a good numerical intuition. “They know, for example, that fifty plus thirty is going to be larger than sixty,” Dehaene said. “Of course, they do not know this verbally and have no way of talking about it. But when we showed them the relevant sets and transformations they immediately got it.”
The Mundurukú, it seems, have developed few cultural tools to augment the inborn number sense. Interestingly, the very symbols with which we write down the counting numbers bear the trace of a similar stage. The first three Roman numerals, I, II, and III, were formed by using the symbol for one as many times as necessary; the symbol for four, IV, is not so transparent. The same principle applies to Chinese numerals: the first three consist of one, two, and three horizontal bars, but the fourth takes a different form. Even Arabic numerals follow this logic: 1 is a single vertical bar; 2 and 3 began as two and three horizontal bars tied together for ease of writing. (“That’s a beautiful little fact, but I don’t think it’s coded in our brains any longer,” Dehaene observed.)
Today, Arabic numerals are in use pretty much around the world, while the words with which we name numbers naturally differ from language to language. And, as Dehaene and others have noted, these differences are far from trivial. English is cumbersome. There are special words for the numbers from 11 to 19, and for the decades from 20 to 90. This makes counting a challenge for English-speaking children, who are prone to such errors as “twenty-eight, twenty-nine, twenty-ten, twenty-eleven.” French is just as bad, with vestigial base-twenty monstrosities, like quatre-vingt-dix-neuf (“four twenty ten nine”) for 99. Chinese, by contrast, is simplicity itself; its number syntax perfectly mirrors the base-ten form of Arabic numerals, with a minimum of terms. Consequently, the average Chinese four-year-old can count up to forty, whereas American children of the same age struggle to get to fifteen. And the advantages extend to adults. Because Chinese number words are so brief—they take less than a quarter of a second to say, on average, compared with a third of a second for English—the average Chinese speaker has a memory span of nine digits, versus seven digits for English speakers. (Speakers of the marvellously efficient Cantonese dialect, common in Hong Kong, can juggle ten digits in active memory.)
In 2005, Dehaene was elected to the chair in experimental cognitive psychology at the Collège de France, a highly prestigious institution founded by Francis I in 1530. The faculty consists of just fifty-two scholars, and Dehaene is the youngest member. In his inaugural lecture, Dehaene marvelled at the fact that mathematics is simultaneously a product of the human mind and a powerful instrument for discovering the laws by which the human mind operates. He spoke of the confrontation between new technologies like brain imaging and ancient philosophical questions concerning number, space, and time. And he pronounced himself lucky to be living in an era when advances in psychology and neuroimaging are combining to “render visible” the hitherto invisible realm of thought.
For Dehaene, numerical thought is only the beginning of this quest. Recently, he has been pondering how the philosophical problem of consciousness might be approached by the methods of empirical science. Experiments involving subliminal “number priming” show that much of what our mind does with numbers is unconscious, a finding that has led Dehaene to wonder why some mental activity crosses the threshold of awareness and some doesn’t. Collaborating with a couple of colleagues, Dehaene has explored the neural basis of what is known as the “global workspace” theory of consciousness, which has elicited keen interest among philosophers. In his version of the theory, information becomes conscious when certain “workspace” neurons broadcast it to many areas of the brain at once, making it simultaneously available for, say, language, memory, perceptual categorization, action-planning, and so on. In other words, consciousness is “cerebral celebrity,” as the philosopher Daniel Dennett has described it, or “fame in the brain.”
In his office at NeuroSpin, Dehaene described to me how certain extremely long workspace neurons might link far-flung areas of the human brain together into a single pulsating circuit of consciousness. To show me where these areas were, he reached into a closet and pulled out an irregularly shaped baby-blue plaster object, about the size of a softball. “This is my brain!” he announced with evident pleasure. The model that he was holding had been fabricated, he explained, by a rapid-prototyping machine (a sort of three-dimensional printer) from computer data obtained from one of the many MRI scans that he has undergone. He pointed to the little furrow where the number sense was supposed to be situated, and observed that his had a somewhat uncommon shape. Curiously, the computer software had identified Dehaene’s brain as an “outlier,” so dissimilar are its activation patterns from the human norm. Cradling the pastel-colored lump in his hands, a model of his mind devised by his own mental efforts, Dehaene paused for a moment. Then he smiled and said, “So, I kind of like my brain.”
Thursday, March 20, 2008
How Running Made Us Human: Endurance Running Let Us Evolve To Look The Way We Do
http://www.sciencedaily.com/releases/2004/11/041123163757.htm
How Running Made Us Human: Endurance Running Let Us Evolve To Look The Way We Do
ScienceDaily (Nov. 24, 2004) — Humans evolved from ape-like ancestors because they needed to run long distances – perhaps to hunt animals or scavenge carcasses on Africa's vast savannah – and the ability to run shaped our anatomy, making us look like we do today.
That is the conclusion of a study published in the Nov. 18 issue of the journal Nature by University of Utah biologist Dennis Bramble and Harvard University anthropologist Daniel Lieberman. The study is featured on Nature's cover.
Bramble and Lieberman argue that our genus, Homo, evolved from more ape-like human ancestors, Australopithecus, 2 million or more years ago because natural selection favored the survival of australopithecines that could run and, over time, favored the perpetuation of human anatomical features that made long-distance running possible.
"We are very confident that strong selection for running – which came at the expense of the historical ability to live in trees – was instrumental in the origin of the modern human body form," says Bramble, a professor of biology. "Running has substantially shaped human evolution. Running made us human – at least in an anatomical sense. We think running is one of the most transforming events in human history. We are arguing the emergence of humans is tied to the evolution of running."
That conclusion is contrary to the conventional theory that running simply was a byproduct of the human ability to walk. Bipedalism – the ability to walk upright on two legs – evolved in the ape-like Australopithecus at least 4.5 million years ago while they also retained the ability to travel through the trees. Yet Homo with its "radically transformed body" did not evolve for another 3 million or more years – Homo habilis, Homo erectus and, finally, our species, Homo sapiens – so the ability to walk cannot explain anatomy of the modern human body, Bramble says.
"There were 2.5 million to 3 million years of bipedal walking [by australopithecines] without ever looking like a human, so is walking going to be what suddenly transforms the hominid body?" he asks. "We're saying, no, walking won't do that, but running will."
Walking cannot explain most of the changes in body form that distinguish Homo from Australopithecus, which – when compared with Homo – had short legs, long forearms, high permanently "shrugged" shoulders, ankles that were not visibly apparent and more muscles connecting the shoulders to the head and neck, Bramble says. If natural selection had not favored running, "we would still look a lot like apes," he adds.
I Run, Therefore I Am
Bramble and Lieberman examined 26 traits of the human body – many also seen in fossils of Homo erectus and some in Homo habilis – that enhanced the ability to run. Only some of them were needed for walking. Traits that aided running include leg and foot tendons and ligaments that act like springs, foot and toe structure that allows efficient use of the feet to push off, shoulders that rotate independently of the head and neck to allow better balance, and skeletal and muscle features that make the human body stronger, more stable and able to run more efficiently without overheating.
"We explain the simultaneous emergence of a whole bunch of anatomical features, literally from head to toe," Bramble says. "We have a hypothesis that gives a functional explanation for how these features are linked to the unique mechanical demands of running, how they work together and why they emerged at the same time."
Humans are poor sprinters compared with other running animals, which is partly why many scientists have dismissed running as a factor in human evolution. Human endurance running ability has been inadequately appreciated because of a failure to recognize that "high speed is not always important," Bramble says. "What is important is combining reasonable speed with exceptional endurance."
Another reason is that "scientists are in developed societies that are highly dependent on technology and artificial means of transport," he adds. "But if those scientists had been embedded in a hunter-gatherer society, they'd have a different view of human locomotor abilities, including running."
Why Did Humans Start Running?
The researchers do not know why natural selection favored human ancestors who could run long distances. For one possibility, they cite previous research by University of Utah biologist David Carrier, who hypothesized that endurance running evolved in human ancestors so they could pursue predators long before the development of bows, arrows, nets and spear-throwers reduced the need to run long distances.
Another possibility is that early humans and their immediate ancestors ran to scavenge carcasses of dead animals – maybe so they could beat hyenas or other scavengers to dinner, or maybe to "get to the leftovers soon enough," Bramble says.
Scavenging "is a more reliable source of food" than hunting, he adds. "If you are out in the African savannah and see a column of vultures on the horizon, the chance of there being a fresh carcass underneath the vultures is about 100 percent. If you are going to hunt down something in the heat, that's a lot more work and the payoffs are less reliable" because the animal you are hunting often is "faster than you are."
Anatomical Features that Help Humans Run
Here are anatomical characteristics that are unique to humans and that play a role in helping people run, according to the study:
# Skull features that help prevent overheating during running. As sweat evaporates from the scalp, forehead and face, the evaporation cools blood draining from the head. Veins carrying that cooled blood pass near the carotid arteries, thus helping cool blood flowing through the carotids to the brain.
# A more balanced head with a flatter face, smaller teeth and short snout, compared with australopithecines. That "shifts the center of mass back so it's easier to balance your head when you are bobbing up and down running," Bramble says.
# A ligament that runs from the back of the skull and neck down to the thoracic vertebrae, and acts as a shock absorber and helps the arms and shoulders counterbalance the head during running.
# Unlike apes and australopithecines, the shoulders in early humans were "decoupled" from the head and neck, allowing the body to rotate while the head aims forward during running.
# The tall human body – with a narrow trunk, waist and pelvis – creates more skin surface for our size, permitting greater cooling during running. It also lets the upper and lower body move independently, "which allows you to use your upper body to counteract the twisting forces from your swinging legs," Bramble says.
# Shorter forearms in humans make it easier for the upper body to counterbalance the lower body during running. They also reduce the amount of muscle power needed to keep the arms flexed when running.
# Human vertebrae and disks are larger in diameter relative to body mass than are those in apes or australopithecines. "This is related to shock absorption," says Bramble. "It allows the back to take bigger loads when human runners hit the ground."
# The connection between the pelvis and spine is stronger and larger relative to body size in humans than in their ancestors, providing more stability and shock absorption during running.
# Human buttocks "are huge," says Bramble. "Have you ever looked at an ape? They have no buns." He says human buttocks "are muscles critical for stabilization in running" because they connect the femur – the large bone in each upper leg – to the trunk. Because people lean forward at the hip during running, the buttocks "keep you from pitching over on your nose each time a foot hits the ground."
# Long legs, which chimps and australopithecines lack, let humans to take huge strides when running, Bramble says. So do ligaments and tendons – including the long Achilles tendon – which act like springs that store and release mechanical energy during running. The tendons and ligaments also mean human lower legs that are less muscular and lighter, requiring less energy to move them during running.
# Larger surface areas in the hip, knee and ankle joints, for improved shock absorption during running by spreading out the forces.
# The arrangement of bones in the human foot creates a stable or stiff arch that makes the whole foot more rigid, so the human runner can push off the ground more efficiently and utilize ligaments on the bottom of the feet as springs.
# Humans also evolved with an enlarged heel bone for better shock absorption, as well as shorter toes and a big toe that is fully drawn in toward the other toes for better pushing off during running.
The study by Bramble and Lieberman concludes: "Today, endurance running is primarily a form of exercise and recreation, but its roots may be as ancient as the origin of the human genus, and its demands a major contributing factor to the human body form."
Adapted from materials provided by University Of Utah.
Need to cite this story in your essay, paper, or report? Use one of the following formats:
APA
University Of Utah (2004, November 24). How Running Made Us Human: Endurance Running Let Us Evolve To Look The Way We Do. ScienceDaily. Retrieved March 20, 2008, from http://www.sciencedaily.com /releases/2004/11/041123163757.htm
Tuesday, March 18, 2008
Carbon Footprints (New Yorker article)
EXCERPTS
Each glass of orange juice, for example, contains the equivalent of two glasses of petrol once the transport costs are included. Worse still are highly perishable fresh foods that have been flown in from far away—green beans from Kenya or lettuce from the U.S. They may be worth several times their weight in jet fuel once the transport costs are factored in.”
...
Agricultural researchers at the University of Iowa have reported that the food miles attached to items that one buys in a grocery store are twenty-seven times higher than those for goods bought from local sources. American produce travels an average of nearly fifteen hundred miles before we eat it. Roughly forty per cent of our fruit comes from overseas and, even though broccoli is a vigorous plant grown throughout the country, the broccoli we buy in a supermarket is likely to have been shipped eighteen hundred miles in a refrigerated truck. Although there are vast herds of cattle in the U.S., we import ten per cent of our red meat, often from as far away as Australia or New Zealand.
...
Sea-freight emissions are less than a sixtieth of those associated with airplanes, and you don’t have to build highways to berth a ship. Last year, a study of the carbon cost of the global wine trade found that it is actually more “green” for New Yorkers to drink wine from Bordeaux, which is shipped by sea, than wine from California, sent by truck.
...
lamb raised in New Zealand and shipped eleven thousand miles by boat to England produced six hundred and eighty-eight kilograms of carbon-dioxide emissions per ton, about a fourth the amount produced by British lamb. In part, that is because pastures in New Zealand need far less fertilizer than most grazing land in Britain (or in many parts of the United States). Similarly, importing beans from Uganda or Kenya—where the farms are small, tractor use is limited, and the fertilizer is almost always manure—tends to be more efficient than growing beans in Europe, with its reliance on energy-dependent irrigation systems.
...
Nonetheless, the carbon footprint of the roses from Holland—which are almost always grown in a heated greenhouse—was six times the footprint of those shipped from Kenya.
...
airplanes at high altitudes release at least ten times as many greenhouse gases per mile as trains do.
...
"Detroit will fall apart. I think Ford”—a company that Elkington has advised for years—“will fall apart. They have just made too many bets on the wrong things. A bunch of the institutions that we rely on currently will, to some degree, decompose. I believe that much of what we count as democratic politics today will fall apart, because we are simply not going to be able to deal with the scale of change that we are about to face. It will profoundly disable much of the current political class.”
WHOLE ARTICLE
------
http://www.newyorker.com/reporting/2008/02/25/080225fa_fact_specter
Big Foot
In measuring carbon emissions, it’s easy to confuse morality and science.
by Michael Specter February 25, 2008
A little more than a year ago, Sir Terry Leahy, who is the chief executive of the Tesco chain of supermarkets, Britain’s largest retailer, delivered a speech to a group called the Forum for the Future, about the implications of climate change. Leahy had never before addressed the issue in public, but his remarks left little doubt that he recognized the magnitude of the problem. “I am not a scientist,” he said. “But I listen when the scientists say that, if we fail to mitigate climate change, the environmental, social, and economic consequences will be stark and severe. . . . There comes a moment when it is clear what you must do. I am determined that Tesco should be a leader in helping to create a low-carbon economy. In saying this, I do not underestimate the task. It is to take an economy where human comfort, activity, and growth are inextricably linked with emitting carbon and to transform it into one which can only thrive without depending on carbon. This is a monumental challenge. It requires a revolution in technology and a revolution in thinking. We are going to have to rethink the way we live and work.”
Tesco sells nearly a quarter of the groceries bought in the United Kingdom, it possesses a growing share of the markets in Asia and Europe, and late last year the chain opened its first stores in America. Few corporations could have a more visible—or forceful—impact on the lives of their customers. In his speech, Leahy, who is fifty-two, laid out a series of measures that he hoped would ignite “a revolution in green consumption.” He announced that Tesco would cut its energy use in half by 2010, drastically limit the number of products it transports by air, and place airplane symbols on the packaging of those which it does. More important, in an effort to help consumers understand the environmental impact of the choices they make every day, he told the forum that Tesco would develop a system of carbon labels and put them on each of its seventy thousand products. “Customers want us to develop ways to take complicated carbon calculations and present them simply,” he said. “We will therefore begin the search for a universally accepted and commonly understood measure of the carbon footprint of every product we sell—looking at its complete life cycle, from production through distribution to consumption. It will enable us to label all our products so that customers can compare their carbon footprint as easily as they can currently compare their price or their nutritional profile.”
Leahy’s sincerity was evident, but so was his need to placate his customers. Studies have consistently demonstrated that, given a choice, people prefer to buy products that are environmentally benign. That choice, however, is almost never easy. “A carbon label will put the power in the hands of consumers to choose how they want to be green,” Tom Delay, the head of the British government’s Carbon Trust, said. “It will empower us all to make informed choices and in turn drive a market for low-carbon products.” Tesco was not alone in telling people what it would do to address the collective burden of our greenhouse-gas emissions. Compelled by economic necessity as much as by ecological awareness, many corporations now seem to compete as vigorously to display their environmental credentials as they do to sell their products.
In Britain, Marks & Spencer has set a goal of recycling all its waste, and intends to become carbon-neutral by 2012—the equivalent, it claims, of taking a hundred thousand cars off the road every year. Kraft Foods recently began to power part of a New York plant with methane produced by adding bacteria to whey, a by-product of cream cheese. Not to be outdone, Sara Lee will deploy solar panels to run one of its bakeries, in New Mexico. Many airlines now sell “offsets,” which offer passengers a way to invest in projects that reduce CO2 emissions. In theory, that would compensate for the greenhouse gas caused by their flights. This year’s Super Bowl was fuelled by wind turbines. There are carbon-neutral investment banks, carbon-neutral real-estate brokerages, carbon-neutral taxi fleets, and carbon-neutral dental practices. Detroit, arguably America’s most vivid symbol of environmental excess, has also staked its claim. (“Our designers know green is the new black,” Ford declares on its home page. General Motors makes available hundreds of green pictures, green stories, and green videos to anyone who wants them.)
Possessing an excessive carbon footprint is rapidly becoming the modern equivalent of wearing a scarlet letter. Because neither the goals nor acceptable emissions limits are clear, however, morality is often mistaken for science. A recent article in New Scientist suggested that the biggest problem arising from the epidemic of obesity is the additional carbon burden that fat people—who tend to eat a lot of meat and travel mostly in cars—place on the environment. Australia briefly debated imposing a carbon tax on families with more than two children; the environmental benefits of abortion have been discussed widely (and simplistically). Bishops of the Church of England have just launched a “carbon fast,” suggesting that during Lent parishioners, rather than giving up chocolate, forgo carbon. (Britons generate an average of a little less than ten tons of carbon per person each year; in the United States, the number is about twice that.)
Greenhouse-gas emissions have risen rapidly in the past two centuries, and levels today are higher than at any time in at least the past six hundred and fifty thousand years. In 1995, each of the six billion people on earth was responsible, on average, for one ton of carbon emissions. Oceans and forests can absorb about half that amount. Although specific estimates vary, scientists and policy officials increasingly agree that allowing emissions to continue at the current rate would induce dramatic changes in the global climate system. To avoid the most catastrophic effects of those changes, we will have to hold emissions steady in the next decade, then reduce them by at least sixty to eighty per cent by the middle of the century. (A delay of just ten years in stopping the increase would require double the reductions.) Yet, even if all carbon emissions stopped today, the earth would continue to warm for at least another century. Facts like these have transformed carbon dioxide into a strange but powerful new currency, difficult to evaluate yet impossible to ignore.
A person’s carbon footprint is simply a measure of his contribution to global warming. (CO2 is the best known of the gases that trap heat in the atmosphere, but others—including water vapor, methane, and nitrous oxide—also play a role.) Virtually every human activity—from watching television to buying a quart of milk—has some carbon cost associated with it. We all consume electricity generated by burning fossil fuels; most people rely on petroleum for transportation and heat. Emissions from those activities are not hard to quantify. Watching a plasma television for three hours every day contributes two hundred and fifty kilograms of carbon to the atmosphere each year; an LCD television is responsible for less than half that number. Yet the calculations required to assess the full environmental impact of how we live can be dazzlingly complex. To sum them up on a label will not be easy. Should the carbon label on a jar of peanut butter include the emissions caused by the fertilizer, calcium, and potassium applied to the original crop of peanuts? What about the energy used to boil the peanuts once they have been harvested, or to mold the jar and print the labels? Seen this way, carbon costs multiply rapidly. A few months ago, scientists at the Stockholm Environment Institute reported that the carbon footprint of Christmas—including food, travel, lighting, and gifts—was six hundred and fifty kilograms per person. That is as much, they estimated, as the weight of “one thousand Christmas puddings” for every resident of England.
As a source of global warming, the food we eat—and how we eat it—is no more significant than the way we make clothes or travel or heat our homes and offices. It certainly doesn’t compare to the impact made by tens of thousands of factories scattered throughout the world. Yet food carries enormous symbolic power, so the concept of “food miles”—the distance a product travels from the farm to your home—is often used as a kind of shorthand to talk about climate change in general. “We have to remember our goal: reduce emissions of greenhouse gases,” John Murlis told me not long ago when we met in London. “That should be the world’s biggest priority.” Murlis is the chief scientific adviser to the Carbon Neutral Company, which helps corporations adopt policies to reduce their carbon footprint as well as those of the products they sell. He has also served as the director of strategy and chief scientist for Britain’s Environment Agency. Murlis worries that in our collective rush to make choices that display personal virtue we may be losing sight of the larger problem. “Would a carbon label on every product help us?” he asked. “I wonder. You can feel very good about the organic potatoes you buy from a farm near your home, but half the emissions—and half the footprint—from those potatoes could come from the energy you use to cook them. If you leave the lid off, boil them at a high heat, and then mash your potatoes, from a carbon standpoint you might as well drive to McDonald’s and spend your money buying an order of French fries.”
One particularly gray morning last December, I visited a Tesco store on Warwick Way, in the Pimlico section of London. Several food companies have promised to label their products with the amount of carbon-dioxide emissions associated with making and transporting them. Last spring, Walkers crisps (potato chips) became the first of them to reach British stores, and they are still the only product on the shelves there with a carbon label. I walked over to the crisp aisle, where a young couple had just tossed three bags of Walkers Prawn Cocktail crisps into their shopping cart. The man was wearing fashionable jeans and sneakers without laces. His wife was toting a huge Armani Exchange bag on one arm and dragging their four-year-old daughter with the other. I asked if they paid attention to labels. “Of course,” the man said, looking a bit insulted. He was aware that Walkers had placed a carbon label on the back of its crisp packages; he thought it was a good idea. He just wasn’t sure what to make of the information.
Few people are. In order to develop the label for Walkers, researchers had to calculate the amount of energy required to plant seeds for the ingredients (sunflower oil and potatoes), as well as to make the fertilizers and pesticides used on those potatoes. Next, they factored in the energy required for diesel tractors to collect the potatoes, then the effects of chopping, cleaning, storing, and bagging them. The packaging and printing processes also emit carbon dioxide and other greenhouse gases, as does the petroleum used to deliver those crisps to stores. Finally, the research team assessed the impact of throwing the empty bags in the trash, collecting the garbage in a truck, driving to a landfill, and burying them. In the end, the researchers—from the Carbon Trust—found that seventy-five grams of greenhouse gases are expended in the production of every individual-size bag of potato chips.
“Crisps are easy,” Murlis had told me. “They have only one important ingredient, and the potatoes are often harvested near the factory.” We were sitting in a deserted hotel lounge in Central London, and Murlis stirred his tea slowly, then frowned. “Let’s just assume every mother cares about the environment—what then?” he asked. “Should the carbon content matter more to her than the fat content or the calories in the products she buys?”
I put that question to the next shopper who walked by, Chantal Levi, a Frenchwoman who has lived in London for thirty-two years. I watched her grab a large bag of Doritos and then, shaking her head, return it to the shelf. “Too many carbohydrates,” she said. “I try to watch that, but between the carbs and the fat and the protein it can get to be a bit complicated. I try to buy locally grown, organic food,” she continued. “It tastes better, and it’s far less harmful to the environment.” I asked if she was willing to pay more for products that carried carbon labels. “Of course,” she said. “I care about that. I don’t want my food flown across the world when I can get it close to home. What a waste.”
It is a logical and widely held assumption that the ecological impacts of transporting food—particularly on airplanes over great distances—are far more significant than if that food were grown locally. There are countless books, articles, Web sites, and organizations that promote the idea. There is even a “100-Mile Diet,” which encourages participants to think about “local eating for global change.” Eating locally produced food has become such a phenomenon, in fact, that the word “locavore” was just named the 2007 word of the year by the New Oxford American Dictionary.
Paying attention to the emissions associated with what we eat makes obvious sense. It is certainly hard to justify importing bottled water from France, Finland, or Fiji to a place like New York, which has perhaps the cleanest tap water of any major American city. Yet, according to one recent study, factories throughout the world are burning eighteen million barrels of oil and consuming forty-one billion gallons of fresh water every day, solely to make bottled water that most people in the U.S. don’t need.
“Have a quick rifle through your cupboards and fridge and jot down a note of the countries of origin for each food product,” Mark Lynas wrote in his popular handbook “Carbon Counter,” published last year by HarperCollins. “The further the distance it has travelled, the bigger the carbon penalty. Each glass of orange juice, for example, contains the equivalent of two glasses of petrol once the transport costs are included. Worse still are highly perishable fresh foods that have been flown in from far away—green beans from Kenya or lettuce from the U.S. They may be worth several times their weight in jet fuel once the transport costs are factored in.”
Agricultural researchers at the University of Iowa have reported that the food miles attached to items that one buys in a grocery store are twenty-seven times higher than those for goods bought from local sources. American produce travels an average of nearly fifteen hundred miles before we eat it. Roughly forty per cent of our fruit comes from overseas and, even though broccoli is a vigorous plant grown throughout the country, the broccoli we buy in a supermarket is likely to have been shipped eighteen hundred miles in a refrigerated truck. Although there are vast herds of cattle in the U.S., we import ten per cent of our red meat, often from as far away as Australia or New Zealand.
In his speech last year, Sir Terry Leahy promised to limit to less than one per cent the products that Tesco imports by air. In the United States, many similar efforts are under way. Yet the relationship between food miles and their carbon footprint is not nearly as clear as it might seem. That is often true even when the environmental impact of shipping goods by air is taken into consideration. “People should stop talking about food miles,” Adrian Williams told me. “It’s a foolish concept: provincial, damaging, and simplistic.” Williams is an agricultural researcher in the Natural Resources Department of Cranfield University, in England. He has been commissioned by the British government to analyze the relative environmental impacts of a number of foods. “The idea that a product travels a certain distance and is therefore worse than one you raised nearby—well, it’s just idiotic,” he said. “It doesn’t take into consideration the land use, the type of transportation, the weather, or even the season. Potatoes you buy in winter, of course, have a far higher environmental ticket than if you were to buy them in August.” Williams pointed out that when people talk about global warming they usually speak only about carbon dioxide. Making milk or meat contributes less CO2 to the atmosphere than building a house or making a washing machine. But the animals produce methane and nitrous oxide, and those are greenhouse gases, too. “This is not an equation like the number of calories or even the cost of a product,’’ he said. “There is no one number that works.”
Many factors influence the carbon footprint of a product: water use, cultivation and harvesting methods, quantity and type of fertilizer, even the type of fuel used to make the package. Sea-freight emissions are less than a sixtieth of those associated with airplanes, and you don’t have to build highways to berth a ship. Last year, a study of the carbon cost of the global wine trade found that it is actually more “green” for New Yorkers to drink wine from Bordeaux, which is shipped by sea, than wine from California, sent by truck. That is largely because shipping wine is mostly shipping glass. The study found that “the efficiencies of shipping drive a ‘green line’ all the way to Columbus, Ohio, the point where a wine from Bordeaux and Napa has the same carbon intensity.”
The environmental burden imposed by importing apples from New Zealand to Northern Europe or New York can be lower than if the apples were raised fifty miles away. “In New Zealand, they have more sunshine than in the U.K., which helps productivity,” Williams explained. That means the yield of New Zealand apples far exceeds the yield of those grown in northern climates, so the energy required for farmers to grow the crop is correspondingly lower. It also helps that the electricity in New Zealand is mostly generated by renewable sources, none of which emit large amounts of CO2. Researchers at Lincoln University, in Christchurch, found that lamb raised in New Zealand and shipped eleven thousand miles by boat to England produced six hundred and eighty-eight kilograms of carbon-dioxide emissions per ton, about a fourth the amount produced by British lamb. In part, that is because pastures in New Zealand need far less fertilizer than most grazing land in Britain (or in many parts of the United States). Similarly, importing beans from Uganda or Kenya—where the farms are small, tractor use is limited, and the fertilizer is almost always manure—tends to be more efficient than growing beans in Europe, with its reliance on energy-dependent irrigation systems.
Williams and his colleagues recently completed a study that examined the environmental costs of buying roses shipped to England from Holland and of those exported (and sent by air) from Kenya. In each case, the team made a complete life-cycle analysis of twelve thousand rose stems for sale in February—in which all the variables, from seeds to store, were taken into consideration. They even multiplied the CO2 emissions for the air-freighted Kenyan roses by a factor of nearly three, to account for the increased effect of burning fuel at a high altitude. Nonetheless, the carbon footprint of the roses from Holland—which are almost always grown in a heated greenhouse—was six times the footprint of those shipped from Kenya. Even Williams was surprised by the magnitude of the difference. “Everyone always wants to make ethical choices about the food they eat and the things they buy,” he told me. “And they should. It’s just that what seems obvious often is not. And we need to make sure people understand that before they make decisions on how they ought to live.”
How do we alter human behavior significantly enough to limit global warming? Personal choices, no matter how virtuous, cannot do enough. It will also take laws and money. For decades, American utilities built tall smokestacks, hoping to keep the pollutants they emitted away from people who lived nearby. As emissions are forced into the atmosphere, however, they react with water molecules and then are often blown great distances by prevailing winds, which in the United States tend to move from west to east. Those emissions—principally sulfur dioxide produced by coal-burning power plants—are the primary source of acid rain, and by the nineteen-seventies it had become clear that they were causing grave damage to the environment, and to the health of many Americans. Adirondack Park, in upstate New York, suffered more than anywhere else: hundreds of streams, ponds, and lakes there became so acidic that they could no longer support plant life or fish. Members of Congress tried repeatedly to introduce legislation to reduce sulfur-dioxide levels, but the Reagan Administration (as well as many elected officials, both Democratic and Republican, from regions where sulfur-rich coal is mined) opposed any controls, fearing that they would harm the economy. When the cost of polluting is negligible, so are the incentives to reducing emissions.
“We had a complete disaster on our hands,” Richard Sandor told me recently, when I met with him at his office at the Chicago Climate Exchange. Sandor, a dapper sixty-six-year-old man in a tan cable-knit cardigan and round, horn-rimmed glasses, is the exchange’s chairman and C.E.O. In most respects, the exchange operates like any other market. Instead of pork-belly futures or gold, however, CCX members buy and sell the right to pollute. Each makes a voluntary (but legally binding) commitment to reduce emissions of greenhouse gases—including carbon dioxide, methane, and nitrous oxide—and hydrofluorocarbons. Four hundred corporations now belong to the exchange, including a growing percentage of America’s largest manufacturers. The members agree to reduce their emissions by a certain amount every year, a system commonly known as cap and trade. A baseline target, or cap, is established, and companies whose emissions fall below that cap receive allowances, which they can sell (or save to use later). Companies whose emissions exceed the limit are essentially fined and forced to buy credits to compensate for their excess.
Sandor led me to the “trading floor,” which, like most others these days, is a virtual market populated solely by computers. “John, can you get the carbon futures up on the big screen?” Sandor yelled to one of his colleagues. Suddenly, a string of blue numbers slid across the monitor. “There is our 2008 price,” Sandor said. Somebody had just bid two dollars and fifteen cents per ton for carbon futures.
A former Berkeley economics professor and chief economist at the Chicago Board of Trade, Sandor is known as the “father of financial futures.” In the nineteen-seventies, he devised a market in interest rates which, when they started to fluctuate, turned into an immense source of previously untapped wealth. His office is just north of the Board of Trade, where he served for two years as vice-chairman. The walls are filled with interest-rate arcana and mortgage memorabilia; his desk is surrounded by monitors that permit him to track everything from catastrophic-risk portfolios to the price of pollution.
Sandor invents markets to create value for investors where none existed before. He sees himself as “a guy from the sixties”—but one who believes that free markets can make inequality disappear. So, he wondered, why not offer people the right to buy and sell shares in the value of reduced emissions? “At first, people laughed when I suggested the whole future idea,” he said. “They didn’t see the point of hedging on something like interest rates, and when it came to pollution rights many people just thought it was wrong to take a business approach to environmental protection.”
For Sandor, personal factors like food choices and driving habits are small facets of a far larger issue: making pollution so costly that our only rational choice is to stop. When he started, though, the idea behind a sulfur-dioxide-emissions market was radical. It also seemed distasteful; opponents argued that codifying the right to pollute would only remove the stigma from an unacceptable activity. You can’t trade something unless you own it; to grant a company the right to trade in emissions is also to give it a property right over the atmosphere. (This effect was noted most prominently when the Reagan Administration deregulated airport landing rights, in 1986. Airlines that already owned the rights to land got to keep those rights, while others had to buy slots at auction; in many cases, that meant that the country’s richest airlines were presented with gifts worth millions of dollars.)
Sandor acknowledges the potential for abuse, but he remains convinced that emissions will never fall unless there is a price tag attached to them. “You are really faced with a couple of possibilities when you want to control something,’’ he told me. “You can say, ‘Hey, we will allow you to use only x amount of these pollutants.’ That is the command approach. Or you can make a market.”
In the late nineteen-eighties, Sandor was asked by an Ohio public-interest group if he thought it would be possible to turn air into a commodity. He wrote an essay advocating the creation of an exchange for sulfur-dioxide emissions. The idea attracted a surprising number of environmentalists, because it called for large and specific reductions; conservatives who usually oppose regulation approved of the market-driven solution.
When Congress passed the Clean Air Act, in 1990, the law included a section that mandated annual acid-rain reductions of ten million tons below 1980 levels. Each large smokestack was fitted with a device to measure sulfur-dioxide emissions. As a way to help meet the goals, the act enabled the creation of the market. “Industry lobbyists said it would cost ten billion dollars in electricity increases a year. It cost one billion,” Sandor told me. It soon became less expensive to reduce emissions than it was to pollute. Consequently, companies throughout the country suddenly discovered the value of investing millions of dollars in scrubbers, which capture and sequester sulfur dioxide before it can reach the atmosphere.
Sandor still enjoys describing his first sulfur trade. Representatives of a small Midwestern town were seeking a loan to build a scrubber. “They were prepared to borrow millions of dollars and leverage the city to do it,” he told me. “We said, ‘We have a better idea.’ ” Sandor arranged to have the scrubber installed with no initial cost, and the apparatus helped the city fall rapidly below its required emissions cap. He then calculated the price of thirty years’ worth of that municipality’s SO2 emissions and helped arrange a loan for the town. “We gave it to them at a significantly lower rate than any bank would have done,” Sandor said. “It was a fifty-million-dollar deal and they saved seven hundred and fifty thousand dollars a year—and never had to pay a balloon mortgage at the end. I mention this because trading that way not only allows you to comply with the law, but it provides creative financing tools to help structure the way investments are made. It encourages people to comply at lower costs, because then they will make money.”
The program has been an undisputed success. Medical and environmental savings associated with reduced levels of lung disease and other conditions have been enormous—more than a hundred billion dollars a year, according to the E.P.A. “When is the last time you heard somebody even talking about acid rain?” Sandor asked. “It was going to ravage the world. Now it is not even mentioned in the popular press. We have reduced emissions from eighteen million tons to nine million, and we are going to halve it again by 2010. That is as good a social policy as you are ever likely to see.”
No effort to control greenhouse-gas emissions or to lower the carbon footprint—of an individual, a nation, or even the planet—can succeed unless those emissions are priced properly. There are several ways to do that: they can be taxed heavily, like cigarettes, or regulated, which is the way many countries have established mileage-per-gallon standards for automobiles. Cap and trade is another major approach—although CO2 emissions are a far more significant problem for the world than those which cause acid rain, and any genuine solution will have to be global.
Higher prices make conservation appealing—and help spark investment in clean technologies. When it costs money to use carbon, people begin to seek profits from selling fuel-efficient products like long-lasting light bulbs, appliances that save energy, hybrid cars, even factories powered by the sun. One need only look at the passage of the Clean Water Act, in 1972, to see that a strategy that combines legal limits with realistic pricing can succeed. Water had always essentially been free in America, and when something is free people don’t value it. The act established penalties that made it expensive for factories to continue to pollute water. Industry responded at once, and today the United States (and much of the developed world) manufactures more products with less water than it did fifty years ago. Still, whether you buy a plane ticket, an overcoat, a Happy Meal, a bottle of wine imported from Argentina, or a gallon of gasoline, the value of the carbon used to make those products is not reflected by their prices.
In 2006, Sir Nicholas Stern, a former chief economist of the World Bank, who is now the head of Britain’s Economic Service, issued a comprehensive analysis of the implications of global warming, in which he famously referred to climate change as “the greatest market failure the world has ever seen.” Sir Nicholas suggested that the carbon emissions embedded in almost every product ought, if priced realistically, to cost about eighty dollars a ton.
Trading schemes have many opponents, some of whom suggest that attaching an acceptable price to carbon will open the door to a new form of colonialism. After all, since 1850, North America and Europe have accounted for seventy per cent of all greenhouse-gas emissions, a trend that is not improving. Stephen Pacala, the director of Princeton University’s Environmental Institute, recently estimated that half of the world’s carbon-dioxide emissions come from just seven hundred million people, about ten per cent of the population.
If prices were the same for everyone, however, rich countries could adapt more easily than countries in the developing world. “This market driven mechanism subjects the planet’s atmosphere to the legal emission of greenhouse gases,” the anthropologist Heidi Bachram has written. “The arrangement parcels up the atmosphere and establishes the routinized buying and selling of ‘permits to pollute’ as though they were like any other international commodity.” She and others have concluded that such an approach would be a recipe for social injustice.
No one I spoke to for this story believes that climate change can be successfully addressed solely by creating a market. Most agreed that many approaches—legal, technological, and financial—will be necessary to lower our carbon emissions by at least sixty per cent over the next fifty years. “We will have to do it all and more,” Simon Thomas told me. He is the chief executive officer of Trucost, a consulting firm that helps gauge the full burden of greenhouse-gas emissions and advises clients on how to address them. Thomas takes a utilitarian approach to the problem, attempting to convince corporations, pension funds, and other investors that the price of continuing to ignore the impact of greenhouse-gas emissions will soon greatly exceed the cost of reducing them.
Thomas thinks that people finally are beginning to get the message. Apple computers certainly has. Two years ago, Greenpeace began a “Green my Apple” campaign, attacking the company for its “iWaste.” Then, last spring, not long before Apple launched the iPhone, Greenpeace issued a guide to electronics which ranked major corporations on their tracking, reporting, and reduction of toxic chemicals and electronic waste. Apple came in last. The group’s findings were widely reported, and stockholders took notice. (A company that sells itself as one of America’s most innovative brands cannot afford to ignore the environmental consequences of its manufacturing processes.) Within a month, Steve Jobs, the company’s C.E.O., posted a letter on the Apple Web site promising a “greener Apple.” He committed the company to ending the use of arsenic and mercury in monitors and said that the company would shift rapidly to more environmentally friendly LCD displays.
“The success of approaches such as ours relies on the idea that even if polluters are not paying properly now there is some reasonable prospect that they will have to pay in the future,’’ Thomas told me. “If that is true, then we know the likely costs and they are of significant value. If polluters never have to pay, then our approach will fail.
“You have to make it happen, though,” he went on. “And that is the job of government. It has to set a level playing field so that a market economy can deliver what it’s capable of delivering.” Thomas, a former investment banker, started Trucost nearly a decade ago. He mentioned the free-market economist Friedrich von Hayek, who won the Nobel Prize in Economics in 1974. “There is a remarkable essay in which he shows how an explosion, say, in a South American tin mine could work its way through the global supply chain to increase the price of canned goods in Europe,” Thomas said. I wondered what the price of tin could have to do with the cost of global warming.
“It is very much to the point,” Thomas answered. “Tin became more expensive and the market responded. In London, people bought fewer canned goods. The information travelled all the way from that mine across the world without any person in that supply chain even knowing the reasons for the increase. But there was less tin available and the market responded as you would have hoped it would.” To Thomas, the message was simple: “If something is priced accurately, its value will soon be reflected in every area of the economy.”
Without legislation, it is hard to imagine that a pricing plan could succeed. (The next Administration is far more likely to act than the Bush Administration has been. The best-known climate-change bill now before Congress, which would mandate capping carbon limits, was written by Senator Joseph Lieberman. Hillary Clinton, Barack Obama, and John McCain are co-sponsors. Most industrial leaders, whatever their ideological reservations, would prefer a national scheme to a system of rules that vary from state to state.) Even at today’s anemic rates, however, the market has begun to function. “We have a price of carbon that ranges from two to five dollars a ton,” Sandor told me. “And everyone says that is too cheap. Of course, they are right. But it’s not too cheap for people to make money.
“I got a call from a scientist a while ago”—Isaac Berzin, a researcher at M.I.T. “He said, ‘Richard, I have a process where I can put an algae farm next to a power plant. I throw some algae in and it becomes a super photosynthesis machine and sucks the carbon dioxide out of the air like a sponge. Then I gather the algae, dry it out, and use it as renewable energy.” Berzin asked Sandor whether, if he was able to take fifty million tons of carbon dioxide out of the atmosphere in this way, he could make a hundred million dollars.
“I said, ‘Sure,’ ” Sandor recalled, laughing. “Two dollars a ton, why not? So he sends me a term paper. Not a prospectus, even.” Sandor was skeptical, but it didn’t take Berzin long to raise twenty million dollars from investors, and he is now working with the Arizona Public Service utility to turn the algae into fuel. Sandor shook his head. “This is at two dollars a ton,” he said. “The lesson is important: price stimulates inventive activity. Even if you think the price is too low or ridiculous. Carbon has to be rationed, like water and clean air. But I absolutely promise that if you design a law and a trading scheme properly you are going to find everyone from professors at M.I.T. to the guys in Silicon Valley coming out of the woodwork. That is what we need, and we need it now.”
In 1977, Jimmy Carter told the American people that they would have to balance the nation’s demand for energy with its “rapidly shrinking resources” or the result “may be a national catastrophe.” It was a problem, the President said, “that we will not solve in the next few years, and it is likely to get progressively worse through the rest of this century. We must not be selfish or timid if we hope to have a decent world for our children and grandchildren.” Carter referred to the difficult effort as the “moral equivalent of war,” a phrase that was widely ridiculed (along with Carter himself, who wore a cardigan while delivering his speech, to underscore the need to turn down the thermostat).
Carter was prescient. We are going to have to reduce our carbon footprint rapidly, and we can do that only by limiting the amount of fossil fuels released into the atmosphere. But what is the most effective—and least painful—way to achieve that goal? Each time we drive a car, use electricity generated by a coal-fired plant, or heat our homes with gas or oil, carbon dioxide and other heat-trapping gases escape into the air. We can use longer-lasting light bulbs, lower the thermostat (and the air-conditioning), drive less, and buy more fuel-efficient cars. That will help, and so will switching to cleaner sources of energy. Flying has also emerged as a major carbon don’t—with some reason, since airplanes at high altitudes release at least ten times as many greenhouse gases per mile as trains do. Yet neither transportation—which accounts for fifteen per cent of greenhouse gases—nor industrial activity (another fifteen per cent) presents the most efficient way to shrink the carbon footprint of the globe.
Just two countries—Indonesia and Brazil—account for about ten per cent of the greenhouse gases released into the atmosphere. Neither possesses the type of heavy industry that can be found in the West, or for that matter in Russia or India. Still, only the United States and China are responsible for greater levels of emissions. That is because tropical forests in Indonesia and Brazil are disappearing with incredible speed. “It’s really very simple,” John O. Niles told me. Niles, the chief science and policy officer for the environmental group Carbon Conservation, argues that spending five billion dollars a year to prevent deforestation in countries like Indonesia would be one of the best investments the world could ever make. “The value of that land is seen as consisting only of the value of its lumber,” he said. “A logging company comes along and offers to strip the forest to make some trivial wooden product, or a palm-oil plantation. The governments in these places have no cash. They are sitting on this resource that is doing nothing for their economy. So when a guy says, ‘I will give you a few hundred dollars if you let me cut down these trees,’ it’s not easy to turn your nose up at that. Those are dollars people can spend on schools and hospitals.”
The ecological impact of decisions like that are devastating. Decaying trees contribute greatly to increases in the levels of greenhouse gases. Plant life absorbs CO2. But when forests disappear, the earth loses one of its two essential carbon sponges (the other is the ocean). The results are visible even from space. Satellite photographs taken over Indonesia and Brazil show thick plumes of smoke rising from the forest. According to the latest figures, deforestation pushes nearly six billion tons of CO2 into the atmosphere every year. That amounts to thirty million acres—an area half the size of the United Kingdom—chopped down each year. Put another way, according to one recent calculation, during the next twenty-four hours the effect of losing forests in Brazil and Indonesia will be the same as if eight million people boarded airplanes at Heathrow Airport and flew en masse to New York.
“This is the greatest remaining opportunity we have to help address global warming,” Niles told me. “It’s a no-brainer. People are paying money to go in and destroy those forests. We just have to pay more to prevent that from happening.” Niles’s group has proposed a trade: “If you save your forest and we can independently audit and verify it, we will calculate the emissions you have saved and pay you for that.” The easiest way to finance such a plan, he is convinced, would be to use carbon-trading allowances. Anything that prevents carbon dioxide from entering the atmosphere would have value that could be quantified and traded. Since undisturbed farmland has the same effect as not emitting carbon dioxide at all, people could create allowances by leaving their forests untouched or by planting new trees. (Rain forests are essential to planetary vitality in other ways, too, of course. More than a third of all terrestrial species live in forest canopies. Rising levels of CO2 there alter the way that forests function, threatening to increase flooding and droughts and epidemics of plant disease. Elevated CO2 in the forest atmosphere also reduces the quality of the wood in the trees, and that in turn has an impact on the reproduction of flowers, as well as that of birds, bees, and anything else that relies on that ecosystem.)
From both a political and an economic perspective, it would be easier and cheaper to reduce the rate of deforestation than to cut back significantly on air travel. It would also have a far greater impact on climate change and on social welfare in the developing world. Possessing rights to carbon would grant new power to farmers who, for the first time, would be paid to preserve their forests rather than destroy them. Unfortunately, such plans are seen by many people as morally unattractive. “The whole issue is tied up with the misconceived notion of ‘carbon colonialism,’ ” Niles told me. “Some activists do not want the Third World to have to alter their behavior, because the problem was largely caused by us in the West.”
Environmental organizations like Carbon Trade Watch say that reducing our carbon footprint will require restructuring our lives, and that before we in the West start urging the developing world to do that we ought to make some sacrifices; anything else would be the modern equivalent of the medieval practice of buying indulgences as a way of expiating one’s sins. “You have to realize that, in the end, people are trying to buy their way out of bad behavior,” Tony Juniper, the director of Friends of the Earth, told me. “Are we really a society that wants to pay rich people not to fly on private jets or countries not to cut down their trees? Is that what, ultimately, is morally right and equitable?”
Sandor dismisses the question. “Frankly, this debate just makes me want to scream,” he told me. “The clock is moving. They are slashing and burning and cutting the forests of the world. It may be a quarter of global warming and we can get the rate to two per cent simply by inventing a preservation credit and making that forest have value in other ways. Who loses when we do that?
“People tell me, well, these are bad guys, and corporate guys who just want to buy the right to pollute are bad, too, and we should not be giving them incentives to stop. But we need to address the problems that exist, not drown in fear or lose ourselves in morality. Behavior changes when you offer incentives. If you want to punish people for being bad corporate citizens, you should go to your local church or synagogue and tell God to punish them. Because that is not our problem. Our problem is global warming, and my job is to reduce greenhouse gases at the lowest possible cost. I say solve the problem and deal with the bad guys somewhere else.”
The Tesco corporate headquarters are spread across two low-slung, featureless buildings in an unusually dismal part of Hertfordshire, about half an hour north of London. Having inspired many of the discussions about the meaning of our carbon footprint, the company has been criticized by those who question the emphasis on food. As Adrian Williams, the Cranfield agricultural researcher, put it, the company has been “a little bit shocked” by the discovery that its original goal, to label everything, was naïve.
The process has indeed been arduous. Tesco has undertaken a vast—and at times lonely—attempt to think about global warming in an entirely new way, and the company shows little sign of pulling back. “We are spending more than a hundred million pounds a year trying to increase our energy efficiency and reduce CO2 emissions,” Katherine Symonds told me. A charismatic woman with an abiding belief that global warming can be addressed rationally, Symonds is the corporation’s climate-change manager. “We are trying to find a way to help consumers make choices they really want to make—choices that mean something to them. This is not all about food. We just happen to be in the food business.
“One of our real responsibilities is to say to our customers, ‘The most important thing you can do to effect climate change is insulate your house properly,’ ” she went on. “ ‘Next would be to get double-glazed windows,’ ” which prevent heat from escaping in the winter. “Third, everyone should get a new boiler.’ We are trying to put this into context, not to say, ‘Buy English potatoes.’ ” Consumers are unlikely to stop shopping. Economies won’t stand still, either; those of China and India are expanding so speedily that people often ask whether sacrifices anywhere else can even matter.
“We have to be careful not to rush from denial to despair,” John Elkington told me, when I visited him not long ago at his offices at SustainAbility, the London-based environmental consulting firm he helped found more than two decades ago. He believes there is a danger that people will feel engulfed by the challenge, and ultimately helpless to address it.
“We are in an era of creative destruction,” he said. A thin, easygoing man with the look of an Oxford don, Elkington has long been one of the most articulate of those who seek to marry economic prosperity with environmental protection. “What happens when you go into one of these periods is that before you get to the point of reconstruction things have to fall apart. Detroit will fall apart. I think Ford”—a company that Elkington has advised for years—“will fall apart. They have just made too many bets on the wrong things. A bunch of the institutions that we rely on currently will, to some degree, decompose. I believe that much of what we count as democratic politics today will fall apart, because we are simply not going to be able to deal with the scale of change that we are about to face. It will profoundly disable much of the current political class.”
He sat back and smiled softly. He didn’t look worried. “I wrote my first report on climate change in 1978, for Herman Kahn, at the Hudson Institute,” he explained. “He did not at all like what I was saying, and he told me, ‘The trouble with you environmentalists is that you see a problem coming and you slam your foot on the brakes and try and steer away from the chasm. The problem is that it often doesn’t work. Maybe the thing to do is jam your foot on the pedal and see if you can just jump across.’ At the time, I thought he was crazy, but as I get older I realize what he was talking about. The whole green movement in technology is in that space. It is an attempt to jump across the chasm.” ♦
Subscribe to:
Posts (Atom)