Tuesday, May 27, 2008

The Smart is less fuel efficient that the Prius?

According to the EPA, fuel efficiency is lower than the 5-seat Toyota Prius, which achieves 48 city and 45 highway while the Smart achieves 33 city and 41 highway. Additionally, the larger Chevrolet Cobalt, with its 148 hp engine achieves a combined 31 MPG compared to the Smart's 36 combined MPG with its 71 hp engine.

-----

Forbes Article
http://www.forbes.com/2006/07/12/unsolicited-advice-advertising-meb_0712smart.html

Tiniest car in U.S. earns top marks in crash testing


8-foot, 8-inch Smart Fortwo 'good' in front, side impacts

Ken Thomas, Associated Press

Wednesday, May 14, 2008
Smart Fortwo undergoes a frontal offset crash test, in wh...

(05-14) 04:00 PDT Washington --

The 2008 Smart Fortwo micro car, the smallest car for sale in the U.S. market, has earned top scores in crash tests conducted by the insurance industry.

The 8-foot, 8-inch vehicle received the highest rating of "good" in front-end and side-impact testing by the Insurance Institute for Highway Safety, helping address some concerns that consumers may be more vulnerable in the tiny two-seater.

The test results, released today, show how well vehicles stack up against others of similar size and weight. The institute noted that the front-end test scores can't be compared across weight classes, meaning a small car that earns a good rating isn't considered safer than a large car that did not earn the highest rating.

Adrian Lund, the institute's president, said a small car may be more practical in congested urban areas where serious, high-speed crashes are less likely. The institute conducted the crash test to help guide consumers who want a small car that can give them good protection.

"All things being equal in safety, bigger and heavier is always better. But among the smallest cars, the engineers of the Smart did their homework and designed a high level of safety into a very small package," Lund said.

The institute's frontal-crash test simulates a 40 mph crash with a similar vehicle. The side crash simulates what would happen if the vehicle was struck in the side by a sport utility vehicle at 31 mph.

In a test that assessed the vehicle's protection in rear crashes, the Fortwo received the second-highest rating of acceptable.

Smart, a division of Daimler AG's Mercedes-Benz brand, is arriving in U.S. showrooms this year as consumers deal with rising fuel prices. The automaker has received more than 30,000 reservations for the vehicle - which has a base price of more than $12,000 with destination charges included and more than $17,000 for a fully loaded Smart passion convertible. Customers are putting down $99 to reserve a car.

The vehicle, which had sold 6,159 units through the end of April, gets 33 miles per gallon in the city and 41 mpg on the highway. The Fortwo is more than 3 feet shorter and nearly 700 pounds lighter than a Mini Cooper.

http://sfgate.com/cgi-bin/article.cgi?f=/c/a/2008/05/14/BU9U10LJLM.DTL

This article appeared on page C - 3 of the San Francisco Chronicle

Friday, May 16, 2008

If you're richer, you're happier

From The Times
May 14, 2008
People tell you that wealth does not lead to happiness. New research shows they're wrong
Daniel Finkelstein

To mark the first anniversary of Paddington Bear residing with the Browns, a small party is held at which Paddington performs conjuring tricks. Carefully reading from his conjuring book, the bear places Mr Curry's watch inside a handkerchief and smashes it with a hammer.

Unfortunately, Paddington has turned two pages at once. They were stuck together with marmalade. So he misses the words that follow the advice to bring down the hammer on the handkerchief - “having first removed the watch”.

I have been patient. For my entire adult life, I have been looking out to see Paddington's trick performed for real. But now I have. As my mother always told me: “Everything comes to he who waits.”

The production of literature about happiness has become an industry. Earlier this week a cross-party group of Christian MPs produced a report on the topic and were able to begin with a long list of books on the subject published in the last two years. Perhaps the most successful are Happiness: Lessons from a New Science by Richard Layard and Affluenza by Oliver James but there has been a host of others.

The starting point for this work is something called the Easterlin Paradox. In a 1974 paper, the economist Richard Easterlin presented empirical evidence on income and happiness that was pretty puzzling. Using surveys of how happy people say that they are, the paper seemed to show that within countries, the richer people are, the happier they are, but that between countries the same didn't hold.

What this suggests is that being relatively rich compared to your fellow countrymen makes you happier, but that your absolute wealth doesn't matter. Once a minimum income level is reached, an amount necessary for a country's residents to subsist, all that extra economic growth doesn't appear to be improving life satisfaction.

The implications of Easterlin's discovery are pretty strong. It suggests that all this consumption is doing us no good. That is what the Christian MPs suggest, questioning whether we haven't sacrificed family life on the altar of capitalism. Indeed, some authors go farther and suggest that the very act of shopping is actually making us unhappy. The Easterlin Paradox certainly means that we shouldn't be organising our economies to maximise economic growth. Happiness, not income, should be our guide.

The leading happiness authors suggest that we should concentrate on reducing inequality. This might lead to lower national income, but who cares about that? It isn't making us happier. Increased equality would stop us all worrying about our relative positions and thus remove a source of unhappiness.

And all this stuff has caught the mood. It's the intellectual vogue topic. David Cameron is talking about improving General Wellbeing not just Gross National Product. It's everywhere.

There is just one teeny, tiny problem. It seems as if Easterlin wasn't correct.

It appears that before picking up their hammers to smash down on the handkerchief of economic growth, the happiness authors had an accident with the marmalade. They turned over two sticky pages at once and missed the reassessment of Easterlin's work that has been taking place.

Easterlin's original paper was based on fairly limited data. Betsey Stevenson and Justin Wolfers, of the Wharton School, University of Pennsylvania, have been looking at the vast amount of data that has become available since then. And guess what? The two economists show that there is “a clear and positive link between average levels of subjective wellbeing across countries with no evidence of a satiation point beyond which wealthier countries have no further increases in subjective wellbeing”.

In other words they show that it's not just relative wealth that matters, it is absolute wealth too - on average, the richer you are, the happier you are. And this isn't true just for the first slug of income, just until we can subsist, it is true all the way up and as economies keep growing.

The Nobel prize-winning economist Daniel Kahneman is sufficiently important in this area of economics that Richard Layard dedicated his happiness book to him. Kahneman now believes the new evidence from the Wharton academics is “quite compelling” and adds that “there is just a vast amount of accumulating evidence that the Easterlin Paradox may not exist”.

Now this doesn't, of course, prove by itself that higher income causes greater happiness. Let's not make that mistake. In the academic literature on psychology you will find plenty of reason to believe, for instance, that the relationship might be the other way round - that greater happiness might cause higher income.

At the very least, however, it shows that higher income is consistent with greater happiness and isn't actually making us unhappy. It also means, again at the very least, that if the happiness authors want to advance the faintly counterintuitive idea that more income doesn't increase life satisfaction, they have a lot of work to do finding an entirely new way of making their point.

The Easterlin Paradox seemed to offer a way out for those unhappy with capitalism. After spending decades advancing methods of increasing growth that didn't work, much of the Left has moved on. Now they are arguing that growth doesn't matter or might actually be harmful. And the happiness literature helped make this point.

So what will happen now with these critics? Will they ignore the data? Will they walk away from the happiness idea and forget they ever mentioned it? Or will they turn their work on its head and use the new evidence to start arguing that capitalism might be the route to happiness after all?

I don't somehow think they'll choose this last option. Do you?

daniel.finkelstein@thetimes.co.uk

Thursday, May 15, 2008

Going Underground -- Paul Stamets On The Vast, Intelligent Network Beneath Our Feet

The Sun Interview February 2008 | issue 386
by Derrick Jensen

The complete text of this selection is available in our print edition.


For several years people from different places and backgrounds kept recommending the same oddly titled book to me: Paul Stamets’s Mycelium Running: How Mushrooms Can Help Save the World (Ten Speed Press). Everyone told me it was one of the most mind-bending texts they’d ever read. With so many recommendations, I perversely hesitated to pick the book up, and when I finally did, I prepared myself to be disappointed.

I wasn’t. Stamets fundamentally changed my view of nature — in particular, fungi: yeasts, mushrooms, molds, the whole lot of them.

When we think of fungi, most of us picture mushrooms, those slightly mysterious, potentially poisonous denizens of dark, damp places. But a mushroom is just the fruit of the mycelium, which is an underground network of rootlike fibers that can stretch for miles. Stamets calls mycelia the “grand disassemblers of nature” because they break down complex substances into simpler components. For example, some fungi can take apart the hydrogen-carbon bonds that hold petroleum products together. Others have shown the potential to clean up nerve-gas agents, dioxins, and plastics. They may even be skilled enough to undo the ecological damage pollution has wrought.

Since reading Mycelium Running, I’ve begun to consider the possibility that mycelia know something we don’t. Stamets believes they have not just the ability to protect the environment but the intelligence to do so on purpose. His theory stems in part from the fact that mycelia transmit information across their huge networks using the same neurotransmitters that our brains do: the chemicals that allow us to think. In fact, recent discoveries suggest that humans are more closely related to fungi than we are to plants.

Almost since life began on earth, mycelia have performed important ecological roles: nourishing ecosystems, repairing them, and sometimes even helping create them. The fungi’s exquisitely fine filaments absorb nutrients from the soil and then trade them with the roots of plants for some of the energy that the plants produce through photosynthesis. No plant community could exist without mycelia. I’ve long been a resident and defender of forests, but Stamets helped me understand that I’ve been misperceiving my home. I thought a forest was made up entirely of trees, but now I know that the foundation lies below ground, in the fungi.

Stamets became interested in biology in kindergarten, when he planted a sunflower seed in a paper cup and watched it sprout and lift itself toward the light. Somewhere along the way, he developed a fascination with life forms that grow not toward the sun but away from it. In the late seventies he got a Drug Enforcement Administration permit to research hallucinogenic psilocybin mushrooms at Evergreen State College in Washington. Stamets is now fifty-two and has studied mycelia for more than thirty years, naming five new species and authoring or coauthoring six books, including Growing Gourmet and Medicinal Mushrooms (Ten Speed Press) and The Mushroom Cultivator (Agarikon Press). He’s the founder and director of Fungi Perfecti (www.fungi.com), a company based outside Olympia, Washington, that provides mushroom research, information, classes, and spawn — the mushroom farmer’s equivalent of seed. Much of the company’s profits go to help protect endangered strains of fungi in the old-growth forests of the Pacific Northwest. I interviewed Stamets in June 2007.

Jensen: How many different types of mushrooms are there?

Stamets: There are an estimated one to two million species of fungi, of which about 150,000 form mushrooms. A mushroom is the fruit body — the reproductive structure — of the mycelium, which is the network of thin, cobweblike cells that infuses all soil. The spores in the mushroom are somewhat analogous to seeds. Because mushrooms are fleshy, succulent, fragrant, and rich in nutrients, they attract animals — including humans — who eat them and thereby participate in spreading the spores through their feces.

Our knowledge of fungi is far exceeded by our ignorance. To date, we’ve identified approximately 14,000 of the 150,000 species of mushroom-forming fungi estimated to exist, which means that more than 90 percent have not yet been identified. Fungi are essential for ecological health, and losing any of these species would be like losing rivets in an airplane. Flying squirrels and voles, for example, are dependent upon truffles, and in old-growth forests, the main predator of flying squirrels and voles is the spotted owl. This means that killing off truffles would kill off flying squirrels and voles, which would kill off spotted owls.

That’s just one food chain that we can identify; there are many thousands more we cannot. Biological systems are so complex that they far exceed our cognitive abilities and our linear logic. We are essentially children when it comes to our understanding of the natural world.

Jensen: In your book you say that animals are more closely related to fungi than they are to plants or protozoa or bacteria.

Stamets: Yes. For example, we inhale oxygen and exhale carbon dioxide; so do fungi. One of the big differences between animals and fungi is that animals have their stomachs on the inside. About 600 million years ago, the branch of fungi leading to animals evolved to capture nutrients by surrounding their food with cellular sacs — essentially primitive stomachs. As these organisms evolved, they developed outer layers of cells — skins, basically — to prevent moisture loss and as a barrier against infection. Their stomachs were confined within the skin. These were the earliest animals.

Mycelia took a different evolutionary path, going underground and forming a network of interwoven chains of cells, a vast food web upon which life flourished. These fungi paved the way for plants and animals. They munched rocks, producing enzymes and acids that could pull out calcium, magnesium, iron, and other minerals. In the process they converted rocks into usable foods for other species. And they still do this, of course.

Fungi are fundamental to life on earth. They are ancient, they are widespread, and they have formed partnerships with many other species. We know from the fossil record that evolution on this planet has largely been steered by two cataclysmic asteroid impacts. The first was 250 million years ago. The earth became shrouded in dust. Sunlight was cut off, and in the darkness, massive plant communities died. More than 90 percent of species disappeared. And fungi inherited the earth. Organisms that paired with fungi through natural selection were rewarded. Then the skies cleared, and light came back, and evolution continued on its course until 65 million years ago, bam! It happened again. We were hit by another asteroid, and there were more massive extinctions. That’s when the dinosaurs died out. Again, organisms that paired with fungi were rewarded. So these asteroid impacts steered life toward symbiosis with fungi: not just plants and animals, but bacteria and viruses, as well.

Jensen: Can you give some examples of these partnerships?

Stamets: A familiar one is lichens, which are actually a fungus and an alga growing symbiotically together. Another is “sleepy grass”: Mesoamerican ranchers realized that when their horses ate a certain type of grass, the horses basically got stoned. When scientists studied sleepy grass, they found that it wasn’t the grass at all that was causing the horses to get stoned, but an endophytic fungus, meaning one that grows within a plant, in the stems and leaves.

Here’s another example: At Yellowstone’s hot springs and Lassen Volcanic Park, people noticed that some grasses could survive contact with scalding hot water — up to 160 degrees. Scientists cultured these grasses in a laboratory and saw a fungus growing on them. They thought it was a contaminant, so they separated the fungus from the grass cells and tried to regrow the grass. But without the fungus the grass died at around 110 degrees. So they reintroduced this fungus and regrew the grass, and once again it survived to 160 degrees. That particular fungus, of the genus Curvularia, conveyed heat tolerance to the grass. Scientists are now looking at the possibility of getting this Curvularia to convey heat tolerance to corn, rice, and wheat, so that these grasses could be grown under drought conditions or in extremely arid environments, expanding the grain-growing regions of the world.

Other researchers took a Curvularia fungus from cold storage at a culture bank and joined it with tomatoes, expecting that it would confer heat tolerance. But the tomatoes all died at 105 degrees. They discovered that the cold storage had killed a virus that wild Curvularia fungus carries within it — which was odd, since you’d think cold storage would keep the virus alive. When they reintroduced the virus back into the Curvularia cultures and then reassociated the fungus with tomato plants, the plants survived the heat. So this is a symbiosis of three organisms: a plant, a fungus, and a virus. Only together could they survive extreme conditions.

These examples are just the tip of the iceberg. They show the intelligence of nature, how these different entities form partnerships to the benefit of all.

Jensen: Of course this raises the question of boundaries: Is that tomato-fungus-virus one entity or three? Where does one organism stop and the other begin?

Stamets: Well, humans aren’t just one organism. We are composites. Scientists label species as separate so we can communicate easily about the variety we see in nature. We need to be able to look at a tree and say it’s a Douglas fir and look at a mammal and say it’s a harbor seal. But, indeed, I speak to you as a unified composite of microbes. I guess you could say I am the “elected voice” of a microbial community. This is the way of life on our planet. It is all based on complex symbiotic relationships.

A mycelial “mat,” which scientists think of as one entity, can be thousands of acres in size. The largest organism in the world is a mycelial mat in eastern Oregon that covers 2,200 acres and is more than two thousand years old. Its survival strategy is somewhat mysterious. We have five or six layers of skin to protect us from infection; the mycelium has one cell wall. How is it that this vast mycelial network, which is surrounded by hundreds of millions of microbes all trying to eat it, is protected by one cell wall? I believe it’s because the mycelium is in constant biochemical communication with its ecosystem.

I think these mycelial mats are neurological networks. They’re sentient, they’re aware, and they’re highly evolved. They have external stomachs, which produce enzymes and acids to digest nutrients outside the mycelium, and then bring in those compounds that it needs for nutrition. As you walk through a forest, you break twigs underneath your feet, and the mycelium surges upward to capture those newly available nutrients as quickly as possible. I say they have “lungs,” because they are inhaling oxygen and exhaling carbon dioxide, just like we are. I say they are sentient, because they produce pharmacological compounds — which can activate receptor sites in our neurons — and also serotonin-like compounds, including psilocybin, the hallucinogen found in some mushrooms. This speaks to the fact that there is an evolutionary common denominator between fungi and humans. We evolved from fungi. We took an overground route. The fungi took the route of producing these underground networks that are highly resilient and extremely adaptive: if you disturb a mycelial network, it just regrows. It might even benefit from the disturbance.

I have long proposed that mycelia are the earth’s “natural Internet.” I’ve gotten some flak for this, but recently scientists in Great Britain have published papers about the “architecture” of a mycelium — how it’s organized. They focused on the nodes of crossing, which are the branchings that allow the mycelium, when there is a breakage or an infection, to choose an alternate route and regrow. There’s no one specific point on the network that can shut the whole operation down. These nodes of crossing, those scientists found, conform to the same mathematical optimization curves that computer scientists have developed to optimize the Internet. Or, rather, I should say that the Internet conforms to the same optimization curves as the mycelium, since the mycelium came first.

Hunter-gatherers -- Noble or savage?

Dec 19th 2007
From The Economist print edition


The era of the hunter-gatherer was not the social and environmental Eden that some suggest



HUMAN beings have spent most of their time on the planet as hunter-gatherers. From at least 85,000 years ago to the birth of agriculture around 73,000 years later, they combined hunted meat with gathered veg. Some people, such as those on North Sentinel Island in the Andaman Sea, still do. The Sentinelese are the only hunter-gatherers who still resist contact with the outside world. Fine-looking specimens—strong, slim, fit, black and stark naked except for a small plant-fibre belt round the waist—they are the very model of the noble savage. Genetics suggests that indigenous Andaman islanders have been isolated since the very first expansion out of Africa more than 60,000 years ago.

About 12,000 years ago people embarked on an experiment called agriculture and some say that they, and their planet, have never recovered. Farming brought a population explosion, protein and vitamin deficiency, new diseases and deforestation. Human height actually shrank by nearly six inches after the first adoption of crops in the Near East. So was agriculture “the worst mistake in the history of the human race”, as Jared Diamond, evolutionary biologist and professor of geography at the University of California, Los Angeles, once called it?

Take a snapshot of the old world 15,000 years ago. Except for bits of Siberia, it was full of a new and clever kind of people who had originated in Africa and had colonised first their own continent, then Asia, Australia and Europe, and were on the brink of populating the Americas. They had spear throwers, boats, needles, adzes, nets. They painted pictures, decorated their bodies and believed in spirits. They traded foods, shells, raw materials and ideas. They sang songs, told stories and prepared herbal medicines.

They were “hunter-gatherers”. On the whole the men hunted and the women gathered: a sexual division of labour is still universal among non-farming people and was probably not shared by their Homo erectus predecessors. This enabled them to eat both meat and veg, a clever trick because it combines quality with reliability.

Why change? In the late 1970s Mark Cohen, an archaeologist, first suggested that agriculture was born of desperation, rather than inspiration. Evidence from the Fertile Crescent seems to support him. Rising human population density, combined perhaps with a cooling, drying climate, left the Natufian hunter-gatherers of the region short of acorns, gazelles and wild grass seeds. Somebody started trying to preserve and enhance a field of chickpeas or wheat-grass and soon planting, weeding, reaping and threshing were born.

Quite independently, people took the same step in at least six other parts of the world over the next few thousand years: the Yangzi valley, the central valley of New Guinea, Mexico, the Andes, West Africa and the Amazon basin. And it seems that Eden came to an end. Not only had hunter-gatherers enjoyed plenty of protein, not much fat and ample vitamins in their diet, but it also seems they did not have to work very hard. The Hadza of Tanzania “work” about 14 hours a week, the !Kung of Botswana not much more.

The first farmers were less healthy than the hunter-gatherers had been in their heyday. Aside from their shorter stature, they had more skeletal wear and tear from the hard work, their teeth rotted more, they were short of protein and vitamins and they caught diseases from domesticated animals: measles from cattle, flu from ducks, plague from rats and worms from using their own excrement as fertiliser.

They also got a bad attack of inequality for the first time. Hunter-gatherers' dependence on sharing each other's hunting and gathering luck makes them remarkably egalitarian. A successful farmer, however, can afford to buy the labour of others, and that makes him more successful still, until eventually—especially in an irrigated river valley, where he controls the water—he can become an emperor imposing his despotic whim upon subjects. Friedrich Engels was probably right to identify agriculture with a loss of political innocence.

Agriculture also stands accused of exacerbating sexual inequality. In many peasant farming communities, men make women do much of the hard work. Among hunter-gathering folk, men usually bring fewer calories than women, and have a tiresome tendency to prefer catching big and infrequent prey so they can show off, rather than small and frequent catches that do not rot before they are eaten. But the men do at least contribute.

Recently, though, anthropologists have subtly revised the view that the invention of agriculture was a fall from grace. They have found the serpent in hunter-gatherer Eden, the savage in the noble savage. Maybe it was not an 80,000-year camping holiday after all.

In 2006 two Indian fishermen, in a drunken sleep aboard their little boat, drifted over the reef and fetched up on the shore of North Sentinel Island. They were promptly killed by the inhabitants. Their bodies are still there: the helicopter that went to collect them was driven away by a hail of arrows and spears. The Sentinelese do not welcome trespassers. Only very occasionally have they been lured down to the beach of their tiny island home by gifts of coconuts and only once or twice have they taken these gifts without sending a shower of arrows in return.

Several archaeologists and anthropologists now argue that violence was much more pervasive in hunter-gatherer society than in more recent eras. From the
!Kung in the Kalahari to the Inuit in the Arctic and the aborigines in Australia, two-thirds of modern hunter-gatherers are in a state of almost constant tribal warfare, and nearly 90% go to war at least once a year. War is a big word for dawn raids, skirmishes and lots of posturing, but death rates are high—usually around 25-30% of adult males die from homicide. The warfare death rate of 0.5% of the population per year that Lawrence Keeley of the University of Illinois calculates as typical of hunter-gatherer societies would equate to 2 billion people dying during the 20th century.

At first, anthropologists were inclined to think this a modern pathology. But it is increasingly looking as if it is the natural state. Richard Wrangham of Harvard University says that chimpanzees and human beings are the only animals in which males engage in co-operative and systematic homicidal raids. The death rate is similar in the two species. Steven LeBlanc, also of Harvard, says Rousseauian wishful thinking has led academics to overlook evidence of constant violence.


Not so many women as men die in warfare, it is true. But that is because they are often the object of the fighting. To be abducted as a sexual prize was almost certainly a common female fate in hunter-gatherer society. Forget the Garden of Eden; think Mad Max.

Constant warfare was necessary to keep population density down to one person per square mile. Farmers can live at 100 times that density. Hunter-gatherers may have been so lithe and healthy because the weak were dead. The invention of agriculture and the advent of settled society merely swapped high mortality for high morbidity, allowing people some relief from chronic warfare so they could at least grind out an existence, rather than being ground out of existence altogether.

Notice a close parallel with the industrial revolution. When rural peasants swapped their hovels for the textile mills of Lancashire, did it feel like an improvement? The Dickensian view is that factories replaced a rural idyll with urban misery, poverty, pollution and illness. Factories were indeed miserable and the urban poor were overworked and underfed. But they had flocked to take the jobs in factories often to get away from the cold, muddy, starving rural hell of their birth.


Homo sapiens wrought havoc on many ecosystems as Homo erectus had not

Eighteenth-century rural England was a place where people starved each spring as the winter stores ran out, where in bad years and poor districts long hours of agricultural labour—if it could be got—barely paid enough to keep body and soul together, and a place where the “putting-out” system of textile manufacture at home drove workers harder for lower pay than even the factories would. (Ask Zambians today why they take ill-paid jobs in Chinese-managed mines, or Vietnamese why they sew shirts in multinational-owned factories.) The industrial revolution caused a population explosion because it enabled more babies to survive—malnourished, perhaps, but at least alive.

Returning to hunter-gatherers, Mr LeBlanc argues (in his book “Constant Battles”) that all was not well in ecological terms, either. Homo sapiens wrought havoc on many ecosystems as Homo erectus had not. There is no longer much doubt that people were the cause of the extinction of the megafauna in North America 11,000 years ago and Australia 30,000 years before that. The mammoths and giant kangaroos never stood a chance against co-ordinated ambush with stone-tipped spears and relentless pursuit by endurance runners.

This was also true in Eurasia. The earliest of the great cave painters, working at Chauvet in southern France, 32,000 years ago, was obsessed with rhinoceroses. A later artist, working at Lascaux 15,000 years later, depicted mostly bison, bulls and horses—rhinoceroses must have been driven close to extinction by then. At first, modern human beings around the Mediterranean relied almost entirely on large mammals for meat. They ate small game only if it was slow moving—tortoises and limpets were popular. Then, gradually and inexorably, starting in the Middle East, they switched their attention to smaller animals, and especially to warm-blooded, fast-breeding species, such as rabbits, hares, partridges and smaller gazelles. The archaeological record tells this same story at sites in Israel, Turkey and Italy.
Bridgeman Art Library
Bridgeman Art Library

Another fine environmental mess we've got ourselves into

The reason for this shift, say Mary Stiner and Steven Kuhn of the University of Arizona, was that human population densities were growing too high for the slower-reproducing prey such as tortoises, horses and rhinos. Only the fast-breeding rabbits, hares and partridges, and for a while gazelles, could cope with such hunting pressure. This trend accelerated about 15,000 years ago as large game and tortoises disappeared from the Mediterranean diet altogether—driven to the brink of extinction by human predation.

In times of prey scarcity, Homo erectus, like other predators, had simply suffered local extinction; these new people could innovate their way out of trouble—they could shift their niche. In response to demographic pressure, they developed better weapons which enabled them to catch smaller, faster prey, which in turn enabled them to survive at high densities, though at the expense of extinguishing many larger and slower-breeding prey. Under this theory, the atlatl or spear-throwing stick was invented 18,000 years ago as a response to a Malthusian crisis, not just because it seemed like a good idea.


Soon collecting wild grass seeds evolved into planting and reaping crops, which meant fewer proteins and vitamins but ample calories

What's more, the famously “affluent society” of hunter-gatherers, with plenty of time to gossip by the fire between hunts and gathers, turns out to be a bit of a myth, or at least an artefact of modern life. The measurements of time spent getting food by the !Kung omitted food-processing time and travel time, partly because the anthropologists gave their subjects lifts in their vehicles and lent them metal knives to process food.

Agriculture was presumably just another response to demographic pressure. A new threat of starvation—probably during the millennium-long dry, cold “snap” known as the Younger Dryas about 13,000 years ago—prompted some hunter-gatherers in the Levant to turn much more vegetarian. Soon collecting wild grass seeds evolved into planting and reaping crops, which reduced people's intake of proteins and vitamins, but brought ample calories, survival and fertility.

The fact that something similar happened six more times in human history over the next few thousand years—in Asia, New Guinea, at least three places in the Americas and one in Africa—supports the notion of invention as a response to demographic pressure. In each case the early farmers, though they might be short, sick and subjugated, could at least survive and breed, enabling them eventually to overwhelm the remaining hunter-gatherers of their respective continents.

It is irrelevant to ask whether we would have been better off to stay as hunter-gatherers. Being a niche-shifting species, we could not help moving on. Willingly or not, humanity had embarked 50,000 years ago on the road called “progress” with constant change in habits driven by invention mothered by necessity. Even 40,000 years ago, technology and lifestyle were in a state of continuous change, especially in western Eurasia. By 34,000 years ago people were making bone points for spears, and by 26,000 years ago they were making needles. Harpoons and other fishing tackle appear at 18,000 years ago, as do bone spear throwers, or atlatls. String was almost certainly in use then—how do you catch rabbits except in nets and snares?

Nor was this virtuosity confined to practicalities. A horse, carved from mammoth-ivory and worn smooth by being used as a pendant, dates from 32,000 years ago in Germany. By the time of Sungir, an open-air settlement from 28,000 years ago at a spot near the city of Vladimir, north-east of Moscow, people were being buried with thousands of laboriously carved ivory beads and even little wheel-shaped bone ornaments.

Incessant innovation is a characteristic of human beings. Agriculture, the domestication of animals and plants, must be seen in the context of this progressive change. It was just another step: hunter-gatherers may have been using fire to encourage the growth of root plants in southern Africa 80,000 years ago. At 15,000 years ago people first domesticated another species—the wolf (though it was probably the wolves that took the initiative). After 12,000 years ago came crops. The internet and the mobile phone were in some vague sense almost predestined 50,000 years ago to appear eventually.

There is a modern moral in this story. We have been creating ecological crises for ourselves and our habitats for tens of thousands of years. We have been solving them, too. Pessimists will point out that each solution only brings us face to face with the next crisis, optimists that no crisis has proved insoluble yet. Just as we rebounded from the extinction of the megafauna and became even more numerous by eating first rabbits then grass seeds, so in the early 20th century we faced starvation for lack of fertiliser when the population was a billion people, but can now look forward with confidence to feeding 10 billion on less land using synthetic nitrogen, genetically high-yield crops and tractors. When we eventually reverse the build-up in carbon dioxide, there will be another issue waiting for us.

Mothers who eat breakfast and bananas have more boys

By Lindsey Tanner, The Associated Press
CHICAGO (AP) — Snips and snails and puppydog tails ... and cereal and bananas?

That could be what little boys are made of, according to surprising new research suggesting that what a woman eats before pregnancy influences the gender of her baby.

Having a hearty appetite, eating potassium-rich foods including bananas, and not skipping breakfast all seemed to raise the odds of having a boy.

The British research is billed as the first in humans to show a link between a woman's diet and whether she has a boy or girl.

It is not proof, but it fits with evidence from test tube fertilization that male embryos thrive best with longer exposure to nutrient-rich lab cultures, said Dr. Tarun Jain. He is a fertility specialist at University of Illinois at Chicago who wasn't involved in the study.

It just might be that it takes more nutrients to build boys than girls, he said

University of Exeter researcher Fiona Mathews, the study's lead author, said the findings also fit with fertility research showing that male embryos aren't likely to survive in lab cultures with low sugar levels. Skipping meals can result in low blood sugar levels.

Jain said he was skeptical when he first heard about the research. But he said the study was well-done and merits follow-up study to see if the theory proves true.

It's not necessarily as far-fetched as it sounds. While men's sperm determine a baby's gender, it could be that certain nutrients or eating patterns make women's bodies more hospitable to sperm carrying the male chromosome, Jain said.

“It's an interesting question. I'm not aware of anyone else looking at it in this manner,” he said.

The study was published Wednesday in the Proceedings of the Royal Society B, a British medical journal.

The research involved about 700 first-time pregnant women in the United Kingdom who didn't know the sex of their fetuses. They were asked about their eating habits in the year before getting pregnant.

Among women with the highest calorie intake before pregnancy (but still within a normal, healthy range), 56 percent had boys, versus 45 percent of the women with the lowest calorie intake.

Women who ate at least one bowl of breakfast cereal daily were 87 percent more likely to have boys than those who ate no more than one bowlful per week. Cereal is a typical breakfast in Britain and in the study, eating very little cereal was considered a possible sign of skipping breakfast, Mathews said.

Compared with the women who had girls, those who had boys ate an additional 300 milligrams of potassium daily on average, “which links quite nicely with the old wives' tale that if you eat bananas you'll have a boy,” Mathews said.

Women who had boys also ate about 400 calories more daily than those who had girls, on average, she said.

Still, no one's recommending pigging out if you really want a boy or starving yourself if you'd prefer a girl.

Neither style of eating is healthy, and besides all the health risks linked with excess weight, other research suggests obese women have a harder time getting pregnant.

The study results reflect women at opposite ends of a normal eating pattern, not those with extreme habits, Mathews said.

Professor Stuart West of the University of Edinburgh said the results echo research in some animals.

And Dr. Michael Lu, an associate professor of obstetrics, gynecology and public health at the University of California at Los Angeles, said the results “are certainly plausible from an evolutionary biology perspective.” In other words, since boys tend to be bigger, it would make sense that it would take more calories to create them, Lu said.

Still, Lu said a woman's diet before pregnancy may be a marker for other factors in their lives that could influence their baby's gender, including timing of intercourse.

“The bottom line is, we still don't know how to advise patients in how to make boys,” he said.