Sunday, February 1, 2015

[tt] CHE 62n20: Timothy Messer-Kruse: Who Knows?

CHE 62n20: Timothy Messer-Kruse: Who Knows?
January 26, 2015

Timothy Messer-Kruse is a professor in the School of Cultural andCHE
Critical Studies at Bowling Green State University.

Wikipedia U: Knowledge, Authority, and Liberal Education in the Digital
By Thomas Leitch
John Hopkins University Press

Near the top of the prompt I give my students when I assign research
papers is a prohibition: "Due to the uneven quality and reliability
of its entries, Wikipedia may not be cited as a source in your
paper." Such limitations on students' use of the world's largest
encyclopedia are common in academe and supported by a growing number
of scholarly books and articles that highlight the contradictions
inherent in this vast collection of specialized entries written
principally by nonspecialists and edited by no one and everyone.

Thomas Leitch, director of film studies at the University of
Delaware, observes in Wikipedia U that more than any other
information source, Wikipedia has been the target of attacks on its
veracity, which, he points out, are really questions about the
nature of scholarly authority. Those questions expose not only
problems of popular collaborative sources like Wikipedia, but, more
revealingly, similar paradoxes inherent in academe.

Authority is something academe seems to have long mastered, with its
elaborate systems of citation, credentialing, institutional
reputations, peer review, and professional hierarchies. By relying
on anonymous collective authorship (which is pretty close to no
authorship at all), Wikipedia has been whacked with abandon by its
academic critics. While educators express legitimate concerns about
Wikipedia's structural susceptibility to slipshod scholarship or
plagiarism, many of those criticisms are fundamentally anxieties
about how Wikipedia threatens academic authority.

Leitch's innovation is to spin the table in both directions: He uses
the values of higher education to expose the contradictions of
Wikipedia, but he just as deftly employs Wikipedia's ethos to expose
the paradoxes of liberal education's own claims to authority. That
authority is built partly on the rigid hierarchies that valorize and
preserve certain forms of knowledge, but equally on the
institutional encouragement to undermine expertise with new research
and interpretations. Academic authority appears solidly rooted in
sources, but such authority largely depends on the frequency with
which they are cited, which is a social arrangement.

Wikipedia touts itself less as an invention than as a communal,
democratic power whose mission, according to its co-founder, Jimmy
Wales, is to afford "free access to the sum of all human knowledge."
But Wikipedia's democratic approach to authority raises the problem
of all knowledge being flattened into a vast plain where no
landmarks rise above the terrain. That danger is compounded by one
of Wikipedia's advantages over printed compendiums--its unrestricted
depth and reach. The physical limits of pages and volumes enforce
the testing of importance and relevance. On Wikipedia, Constantine
the emperor, Constantine the comic books, and Constantine, Mich.,
can be equally noteworthy, discussed at similar length and depth.

Over time, Wikipedia has apparently perceived that reality as a
potential weakness and has slouched toward hierarchy. Entries are
now designated as "featured" or "good" and graded as A through C on
down to the lowly "stub." Not all contributors are of equal
standing, and autoconfirmed editors, "sysops," "bureaucrats," and
"stewards" have been given different editing powers.

Academe also imperfectly balances the competing values of settled
authority and critical questioning. The Encyclopaedia Britannica is
biased toward stability; its authority rests on expert reliability,
not on self-questioning. Wikipedia embraces the opposite, staking
its authority on a principle of relentless expansion, which
theoretically carries with it a unique ability to rapidly adapt new
knowledge and correct mistakes.

While encouraging broad democratic participation, Wikipedia actually
has a more institutionally conservative approach to debate, having
many structures in place to overcome and resolve controversy but few
to encourage it. While much academic debate lives in the shadows,
with peer review and credentialing--from dissertation defenses to
tenure decisions--transpiring in secret, Wikipedia, through its
"Talk" pages, where editorial deliberations and "revert wars" rage,
hangs its laundry proudly out in public.

But Wikipedia's greatest disjuncture with liberal education--and
what academics find so threatening--is its substitution of the
wisdom of the crowd for expertise. Unlike other academic critics of
Wikipedia, who are mostly silent about the philosophical problems of
expert authority while bashing Wikipedia for its celebration of
amateurism, Leitch underlines the contradictions and paradoxes
involved in liberal education's reliance on credentialed experts.
Academic authority is based on mastery of a field, but the continual
expansion of fields renders mastery impossible except in
ever-narrower specialties.

In this sense, Wikipedia and academe are opposites, the one
prohibiting original research (on the principle that only secondary
sources are generally verifiable) and the other following the rule
of publish or perish, an imperative to research that leads to the
paradox that students be taught a body of conventional knowledge by
instructors who are working to undermine it.

Wikipedia's framework rests not only on democratic ethics but also
on the principle of crowdsourcing, or the wisdom of the crowd--the
phenomenon, first described by Francis Galton, that the median of a
large number of independent estimations of a fact closely
approximates reality. That effect, in part, is what makes the
relatively large number of users and contributors to Wikipedia an
argument in favor of its authority, in spite of the fact that,
according to Wikipedia itself, a quarter of its contributors are
under the age of 21, and more than a third do not hold college

Though focusing on Wikipedia for most of his book, at the end Leitch
observes that Wikipedia is just one example of the rapidly shifting
nature of authority in our social-media-saturated society. Compared
with the increasing Balkanization of a hashtagging, tweeting,
self-referential world, Wikipedia seems almost hidebound. In fact,
Leitch urges liberal education to embrace Wikipedia as a counter to
the accelerating atomization and flattening of knowledge and
authority over all.

There is no ultimate solution to these problems of epistemic
authority, only more or less appropriate grounds upon which to base
a critique of new forms of knowledge. Leitch persuasively argues
that academics have plenty to correct in their own backyard before
criticizing their Wiki neighbors: "Given the surprising gaps between
the principles the academy invokes to justify its strictures against
Wikipedia and the principles implicit in its own practices, it seems
unwise, perhaps impossible, for academics to make a case against
Wikipedia based on their own institutional principles, some of which
Wikipedia shares."

So what is an expert to do? Leitch falls back on pedagogy as a
guide. Rather than treat Wikipedia as a useful pariah, Leitch calls
for a policy of "playful" engagement. Wikipedia, because of its open
architecture, the transparency of its internal debates, and the
obviousness of its strengths and weaknesses, is an excellent
pedagogical tool for teaching about the nature of authority and
knowledge. Directing students to participate in writing or editing
Wikipedia entries develops several online-literacy skills, Leitch
says, although he is on the fence as to whether "crap detection" is
one of them.
tt mailing list

Saturday, January 31, 2015

[tt] NYT: Speck of Interstellar Dust Obscures Glimpse of Big Bang

Speck of Interstellar Dust Obscures Glimpse of Big Bang


Scientists will have to wait a while longer to find out what kicked
off the Big Bang.

Last spring, a team of astronomers who go by the name of Bicep
announced that they had detected ripples in space-time, or
gravitational waves, reverberating from the first trillionth of a
trillionth of a trillionth of a second of time--long-sought
evidence that the expansion of the universe had started out with a
giant whoosh called inflation.

The discovery was heralded as potentially the greatest of the new
century, but after months of spirited debate, the group conceded
that the result could have been caused by interstellar dust, a
notion buttressed by subsequent measurements by the European Space
Agency's Planck satellite that the part of the sky Bicep examined
was in fact dusty.

Now a new analysis, undertaken jointly by the Bicep group and the
Planck group, has confirmed that the Bicep signal was mostly, if not
all, stardust, and that there is no convincing evidence of the
gravitational waves. No evidence of inflation.

"This analysis shows that the amount of gravitational waves can
probably be no more than about half the observed signal," Clem Pryke
of the University of Minnesota said Friday in an interview.

"We can't say with any certainty whether any gravity wave signals
remain," Dr. Pryke added. "Obviously, we're not exactly thrilled,
but we are scientists and our job is to try and uncover the truth.
In the scientific process, the truth will emerge."

When the galactic dust is correctly subtracted, the scientists said,
there was indeed a small excess signal--a glimmer of hope for
inflation fans?--but it was too small to tell if it was because of
gravitational waves or just experimental noise.

The Bicep/Planck analysis was led by Dr. Pryke, one of the four
Bicep principal investigators. Brendan Crill, of the California
Institute of Technology and a member of Planck, acted as a liaison
between the groups. They had planned to post their paper Monday, but
the data was posted early, apparently by accident. It was soon taken
down, but not before it set off an outburst of Twitter messages and
hasty news releases.

A paper is to be posted to the Bicep website and has been submitted
to the journal Physical Review Letters.

But it will be far from the final word. A flotilla of experiments
devoted to the cause are underway, studying a thin haze of
microwaves, known as cosmic background radiation, left from the Big
Bang, when the cosmos was about 380,000 years old. Among them is a
sister experiment to Bicep called Spider, led by Bill Jones of
Princeton and involving a balloon-borne telescope that just
completed a trip around Antarctica, as well as Bicep's own Keck
Array and the recently installed Bicep3.

At stake is an idea that has galvanized cosmologists since Alan Guth
of the Massachusetts Institute of Technology invented it in 1979.
Inflation theory holds that the universe had a violent and brief
surge of expansion in the earliest moments, driven by a mysterious
force field that exerted negative gravity. It would explain such
things as why the universe looks so uniform and where galaxies come
from--quantum dents in the inflating cosmos.

Such an explosion would have left faint corkscrew swirls, known
technically as B-modes, in the pattern of polarization of the
microwaves. So, however, does interstellar dust.

The Bicep group--its name is an acronym for Background Imaging of
Cosmic Extragalactic Polarization--is led by John M. Kovac of the
Harvard-Smithsonian Center for Astrophysics; Jamie Bock of Caltech;
Dr. Pryke; and Chao-Lin Kuo of Stanford. They have deployed a series
of radio telescopes at the South Pole in search of the swirl

Their second scope, Bicep2, detected a signal whose strength was in
the sweet spot for some of the most popular models of inflation,
leading to a sensational news conference attended by Dr. Guth and
Andrei Linde, two of the founding fathers of inflation.

But that was before critics raised the dust question. Moreover, that
result was contrary to a previous limit on the strength of
gravitational waves obtained by the Planck satellite, which has
scanned the entire microwave sky in search of the Big Bang's

Planck observed the microwaves in nine frequencies, making it easy
to distinguish dust. Bicep2 had only one frequency and lacked access
to Planck's data until last fall, when the two groups agreed to work

Dr. Bock of Caltech, in an interview at the end of what he called a
long, stressful day, characterized the result as "no detectable

"I'm not discouraged," he went on. "We're going to have to have
better data to get a definitive answer."

In an email, Paul J. Steinhardt, a Princeton cosmologist who was a
founder of inflation but turned against it in favor of his own
theory of a cyclic bouncing universe, said the new results left
cosmologists back where they were before Bicep.

But Dr. Linde noted that there was evidence in the new analysis for
a gravitational wave signal, albeit at a level significantly lower
than Bicep had reported. "This is what all of us realized almost a
year ago, and it did not change," he said in an email.

The earlier Planck result limiting gravitational waves, he said, had
inspired a firestorm of theorizing, in which he and others produced
a whole new class of theories relating not just to inflation, but to
dark energy as well.

"So yes, we are very excited, and no, the theory did not become more
contrived," he said.

Max Tegmark, an M.I.T. expert on the cosmic microwaves, said, "It's
important to remember that inflation is still alive and well, and
that many of the simplest models predict signals just below this new
limit." The next few years will be interesting, he said.

Michael Turner, a cosmologist at the University of Chicago, said he
could appreciate the frustration of his colleagues, who have been
wandering in the wilderness for nearly four decades looking for
clues to the Beginning.

"Inflation is the most important idea in cosmology since the hot Big
Bang," he said. "It is our Helen of Troy, launching a thousand
tt mailing list

[tt] (c-punks) AWO text available (fwd)

----- Forwarded message from brian carroll <> -----

Date: Tue, 27 Jan 2015 19:55:00 -0600
From: brian carroll <>
To: cypherpunks <>
Subject: AWO text available

Apple Watch Observations. 123pp, btc
no copyright. redistribution & mirroring ok (see disclaimer)

Critique of the Apple Watch concept in a good-bad-&-ugly scenario,
accounting for app mania & the "apps apps apps" mantra preceding
creation of the App Watch, standing in for the watch of the future.
Thus proprietary apps, OS & ecosystem approach as a conceptual
and ideological limit, though also in a context of hidden politics,
exploitation of technology and rapidly developing police state.
In this way dual-use blackbox devices, antagonistic electronics
operating with competing value systems, human and antihuman,
truth absent within computer data models, no structural basis
for knowledge, the bit itself becomes 'the sign of truth', where
$=bit (on=money, off=no-money) in terms of evaluating data,
leading to slavery dynamics, censoring data by denying access
via authoritarian control over culture, on behalf of tyrannical
power politics of Big Daddy as state oppressor, not Big Brother.
In this way - as an optimistic view - establishing new rights to
data access via wearable technology, micro-data & -payments
in surrounding pervasive wireless infrastructure, beyond the
detached aristocratic mindset of Silicon Valley utopianism,
developing tools for humanity, to support & secure freedom.

/note: the gist for cypherpunks & crypto could involve an otherwise
off-the-books threat model & related security issues in a real-world
non-politically-correct analysis conveyed via plainspoken viewpoint/.

0.1 unveiling, cognitive dissonance, counterpoint, time = $$$
1.1 device name, seitiroirp, technocracy, culture & fitness profiles
1.2 watch of the future, Achilles heel, risk and reward, App Watch
1.3 timing, SoC as movement, GPS, XU, OS-determinism, sans AI
1.4 apps, middle-mgmt, ecosystem, ideology, iPhone, autonomy
2.1 aesthetics, inside/outside, gap, data-model, utility & futility
2.2 moonshot, wrist computer, security threat, proprietization
3.1 consumerism, QC and QA, revolution, ideas and processing
3.2 calligraphy v. helveticization, interface as facade, core-rot
3.3 forced perspective, junk|joke, mirroring, mundanity of evil
4.1 watches {mechanical, electronic, computer} w/A&D display
4.2 lessons unlearned, display-battery issues, value, economy
4.3 snsrs, smartwatch {digital watch}; set {subset}, subset {set}
4.4 the heart, biorhythmic movement, fitness, ID maze, dystopia
4.5 jumping spider, true innovation, seed, wristwatch data orrery
4.6 aesthetic substance, 1984, inhuman technology, Dark Ages 2.0
5.1 space-time-place device, brain|mind, consciousness, AI, QvA
5.2 time signals, delay, accuracy, GPS & atomic clocks, Project Apollo
5.3 context+sensors+datalogging+networking < user case studies
6.1 malproduction, surveillance, antagonistic electronics, crazy ones
6.2 more: dual-use, weaponization, unreality, hidden politics, errors
6.3 blackbox, eg. flashdrive, A/B, corruption, NSA, police state, trap
6.4 control. transparency, covert infrastructure, Taylorism, insecurity
6.5 flds+sgnls, diagnostics, forbidden fruit, Abuse, sexuality & power
6.6 fear, donuts, aristocracy, slavery, OS plantation, clockwork-apple
6.7 fortune telling, $=bit, wysinwyg, invisible states, watching spies
6.8 ideals, Big Daddy,, IoT, bubble, conceptual shareholders
7.1 data infrastructure, the grid, 1950s, security,
7.2 lifespan, jewelry.ext, adaptive+assistive, mission, snake eyes
7.3 hybrid [touch'e, hands]. OS highway & App store, data.access
7.4 variation, design styling, extroversion,, tiers, gold
7.5 radar, wi-fi infrastructure, micro-data development, freedom
7.6 direction, cultural RD&D, calculator v. computer, human future

xtre: proposal for public email list to discuss & debate ideas...

It is wondered if anyone would be interested in establishing
an email discussion list related to general AWO themes as
a beginning point for any such artifact/infrastructure/system
issues, observations or analyses as the general framework.
A proposed list name, '', is keyed from the text,
simultaneously referencing a potential Wi-Fi based wristwatch
with public access to pervasive data services, though also the
reality of being watched in surveillance-based global society,
as this also relates to watching the watchers, everyone an eye
as observations empirically correlate, find ground or fail to unify
as this increases isolation, alienation, silence, self-censoring.
The main concern with an open discussion (whether public or
private) is institutional and-or legal support or backing to limit
internal/external subversion or interference with list operation
so that people can feel safe communicating in "free society."
My vote would be for a public list connected to a University,
where a lot of people are discussing the same realm of ideas,
not limited of course to AWO or authorial perspective, though
involving dynamics mentioned, hypotheses, many viewpoints.

Any feedback or ideas on this appreciated. ~cc.nettime-l

{The Internet Emporium}: related project

----- End forwarded message -----
tt mailing list

[tt] NS 3005: Google Glass: dead or quietly evolving?

I wondered why I stopped hearing about them.

NS 3005: Google Glass: dead or quietly evolving?
* 21 January 2015

Glass shut up shop last week. Google said its high-profile wearable
computing project was "graduating" from its original home in the
company's secretive X lab, and that it is still working on new
versions of the device. The firm also said it was closing its
Explorer scheme for developers to get their hands on Glass. Many
interpreted this as Glass being discreetly eased out of the

It wouldn't be a huge surprise. The device was unwieldly and had bad
battery life, a terrible display and limited input options. Its
unappealing aesthetic, combined with the potential for surreptitious
video recording, earned its wearers the nickname Glassholes.

But it's unlikely that Glass is indeed dead. Consumers may have
rejected its high price tag, but business and industry were kinder.
Surgeons, who value hands-free access to patient data, were
particularly keen.

And although Google's initial foray into spicing up our field of
view may be over, a future in which computers augment our senses is
still very much on the cards.

"Our focus is on creating a global communications system that would
be larger than anything that has been talked about to date" Elon
Musk tells Bloomberg about his latest grand venture: hundreds of
satellites providing global high-speed internet access
A picture can be a vital sign of one's love

Every breath you take, I'll be watching you. A high-tech photo frame
billows its surface in time with the breaths of the person pictured,
picking up their breathing pattern from a sensor-studded belt they
wear. The gadget - developed by Jina Kim and colleagues at the Korea
Advanced Institute of Science and Technology in Daejeon, South Korea
- aims to help couples feel closer when they're apart. It was
presented last week at a conference at Stanford University in
California, along with positive feedback from eight couples who
tried it.


tt mailing list

[tt] NS 3005: Shoes vs barefoot: The myth of the normal foot

What about the myth of normal brains?

NS 3005: Shoes vs barefoot: The myth of the normal foot
* 26 January 2015 by [11]Laura Spinney

The average Western foot is deformed by shoes. If you ditch them,
will your feet bounce back or are you simply asking for trouble?

MY RUNNING shoes have a thick sole and cushioned heel. I bought them
five years ago, before the "barefoot" craze for minimalist shoes
that would allow people to better emulate how our ancestors ran.
Soon after that, reports began appearing of injuries sustained by
runners who had adopted these shoes, and lawsuits were filed against
some manufacturers. Now the maximally cushioned or "fat" shoe is
back in vogue, and suddenly my old shoes look high-tech again.

Is all this simply a matter of fashion, I wonder, or is it telling
us something more profound? Surprisingly, we are only beginning to
discover what a normal human foot looks like, how it should move,
and the role that shoes play. Recent research, sparked in part by
the fallout from barefoot running, reveals enormous diversity in
healthy feet. What's more, the average Western foot turns out to be
an outlier, deformed with respect to our ancestors' feet and those
of our barefoot contemporaries. Much of this is down to shoes, which
have taken over some of the work our feet had to do to allow us to
become bipedal. "We assume that the people around us are normal, but
from an evolutionary perspective, they're not," says evolutionary
biologist Daniel Lieberman at Harvard University.

The anatomy of the human foot is no mystery. It is a complex
structure, containing 26 bones and over 100 muscles, tendons and
ligaments. It is also malleable, as will be obvious to anyone who
has seen photos of young women's feet bound according to a gruesome
old Chinese custom, ostensibly to make them dainty. Some victims
wound up with feet that looked as if they had inbuilt high heels.

Foot shape is the product of gene-environment interactions, but how
do they play out? Until recently, the few studies there were had
focused almost exclusively on Westerners - which, in practice, meant
people who had worn shoes since they could walk. Lieberman and his
colleagues were among the first to cast their net more widely. In a
study published in 2010, they found that [18]Kenyan endurance
runners who had grown up without shoes landed more often on their
toes than on their heels as 80 per cent of shoe-wearing distance
runners do. The work helped to trigger the barefoot running craze,
but Lieberman points out that the sample size was small and that the
results didn't support many of the claims later made for barefoot
running, such as the idea that it reduces the risk of injury.
However, the hint that wearing shoes could have such a big impact on
how we use our feet was intriguing, and Lieberman and others have
pursued its implications.

A team led by biological anthropologist Kristiaan d'Août, then at
the University of Antwerp, Belgium, also did pioneering work in this
area. In 2009, they measured the feet of 70 Indians who didn't wear
shoes and compared them with those of 137 Indian and 48 Belgian
shoe-wearers. They also asked all three groups to walk on a
pressure-sensing treadmill, which generated dynamic pressure maps of
the foot as it hit the ground.

The barefoot walkers tended to have relatively wide feet, with
pressure fairly evenly distributed over the parts touching the
ground when walking. The shoe-using Indians had narrower feet and a
less even pressure distribution. But the Belgians, who wore more
constricting shoes, more often than the shoe-wearing Indians, had
very different feet: relatively short and slender, with pressure
hotspots at the heel, big toe and midfoot region of the metatarsals
[19](see diagram).

Floppy feet

The researchers concluded that shoe-wearing is one of the most
powerful environmental factors influencing the shape of our feet
([20]Footwear Science, vol 1, p 81). It can also have a big impact
on the way we walk, as anthropologist Jeremy DeSilva and gait expert
Simone Gill, both at Boston University, discovered. They persuaded
nearly 400 adult visitors to the Boston Museum of Science to walk
barefoot over a 6-metre-long "gait carpet", which measured speed and
stride length as well as building pressure maps. This revealed
something remarkable. Around [21]1 in 13 people were extraordinarily
flat-footed: they had a pressure hotspot resulting from their
midfoot moulding to the ground as they walked. "Their feet were as
flexible as chimps'," says DeSilva ([22]American Journal of Physical
Anthropology, vol 151, p 495).

As humans evolved to be bipedal, our feet developed longitudinal and
transverse arches. These created rigidity in the central part of the
outside of the foot, to help propel us forward when we lift our heel
and push down on the ball of the foot. In other words, a rigid
midfoot is a signature of bipedality. Chimps lack this rigidity,
their feet being floppier in the middle to allow them to grip a
branch. In technical terms, they have a "midtarsal break", and it's
this that DeSilva and Gill observed in some museum visitors. Since
publishing their finding in 2013, they have ruled out the
possibility that the midtarsal break runs in families. In other
words, it isn't strongly heritable, although a predisposition to it
could be. Instead, DeSilva suspects that it is mainly a result of
wearing shoes. "The shoe provides the rigidity, in a way, so the
foot doesn't have to," he says.

Two studies published by Lieberman and colleagues last year seem to
back this conclusion. In one, they looked at the feet of Tarahumara
Native Americans in Mexico - famed endurance runners whose
traditional sandals inspired minimalist running shoes - and found
that those who ran in sandals had stiffer arches than those who ran
in conventional shoes ([23]Journal of Sport and Health Science, vol
3, page 86). [24]The other study showed just how quickly feet can
adapt. After 12 weeks of regular running in minimalist shoes,
Western runners developed significantly stiffer arches.

What goes on within our feet as we walk is still a bit of a mystery.
The pressure map method can only give an indirect measure of the
mechanics involved. But a novel technique pioneered by Paul Lundgren
at the Karolinska Institute in Stockholm, Sweden, and colleagues,
takes things a step further. They surgically implanted metal pins
into nine bones in the feet of six volunteers, and capped the
protruding ends with reflective markers that could be tracked using
motion-capture cameras. The technique revealed that all the joints
in the foot and ankle contribute to the way we walk, the movement of
each joint being dependent on the others ([25]Gait & Posture, vol
28, p 93). It also showed great diversity among individuals in the
range of movement of each joint - especially in the midfoot.

A team at the University of Liverpool, UK, led by Karl Bates, has
replicated that finding in a group of 45 volunteers, using pressure
maps. Their study also included bonobos and orangutans, revealing
the [26]pressure of human footfalls to be as diverse as those
measured in these most arboreal of apes. "What the bone-pin study
showed is that everybody is different," says Bates. "For some people
the foot is stiff, but for others there is actually a surprising
amount of movement."

This natural variation raises important questions. First, if
"normal" covers such a wide range, what is an abnormal foot? In the
past, foot disorders have been defined as much by social concerns as
by medical ones. For example, flat feet were regarded as a sign of
moral flabbiness in the American character, according to medical
historian Beth Linker of the University of Pennsylvania,
Philadelphia, ([27]Social History of Medicine, vol 20, p 91). During
the first world war, a soldier could be invalided out of the US army
for flat feet - but not for shell shock - and flat-foot camps,
designed to rehabilitate the afflicted, spread across the country.

Doctors also have misconceived ideas about feet. "The human foot is
supposed to be very stiff, and if it's not then often a clinical
problem is diagnosed," says Bates. But he and others have shown that
flat-footedness isn't necessarily associated with pain or any
radical restriction of function. None of the flexi-footed visitors
to the Boston Museum of Science complained of pain. And although
DeSilva suspects that people with mobile midfeet may not figure
among the fastest runners, because they have less elastic recoil
when they push off the ground, they pay no obvious price in terms of

Bates believes the new findings should also change the way we
interpret hominin fossils, because the bones of one individual may
tell us little about how its foot worked, let alone how other
members of the species walked. Take Lucy, the famous 3.2
million-year-old australopithecine unearthed in Ethiopia, who
carries all the hallmarks of bipedalism. When DeSilva compared her
ankle bones with X-rays of modern human feet, he concluded that
[28]she was probably flat-footed in a non-pathological way. It's
hard to say how typical of her kind she was, though. "There would
have been variation in her species as in ours, but perhaps around a
different norm," he says.

We still have much to discover about what normal means when it comes
to feet but one thing is clear. Although going barefoot was normal
for most of human evolution, our relatively short period of footwear
use - about 40,000 years, according to the archaeological record -
has left its mark. That's largely because the human foot turns out
to be so plastic. This finding, in turn, holds hope for anyone
wanting to turn back the clock. We may be able to run more like our
ancestors if we take it gradually, realising that in donning
minimalist shoes we load our bodies differently, and that the
surfaces we run on are quite different to what they coped with.
Nevertheless, the jury is still out as to whether barefoot shoes
bring better performance or fewer injuries. Until it delivers its
verdict, I'll be hanging on to my old running shoes.

Cinderella's legacy

"Things started to go wrong in the 16th century," says Marquita
Volken, a shoe archaeologist who runs the Shoe Museum in Lausanne,
Switzerland. It was then that European streets began to be paved and
the soles of shoes began to get thicker to cushion urban feet.
Influenced by the vagaries of fashion, heels rose and both men and
women were soon tottering on platforms up to half a metre high.
These were the peacock's tail of footwear, a showy badge of social
superiority, says Kristiaan d'Août of the University of Liverpool in
the UK - since there was no way the wearer could work in them.

The French Revolution brought everyone back down to earth, and when
heels started rising again the trend only affected women's shoes -
probably, d'Août suggests, because they exaggerated the female
aspects of gait. A recent study hints this could have benefits. It
showed that men's (but not women's) helpfulness towards a woman was
correlated with the height of her heels ([29]Archives of Sexual
Behavior, DOI: 10.1007/s10508-014-0422-z).

High heels are not good for feet, however, especially when shoes
also constrict the toes. Studies of premodern European skeletons
suggest that hallux valgus - the condition commonly known as the
bunion - started to become prevalent in the 16th century, and has
never been more common in women than it is now. A 1993 survey of
American women showed that 88 per cent wore shoes that were too
small for them, 80 per cent reported pain, and 76 per cent had some
sort of foot deformity, bunions being the most common (Foot & Ankle,
vol 14, p 78). "Shoe design is cyclical," says Volken, whose new
book [30]Archaeological Footwear chronicles the development of shoes
from prehistory to the 1600s. "We're currently in an unhealthy

Laura Spinney is based in Lausanne, Switzerland



[tt] NS 3005: Where heroes come from - and how to become one

What about heroes that criticize right thinking?

NS 3005: Where heroes come from - and how to become one
* 22 January 2015 by [10]Michael Bond

"I did it without thinking," people often say after saving a
stranger's life. The truth is, heroism develops over a lifetime -
and it's never too late to learn

IT TOOK Michael McNally about 10 seconds from hearing the crash to
run from his house in the Cape Cod village of Marstons Mills to the
road outside. When he got there, the car was already burning, its
front end bent around a tree. Things were exploding in the engine
compartment. He looked inside and saw a young woman in the passenger
seat. She was about the same age as his daughter. It was clear that
if she stayed there another minute she would die.

McNally, 51, reached in through the passenger window and tried to
pull her out. He lost his grip, so he repositioned himself through
the back seat window and pulled her through by her ankles. "The poor
girl was on fire," he says. "Her skin was coming off. It was a
horrible thing to see." She was severely burned, but survived.

If you want to know why anyone would risk their life to save a
stranger, the last person you should ask are the heroes themselves.
Whether running to a burning car or sheltering someone from secret
police, usually the protagonists cannot explain why they acted the
way they did. "I don't know why I did it," says McNally. "I only
know that I did. I just had to act."

It's a familiar story to Walter Rutkowski, president of the
[16]Carnegie Hero Fund Commission, which awards medals to American
and Canadian civilians who risk their lives to save others - McNally
was given one last year. The usual explanation, he says, is that
there is no explanation. "We have more or less settled into the
knowledge first expressed by our founder Andrew Carnegie in 1904:
heroic action is impulsive."

Compelling stories

This impulsiveness, this apparent unpredictability, is the mystery
of heroism. Like all good mysteries it has inspired a host of
investigators. For many, their interest in the subject is as much
personal as academic: their own stories are often as compelling as
those of their subjects.

One such investigator is Samuel Oliner, a sociologist at Humboldt
State University in Arcata, California. In June 1942, when he was
12, the Nazis ordered his family to move from their home in the
village of Bielanka, southern Poland, to a Jewish ghetto in a nearby
town. Early one morning two months later, Nazi soldiers entered the
ghetto and ordered everyone into the street. Oliner's stepmother,
sensing what was about to happen, pleaded with him to run. So he hid
on the roof while the soldiers herded his family and their
neighbours into trucks, drove them into a nearby forest, and killed

Oliner eventually left his hiding place and headed into the
countryside. After three nights sleeping rough he knocked on the
door of a Catholic woman, Balwina Piecuch, who had known his family
before the war. At great risk to herself and her family, she took
him in, helped him create a false identity and hid him from the
Gestapo. Oliner is fond of saying that her act of kindness not only
saved his life, it also shaped his life. After the war he emigrated
to the US, entered academia and dedicated his career to
understanding the selfless motivation of people like her.

Altruism has long been an evolutionary mystery. Why would anyone
choose to help somebody not related to them, with no promise of
reward? The usual answer is that [17]such behaviour is an
adaptation: for example, groups in which it emerged would have been
more cohesive, and hence more successful. But what about acts of
extreme altruism? Can we ever understand why some people risk -and
sometimes lose - their lives for a stranger?

To try to answer this question, Oliner and his wife Pearl set up the
Altruistic Personality and Prosocial Behavior Institute at Humboldt
State University in 1982. In one of their first studies, still the
largest of its kind, they interviewed and psychologically assessed
406 people who had risked their lives to rescue Jews in
Nazi-occupied Europe, along with 72 people who had lived in occupied
areas but had done nothing out of the ordinary. A number of things
became clear. The rescuers were much more empathic than the
non-rescuers, and they also espoused values of fairness, compassion
and personal responsibility towards strangers that they said they
had learned from their parents.

What's more, they were unusually tolerant: the people they
identified as their "in group" consisted of the whole of humanity,
not just their own kind. As [18]Kristen Monroe at the University of
California, Irvine, who has studied the psychology of Holocaust
rescuers, puts it: "Where the rest of us see a stranger, an altruist
sees a fellow human being."

Samuel Oliner says this finding has held up in all their subsequent
studies. It has also been replicated by psychologist Eva Fogelman,
whose father, too, owed his wartime survival to the generosity of
Polish peasants. Fogelman has spent much of her career studying the
psychological effects of the Holocaust on survivors and their
families. In her book [19]Conscience and Courage, she recalls her
conversations with about 300 rescuers of Jews: "I began after a
while to wait for the recital of one or more of those well-known
passages: a nurturing, loving home; an altruistic parent or beloved
caretaker who served as a role model for altruistic behavior; a
tolerance for people who were different."

Further research has added weight to the idea that some people are
more predisposed than others to help. In one [20]recent study, David
Rand of Yale University and his colleagues got volunteers to play a
series of cooperation and punishment games often used in
experimental economics [21](see diagram). They found that people who
cooperate in one game tend to cooperate in all, and also help out
for real when offered a chance to do so, even when there is nothing
in it for them. "The basic motivators that make you want to help
people apply across a lot of different domains," says Rand.

Where do these motivators come from? In keeping with the Oliners'
findings, cooperators are more likely to hold egalitarian values and
be strongly influenced by their parents' altruism. A series of
[22]recent studies also suggest that altruistic behaviour is seeded
in young children's early social interactions with adults.

There also appears to be a biological component - although whether
this is inherited or acquired is not known. Neuroscientists led by
Abigail Marsh at Georgetown University in Washington DC [23]found
that people who had volunteered to donate a kidney to a stranger had
larger and more responsive right amygdalae than normal. This area of
the brain helps us recognise fearful facial expressions, something
altruists and those high in empathy are adept at. Many studies have
shown that people who are better at recognising fear in others are
more likely to help them. The right amygdala is also notably reduced
in psychopaths, who are spectacularly bad at recognising or
responding to fear.

All this points to what Samuel Oliner calls an "altruistic
personality" - a set of stable, lifelong traits that consistently
orientate some people towards altruistic behaviour.

Rand is now grappling with the million-dollar question: does having
an altruistic personality make someone more likely to risk their
life to save a stranger? He thinks impulsive heroes are indeed
motivated by their personality, though this is hard to test because
people rarely get the opportunity to be heroic more than once. This
infrequency is a major barrier to understanding extreme altruism.
Since heroes are heroes perhaps once in a lifetime, and their
heroism figures prominently in the story of their lives thereafter,
it is tempting for them to rewrite their personal narratives -
particularly when questioned by researchers. "If you put your life
on the line for someone you don't know, most people would want a
narrative to help them make sense of the massive risk they've
taken," says Frank Farley, who studies risk-taking and heroism at
Temple University in Philadelphia.

One thing seems clear, however: heroism is intuitive. It couldn't be
any other way, says Rand, because most people who find themselves in
high-stakes situations are completely unprepared. This is borne out
by another of Rand's studies. He examined the testimonies of 51
Carnegie heroes to try to understand how they decided to risk their
lives. In line with Carnegie's hunch, he [24]found overwhelmingly
that their actions were intuitive rather than deliberative. Even
when they had time to reflect on what they were about to do, they
did not.

Take this typical response, from 60-year-old lawyer Kermit Kubitz,
who in 2007 intervened to protect a 15-year-old girl from a knife
attack and ended up being stabbed himself: "I think it was just
instinct." And recall McNally's 10-second dash to the burning car.
"I didn't really have time to think," he said. Or perhaps he chose
not to.

Default setting

Rand believes our reaction at such times reflects the way we usually
behave in more familiar, low-stakes scenarios. So if someone is
accustomed to acting altruistically on a daily basis, they are more
likely to do so when the risks are high, because this is their
default behaviour. Extreme altruism, then, is just that: an extreme
form of "ordinary" altruism. But, Rand says, this doesn't mean it is
adaptive. Instead, it is a misapplication of an impulse to be
generally helpful to others.

He also acknowledges that although altruism is necessary for
heroism, it isn't sufficient. "If you were to put that same person
in the same situation, we don't know how often they would risk their
life. They need to be in the right internal state."

Farley agrees that an altruistic personality alone does not make
heroes. You also need a propensity for risk-taking or
thrill-seeking, known as a Type T personality, he says. "If you are
a T Type without altruism, you may take a pass. If you are
altruistic but risk-averse, you may also take a pass."

How often these personality traits come together is not known, but
heroic acts are not all that unusual. In its 110 years the US
Carnegie fund has awarded almost 10,000 medals, around 90 per cent
of them to men, and 20 per cent posthumously to people who died in
their act of heroism. Its sister organisations in Europe have
awarded thousands more.

Further understanding has come from studying war heroes. Heroism in
battle is a little different from extreme altruism because soldiers'
heroic acts are almost always inspired by loyalty to comrades rather
than compassion towards strangers. Perhaps not surprisingly, then,
war heroes don't seem to share personality traits in the same way
that extreme altruists do. A [25]study of 283 Israeli soldiers
awarded medals for bravery during the 1973 Yom Kippur war found no
personality traits that set them apart from other soldiers.

However, it is possible that they share traits or histories that
have not been picked up due to a lack of biographical data. Didy
Grahame, who as secretary of the Victoria Cross and George Cross
Association in London is familiar with the histories of hundreds of
decorated heroes, says that a disproportionate number are older
siblings from a large family, sons of widowed mothers, or had other
early life experiences that gave them a habit of caring and taking

This has not been confirmed by academic research, but if it is
correct, war heroes have much in common with civilian ones. McNally,
reflecting on his rescue mission, revealed that he had been a helper
all his life, caring for his mother after the death of his father.

Despite the indications that altruistic behaviour comes more
naturally to some people than others, many researchers in the field
are convinced that it can be taught. "While the biological
predisposition should not be neglected, there is no doubt in my mind
that people can be primed to change their position from bystander to
helper," says Samuel Oliner. He says the best chance is during
childhood, and that school curricula should include programmes aimed
at instilling altruistic values.

It is also possible in adulthood, says Ervin Staub, a psychologist
at the University of Massachusetts at Amherst, who has spent much of
his career trying to achieve just that. Like so many others, the
seeds of his academic mission were planted during the second world
war. His family was among thousands of Hungarian Jews who were
sheltered from the Nazis by the Swedish diplomat Raoul Wallenberg.
For three decades, Staub has been testing what he calls "active
bystandership" - the capacity of people who may not be naturally
heroic to help those in distress.

Staub worked with California's department of justice following the
beating of Rodney King by Los Angeles police in March 1991,
encouraging officers to break ranks and speak out. Since 1998, he
has been promoting reconciliation in Rwanda. One of his successes
there is an educational radio drama designed teach people about the
causes of conflict and how to resolve it, which he says has
engendered more positive relations between ethnic and social groups
and led to a greater appetite for reconciliation. He has also
established a programme in Massachusetts to help schoolchildren
challenge bullies, which can require considerable courage.

Two years before Carnegie established the hero award, his friend
Silas Weir Mitchell, one of the founders of neurology, wrote a
magazine article called Heroism in Every-Day Life. "Men are in
emergencies the puppets of their past, which of a sudden pulls the
unseen wires and determines action," he wrote. "The gun was loaded
long ago: occasion pulls the trigger."

That article may have inspired Carnegie to set up his fund.
Fortunately for humanity, we now know that neither man got it
entirely right. Heroes do not act entirely out of the blue, and it
is never too late to load that gun.

Michael Bond is a New Scientist consultant based in London. This
article is based on a chapter in his book [26]The Power of Others


tt mailing list

[tt] NS 3005: Let them eat steak: How to eat meat the healthy way

How many years are taken off one's life span by worrying about one's

NS 3005: Let them eat steak: How to eat meat the healthy way
* 21 January 2015 by [11]Linda Geddes

Linked to all manner of illness and an eco-villain too - meat has an
image problem. But the evidence says that smart diners can welcome
it back to the menu

BACON causes breast cancer; chops clog your arteries. The headlines
are clear - if you care about your health, you shouldn't be eating
meat. Once considered the star attraction of a balanced, healthy
plate of food, meat is now linked to obesity, heart disease and
cancer. Add the environmental concerns over a growing global
appetite for meat, and it seems meat should now be an occasional
guilty pleasure rather than a daily staple, or so we are told.

Yet the evidence isn't quite as clear-cut as the headlines suggest,
and not everyone is convinced of the perils of tucking into a juicy
steak. A growing body of research - which is, perhaps
unsurprisingly, being championed by the meat industry - suggests
that recommendations to cut down on or give up meat altogether are
too restrictive and could even be doing us more harm than good. Who
should we believe, and are the dire warnings about the health risks
of eating meat justified?

The first hints that meat isn't all it's cut out to be came in the
1970s, says [18]Denis Corpet, who studies the role of diet in cancer
at the University of Toulouse in France. "Surveys started to show
that countries that eat a lot of meat see more colorectal cancer
than countries where people eat very little."

That link to cancer was more firmly established in 2007, with a
World Cancer Research Fund (WCRF) report which pulled together the
results of 14 studies, [19]concluding that red and processed meats
were "convincing causes of colorectal cancer". It suggested cutting
out processed meat altogether and eating no more than 500 grams of
red meat per week, prompting newspaper headlines such as "a sausage
a day can increase bowel cancer risk". For most other cancers, the
evidence is less convincing, says epidemiologist Teresa Norat at
Imperial College London. "The evidence is really for colorectal, and
probably stomach cancer."

Of course, meat has gained its unhealthy reputation for other
reasons as well. Two large studies published in 2012 found that
[20]the risk of dying from all causes - including bowel cancer and
heart disease - during the study follow-up period was 13 per cent
higher for people eating 85 grams of red meat per day, and 20 per
cent for those eating 85 grams of processed meat. That would
translate to roughly [21]a year off life expectancy for a
40-year-old man who eats a burger a day.

If these studies are to be believed, that's a lot of lives
potentially being shortened by meat-eating. [22]UK dietary surveys
show that 4 in 10 men and 1 in 10 women eat more than 90 grams of
red and processed meat a day on average.

But matters are complicated by the fact that studying exactly what
people put in their mouths is notoriously tricky. For the most part
researchers have had to go on what people say they eat, which can be
unreliable. And diet is intricately linked to other lifestyle
factors that affect health, not to mention the fact that studies
vary in the way they are carried out: many don't make a distinction
between different kinds of meat, for example.

Some of the most recent, large-scale research that does take these
factors into account has found little or no connection between meat
consumption and cancer or heart disease. In 2013, results emerged
from two such studies. One was the EPIC trial, which followed half a
million people in 10 European countries over 12 years, and as well
as distinguishing between consumption of red meat, white meat and
processed meat, it also controlled for factors such as smoking,
fitness, body mass index and education levels, all of which might be
correlated with high meat consumption.

Red alert

The study found no association at all between fresh red meat and ill
health, but the link with processed meat remained. It found that for
every 50 grams of processed meat people consumed each day, [23]their
risk of early death from all causes increased by 18 per cent (see
also "[24]The raw facts"). And a US study of almost 18,000 people
taking part in the National Health and Nutrition Examination Survey
(NHANES) found [25]no association between deaths from cancer or
cardiovascular disease and the consumption of meat - even processed

The NHANES findings were surprising, says [26]Sabine Rohrmann of the
University of Zurich, Switzerland, who was involved in both NHANES
and EPIC. "It was an outlier, because most studies have shown an
association." One explanation could be that the dietary
questionnaire used in NHANES was too crude. It didn't ask people
about portion sizes, simply how often they consumed red meat, so
people who said they frequently ate meat might only have been eating
small amounts.

On the other hand, there could genuinely be no association between
meat consumption and deaths in this population. "At this stage, I
don't think we have enough evidence to say that people should avoid
meat," says Rohrmann. "It's an important food, it contains B
vitamins, iron, zinc and other minerals and micronutrients. But meat
consumption shouldn't be too high."

Contrary to the advice being dished out by the WCRF, based on her
findings she wouldn't advocate abstaining from processed meats, at
least until more data is available: "My recommendation would also be
to limit it."

Even those singing the praises of meat agree with the idea of
cutting down on the processed forms. But for fresh meat, they also
point to the [27]turning tide of evidence around saturated fat, once
viewed as public enemy number one. Its supposed heart-harming effect
was one of the reasons people were told to cut meat consumption in
the 1970s. But recent studies hint that saturated fats aren't as bad
for the heart as previously thought. There are numerous benefits
from eating fresh meat too, they say, not least as the [28]most
readily available source of dietary iron.

Besides, over the last few decades, cuts of beef have become much
leaner. More than 60 per cent of beef cuts now meet the US
government guidelines for lean meat, says Shalene McNeill, a
nutritionist at the [29]National Cattlemen's Beef Association in
Denver, Colorado.

Ironically, though, it's the iron-rich component in unprocessed red
meat, rather than its fat content, which is now generating concern.
For a long time, Corpet had been trying to understand why in his
studies it was only red meat that seemed to induce pre-cancerous
changes in the bowels of mice; poultry didn't, and fish even seemed
to be protective. Then he realised the thing that makes red meat
stand out from the rest: haem.

Haem is the iron-rich, non-protein component of haemoglobin - the
substance that carries oxygen around in blood, and it is what gives
meat its red colour. To test whether haem could be the missing link,
Corpet added powdered haemoglobin to rats' food. "It had the same
effect as feeding them beefsteak - it promoted tumour growth," he
says. [30]Chicken, which contains very little haem, did not.

Haem seems to produce carcinogenic molecules by oxidising fats it
comes into contact with - both in the meat, and in vegetable oils.
"Even if I eat a very lean red meat like liver, the haem will
oxidise whatever fat I have in my salad dressing, for example," says

Other problems could arise not from the meat itself, but how it
reacts with microbes in the gut to produce potentially
artery-clogging compounds (see "[31]The raw facts"). The way we cook
meat could also make a difference. [32]Barbecuing and frying it
could contribute to ill health, since charring produces carcinogenic
compounds, and some people might be more susceptible than others.
For instance, [33]smokers with certain genetic mutations are at
greater risk of colorectal cancer if they eat a lot of well-cooked
meat compared with non-smokers eating the same amount.

So if even fresh, lean meat might be risky, is there any reason to
eat the stuff, besides it being tasty?

The nutritional components of meat can certainly be obtained from
other sources, even if it's more of a challenge. For example,
essential amino acids are found in small quantities in foods such as
peas and rice. Even so, the evidence goes against cutting out meat
altogether. Perhaps the most surprising finding from the EPIC study
was that those who ate no meat at all had a higher risk of early
death from any cause than those who ate a small amount of red meat.
"What we see from studies is that people who eat small amounts of
meat are as healthy, or maybe healthier, than vegetarians," says

Cold potato

Why is that? For a start, vegetarians don't always make healthy food
choices. And it's true that because meat has a high protein content
and contains all the essential amino acids, you need to eat less of
it than plant-based foods to get your quota. "In order to get 25
grams of protein from beef you would need to eat around 150
calories' worth," says McNeill. "You'd have to eat about 550
calories of peanut butter to get the same amount of protein. Even
beans, you'd have to eat double the calories." Reducing, rather than
removing, meat from your diet works from an environmental
perspective too (see "[34]Red meat can be green").

Indeed, for those trying to lose weight or reduce cholesterol,
incorporating a little lean red meat can help you stick to your
guns: you're perhaps more likely to keep to your diet because meat
is tasty, and the high protein content also makes you feel fuller.

All this goes against the accusation that meat must be fuelling the
rise in obesity. What's more, studies have shown that [35]you can
reduce cholesterol levels even if you eat lean red meat every day.

There may also be simple ways to minimise the risks. The EPIC trial
found that the early death risk for meat eaters who reported
consuming lots of fibre was lower than for those who ate very little
meat. Similarly, last year, a study found that when people ate cold
potatoes with their meat, a certain kind of starch called
butyrylated resistant starch, which is produced when potatoes are
cooked and then left to cool, seemed to [36]protect them against the
DNA damage to gut cells that is associated with colorectal cancer.

Such culinary tweaks could help, but they shouldn't detract from the
fact that there do seem to be genuine risks associated with red meat
- particularly the processed variety - at least when it is consumed
in abundance. "Our recommendation is that you should not eat more
than 70 grams red meat per day - which is something like eating a
portion two or three times per week," says Norat. Whether it's
better to eat a little meat each day or to save up your credits for
a weekend steak splurge remains unclear.

You might try introducing meat-free Mondays into your week, pledging
not to eat any meat or dairy food after 6 pm; or trying to use meat
just for flavouring, rather than as a key ingredient in meals. As
for how you eat it, it seems we had it right all along: go for fresh
meat and two veg, just make sure it's not chargrilled. And while
you're at it, don't forget potatoes are a dish best served cold.

Brave new meat

It's probably time to cut down on preservative-laden processed meats
like cured sausages (see main story). But you could soon be tucking
in to a safer hot dog, one in which cancer-causing preservatives are
replaced by new, plant-derived antioxidants. They have already been
shown to prevent microbes from growing in meat. What's more, the
produce had a shelf life acceptable to meat producers, with the
right colour and texture. It will be a while before these
phytochemical sausages hit the shops, though, as they need to be

In the meantime, how about heading out for a cricket burger? [38]The
first edible insect farm opened in the US last year and the critters
are protein-rich and easy on the environment. They can be reared in
a fraction of the space needed for farmyard animals, their waste
contains less polluting ammonia, and they emit fewer greenhouse

There is still the yuck factor to overcome, of course, and for now,
buying insects that taste nice costs far more than buying the
equivalent amount of steak.

Others would rather do away with whole animals, pinning their hopes
instead on [39]lab-grown cuts, which would require less than 1 per
cent of the land, consume about 4 per cent of the water and about
half the energy as the same amount of farmed beef. But many doubt
whether lab-grown meat will ever be cheap enough to produce
commercially. Plus, unlike meat from an animal, the lab-grown stuff
has no in-built immune system, so contamination is a potential
issue. Lab-produced meat also requires a product of cattle slaughter
- fetal calf serum - to grow.

Red meat can be green

Make no bones about it, current global meat consumption is a
disaster for the environment, and still consumption is rising in
many developing nations. As much as 32 per cent of greenhouse gas
emissions come from rearing livestock, a third of the world's
cultivated land is used to grow animal feed, and [40]it takes 15,500
litres of water (a small swimming pool) to produce 1 kilogram of
beef. But eliminating meat - or substituting beef for chicken or
pork - isn't necessarily the greenest option.

"There's this view that meat is vile from an environmental
perspective, but there's lots of pastureland around the world that
can't be used to grow crops, and if it's grazed properly it could be
grazed forever. We can't digest that cellulose, but cows and sheep
can," says [41]Vaclav Smil of Manitoba University in Winnipeg,
Canada, author of [42]Should We Eat Meat? The same goes for crop
residues, such as the straw and bran from grain. Smil calculates
that if we used only sustainable grazing and fed livestock on crop
residues, we could still raise about two-thirds of the meat we do

Grazing cattle and sheep also contribute to biological diversity and
are often vital components of rural livelihoods and communities,
says [43]Vicki Hird, senior campaigner for land, food and water at
Friends of the Earth in London. Chicken and pork produce fewer
greenhouse gases, but these animals eat grain and other sources of
protein that could be eaten by people instead. "The evidence makes
clear that we really just need to eat less meat, and better," Hird

The raw facts

Daily staple or public enemy?

Meat is a one-stop-shop for essential amino acids - the ones the
body needs to build proteins but can't make on its own. It is also a
rich source of vitamin B12, iron and protein, all of which are often
lacking in plant-based foods.

But the types of meat we eat, and how much, matter. We are now
eating meat in unprecedented quantities, and demand is growing,
especially in developing nations.

The kinds of meat we consume are also changing. In the UK, we are
buying less fresh meat and more meat in the form of pre-prepared
meals, which might contain added sugar, fat, salt and preservatives
[46](see graph). While there's little indication that white meats
like poultry, or fish, are a health concern, the evidence for red
processed meats like bacon, salami and ham is not encouraging (see
"Processed versus fresh").

All this raises concerns for our health and the environment.
However, eating the right kinds of meat can be beneficial for both
(see "[48]Red meat can be green").

As well as vitamins and the like,
meat contains a lot of protein for its calorie content, so although
other foods give us protein too, meat is the most efficient source.
Avoiding it could make it harder to get a healthy, balanced diet.

Linda Geddes is a consultant for New Scientist based in Bristol, UK


tt mailing list