Wednesday, August 20, 2014
It was a beautiful May afternoon when Donnel Gomes took his week-old
silver Mercedes for a spin into the city. He turned onto Broad
Street, a main thoroughfare downtown, and ... kaboom!
The car fell into a huge pothole, blowing its right tire, ripping
the front axle and knocking out the air-bag system. Cost: $3,800.
"It was a wreck," said the 48-year-old electrician, although he
reckoned he got off easy compared with a motorcyclist whom Gomes saw
thrown into the air after hitting a crater on another downtown
street. "A damn minefield," he said of traversing many of
Rhode Island has an unusually large share of shoddy highways,
streets and bridges, but it's not much better in the rest of the
The House and Senate have put forth plans to temporarily fill the
Highway Trust Fund, which is used to pay for road, bridge and public
transit construction but is expected to go broke this summer. Though
the details differ in the two proposals, the result is the same:
Congress would buy another...
The House and Senate have put forth plans to temporarily fill the
Highway Trust Fund, which is used to pay for road, bridge and public
transit construction but is expected to go broke this summer. Though
the details differ in the two proposals, the result is the same:
Congress would buy another... ( The Times editorial board )
America's transportation infrastructure, once an engine of mobility
and productivity, has fallen into such disrepair that it's become an
Consumers shell out billions of dollars for extra car repairs every
year. Insufficient and poorly maintained roads mean costly
bottlenecks for businesses, which discourage expansion and hobble
American companies competing in the global economy.
Congestion on major urban highways costs the economy more than $100
billion a year in fuel and lost work time, estimates the American
Society of Civil Engineers.
And, says said Casey Dinges, the engineering group's senior managing
director: "It's become a white-knuckle experience for many
Age is a key factor. Most of the major highways were built decades
America's transportation structures look all the more frayed next to
those in advanced economies in Europe and Japan, or in China, which
has been busily constructing high-speed rail and new airports.
lRelated Senate reports are critical of Bay Bridge construction
U.S. spending for transportation and other infrastructure accounts
for 2.4% of its economy versus about 12% for China, says economist
David Dollar, a former China director for the World Bank. Europe's
infrastructure spending is about 5%.
Dollar, now with the Brookings Institution, says visiting Chinese
officials and business leaders frequently remark how surprised they
are at America's declining infrastructure, sometimes asking whether
they can help finance improvements.
American politicians, from President Obama down to small-town
mayors, decry the deplorable condition of infrastructure, but many
are reluctant to raise taxes or boost tolls and user fees.
Between the federal government and local entities, government
spending for highways runs less than $90 billion a year, which is
barely enough to maintain the status quo, let alone improve roadway
conditions and performance.
That's partly why the share of congested highways in U.S. cities has
risen from 25% in the early 1980s to more than 40% today, according
to the Transportation Department. Roads with "acceptable ride
quality" fell from 87% in 1995 to 82% in 2010.
It's especially tough for states such as Rhode Island that have been
lagging economically and depend heavily on Uncle Sam for
transportation funds. The federal highway program is funded by an
18.4-cents-per-gallon gas tax, but that hasn't budged since 1993.
Now, the fund is on the verge of insolvency. Congress came through
last week with a last-minute replenishment of money, but it'll only
last until May.
The upshot is that states and localities make do with what they can.
A quarter of the country's 147,870 bridges are deficient or
obsolete, according to a July report by the White House on
infrastructure investment. Rhode Island's are in the worst shape in
the nation, with 57% of its bridges falling into those categories.
One of them, Middle Road Bridge in East Greenwich, south of
Providence, runs over a key freeway and leads to the town's high
school. Buses filled with students have been rumbling over the
235-foot bridge for years. When Dan Paolino and his crew began
maintenance work on it this summer, he found that some of the
supporting beams underneath were so beaten down by time and
rainwater that they were "paper-thin."
"Yeah, pretty scary," says Paolino, a superintendent at Cardi Corp.
construction, as he bends below a beam and scrapes off a layer of
rusted steel with his bare fingers.
New Englanders and those in tough-climate cities like Chicago are
used to swerving around potholes, depressions and other unsightly
road hazards caused by nature's freezing-and-thawing cycle and man's
salting of the streets.
Largely because of the brutal winter, New Jersey got nearly 10,000
reports of potholes in the year that ended June 30, more than double
the previous 12 months. Rhode Island has already paid drivers nearly
triple the amount for pothole damages last winter and spring as it
did a year earlier--and the state is still processing claims.
Some of the worst roads, however, are in sunny California. All told,
about a third of the state's public roads are in bad shape, compared
with 14% nationally.
TRIP, a transportation research group, says that works out to an
extra $832 in maintenance costs for the typical motorist in Los
Angeles, more than double the national average of $377.
Daniel Sperling, director of the Institute of Transportation Studies
at UC Davis, says California's road problem is largely twofold:
heavy use and deferred maintenance. "The amount of vehicles and
usage have gone up, but the amount of revenue has been frozen or
Some localities, including Los Angeles, have resorted to an increase
in sales tax to pay for transportation needs. But that's nowhere
near enough, Sperling says, adding that with greater fuel efficiency
requirements on cars, funding from gasoline taxes is likely to get
Leonard Lardaro, an economics professor at the University of Rhode
Island who has been tracking Rhode Island's economy for two decades,
says decaying infrastructure is a burden on the state.
Once a hub of textile production and the nation's center for costume
jewelry production, Rhode Island has shifted largely to a service
economy, as has the rest of the country. Its prosperity is tied to
healthcare, tourism and business services.
But the state has persistently lagged behind others in the recovery,
with higher unemployment and slower job growth. Lardaro says that's
Infrastructure improvements, he says, "are investment-oriented
activities. They generate future growth."
tt mailing list
* 18 August 2014 by Douglas Heaven
* Book information
* The Dark Net: Inside the digital underworld by Jamie Bartlett
* Published by: William Heinemann/Random House
* Price: £20
Jamie Bartlett's encounters with the characters behind subversive
currencies and online erotica make fascinating reading, but The Dark
Net is really about us
WHAT is the internet for? Meeting new people. Bullying them
mercilessly. Sharing a video of yourself playing guitar. Sharing a
video of yourself masturbating. Buying books. Buying drugs.
Registering a vote. Requesting a murder.
For the first users of the Arpanet - a tiny network of computers set
up to link a handful of academic institutions in the 1970s - there
was a much simpler answer. The sole point of connecting computers
was to allow the easy sharing of data. In one sense, that's still
all the internet is for. But the mess of humanity has since put its
spin on things.
There is nothing you can think of, however extreme, sordid,
outlandish or just plain weird, that someone else hasn't already put
online. Take the Assassination Market. Here you can add a name to a
list or add money to a pot attached to a name already there. If
enough people chip in and the pot for a particular name grows, or so
the idea goes, then someone, somewhere, will be motivated to claim
it. Would-be assassins just have to make an anonymous prediction of
the exact time and place of a named individual's death. If they make
it come true, they bag the fee.
The Assassination Market is Jamie Bartlett's first port of call in
The Dark Net, a travelogue around the dark side of the internet.
It gets worse. At least the Assassination Market has the whiff of a
stunt about it - to date no one has been knocked off the list. The
online forums of far-right groups like the English Defence League,
on the other hand, are much more successful at engendering the kind
of hatred that is then vented on the streets. And since Bartlett is
head of the Violence and Extremism Programme and the Centre for the
Analysis of Social Media at the think-tank Demos, you would expect
him to be a confident and well-informed guide.
He certainly finds his way to sites that won't pop up in a typical
search engine, but most of his material is gathered from the shadowy
corners of the internet's most familiar places.
Wherever you go online, nastiness is never far away. Facebook and
Twitter are as riddled with obnoxious bullies, hate-mongering
extremists and illegal activity as the off-piste terrain reachable
only with specialist browsers. "I came to realise that the unspoken
truth about the dark net... is that everything is close to the
surface," Bartlett writes. "Hidden encrypted websites and mysterious
underground drugs markets sound like they exist far beneath the
surface web of Google and Facebook. But cyberspace doesn't have
It's easy to forget that because most of us seldom break out of our
habitual browsing patterns. But we need only look at the Facebook
and Twitter updates posted by ISIS since it swept across northern
Iraq, for example, to get a jarring reminder.
But Bartlett has a trick that enables him to lift the book above
others that riff off the allure of a secret internet, making it far
more than a potpourri of pornography and pot purchasing. He goes out
of his way to meet the people behind the online personas. Writing a
book about the internet can be done without leaving the house;
writing a good one really requires the writer to get to know
Each of the book's nine chapters introduces us to a different set of
characters. So, for example, Bartlett visits Calafou, a libertarian
commune in Spain, to meet Amir Taaki. Taaki is a genius coder and
architect of Dark Wallet, an anonymous Bitcoin payment system that
he thinks just might save the world, one overthrown government at a
Then Bartlett is off again, to meet Paul, a polite and charming
fascist, and later Michael, a family man convicted for downloading a
hard-drive's worth of child pornography. Michael blames the internet
for his activities: "I cannot believe there is so much out there!
Why on earth was it so easy for me to find it?"
Bartlett gets invited by "cam-girl" Vex to witness the business end
of her live three-girl webcam show. Like any good reporter, he sits
perched next to her bed, just off camera, notepad on his knees.
By meeting the people behind the online activity, Bartlett humanises
it. And the internet is nothing if not an enabler of everything
human. It brings together people with similar interests, the nasty
and extreme just as much as the harmlessly niche.
But the internet also encourages us to think of our online selves as
less real. Sitting alone in front of a screen can make it hard to
appreciate the "reality" of your online behaviour - it's easy to do
things we would never dream of doing in the offline world.
Technology isn't neutral, Bartlett writes. "Technology extends the
power and freedom of those that use it." There have always been two
sides to this story: technology will make everything better and
technology will ruin us all. Seen as an extension of ourselves, it's
likely to do neither. We shouldn't look for salvation in any
technology, especially a blank canvas like the internet. There will
always be a dark side.
This article appeared in print under the headline "Down and very
* 18 August 2014 by Bob Holmes
Bob Holmes is a consultant for New Scientist
The Y chromosome, which makes men male, has been shrinking for 180
million years. But there's more to this rotting husk than anyone
THERE'S nothing very macho about the Y chromosome. Even though it's
what makes men male, the human Y, like its counterparts in almost
all mammals, is tiny compared with its partner, the X chromosome.
It's lost hundreds of genes - and if the Y continues to lose them,
it could someday wink out of existence entirely.
Claims of its impending demise are starting to look premature,
however. Far from being a rotting husk, the modern Y, tiny though it
is, is turning out to be a highly evolved and surprisingly important
part of men's wider genetic endowment, responsible for far more than
It is easy to see why some biologists thought the Y was destined for
oblivion: it is all on its own. There are two copies of all other
chromosomes, which are basically containers for holding genes. Each
copy acts as a backup for the other. The pairs line up and swap bits
when organisms reproduce. Some offspring get landed with chromosomes
full of damaged genes and are eliminated by natural selection,
whereas others inherit undamaged copies and survive to reproduce.
Way back in the evolutionary past, there was no Y, just a regular
pair of chromosomes. Sex was determined by environmental factors
such as temperature. But then a gene on a single chromosome mutated
in a way that made any individual that inherited it male. At first
this proto-Y could still swap genes with its partner, the proto-X
chromosome. About 180 million years ago - in the line of mammals
that branched away from the platypus and echidna - a section of it
containing the gene variant for maleness got flipped back to front.
This section no longer lined up properly with the corresponding part
of the proto-X, so damaged genes in this section could no longer be
swapped for good ones.
Further inversions put more and more of the Y beyond repair. The X
was fine because females inherit two copies that can swap parts. The
Y, however, started to lose bits because men have just one copy. The
human version now has just 78 genes, far less than its original 600
or so. At this rate of decay the Y ought to disappear altogether
within 5 million years, as famously predicted a few years ago by
Jenny Graves at La Trobe University in Melbourne, Australia.
But there is growing reason to believe that what's left of the Y is
here to stay. For one thing, even though it has lost almost all of
its original set of genes, it has gained others: we now know that 61
of the human Y's 78 genes were not present before the first
inversion took place. Almost all the new genes play a role in sperm
production, making the Y a perfect home for them. There are often
several copies of these genes, too, so there are backups.
An even stronger reason to think the Y chromosome has a bright
future comes from the discovery, by Daniel Bellott at the Whitehead
Institute in Boston, that its decay seems to have ground to a halt.
His team compared the Y chromosomes of eight mammals - human, chimp,
rhesus macaque, marmoset, mouse, rat, bull and opossum - to trace
its evolutionary history (Nature, vol 508, p 494). They found bursts
of gene loss directly after inversions happen, followed by long
periods of stability. In fact, not a single gene has been lost from
the oldest part of the human Y in the past 44 million years.
The remaining genes may simply be too essential to lose. A team led
by Henrik Kaessmann at the University of Lausanne, Switzerland,
surveyed the Y-chromosomes of 15 different mammal species and one
bird. They found that a chromosome linked with maleness evolved
three distinct times - once in birds, once in the ancestor of the
platypus and echidna, and a third time in the ancestor of all other
mammals. The ancestors of the three Ys each started with different
kinds of genes, but to Kaessmann's surprise, all ended up with a
stable set of the same sorts of genes, which is what Bellott's team
also found. "You play this evolutionary game with different sets of
genes, and you get the same kinds of genes retained in each case,"
he says. "It's always the regulatory genes that remain."
Why? When a gene is lost from the Y, males are left with one copy of
the gene, on their single X chromosome. That means less of the
protein the gene codes for gets made - roughly half the usual
dosage. Evolution can fix this in males by ramping up production
from the single X, but then their female descendants get a double
dose from their two Xs. To keep gene output the same in the two
sexes despite this difference, females have evolved to inactivate
one of their two copies of most genes on the X. Perhaps the amount
of protein produced by the regulatory genes retained on the Y had to
be so precisely calibrated that organisms couldn't survive the
awkward intermediate stage when this workaround did not yet function
perfectly, suggests James Turner at the MRC National Institute for
Medical Research in London. Regulatory genes are particularly vital
because they control many other genes.
So important are the Y genes, in fact, that even during a man's
lifetime, losing the Y in some tissues takes a toll. Chromosomes can
be lost when cells divide, and men who lose the Y chromosome in
their bone marrow - which happens in about 8 per cent of elderly men
- have a higher risk of cancer and die an average of 5.5 years
younger than other men.
All these findings seem to point in one direction. "The
rotting-Y-chromosome theory is dead," says Turner. Most people
agree. Graves remains a holdout, however, noting that many of the
genes Bellott and Kaessmann single out as essential on the human Y
have disappeared from the Y of some other mammals, suggesting they
are not so essential after all.
Nor is Graves reassured by the finding that the human Y has been
stable. "Just the fact that we've had pretty much the same Y
chromosome for millions of years doesn't mean it won't disappear
tomorrow," she says, pointing out that it already has in a few
rodents, insects and other organisms.
Whatever the human Y chromosome's ultimate fate, the new findings
are raising interesting issues. Since the Y chromosome no longer
interacts with the X, even the genes they still have in common have
been evolving separately for tens of millions of years. So might
they now play subtly different roles in the body? A Y-chromosome
copy of a regulatory gene that turns on a slightly different set of
genes in a slightly different group of cells than the X copy could
make male and female cells act quite differently from each other.
"This could have important consequences for differences in disease
prevalence between males and females," says Bellott.
Kaessmann agrees. "It's going to be very interesting to find out
what the Y genes might regulate, compared to the X copies," he says.
So not only is the Y chromosome no longer shrinking, it may be
growing in importance - at least in the minds of biologists.
This article appeared in print under the headline "Why oh Y?"
tt mailing list
* 14 August 2014 by Naomi Lubick
Naomi Lubick is a freelance writer based in Stockholm, Sweden
Water grids are ruinously expensive to build and maintain. Treating
wastewater in the home is a practical alternative - if we can get
over the yuck factor
IN A garden shed called Stanley on the bank of a muddy pond, Darren
Reynolds is about to have a drink. The pond's less-than-limpid
waters would normally flow through the surrounding reed beds to a
drainage channel. Reynolds, however, is pouring himself a perfectly
What's in his shed gives him reasonable confidence in what he is
doing: it contains a mini treatment plant that can produce drinkable
water on demand. Conceived for places with no fixed water
infrastructure, such as refugee camps, Reynolds thinks that with a
little tweaking it could be just the thing for more suburban
A professor of health and the environment at the University of the
West of England in Bristol, Reynolds is one of a band of researchers
advocating a fundamental shift from the way we pipe water now. Just
as the future of electricity is seen by some to lie with a
decentralised network of small-scale producers, the way to lessen
water woes in countries across the globe could be for each of us to
take charge of our water treatment ourselves. It is a bold vision -
but can it work?
Clean water is the most basic of necessities: in 2010 the United
Nations declared it a fundamental human right. Yet the World Health
Organization estimates that over 1 billion people are without clean
drinking water, while more than one-third of the world's 7 billion
people lack basic sanitation.
In nations such as the UK and the US, meanwhile, access to safe
drinking water may generally be easy as turning on a tap, but what
lies behind that is often forgotten. "At the turn of the last
century, putting water treatment in was how a city proved it had
made itself," says Michael Beach of the US Centers for Disease
Control and Prevention (CDC) in Atlanta, Georgia. "Go to
Philadelphia, go to Baltimore - the water treatment facilities look
like Greek temples, because it was probably one of the greatest
public health achievements we had put in place."
But since then water infrastructure has tended to lie out of sight
and out of mind - and it is crumbling. Last year alone, Thames
Water, which provides water for London and parts of south-east
England, lost 646 million litres every day to leaks from its 31,000
kilometres of pipes, enough to meet the needs of 3 million people.
The company puts the cost of replacing its entire network at upwards
of £12 billion.
In the US an estimated 240,000 mains water leaks occur each year. In
a particularly spectacular incident last month, a ruptured pipe in
Los Angeles sent some 90 million litres of water gushing down Sunset
Boulevard and through a campus of the University of California. In a
report to Congress last year, the US Environmental Protection Agency
estimated that $384 billion of infrastructure investment would be
needed to deliver drinking water across the US for the next two
decades. Some say the figures should be even higher, factoring in
the costs of protecting supplies from the more frequent extreme
weather events predicted for the future, courtesy of climate change.
Water market researcher Brent Giles of international firm Lux
Research is sceptical of these gloomiest predictions. "People who
say it will take trillions of dollars of infrastructure investment
are typically civil engineers," he says - and they love their grand
designs. Nevertheless, Giles concedes the problem is undoubtedly
Leaky pipes are expensive to fix, but also a health hazard. "Once
the water pressure is below a certain point, that's when things can
move in," says Beach. It's a problem often compounded by the
tendency of laying clean-water and wastewater pipes close to one
another. A Norwegian study in 2007 implicated leaky water pipes in
many cases of gastrointestinal illness there. In 2009/10, the most
recent years for which figures are available, the CDC tracked more
than 30 US disease outbreaks back to contaminated public water
supplies. "It's a crumbling infrastructure problem, not a country
problem," says Beach.
The fixes we have at the moment are not ideal. To combat hazardous
microbes, utility companies in the US and UK add tiny amounts of
chlorine to their treated water. Chlorine treatment in general has
raised concerns, as chemical by-products of the process have been
tied to health problems including bladder cancer and genetic
To stem leaks more quickly, companies such as IBM are developing
smarter water management systems that can monitor the rate of flow
and respond quickly to pressure drops. Quest Inspar, a company based
in Houston, Texas, has developed robots that travel through pipes to
find leaks and coat any cracks with sealant.
For Reynolds, such solutions are papering over the cracks. What is
needed, he says, is a rethink of the underlying paradigm. "We dirty
water, treat water, and then pump water, and then we do it again -
at great cost," he says. Most of our water infrastructure is in
place to deal with the 150 or so litres of wastewater each person
produces every day. According to figures from industry body Water
UK, this water is sullied by just 60 grams of organic matter - less
than 0.1 per cent of the total flushed.
If we could treat this waste closer to home, recycling dirty water
and bringing it up to drinking standard, that could reduce costs and
minimise the health risks of leakages in the wider pipe network.
Such an approach might also provide a cheaper way to ensure clean
water and sanitation for poorer countries with little established
Local water purification systems have existed for years in places
off the water grid - from remote homesteads to the International
Space Station. They come in a range of shapes and sizes, including
sand filters for rain barrels that physically trap microbes and
other contaminants, solar-powered rooftop purifiers that use light
to break them down, and pots lined with nanoscale particles of
silver that kill microbes on contact.
But such systems tend to be very small-scale, providing no more than
the minimum water requirement for a family. In his shed, Reynolds is
working on something a little bigger. He takes water from a pond on
the university campus and passes it through a series of filters and
membranes to remove impurities. Along the way, a small amount of
brine solution is added that has been zapped with a current,
releasing hydroxyl radicals, hypochlorous acid, chlorine ions and
other related substances that act as disinfectants.
This current can be created using energy collected by a solar cell.
When the current is switched off again, the disinfectant reverts to
brine, sidestepping the environmentally troublesome residues that
can be left behind by direct chlorination. Reynolds's tests have
shown that the concentration of common pathogens dropped below
detection limits with just 10 to 20 seconds of treatment. And what
works for pond water should also work for wastewater from the home.
"It could almost certainly be used for treating 'grey water', the
stuff we are flushing down the toilet or sink," he says.
He's not the only one moving in this direction. Water researcher
Jörg Drewes at the Technical University of Munich, Germany, and his
team have developed a system that attaches to a building's pipes
before water flows to the taps. It combines filtration membranes
with treatments using oxidation and ultraviolet light to break down
contaminants, and produces drinking-standard water from a variety of
Commericial products are becoming available too. Puralytics, a firm
based in Beaverton, Oregon, markets "Solar Bags" that use sunlight
to render contaminants harmless, but has also developed larger
systems that deal with higher volumes of water and use LEDs for
places where sunlight is not available. Others, such as General
Electric and the Australian company Aquacell, are developing systems
based on filtration membranes covered in a film of helpful microbes
- a common technology in large-scale wastewater plants - to "mine"
wastewater of harmful substances and treat it for reuse.
Ben Grumbles of the pressure group US Water Alliance believes we now
have the technology to make decentralised water treatment work.
"There are competing visions, to centralise or not to centralise,"
he says. "Should we have a continued reliance on modern centralised
wastewater and drinking utilities, or do we increasingly move to the
'spaceship' model, where homes and businesses are off the grid in
The most obvious benefit would be to people in those developing
countries in which there isn't much water infrastructure (see
"Sludge power"). But as far as wealthy countries are concerned,
Giles for one is less convinced that off-grid water will wash.
He thinks it makes sense to encourage local water recycling, using
lightly, locally treated recycled water for watering lawns, washing
cars and the like. But he worries that if homeowners or building
managers, rather than local authorities, must produce their own
drinking water, vulnerable populations such as the elderly and the
poor may end up with unclean water.
Beach agrees that any transition to off-grid water would need
stringent regulatory and legal frameworks. The lesson of what
happens when homeowners are responsible for their own filtration and
pipe systems comes from the 16 million households in the US that get
their water from private wells, he says. The rules say they have to
prove their wells are clean, but in practice there is little
oversight. "What that means is nothing gets done."
Drewes suggests that a switch to local water treatment would require
companies and experts - call them reverse plumbers - who either come
in to check decentralised water purification systems or use smart
meters to monitor them from afar. He and his team have recently
started investigating the scales at which each of these models would
be economical with the local purification systems they have
Other problems would need to be solved, too - not least a powerful
yuck factor when it comes to perceptions of "toilet-to-tap" water
treatment. For that reason and others, Drewes believes the most
likely move in the developed world is not a wholesale switch to
off-grid water, but some form of hybrid solution. He envisages old,
patched-up pipes continuing to distribute water in the volumes we
demand, but treated to a lower standard. At local treatment
facilities, it would undergo further treatment to bring it up to
par. Grumbles also sees the future in a hybrid model of "centralised
systems delivering water, but not fit to drink".
Reynolds is still working on his own solution. Since January, he has
been operating a larger purification system in his shed that can
pump out 3000 litres an hour, enough for 1500 people's most basic
needs. Working together with Portsmouth Aviation, he has developed a
lorry-sized version capable of treating 18 to 20 cubic metres an
hour. It is being trialled in Romania, providing water for
communities that normally depend on boreholes polluted by
contaminants including agricultural run-off. The by-products
extracted by treatment are so rich in nitrogen that they can
themselves be used as fertiliser.
It's early days for off-grid water, Reynolds admits, but the costs
of maintaining and extending existing water infrastructure will
sooner or later force a rethink, he says. "In the world of the
future, it is not going to pull us all through."
This article appeared in print under the headline "Pipe dream"
"There are 2.6 billion people in this world who don't have access to
sanitation," says Orianna Bretschger. More people have access to a
mobile phone than a toilet."
The researcher at the J. Craig Venter Institute in La Jolla,
California, is developing a small solution to that large problem.
Her microbe-based water-treatment systems pull pathogens from water
- while producing fertiliser as a by-product, and generating power
Large wastewater utilities in wealthy nations already use "activated
sludge" containing microbes to break down organic matter. But where
oxygen is readily available, nasties such as E. coli can also
thrive. The treatment must be combined with extra processes, such as
expensive reverse osmosis or large-scale chlorination, which are
impractical in the developing world.
Bretschger and her colleagues have instead concocted a cocktail of
sewage-treating bugs that thrive without oxygen. This mix of
microbes could break down water-borne organic matter while
outcompeting the nasty bugs that need oxygen. They would borrow
electrons from surrounding metal surfaces to "breathe", creating a
current that could also be used to power bathroom facilities. That
would make for a lasting and low-maintenance solution in developing
countries with little water infrastructure, Bretschger says.
Last year the philanthropic Roddenberry Foundation gifted her $5
million to continue developing the system and bring its output up to
drinking standard. Bretschger's prototype is only about the size of
coffee cup. "We need to demonstrate consistency and reliability in a
tank the size of a building," she says.
This month, her team will be assembling the components to build a
full-scale version with students in a nearby high school. That will
serve a dual purpose, she hopes: to prove the system works, and to
start to change mindsets. "Public perception has a lot to do with
the success of these 'toilet-to-tap' projects," she says. "I like to
say 'showers to flowers': all of our water is recycled, it's just a
matter of how we do it and the time it takes to get that water back
* 18:00 10 August 2014 by Hal Hodson
Try not to faint from shock. The controversial Keystone XL pipeline,
which would carry Canadian oil through the US, will make climate
change worse. It will boost global emissions of carbon dioxide by up
to 110 million tonnes per year. The finding will step up the
pressure on US president Barack Obama to stop the pipeline being
That extra CO is not a huge amount on a global scale. "But it is
a step in the wrong direction," says Jerry Schnoor of the University
of Iowa in Iowa City, who was not involved in the new analysis. "It
is an investment that will lock us into an untenable environmental
situation. It's a pipeline to nowhere, economically speaking."
Keystone XL, proposed by the Canadian energy company TransCanada, is
intended to run from Alberta, Canada, to Steel City, Nebraska. There
it would link to existing pipes, to carry oil from Canada's tar
sands to refineries on the US's Gulf of Mexico coast.
It has proved to be enormously controversial. Its supporters argue
it will boost the economy, while environmentalists say the toxic oil
could be spilled and that it encourages the use of tar sands, which
produce more greenhouse gases than normal oil.
Extra carbon dioxide
Barack Obama must decide whether to allow its construction. On 25
June 2013, he mentioned Keystone XL in a speech about climate
change. Obama said that the pipeline could be built only if it "does
not significantly exacerbate the problem of carbon pollution". Now
it seems it will.
The new study comes from Peter Erickson and Michael Lazarus of the
Stockholm Environment Institute in Seattle, Washington. They
estimated how much building Keystone XL would affect oil prices. For
every barrel of extra oil obtained from tar sands as a result of the
pipeline, global oil consumption would increase by 0.6 barrels,
because the extra oil would lower oil prices and encourage people to
"The maths works out. The model is simple and straightforward," says
Nico Bauer of the Potsdam Institute for Climate Impact Research in
It makes sense, says Schnoor. "Common sense holds that the Keystone
XL pipeline will increase supply from the Alberta oil sands region,"
he says. That's because the bottleneck in the whole system is
getting the oil to the refineries. "Canada has much more oil sands
that could be brought into production if they had infrastructure to
get it refined and to market."
Supply and demand
Yet Erickson and Lazarus's study flies in the face of the official
assessment by the US Department of State, the final version of which
was published in January this year. The DoS argued that Alberta's
oil sands would be exploited, and their carbon released, regardless
of whether or not Keystone is built.
"Their sum conclusion is that Keystone doesn't unlock the oil sands
at all," says Erickson. "They just wave their hands and say zero.
You end up scratching your head."
The US Environmental Protection Agency slammed several components of
the DoS study in April last year, when it was still a draft. The EPA
said the DoS had not accounted for the market effects of increased
oil supply through Keystone XL. "We note that the discussion...
regarding energy markets, while informative, is not based on an
updated energy-economic modelling effort," the EPA wrote.
Erickson and Lazarus's analysis confirms the EPA's suspicions. They
say the DoS study failed to account for the effect of a flood of tar
sands oil hitting the market through Keystone XL. Essentially, the
DoS ignored the fundamental economic principle of supply and demand.
When Erickson and Lazarus took this into account, it turned out
Keystone XL will be about four times more carbon intensive than the
It's not clear the emissions would be quite as big as that, says
Deepak Rajagopal of the University of California, Los Angeles. "If
this pipeline was not built, the materials and energy and labour
would [be] allocated to some other project," he says. But he says
that should not affect the overall conclusion.
Time to cut emissions
Obama is trying to cut the US's greenhouse gas emissions, for
instance clamping down on emissions from power stations. Erickson
says not building Keystone offers instant emissions cuts, of a
magnitude that the government is retooling entire industries over
many years to achieve elsewhere. "[It's] a carbon saving policy that
the US has at its fingertips," says Erickson.
"The combined effects of the standards for industrial boilers and
cement kilns is just 20 to 60 million tonnes of CO a year,"
Erickson says. "Even new power plants to be built by 2020 are
expected to save 160 to 575 million tonnes annually."
"When do we begin to stop?" asks Schnoor. "If not now, when? If one
accepts that climate change is a very serious problem, and I do, one
concludes that investing in infrastructure that will last 50 years
or more is simply not prudent.
Journal reference: Nature Climate Change, DOI: 10.1038/nclimate2335
tt mailing list
Tuesday, August 19, 2014
Date: Tue, 19 Aug 2014 06:25:30 -0400
From: Dave Farber via ip <firstname.lastname@example.org>
To: ip <email@example.com>
Subject: [IP] Computer Eyesight Gets a Lot More Accurate
---------- Forwarded message ----------
From: *Dewayne Hendricks* <firstname.lastname@example.org>
Date: Tuesday, August 19, 2014
Subject: [Dewayne-Net] Computer Eyesight Gets a Lot More Accurate
To: Multiple recipients of Dewayne-Net <email@example.com>
Computer Eyesight Gets a Lot More Accurate
By JOHN MARKOFF
Aug 18 2014
Just as the Big Bad Wolf promised Little Red Riding Hood that his bigger
eyes were "the better to see you with," a machine's ability to see the
world around it is benefiting from bigger computers and more accurate
The improvement was visible in contest results released Monday evening by
computer scientists and companies that sponsor an annual challenge to
measure improvements in the state of machine vision technology.
Started in 2010 by Stanford, Princeton and Columbia University scientists,
the Large Scale Visual Recognition Challenge this year drew 38 entrants
from 13 countries. The groups use advanced software, in most cases modeled
loosely on the biological vision systems, to detect, locate and classify a
huge set of images taken from Internet sources like Twitter. The contest
was sponsored this year by Google, Stanford, Facebook and the University of
Contestants run their recognition programs on high-performance computers
based in many cases on specialized processors called G.P.U.s, for graphic
This year there were six categories based on object detection, locating
objects and classifying them. Winners included the National University of
Singapore, the Oxford University, Adobe Systems, the Center for Intelligent
Perception and Computing at the Chinese Academy of Sciences, as well as
Google in two separate categories.
Accuracy almost doubled in the 2014 competition and error rates were cut in
half, according to the conference organizers.
"This year is really what I consider a historical year for the challenge,"
said Fei-Fei Li, the director of the Stanford Artificial Intelligence
Laboratory and one of the creators of a vast set of labeled digital images
that is the basis for the contest. "What really excites us is that
performance has taken a huge leap."
Despite the fact that contest is based on pattern recognition software that
can be "trained" to recognize objects in digital images, the contest itself
is made possible by the Imagenet database, an immense collection of more
than 14 million images that have been identified by humans. The Imagenet
database is publicly available to researchers at http://image-net.org/.
In the five years that the contest has been held, the organizers have
twice, once in 2012 and again this year, seen striking improvements in
accuracy, accompanied by more sophisticated algorithms and larger and
In 2012 the contest was won by Geoffrey E. Hinton, a cognitive scientist at
the University of Toronto, and two of his students. Mr. Hinton is a pioneer
in the field of artificial neural networks, and in 2013 he joined Google
with his students Alex Krizhevsky and Ilya Sutskever.
This year the entrants had the option of either disclosing the details of
their algorithms or keeping them proprietary, and all of the winning groups
chose to share details of their technical innovations. That was
significant, according to Dr. Li, because it is possible to move quickly
from research to commercial applications.
Machine vision has countless applications, including computer gaming,
medical diagnosis, factory robotics and automotive safety systems. Recently
a number of carmakers have added the ability to recognize pedestrians and
bicyclists and stop automatically without driver intervention.
Dewayne-Net RSS Feed: <http://dewaynenet.wordpress.com/feed/>
RSS Feed: https://www.listbox.com/member/archive/rss/247/25094221-ddf8422b
Powered by Listbox: http://www.listbox.com
Monday, August 18, 2014
As E-Book Subscription Services Grow Their Catalogs, the Age-Old Institution
By GEOFFREY A. FOWLER
Updated Aug. 12, 2014 1:46 p.m. ET
Amazon, Scribd and Oyster now sell all-you-can-eat ebook subscriptions. But
Personal Tech columnist Geoffrey A. Fowler found a free alternative with a
much better digital selection: Your local public library.
A growing stack of companies would like you to pay a monthly fee to read
e-books, just like you subscribe to Netflix NFLX +1.00% to binge on movies
and TV shows.
Don't bother. Go sign up for a public library card instead.
Really, the public library? Amazon.com AMZN +3.36% recently launched Kindle
Unlimited, a $10-per-month service offering loans of 600,000 e-books.
Startups called Oyster and Scribd offer something similar. It isn't very
often that a musty old institution can hold its own against tech disrupters.
But it turns out librarians haven't just been sitting around shushing people
while the Internet drove them into irrelevance. More than 90% of American
public libraries have amassed e-book collections you can read on your iPad,
and often even on a Kindle. You don't have to walk into a branch or risk an
overdue fine. And they're totally free.
Publishers have come to see libraries not only as a source of income, but
also as a marketing vehicle. Emily Prapuolenis/The Wall Street Journal
Though you still have to deal with due dates, hold lists and occasionally
clumsy software, libraries, at least for now, have one killer feature that
the others don't: e-books you actually want to read.
To compare, I dug up best-seller lists, as well as best-of lists compiled by
authors and critics. Then I searched for those e-books in Kindle Unlimited,
Oyster and Scribd alongside my local San Francisco Public Library. To rule
out big-city bias, I also checked the much smaller library where I grew up
in Richland County, S.C.
Of the Journal's 20 most recent best-selling e-books in fiction and
nonfiction, Amazon's Kindle Unlimited has none--no "Fifty Shades of Grey,"
no "The Fault in Our Stars." Scribd and Oyster each have a paltry three. But
the San Francisco library has 15, and my South Carolina library has 11.
From Amazon's own top-20 Kindle best-seller lists from 2013, 2012 and 2011,
Kindle Unlimited has no more than five titles a year, while the San
Francisco library has at least 16.
Of Stephen King's 2012 list of his all-time 10 favorite books, Amazon and
the other subscription services have four, including classics such as George
Orwell's "1984" and Charles Dickens's "Bleak House." But the San Francisco
library has all of those, as well as Salman Rushdie's "The Satanic Verses"
and William Golding's "Lord of the Flies" for a total of eight. (My South
Carolina library also only has four.)
You will certainly find things to read on all of these paid services. For $9
per month, Scribd offers a slightly superior browsing experience and
collection, especially if you're the kind of reader who goes deep into a
genre. Of one critic's list of the "10 best vampire novels no one has read,"
Scribd has four, while Oyster has three and Amazon has none. My South
Carolina library has two, but the San Francisco library just has one.
But who needs to pay for a "Netflix for books" subscription? I am a little
awed by binge book readers. Still, not everyone is so voracious that they
could guarantee reading $120 worth of e-books in a year.
How They Stack Up
E-book subscription services don't always have the big-name e-books
available at some public libraries. Below, a comparison in the availability
of books on three services--Oyster, Kindle Unlimited and Scribd--with the
public libraries in San Francisco and Richland County, S.C. We compared
Amazon's top 20 best-sellers on Kindle from 2013, as well as a more esoteric
list of author Stephen King's 10 favorites.
Another argument against shelling out for Kindle Unlimited comes from Amazon
itself: If you own a Kindle device and subscribe to Amazon Prime, you
already get one e-book loan a month as part of the service.
The subscription companies say their services are designed to let you try
more books without the barrier of committing to buy. But none of these
services yet feel as complete as Spotify, the $10-per-month all-you-can-eat
music streaming service I used to explore the latest Miley Cyrus album
without the regret of buying it.
How did library e-book collections get such a leg up? Amazon is locked in a
hate-hate relationship with many publishers, so none of the five largest
will sell their whole collection to Amazon for its subscription service.
(Amazon bought a few big titles like the Harry Potter and Hunger Games
series, has 500 books already in the public domain and pads out the rest
with back-catalog and self-published books to reach the 600,000 tally it
touts.) And so far only two of the big publishers will sell even part of
their collections to startups Oyster and Scribd.
Over at the library, the situation is different. All of the big five
publishers sell their e-book collections for loans, usually on the same day
they're available for consumers to purchase. They haven't always been so
friendly with libraries, and still charge them a lot for e-books. Some
library e-books are only allowed a set number of loans before "expiring."
Publishers have come to see libraries not only as a source of income, but
also as a marketing vehicle. Since the Internet has killed off so many
bookstores, libraries have become de facto showrooms for discovering books.
I have a soft spot for public libraries. I grew up reading at the one where
my mom, now retired, worked. Like many, I hadn't used my library card much
since I started reading books on screens. But in the past few weeks,
discovering my library's e-book collection helped me reconnect with the
power of the library card I felt when I was young.
It's easier than you might think. At the typical public library, you need
only log in with your card number and a PIN to its e-book collection, then
search through the online catalog.
During online checkout, many will give you the choice to zap your borrowed
book directly to a Kindle reader, a tablet or phone app, or a computer
screen. When it's time to "return" the e-book, it just disappears.
A reading tablet at a library in San Francisco. Emily Prapuolenis/The Wall
In exchange for this free access, you have to accept a bit more hassle. Your
loans may expire after 21 days or less, but you can recheck them out. Some
libraries have multiple e-book collections that you have to search and learn
to use. Most aren't as pretty as Scribd or Oyster, which let you scroll
through large images of book covers to find something that suits your fancy.
The most legitimate argument against libraries is the wait list: About half
the e-books I surveyed were checked out. This required placing a "hold" that
could last a few weeks, or sometimes even months. The smaller your library,
the less likely it could afford extra digital copies. (San Francisco's
library has some tech books and travel guides where publishers allow
unlimited simultaneous checkouts.)
What's available will depend on your community tax dollars--many libraries
took a hit in the last recession. Some communities have banded together to
create larger e-libraries; for example, all Colorado residents can use the
Denver Public Library.
Libraries' current collection advantage, borne of those publisher contracts,
isn't likely to last forever. Publishers may resolve their squabbles with
Amazon or come to see paid subscriptions as a lucrative new market. What
would happen to libraries if they truly had to go head-to-head with Amazon?
Ultimately, the winner will be the service that offers the most convenient
access to the best array of e-books.
But libraries serve nobler purposes than just amassing vampire romances.
They provide equal access to knowledge, from employment services to computer
training. And in an age where getting things "free" usually means
surrendering some privacy, libraries have long been careful about protecting
The rise of paid subscription services is proof that there's demand for what
libraries can offer in our Internet era.
Write to Geoffrey A. Fowler at firstname.lastname@example.org or on Twitter at
tt mailing list