[Rachel’s introduction: Since the dawn of the Industrial Revolution, we
have assumed that economic activity tends to create a net benefit for
society as a whole, even if some people are harmed and the environment
is degraded. But now there is evidence that the entire planet and all
its inhabitants are imperiled by the total size of the human
enterprise. As a result, the
precautionary principle

has arisen as an alternative way to balance our priorities. Two
overarching decision rules are competing for supremacy–“trust in
economic growth” vs. “precaution.” Europe is edging toward precaution.
The U.S. is, so far, sticking with “trust in economic growth.”]

By Joseph H. Guth

[Joseph H. Guth, JD, PhD, is Legal Director of the Science and Environmental Health Network.
He holds a PhD in biochemistry from University of Wisconsin (Madison),
and a law degree from New York University. His work has appeared
previously in Rachel’s #846, #861, #892, #901 and #905.]

Everyone knows the role of law is to control and guide the economy.
From law, not economics, springs freedom from slavery, child labor and
unreasonable working conditions. Law, reflecting the values we hold
dear, governs our economy’s infliction of damage to the environment.

Our law contains what might be called an overarching environmental
decision rule that implements our social choices. The structure of this
decision rule is an intensely political issue, for the people of our
democracy must support its far-reaching consequences. Today we (all of
us) are rethinking our current environmental decision rule, which our
society adopted in the course of the Industrial Revolution.

The “trust in economic growth” environmental decision rule

Our overarching environmental decision rule (which is also prevalent in
much of the rest of the world) constitutes a rarely-stated balance of
social values that is hard to discern even though it pervades every
aspect of our society.

This decision rule relies extensively on cost-benefit analysis and risk
assessment, but the decision rule itself is even broader in scope. The
foundation of the rule is the assumption that economic activity usually
provides a net benefit to society, even when it causes some damage to
human health and the environment. (This conception of “net benefit”
refers to the effect on society as a whole, and does not trouble itself
too much with the unequal distribution of costs and benefits.)

From this assumption, it follows that we should allow all economic
activities, except those for which someone can prove the costs outweigh
the benefits. This, then, is the prevailing environmental decision rule
of our entire legal system: economic activity is presumed to provide a
net social benefit even if it causes some environmental damage, and
government may regulate (or a plaintiff may sue) only if it can carry
its burden of proof to demonstrate that the costs outweigh the benefits
in a particular case. Let’s call this the “trust in economic growth”
decision rule.

The “precautionary principle” environmental decision rule

The “precautionary principle” is equal in scope to the “trust in
economic growth” decision rule, but incorporates a profoundly different
judgment about how to balance environment and economic activity when
they come into conflict. Under this principle, damage to the
environment should be avoided, even if scientific uncertainties remain.
This rule implements a judgment that we should presumptively avoid
environmental damage, rather than presumptively accept it as we do
under the “trust in economic growth” rule.

The role of “cost-benefit” analysis

“Cost-benefit” analysis is a subsidiary tool of both decision rules.
However, it is used in very different contexts under the two rules. It
can sometimes be employed under the “precautionary” decision rule as a
way to compare and decide among alternatives. But under the “trust in
economic growth” decision rule, cost-benefit analysis appears as the
only major issue and often masquerades as the decision rule itself.
This is because most of our laws implicitly incorporate the unstated
presumption that economic activities should be allowed even if they
cause environmental damage. By and large, our laws silently bypass that
unstated presumption and start out at the point of instructing
government to develop only regulations that pass a cost-benefit test.

Thus, the foundational presumption of the “trust in economic growth”
decision rule is simply accepted as a received truth and is rarely
examined or even identified as supplying our law’s overarching context
for cost-benefit calculations. Almost all economists probably agree
with it (except those few who are concerned with the global human
footprint and are trying to do full cost accounting for the economy as
a whole).

The role of “sound science”

How does science, particularly “sound science,” relate to all this?
Science supplies a critical factual input used by governments and
courts in implementing environmental decision rules. Science is
employed differently by the two decision rules, but science does not
constitute or supply a decision rule itself. Like cost-benefit
analysis, science is a subsidiary tool of the decision rules and so
cannot properly be placed in “opposition” to either decision rule. A
claim that the precautionary principle, as an overarching environmental
decision rule implementing a complex balancing of social values, is in
“opposition” to science is a senseless claim.

The phrase “sound science” represents the proposition that a scientific
fact should not be accepted by the legal system unless there is a high
degree of scientific certainty about it. It is a term used by industry
in regulatory and legal contexts and is not commonly used by scientists
while doing scientific research. However, it resonates within much of
the scientific community because it is a call to be careful and

“Sound science” also represents a brake on the legal system’s
acceptance of emerging science, of science that cuts across
disciplines, and of science that diverges from the established
traditions and methodologies that characterize many specific
disciplines of science. “Sound science” encourages scientists who are
concerned about the world to remain in their silos, to avoid looking at
the world from a holistic viewpoint, and to avoid professional risks.

But, why does it work for industry? The call for government and law to
rely only on “sound science” when they regulate is a call for them to
narrow the universe of scientific findings that they will consider to
those that have a high degree of certainty.

This serves industrial interests under our prevailing “trust in
economic growth” decision rule because it restricts the harms to human
health and the environment that can be considered by government and law
to those that are sufficiently well established to constitute “sound

Because the burden of proof is on government, requiring government to
rely only on facts established by “sound science” reduces the scope of
possible regulatory activity by making it harder for government to
carry its burden to show that the benefits of regulation (avoidance of
damage to health and environment) outweigh the costs to industry.
Exactly the same dynamic is at play when plaintiffs try to carry their
burden of proof to establish legal liability for environmental damage.

Shifting the burden of proof would shift the effect of “sound science”

“Sound science” can help industrial interests under a precautionary
decision rule, but it also contains a seed of disaster for them.

Precaution is triggered when a threat to the environment is identified,
so that the more evidence required to identify a threat as such, the
fewer triggers will be pulled. While the precautionary principle is
designed to encourage environmental protection even in the face of
uncertainty, those opposed to environmental protection urge that the
threshold for identification of threats should require as much
certainty as possible, and preferably be based only on “sound science.”

The seed of disaster for industrial interests is this: the burden of
proof can be switched under the precautionary principle (so that when a
threat to the environment is identified the proponent of an activity
must prove it is safe — just as a pharmaceutical company must prove
that a drug is safe and effective before it can be marketed). When that
happens, a call for “sound science” actually would cut against such
proponents rather than for them. This is because proponents of an
activity would have to provide proof of safety under a “sound science”
standard. In other words, the call for “sound science” creates higher
burdens on those bearing the burden of proof. In fact, while debates
about “sound science” masquerade as debates about the quality of
science, the positions that different actors take are actually driven
entirely by the underlying legal assignment of the burden of proof.

Why precaution? Because of cumulative impacts.

One of the reasons for adopting the precautionary principle, rather
than the “trust in economic growth” decision rule, is “cumulative

The foundational assumption of the “trust in economic growth” rule
(that economic activity is generally to the net benefit of society,
even if it causes environmental damage) is further assumed to be true
no matter how large our economy becomes. To implement the “trust in
economic growth” rule, all we do is eliminate any activity without a
net benefit, and in doing this we examine each activity independently.
The surviving economic activities, and the accompanying cost-benefit-
justified damage to the environment, are both thought to be able to
grow forever.

Not only is there no limit to how large our economy can become, there
is no limit as to how large justified environmental damage can become
either. The “trust in economic growth” decision rule contains no
independent constraint on the total damage we do to Earth — indeed the
core structure of this decision rule assumes that we do not need any
such constraint. People who think this way see no need for the
precautionary principle precisely because they see no need for the
preferential avoidance of damage to the environment that it embodies.

But, as we now know, there is in fact a need for a limit to the damage
we do to earth. Unfortunately, the human enterprise has now grown so
large that we are running up against the limits of the Earth — if we
are not careful, we can destroy our only home. (Examples abound: global
warming, thinning of Earth’s ozone shield, depletion of ocean
fisheries, shortages of fresh water, accelerated loss of species, and
so on.)

And it is the cumulative impact of all we are doing that creates this
problem. One can liken it to the famous “straw that broke the camel’s
back.” At some point “the last straw” is added to the camel’s load, its
carrying capacity exceeded. Just as it would miss the larger picture to
assume that since one or a few straws do not hurt the camel, straw
after straw can be piled on without concern, so the “trust in economic
growth” decision rule misses the larger picture by assuming that
cost-benefit-justified environmental damage can grow forever.

Thus, it is the total size of our cumulative impacts that is prompting
us to revisit our prevailing decision rule. This is why we now need a
decision rule that leads us to contain the damage we do. It is why we
now must work preferentially to avoid damage to the Earth, even if we
forego some activities that would provide a net benefit if we lived in
an “open” or “empty” world whose limits were not being exceeded. We can
still develop economically, but we must live within the constraints
imposed by Earth itself.

Ultimately, the conclusion that we must learn to live within the
capacity of a fragile Earth to provide for us, painful as it is, is
thrust upon us by the best science that we have — the science that
looks at the whole biosphere, senses the deep interconnections between
all its parts, places us as an element of its ecology, recognizes the
time scale involved in its creation and our own evolution within it,
and reveals, forever incompletely, the manifold and mounting impacts
that we are having upon it and ourselves.


From: Environment News service

July 27, 2007


[Rachel’s introduction: Here we have new evidence that the timing of a
toxic exposure can be as important as the magnitude of the exposure.
This finding is undermining the traditional mantra of toxicology, “The
dose makes the poison.” Evidently, it is the dose and the timing of exposure that makes the poison.]

Geneva, Switzerland — An increased risk of cancer, heart and lung
disease in adults can result from exposures to certain environmental
chemicals during childhood, the World Health Organization said today.
This finding is part of the first report
ever issued by the agency focusing on children’s special susceptibility
to harmful chemical exposures at different stages of their growth.

Air and water contaminants, pesticides in food, lead in soil, as well
many other environmental threats which alter the delicate organism of a
growing child may cause or worsen disease and induce developmental
problems, said the World Health Organization, WHO, releasing the report
at its Geneva headquarters.

The peer-reviewed report highlights the fact that in children, the
stage in their development when exposure to a threat occurs may be just
as important as the magnitude of the exposure.

“Children are not just small adults,” said Dr. Terri Damstra, team
leader for WHO’s Interregional Research Unit. “Children are especially
vulnerable and respond differently from adults when exposed to
environmental factors — and this response may differ according to the
different periods of development they are going through.”

“For example, their lungs are not fully developed at birth, or even at
the age of eight, and lung maturation may be altered by air pollutants
that induce acute respiratory effects in childhood and may be the
origin of chronic respiratory disease later in life,” Dr. Damstra said.

Over 30 percent of the global burden of disease in children can be attributed to environmental factors, the WHO study found.

The global health body said this report is the most comprehensive work
yet undertaken on the scientific principles to be considered in
assessing health risks in children.

The work was undertaken by an advisory group of 24 scientific experts,
representing 18 countries. They were convened by WHO to provide
insight, expertise, and guidance, and to ensure scientific accuracy and
objectivity. Once the text was finalized, it was sent to more than 100
contact points throughout the world for review and comment, and also
made available on WHO’s International Programme of Chemical Safety
website for external review and comment.

The central focus of the study is on the child from embryo through
adolescence and on the need to have a good understanding of the
interactions between exposure, biological susceptibility, and
socioeconomic and nutritional factors at each stage of a child’s

The scientific principles proposed in the document for evaluating
environmental health risks in children will help the health sector,
researchers and policy makers to protect children of all ages through
improved risk assessments, appropriate interventions and focused
research to become healthy adults.

Children have different susceptibilities during different life stages,
due to their dynamic growth and developmental processes, the authors

Health effects resulting from developmental exposures prenatally and at
birth may include miscarriage, still birth, low birth weight and birth

Young children may die or develop asthma, neurobehavioral and immune
impairment. Adolescents may experience precocious or delayed puberty.

The vulnerability of children is increased in degraded and poor
environments, the report confirms. Neglected and malnourished children
suffer the most. These children often live in unhealthy housing, lack
clean water and sanitation services, and have limited access to health
care and education. For example, lead is known to be more toxic to
children whose diets are deficient in calories, iron and calcium.

WHO warns, “One in five children in the poorest parts of the world will
not live longer than their fifth birthday — mainly because of
environment-related diseases.”

This new volume of the Environmental Health Criteria series, Principles
for Evaluating Health Risks in Children Associated with Exposure to
Chemicals, is online here.

Copyright Environment News Service (ENS) 2007


From: Voice of America

August 8, 2007


[Rachel’s introduction: Scientists are concerned about the health risks
of an ingredient in many of the plastic containers used for food and
drinks. Bisphenol A — or BPA — is a common chemical used to make
tough, shatter-resistant plastic. Now there is a growing concern among
scientists about the health risks of BPA, especially for children.]

By Melinda Smith

Bisphenol A is everywhere. BPA, as it is also called, can be found in
the plastic milk bottle used to feed your baby, the cola bottle or food
container you pick up for a fast food meal, the kidney dialysis machine
patients need to keep them alive, and the dental sealant used to help
prevent tooth decay.

A recent study
in the journal Reproductive Toxicology raises concerns about adverse
effects of BPA in fetal mice, at levels even lower than U.S. government
standards permit.

The scientists say widespread exposure through food and liquid
containers occurs when the chemical bonds which hold bisphenol A in the
plastic degrades. BPA then leaches into the containers.

Frederick vom Saal, a professor of biological sciences at the
University of Missouri, says he is concerned about what we absorb and
then transmit to our infants. “Very low doses of this — below the
amounts that are present in humans. When particularly exposure occurs
in fetuses, in fetuses and newborns, you end up with those babies
eventually developing prostate cancer, breast cancer — they become

Kimberly Lisack is a mother who wants more information about what she
feeds her baby. “I get concerned looking at a lot of the packaged baby
foods. It’s a lot of chemicals and things I don’t know what they are.”

Bisphenol A has been produced in polycarbonate plastic for decades. A
statement released by the American Chemistry Council says the report is
at odds with other international studies that say BPA levels pose no
health risks to consumers.


From: OnEarth

August 1, 2007


[Rachel’s introduction: Discoveries about the impact of the environment
on our genes could revolutionize our concept of illness.]

by Laura Wright

Martha Herbert, a pediatric neurologist at Boston’s Massachusetts
General Hospital, studies brain images of children with autism. She was
seeing patients one day a few years ago when a 3-year-old girl walked
in with more than the usual cognitive and behavioral problems. She was
lactose intolerant, and foods containing gluten always seemed to upset
her stomach. Autistic children suffer profoundly, and not just in their
difficulty forming emotional bonds with family members, making friends,
or tolerating minor deviations from their daily routines. Herbert has
seen many young children who’ve had a dozen or more ear infections by
the time they made their way through her door, and many others — “gut
kids” — with chronic diarrhea and other gastrointestinal problems,
including severe food allergies. Such symptoms don’t fit with the
traditional explanation of autism as a genetic disorder rooted in the
brain, and that was precisely what was on Herbert’s mind that day.
She’s seen too many kids whose entire systems have gone haywire.

During the course of the little girl’s appointment, Herbert learned
that the child’s father was a computer scientist — a bioinformatist no
less, someone trained to crunch biological data and pick out patterns
of interest. She shared with him her belief that autism research was
overly focused on examining genes that play a role in brain development
and function, to the exclusion of other factors — namely, children’s
susceptibility to environmental insults, such as exposure to chemicals
and toxic substances. Inspired by their conversation, Herbert left the
office that day with a plan: She and the girl’s father, John Russo,
head of computer science at the Wentworth Institute of Technology,
would cobble together a team of geneticists and bioinformatists to root
through the scientific literature looking for genes that might be
involved in autism without necessarily being related to brain
development or the nervous system.

The group scanned databases of genes already known to respond to
chemicals in the environment, selecting those that lie within sequences
of DNA with suspected ties to autism. They came up with more than a
hundred matches, reinforcing Herbert’s belief that such chemicals
interact with specific genes to make certain children susceptible to

Although some diseases are inherited through a single genetic mutation
— cystic fibrosis and sickle cell anemia are examples — the classic
“one gene, one disease” model doesn’t adequately explain the complex
interplay between an individual’s unique genetic code and his or her
personal history of environmental exposures. That fragile web of
interactions, when pulled out of alignment, is probably what causes
many chronic diseases: cancer, obesity, asthma, heart disease, autism,
and Alzheimer’s, to name just a few. To unravel the underlying
biological mechanisms of these seemingly intractable ailments requires
that scientists understand the precise molecular dialogue that occurs
between our genes and the environment — where we live and work, what
we eat, drink, breathe, and put on our skin. Herbert’s literature scan
was a nod in this direction, but actually teasing out the answers in a
laboratory has been well beyond her or anyone else’s reach — until now.

Consider for a moment that humans have some 30,000 genes, which
interact in any number of ways with one or more of the 85,000
synthetic, commercially produced chemicals, as well as heavy metals,
foods, drugs, myriad pollutants in the air and water, and anything else
our bodies absorb from the environment. The completion of the Human
Genome Project in 2003 armed scientists with a basic road map of every
gene in the human body, allowing them to probe more deeply into the
ways our DNA controls who we are and why we get sick, in part by
broadening our understanding of how genes respond to external factors.
In the years leading up to the project’s completion, scientists began
developing powerful new tools for studying our genes. One is something
called a gene chip, or DNA microarray, which came about through the
marriage of molecular biology and computer science. The earliest
prototype was devised about a decade ago; since then these tiny
devices, as well as other molecular investigative tools, have grown
exponentially in their sophistication, pushing medical science toward a
new frontier.

Gene chips are small, often no larger than your typical domino or glass
laboratory slide, yet they can hold many thousands of genes at a time.
Human genes are synthesized and bound to the surface of the chip such
that a single copy of each gene — up to every gene in an organism’s
entire genome — is affixed in a grid pattern. The DNA microarray
allows scientists to take a molecular snapshot of the activity of every
gene in a cell at a given moment in time.

The process works this way: Every cell in your body contains the same
DNA, but DNA activity — or expression — is different in a liver cell,
say, than it is in a lung, brain, or immune cell. Suppose a scientist
wishes to analyze the effect of a particular pesticide on gene activity
in liver cells. (This makes sense, since it is the liver that processes
and purges many toxins from the body.) A researcher would first expose
a liver cell culture in a test tube to a precise dose of the chemical.
A gene’s activity is observed through the action of its RNA, molecules
that convey the chemical messages issued by DNA. RNA is extracted from
the test tube, suspended in a solution, then poured over the gene chip.
Any given RNA molecule will latch on only to the specific gene that
generated it. The genes on the chip with the most RNA stuck to them are
the ones that were most active in the liver cells, or most “highly
expressed.” The genes that don’t have any RNA stuck to them are said to
be “turned off” in those cells. Scientists use the microarray to
compare the exposed cells to non-exposed, control cells (see sidebar).
Those genes that show activity in the exposed cells but not in the
control cells, or vice versa, are the ones that may have been most
affected by the pesticide exposure.

DNA microarrays open the door to an entirely new way of safety-testing
synthetic chemicals: Each chemical alters the pattern of gene activity
in specific ways, and thus possesses a unique genetic fingerprint. If a
chemical’s genetic fingerprint closely matches that of another
substance already known to be toxic, there is good reason to suspect
that that chemical can also do us harm. Ultimately, government agencies
charged with regulating chemicals and protecting our health could use
this method, one aspect of a field called toxicogenomics, to wade
through the thousands of untested or inadequately studied chemicals
that circulate in our environment. In other words, these agencies could
make our world safer by identifying — and, one hopes, banning —
hazardous substances.

For such a young field, toxicogenomics has already begun to challenge
some fundamental assumptions about the origins of disease and the
mechanisms through which chemicals and various environmental exposures
affect our bodies. Consider the case of mercury, which was identified
as poisonous many centuries ago. Its potential to wreak havoc on the
human nervous system was most tragically demonstrated in the mass
poisoning of the Japanese fishing village of Minamata in the 1950s.
More recently, scientists have begun to amass evidence suggesting that
mercury also harms the immune system. In 2001, Jennifer Sass, a
neurotoxicologist and senior scientist at the Natural Resources Defense
Council (NRDC), who was then a postdoctoral researcher at the
University of Maryland, designed an experiment that included the use of
microarrays and other molecular tools to figure out how, exactly,
mercury was interfering with both our nervous and immune systems. She
grew cells in test tubes — one set for mouse brain cells, another for
mouse liver cells — and exposed them to various doses of mercury so
that she could see which genes were being switched on and off in the
presence of the toxic metal. In the brain and the liver cells, she
noticed unusual activity in the gene interleukin-6, which both responds
to infection and directs the development of neurons.

“We thought we had mercury figured out,” says Ellen Silbergeld, a
professor of environmental health sciences at Johns Hopkins University,
who collaborated with Sass on the study. Genomic tools may identify
effects of other chemicals by allowing scientists to “go fishing,” as
Silbergeld puts it, for things they didn’t know to look for.

The findings of Sass, Silbergeld, and others indicate that mercury
might play a role in the development of diseases involving immune
system dysfunction. These diseases perhaps include autism — think of
Herbert’s patients with their inexplicable collection of infections and
allergies — but also the spate of autoimmune disorders that we can’t
fully explain, from Graves’ disease and rheumatoid arthritis to
multiple sclerosis and lupus.

“Do we need to reevaluate our fish advisories?” Silbergeld asks. “Are
our regulations actually protecting the most sensitive people?” We
target pregnant women and children because we’ve presumed that
mercury’s neurotoxic effects are most damaging to those whose brains
are still developing. Sass and Silbergeld’s findings don’t contradict
that assumption, but they do suggest that there might be other adults
who are far more vulnerable than we’d realized — who simply can’t
tolerate the more subtle effect the metal has on their immune system
because of a peculiarity in their genetic makeup. Designing fish
advisories for those people, whose sensitivities are coded in their
DNA, is a challenge we’ve never tackled before.

Translating new findings about how chemicals affect gene activity into
something of broader public health value will require that we
understand precisely the tiny genetic differences among us that make
one person or group of people more vulnerable than others to certain
environmental exposures. One way to do that is by slightly modifying
the gene chip to allow researchers to scan up to a million common
genetic variants — alternate spellings of genes, so to speak, that
differ by just a single letter — to look for small differences that
might make some people more likely to get sick from a toxic exposure.

Our attempts to identify those who are most genetically susceptible to
developing a particular disease as a result of environmental exposures
have already yielded important insights. Patricia Buffler, dean emerita
of the School of Public Health at the University of California,
Berkeley, has found that children with a certain genetic variant may be
susceptible to developing leukemia in high-traffic areas, where they’re
likely to be exposed to benzene compounds in auto exhaust. Other
studies have found that a particular genetic variation in some women
who drink chlorinated municipal water leads to an increased likelihood
that they’ll give birth to underweight babies. Still others have found
that a specific version of an immune gene, HLA-DP, renders people
vulnerable to the toxic effects of the metal beryllium, which causes a
chronic lung condition in the genetically sensitive population. This
particular vulnerability raises some sticky workplace issues. Toxic
exposure to beryllium occurs almost exclusively in industrial settings
where welders and other machinists come in contact with the metal while
making defense industry equipment, computers, and other electronics.
Should employers test their workers for genetic variants that may put
them at risk for developing a disease? Could that information be used
to bar someone from a job? Such ethical considerations, and their legal
and public policy ramifications, will only multiply as we learn more.

But first, a more fundamental question: Do we even understand what
today’s chronic diseases are? It is beginning to appear that what we
call autism may in fact be many illnesses that we’ve lumped together
because those who are afflicted seem to behave similarly. Doctors base
their diagnosis on behavioral symptoms, not on what caused those
symptoms. Some scientists now refer to the condition as “autisms,”
acknowledging that we’ve yet to find a single, unifying biological
mechanism, despite the identification, in some studies, of a handful of
genes that may confer increased vulnerability. But then, genes or
environmental exposures that appear to be important causal factors in
one study may not show up at all in another. This leaves scientists to
wonder whether the problem isn’t that the disease is so diverse in its
biological origins that only a truly massive study — involving many
thousands of patients — would have the statistical power to tease
apart the various factors involved.

The same difficulty probably holds true for many chronic diseases,
explains Linda Greer, a toxicologist and director of the health program
at NRDC. “What we think of as asthma, for example, is probably not one
condition at all. It’s probably many different diseases that today we
simply call asthma.” Seemingly contradictory explanations for the
epidemic could all turn out to be true. Until we are able to sift out
what makes one asthmatic child different from the next — how and why
their respective molecular makeups differ — treatments or preventive
measures that work for one child will continue to fail for another.

At the Centers for Disease Control and Prevention, Muin Khoury, the
director of the National Office of Public Health Genomics, has created
theoretical models to try to figure out just how many different factors
may be involved in most chronic diseases. His findings suggest that
some combination of 10 to 20 genes plus a similar number of
environmental influences could explain most of the complex chronic
diseases that plague the population. But to analyze how even a dozen
genes interact with a dozen environmental exposures across large
populations requires vast studies: immense numbers of people and huge
volumes of data — everything from precise measurements of gene
activity inside cells to exact recordkeeping of subject” exposure to
environmental hazards. Microarrays and other molecular tools now make
such studies possible.

In 2003, Columbia University and the Norwegian government together
launched the Autism Birth Cohort, one of the largest autism
investigations in history. The study will track 100,000 Norwegian
mothers and children — from before birth through age 18 — collecting
clinical data, blood, urine, and other biological materials. It will
also collect RNA in order to analyze gene activity. Though initial
results are due in 2011, it will take decades to complete this study,
and RNA samples will have to be painstakingly archived while the
investigators await additional funding. Although the current study is
not focused on environmental health per se, researchers plan to measure
a variety of biological exposures — including infection, environmental
toxins, and dietary deficiencies — in each mother and child. As the
children grow up, and as some among them develop disease, scientists
will have complete records to analyze for key commonalities and
differences. Which genes do the sick children have in common? Which
chemical exposures were most meaningful? The answers may provide clues
not only to the origins of autism, but to many other disorders, from
cerebral palsy to asthma to diabetes. Other archiving projects are even
more ambitious, such as the U.K. Biobank project, which has begun to
enroll 500,000 people to create the world’s largest resource for
studying the role of the environment and genetics in health and disease.

As vital to our understanding of human disease as such studies may
prove to be, a 50-year-old taking part in the U.K. Biobank project
isn’t likely to reap the rewards. “It will take a long time to make
sense of the data,” says Paul Nurse, a 2001 Nobel laureate in medicine
and the president of Rockefeller University. According to Nurse, it may
well be that most of the researchers starting these studies today won’t
see the final results — the data will be analyzed by their children.
In his estimation, that’s all the more reason “to get on with it.”

In response to concern that environmental exposures were affecting
children’s health, the Clinton administration in 2000 launched the
National Children’s Study, the largest such undertaking in the United
States, under the auspices of the National Institutes of Health. The
goal was to enroll 100,000 children; a genetic biobanking component has
since been added. Investigators have not yet recruited participants, in
part because of financial uncertainties. The Bush administration’s 2007
budget proposal completely eliminated money for the study, though
Congress reinstated funding in February.

The irony is that cutting funding for such projects may be the most
expensive option of all. Even if we successfully address campaign-
dominating political issues like skyrocketing medical costs and the
growing ranks of the uninsured, our failure to consider the fundamental
mechanisms of disease — the interplay between our genes and the
environment — could still bankrupt us, socially if not financially.
Until we’re able to interrupt the slide toward disease much earlier,
based on our developing knowledge of how genes and the environment
interact, medicine will remain the practice of “putting people back
together after they’ve been hit by the train,” says Wayne Matson, a
colleague of Martha Herbert’s who studies Huntington’s and other
neurodegenerative diseases at the Edith Nourse Rogers Memorial Veterans
Hospital in Bedford, Massachusetts. “It would be a lot better if we
knew how to pull that person off the tracks in the first place.”

Copyright 2007 by the Natural Resources Defense Council


From: Reuters

August 9, 2007


[Rachel’s introduction: A new study in Science magazine predicts that global warming will accelerate after 2009.]

Washington (Reuters) — Global warming is forecast to set in with a
vengeance after 2009, with at least half of the five following years
expected to be hotter than 1998, the warmest year on record, scientists
reported on Thursday.

Climate experts have long predicted a general warming trend over the
21st century spurred by the greenhouse effect, but this new study gets
more specific about what is likely to happen in the decade that started
in 2005.

To make this kind of prediction, researchers at Britain’s Met Office —
which deals with meteorology — made a computer model that takes into
account such natural phenomena as the El Nino pattern in the Pacific
Ocean and other fluctuations in ocean circulation and heat content.

A forecast of the next decade is particularly useful, because climate
could be dominated over this period by these natural changes, rather
than human-caused global warming, study author Douglas Smith said by

In research published in the journal Science,
Smith and his colleagues predicted that the next three or four years
would show little warming despite an overall forecast that saw warming
over the decade.

“There is… particular interest in the coming decade, which represents
a key planning horizon for infrastructure upgrades, insurance, energy
policy and business development,” Smith and his co- authors noted.

The real heat will start after 2009, they said.

Until then, the natural forces will offset the expected warming caused
by human activities, such as the burning of fossil fuels, which
releases the greenhouse gas carbon dioxide.


“There is… particular interest in the coming decade, which represents
a key planning horizon for infrastructure upgrades, insurance, energy
policy and business development,” Smith and his co- authors noted.

To check their models, the scientists used a series of “hindcasts” —
forecasts that look back in time — going back to 1982, and compared
what their models predicted with what actually occurred.

Factoring in the natural variability of ocean currents and temperature
fluctuations yielded an accurate picture, the researchers found. This
differed from other models which mainly considered human-caused climate

“Over the 100-year timescale, the main change is going to come from
greenhouse gases that will dominate natural variability, but in the
coming 10 years the natural internal variability is comparable,” Smith

In another climate change article in the online journal Science
Express, U.S. researchers reported that soot from industry and forest
fires had a dramatic impact on the Arctic climate, starting around the
time of the Industrial Revolution.

Industrial pollution brought a seven-fold increase in soot — also
known as black carbon — in Arctic snow during the late 19th and early
20th centuries, scientists at the Desert Research Institute found.

Soot, mostly from burning coal, reduces the reflectivity of snow and
ice, letting Earth’s surface absorb more solar energy and possibly
resulting in earlier snow melts and exposure of much darker underlying
soil, rock and sea ice. This in turn led to warming across much of the
Arctic region.

At its height from 1906 to 1910, estimated warming from soot on Arctic
snow was eight times that of the pre-industrial era, the researchers

Copyright 2007 Reuters Ltd


From: News & Observer (Charlotte, N.C.)

August 9, 2007


[Rachel’s introduction: A decade-long study finds that planting trees
is not a real solution to the problem of global warming. (Still, there
are many good reasons to plant trees.)]

By Margaret Lillard, The Associated Press

RALEIGH — A decade-long experiment led by Duke University scientists
indicates that trees provide little help in offsetting increased levels
of the greenhouse gas carbon dioxide.

That’s because the trees grew more, but only those that got the most
water and nutrients could store significant levels of carbon.

“The responses are very variable according to how available other
resources are — nutrients and water — that are necessary for tree
growth,” said Heather McCarthy, a former graduate student at the
private university in Durham who spent 6 1/2 years on the project.
“It’s really not anywhere near the magnitude that we would really need
to offset emissions.”

McCarthy, now a postdoctoral fellow at the University of California at
Irvine, presented the findings this week at a national meeting of the
Ecological Society of America in San Jose, Calif. Researchers from the
U.S. Forest Service, Boston University and the University of Charleston
also contributed to the report.

All helped in the Free Air Carbon Enrichment experiment, in which pine
trees in Duke Forest were exposed to higher-than-normal levels of
carbon dioxide.

The scientists also gathered data on whether the forest could grow fast
enough to help control predicted increases in the level of carbon

The loblolly pines grew more tissue, but only those that got the most
water and nutrients were able to store enough carbon to have any impact
on global warming, the scientists discovered.

“These trees are storing carbon,” McCarthy said Wednesday. “It’s just not such a dramatic quantity more.”

That means proposals to use trees to bank increasing amounts of carbon
dioxide emitted by humans may depend too heavily on the weather and
large-scale fertilization to be feasible.

“It would be an attractive solution, for sure,” McCarthy said. “I don’t
know how realistic people thought it was, but I think there were
certainly high hopes.”

Scientists blame the worldwide buildup of carbon dioxide — due largely
to the burning of fossil fuel — for global warming. The United States
is second only to China in the level of greenhouse gas it emits as a

The experiment site, funded by the U.S. Department of Energy, holds
four forest plots dominated by loblolly pines in Duke Forest, in
north-central North Carolina.

The trees are exposed to extra levels of carbon dioxide from computer-
controlled valves that are mounted on rings of towers above the
treetops. The valves can be adjusted to account for wind speed and
direction, and sensors throughout the plot monitor carbon dioxide
levels, McCarthy said. Four more plots received no extra gas.

Trees exposed to the gas produced about 20 percent more biomass — wood
and vegetation — than untreated trees. But the researchers said the
amounts of available water and nitrogen nutrients varied substantially
between plots.

Ram Oren, the project director and a professor of ecology at Duke’s
Nicholas School of the Environment and Earth Sciences, said in a
written statement that replicating the tree growth in the real world
would be virtually impossible.

“In order to actually have an effect on the atmospheric concentration
of CO2, the results suggest a future need to fertilize vast areas,”
Oren said. “And the impact on water quality of fertilizing large areas
will be intolerable to society.”

Copyright Copyright 2007, The News & Observer Publishing Company


From: International Herald Tribune

August 7, 2007


[Rachel’s introduction: One of the predicted consequences of global
warming is greater extremes of weather — more droughts, floods,
tornadoes, hurricanes, and so on. The prediction seems to be coming

By Reuters and The Associated Press

GENEVA: Much of the world has experienced record-breaking weather
events this year, from flooding in Asia to heat waves in Europe and
snowfall in South Africa, the United Nations weather agency said

The World Meteorological Organization said global land surface
temperatures in January and April were the warmest since such data
began to be recorded in 1880, at more than one degree Celsius higher
than average for those months.

There have also been severe monsoon floods across South Asia;
abnormally heavy rains in northern Europe, China, Sudan, Mozambique and
Uruguay; extreme heat waves in southeastern Europe and Russia; and
unusual snowfall in South Africa and South America this year, the
meteorological agency said.

“The start of the year 2007 was a very active period in terms of
extreme weather events,” Omar Baddour of the agency’s World Climate
Program said in Geneva.

While most scientists believe extreme weather events will be more
frequent as heat-trapping carbon dioxide emissions cause global
temperatures to rise, Baddour said it was impossible to say with
certainty what the second half of 2007 would bring. “It is very
difficult to make projections for the rest of the year,” he said.

The Intergovernmental Panel on Climate Change, a UN group of hundreds
of experts, has noted an increasing trend in extreme weather events
over the past 50 years and said irregular patterns will probably

South Asia’s worst monsoon flooding in recent memory has affected 30
million people in India, Bangladesh and Nepal, destroying croplands,
livestock and property and raising fears of new health crises in the
densely populated region.

Heavy rains also hit southern China in June, with nearly 14 million
people affected by floods and landslides that killed 120 people, the
World Meteorological Organization said.

England and Wales this year had their wettest May and June since
records began in 1766, resulting in extensive flooding and more than $6
billion in damage, as well as at least nine deaths. Germany swung from
its driest April since country-wide observations started in 1901 to its
wettest May on record. And torrential rains have followed weeks of
severe drought in northern Bulgaria — officials said Tuesday that at
least seven people have been killed in floods.

Mozambique suffered its worst floods in six years in February, followed
by a tropical cyclone the same month. Flooding of the Nile River in
June caused damage in Sudan.

In May, Uruguay had its worst flooding since 1959.

In June, the Arabian Sea had its first documented cyclone, which touched Oman and Iran.

Temperatures also strayed from the expected this year. Records were
broken in southeastern Europe in June and July, and in western and
central Russia in May. In many European countries, April was the
warmest ever recorded.

Argentina and Chile saw unusually cold winter temperatures in July
while South Africa had its first significant snowfall since 1981 in

The World Meteorological Organization and its 188 member states are
working to set up an early warning system for extreme weather events.
The agency also wants to improve monitoring of the impacts of climate
change, particularly in poorer countries that are expected to bear the
brunt of floods.

As exceptionally heavy rains continued to cut a wide swath of ruin
across northern India, a top UN official warned Tuesday that climate
change could destroy vast swaths of farmland in the country, ultimately
affecting food production and adding to the problems of already
desperate peasants, The New York Times reported from New Delhi.

Even a small increase in temperatures, said Jacques Diouf, head of the
Food and Agricultural Organization, could push down crop yields in the
world’s southern regions, even as agricultural productivity goes up in
the north. A greater frequency of droughts and floods, one of the
hallmarks of climate change, the agency added, could be particularly
bad for agriculture.

“Rain-fed agriculture in marginal areas in semi-arid and sub-humid
regions is mostly at risk,” Diouf said. “India could lose 125 million
tons of its rain-fed cereal production — equivalent to 18 percent of
its total production.” That is a sign of the steep human and economic
impact of extreme weather in India.

Copyright International Herald Tribune


Rachel’s Democracy & Health News (formerly Rachel’s
Environment & Health News) highlights the connections between
issues that are often considered separately or not at all.

The natural world is deteriorating and human health is declining
because those who make the important decisions aren’t the ones who bear
the brunt. Our purpose is to connect the dots between human health, the
destruction of nature, the decline of community, the rise of economic
insecurity and inequalities, growing stress among workers and families,
and the crippling legacies of patriarchy, intolerance, and racial
injustice that allow us to be divided and therefore ruled by the few.

In a democracy, there are no more fundamental questions than, “Who gets
to decide?” And, “How DO the few control the many, and what might be
done about it?”

Rachel’s Democracy and Health News is published as often as necessary
to provide readers with up-to-date coverage of the subject.


Peter Montague –

Tim Montague –


To start your own free Email subscription to
Rachel’s Democracy & Health News send a blank Email to:

In response, you will receive an Email asking you to confirm that you want to subscribe.


Environmental Research Foundation

P.O. Box 160, New Brunswick, N.J. 08903