Life Finds a Way

“Broadly speaking, the ability of the park is to control the spread of life forms.  Because the history of evolution is that life escapes all barriers.  Life breaks free.  Life expands to new territories.  Painfully, perhaps even dangerously.  But life finds a way.”

I am among those who have both admired the works of Michael Crichton and been concerned that he has at times been overly alarmist.  I am thinking of his novel Prey, in particular, in which he describes the evolution of predatory swarms of self-replicating homicidal nanobots.  It was an entertaining-enough novel, but unrealistic in its portrayal of the dangers of nanotechnology.  Such is the prerogative of fiction.  I found his book Jurassic Park, from which the above quotation is extracted, to be more measured in its cautions.  Interestingly, Jurassic Park was written in 1990, fully more than a decade before an interesting real-life occurrence of what he was talking about.  In this case, it was not dinosaurs, of course, but corn.

One of the first so-called “plant pesticides” was StarLink corn, which was genetically engineered to incorporate genes from the bacterium Bacillus thuringiensis, which had been known for decades to produce insecticidal toxins.  When the Environmental Protection Agency registered StarLink in 1998, it was with the restriction that it be used as animal feed and in industrial products, and not to be consumed by humans as food.  But, as Michael Crichton pointed out years previously, life finds a way.

In September 2000, a group of environmental and food-safety groups known as Genetically Engineered Food Alert announced that it had discovered StarLink corn in Taco Bell taco shells, prompting the first recall of food derived from a genetically modified organism.  Things quickly escalated, with some 300 kinds of food items ultimately being recalled because of concerns about the presence of StarLink corn.  Corn farmers protested.  Consumers of corn protested.  And the machinery of government was set in motion through the Food and Drug Administration and the Department of Agriculture to cooperate with the producer of StarLink in containing its spread.

The story of StarLink is a cautionary one that highlights the difficulties that can exist in trying to constrain the will of Nature and has relevance for the increasing use of various forms of nanotechnology.  Materials that fall within the very broad umbrella that “nanotechnology” encompasses are now used in more than 1000 household products, ranging from cosmetics to tennis racquets.  Perhaps even more interesting, though, are the more recent uses of nanoparticles in bone-replacing composites and chemotherapy delivery systems.

The amazing potential of these technologies can be readily appreciated just by considering the delivery of chemotherapy to cancer patients.  There are known substances that can effectively kill tumors in many cases, but current delivery systems amount to using them in a way that increases the toxicity in a patient’s entire body — essentially trying to find that line of toxicity that will kill the tumor but not the patient, who becomes incapacitatingly ill with effects that include nausea, hair loss, bleeding, diarrhea, and many others.  The use of nanoparticles to deliver the substances directly to the tumors has the potential of both increasing the effectiveness of the treatment while dramatically reducing the negative impact on the rest of the patient’s body.

This week, I had the privilege of discussing legal aspects of nanotechnology with Dr. Hildegarde Staninger on her broadcast at One Cell One Light Radio.  A copy of the broadcast can be found here.  During our discussion, we touched on the capacity of nanoparticles, by virtue of their extraordinarily small size, to intrude unexpectedly into the environment.  There are known health risks associated with nanoparticles, such as the triggering of autophagic cell death in human lungs caused by polyamidoamine dendrimers, and there are surely unknown health risks as well.  We also discussed government regulation of nanotechnology, specifically how the very breadth of applications for nanotechnology makes that process difficult and how instead efforts have been made to incorporate nanotechnology into the existing regulatory framework.

Interestingly, this week saw one of the first attempts to deviate from that approach.  At the Nanodiagnostics and Nanotherapeutics meeting held at the University of Minnesota, an invited panel discussed draft guidelines developed with the support of the National Institutes of Health to provide for regulatory oversight of medical applications of nanotechnology.  The final recommendations will not be available for some time, and the usual rulemaking procedures for administrative agencies to allow for public comment will need to be completed.  But the draft recommendations provide insight into how a nanotechnology-specific regulatory framework might develop.  Copies of papers by the group published earlier this year can be found here and here (subscriptions required) and the (free) report on the conference recommendations by the journal Nature can be found here.

Briefly, the group appears to be converging on a recommendation for the creation of two additional bodies within the Department of Health and Human Services — an interagency group that consolidates information from other government agencies in evaluating risks and an advisory body that includes expert members of the public.  These strike me as good recommendations, and there is no doubt that the group considering them has weighed the merits and disadvantages of developing an oversight framework specific to the concerns presented by nanotechnology.

As I mentioned to Dr. Staninger during our discussion, it is very much my belief that dialogues that educate the public about the real risks of nanotechnology — not fictional psychopathic nanobot swarms — are needed in developing appropriate and effective regulation.  There are risks to nanotechnology, just as there are with every technology having such enormous benefit, and realistic management of those risks is a part of the process of exploiting them to our benefit.

Tending Towards Savagery

I am no particular fan of the monarchy, but Prince Charles was given a bad rap in 2003 when he called for the Royal Society to consider the environmental and social risks of nanotechnology.  “My first gentle attempt to draw the subject to wider attention resulted in ‘Prince fears grey goo nightmare’ headlines,” he lamented in 2004.  Indeed, while yet somewhat misguided, the Prince’s efforts to draw attention to these issues were genuine and not far from mainstream perceptions that scientists sometimes become so absorbed with their discoveries that they pursue them without sober regard for the potential consequences.  A copy of his article can be read here, in which he claims never to have used the expression “grey goo,” and in which he makes a reasonable plea to “consider seriously those features that concern non-specialists and not just dismiss those concerns as ill-informed or Luddite.” 

It is unfortunate that the term “grey goo” has becomes as inexorably linked with nanotechnology as the term “frankenfood” has become associated with food derived from genetically modified organisms.  The term has its origins in K. Eric Drexler’s 1986 book Engines of Creation

[A]ssembler-based replicators will therefore be able to do all that life can, and more.  From an evolutionary point of view, this poses an obvious threat to otters, people, cacti, and ferns — to the rich fabric of the biosphere and all that we prize…. 

“Plants” with “leaves” no more efficient than today’s solar cells could out-compete real plants, crowding the biosphere with an inedible foliage.  Tough, omnivorous “bacteria” could out-compete real bacteria:  they could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days.  Dangerous replicators could easily be too tough, small, and rapidly spreading to stop…. 

Among the congoscenti of nanotechnology, this threat has become known as the “gray goo problem.” 

Even at the time, most scientists largely dismissed Drexler’s description as unrealistic, fanciful, and needlessly alarmist.  The debate most famously culminated in a series of exchanges in 2003 in Chemical and Engineering News between Drexler and Nobel laureate Richard Smalley, whose forceful admonition was applauded by many: 

You and people around you have scared our children.  I don’t expect you to stop, but I hope others in the chemical community will join with me in turning on the light and showing our children that, while our future in the real world will be challenging and there are real risks, there will be no such monster as the self-replicating mechanical nanobot of your dreams.

 Drexler did, in the end, recant, conceding in 2004 that “[t]he popular version of the grey-goo idea seems to be that nanotechnology is dangerous because it means building tiny self-replicating robots that could accidentally run away, multiply, and eat the world.  But there’s no need to build anything remotely resembling a runaway replicator, which would be a pointless and difficult engineering task….  This makes fears of accidental runaway replication … quite obsolete.”  But too many others have failed to take note, as sadly highlighted by this month’s bombing of two Mexican professors who work on nanotechnology research. 

Responsibility for the most recent bombings, as well as other bombings in April and May, has been claimed by “Individualidades tendiendo a lo Salvaje” (roughly translated into English as “Individuals Tending Towards Savagery”), an antitechnology group that candidly claims Unabomber Ted Kaczynski as its inspiration.  The group even has its own manifesto.  It is not as long as the Unabomber’s but is equally contorted in attempting to justify the use of violence as a means of opposing technological progress.  A copy of the original manifesto can be read here and an English translation can be found here

The manifesto references Drexler when it cites the absurd rationale for the group’s violence: 

[Drexler] has mentioned … the possible spread of a grey goo caused by billions of nanoparticles self-replicating themselves voluntarily and uncontrollably throughout the world, destroying the biosphere and completely eliminating all animal, plant, and human life on this planet.  The conclusion of technological advancement will be pathetic, Earth and all those on it will have become a large gray mass, where intelligent nanomachines reign.

 No clear-thinking person supports the group’s use of violence.  But at the same time, there are many nonscientists who are suspicious of the motivations that underlie much of scientific research.  One need only look at the recent news to understand the source of that distrust:  just this week, the Presidential Panel for the Study of Bioethical Issues released a report detailing atrocities commited by American scientists in the 1940’s that involved the nonconsensual infection of some 1300 Guatemalans with syphilis, gonorrhea, or chancroid.  There are many other examples where scientists have engaged in questionable practices with a secrecy that is counter to the very precepts of scientific investigation. 

“Nanotechnology” is a wonderful set of technologies that have already found their way into more than 1000 commercial products being sold in the electronic, medical, cosmetics, and other markets.  But even though the use of nanotechnology is spreading, many remain concerned that it is unwise to allow it, even if they would not go so far as to bomb the scientists working on the technology.  Here I find myself sympathetic with the real message that Prince Charles was attempting to spread — namely, that the concerns of the nonscientist public need to be addressed, even if those concerns seem to be ill-conceived.

Is Your Scientific Malpractice Insurance Paid Up?

“Thanks to his intuition as a brilliant physicist and by relying on different arguments, Galileo, who practically invented the experimental method, understood why only the sun could function as the centre of the world, as it was then known, that is to say, as a planetary system.  The error of the theologians of the time, when they maintained the centrality of the Earth, was to think that our understanding of the physical world’s structure was, in some way, imposed by the literal sense of Sacred Scripture.”

 Pope John Paul II, November 4, 1992

 

Pope John Paul II did for the Catholic Church in 1992 what scientists do every single day in their professional lives:  admit to a mistake in understanding the nature of the universe.  Scientists do it because it is a fundamental part of the scientific method to acknowledge the failings in our understanding of the world, and because of our collective commitment to improving that understanding by refusing to become doctrinaire.  A scientist gains no higher respect from his peers than when he tells them he was mistaken and goes on to share what he has learned from that mistake so that they may continue the advance of knowledge.  It is this fundamental pillar of the scientific method that has single-handedly been responsible for its tremendous and astonishing successes. 

As the pope noted in his statement, Galileo was one of those who were responsible for devising such a brutal and uncompromising commitment to the evidence of our own eyes and ears in drawing conclusions about the world.  For this, he was condemned by the Church, sentenced to live under house arrest at his farmhouse in Arcetri, where he would have little to do other than grow blind and die.  It would not be until 1835 — more than 200 years after his conviction — that the Vatican would remove his Dialogue Concerning the Two Chief World Systems from its list of banned books and not until 1992 — more than 350 years after his conviction — that it would formally admit that it was wrong and Galileo was right.  (Some additional commentary that I have previously made about Galileo can be found here.) 

I find it unfortunate that it is again in Italy that ridiculous persecution of scientists is taking place.  It is not the Church this time, but rather the Italian state that is trying to hold scientists to a standard that fails to recognize the fundamental character of the scientific method.  On April 6, 2009, an earthquake struck Italy in the Abruzzo region, resulting in the death of more than 300 people and damaging thousands of building.  About 65,000 lost their homes and most of those were forced to live for weeks in makeshift “tendopoli” — tent cities — that were erected to house the quake refugees, a sad circumstance that Prime Minister Silvio Berlusconi thoughtlessly suggested was an opportunity for them to enjoy a “camping weekend.” 

The region had been experiencing Earth tremors for more than ten weeks in advance of the earthquake, and on March 30, a 4.0-magnitude earthquake struck the region.  There was concern among the public that a larger earthquake would follow, as indeed it did a week later.  A meeting of the Major Risks Committee, which provides advice to the Italian Civil Protection Agency on the risks of natural disasters was held on March 31.  Minutes from the meeting show that the following statements were made about the possibility of a major earthquake in Abruzzo:  “A major earthquake in the area is unlikely but cannot be ruled out”; “in recent times some recent earthquakes have been preceded by minor shocks days or weeks beforehand, but an the other hand many seismic swarms did not result in a major event”; “because L’Aquila is in a high-risk zone it is impossible to say with certainty that there will be no large earthquake”; “there is no reason to believe that a swarm of minor events is a sure predictor of a major shock” — all the sorts of cautious statements by scientists trying to place their understanding of the real risk in context of what they know about seismology and what they do not. 

But at a press conference later held by Bernardo De Bernardinis, a government official who was the deputy technical head of the Civil Protection Agency, reporters were told that “the scientific community tells us there is no danger, because there is an ongoing discharge of energy.”  The idea that small seismic events “release energy,” like letting a bit of steam out of a pressure cooker, is one that is soundly rejected by seismologists; the Earth does not function that way. 

The bizarre aftermath has been the bringing of charges of manslaughter against De Bernardinis and six seismologist members of the Major Risks Committee for their failure to properly warn the public of the danger.  The charges were brought almost a year ago, but a preliminary hearing was not held until last week because of delays resulting from requests by dozens of those damaged by the earthquake to receive civil compensation from the accused scientists.  Astonishingly, the result of the hearing was not an outright dismissal of the homicide charges, but instead a decision to proceed with a trial that will begin on September 20. 

To my mind, this case is an absurd attack on scientists, demanding an infallibility from them that they never claim.  As one of the indicted seismologists noted, there are hundreds of seismic shocks in Italy every year:  “If we were to alert the population every time, we would probably be indicted for unjustified alarm.”  These scientists face not only potential incarceration for twelve years if they are convicted of manslaughter, but also potential civil liability for property damage resulting from the earthquake.  The fact that this possibility is even being entertained is alarming:  It is likely to have a detrimental effect on the kinds of information scientists are willing to share with the public.  And if there is a realistic potential for civil liability arising from the kinds of statements that scientists routinely make, it may indeed make sense for scientists to seek malpractice insurance.  The very idea, though, that scientific research should be haunted by the threat of legal liability in the way that medicine is already, is more than troubling.

No Nation Was Ever Ruined By Trade

“Canada is a country whose main exports are hockey players and cold fronts.  Our main imports are baseball players and acid rain.”

                                                                                        Pierre Elliott Trudeau

 

One of the accusations frequently leveled at environmentalists is that they are, much like meteorologists, hopelessly fickle.  People remember widespread reports in the 1970’s about the possibility of global cooling and the potential imminent onset of another ice age, when now all the talk is about global warming.  Or they recall something of Paul Ehrlich’s dire predictions that agricultural production would be incapable of supporting the world’s population, which they watched grow by more than a factor of two in concert with the development of an obesity epidemic.  Or they remember how the controversy over acid rain became such an issue between the United States and Canada, so jeopardizing the Canada – U.S. Free Trade Agreement that Prime Minister Brian Mulroney cynically wondered whether it would be necessary to go to war with the United States over the issue. 

No one talks about acid rain these days, at least not the way they used to.  But what changed? 

The impression that many in the public seem to have is that acid rain became an issue in the early 1980’s, when images of dying forests and lakes were widely circulated, and then withered away as climatologists shifted their focus to other issues.  The reality is, of course, very different.  Ever since the dawn of the Industrial Revolution, the effects of acidity in precipitation have been noted, with the term “acid rain” being coined by Robert Angus Smith in 1872.  It is associated with the emission of sulfur-, nitrogen-, and carbon-containing gases as byproducts of industrial processes that produce acidic compounds when they react with water.  And the reason it is not discussed as widely as it once was is not because the issue mysteriously vanished or because climatologists are opportunistically fickle, but because actions were taken to reduce its impact.

It was George H. W. Bush who had pledged to become the “environmental president” and who in 1990 supported what was then an innovative approach to reducing targeted emissions.  The basic idea was one  that had been studied theoretically by economists and which attempted to adapt market mechanisms as an indirect form of regulation.  Rather than dictate through strict regulation how emissions should be reduced, the Clean Air Act was amended to put those market mechanisms in place by establishing what has since become known as a “cap and trade” system.  The basic idea was to limit the aggregate sulfur dioxide emissions from different sources, but to permit allowances to be traded so that the market would be involved in determining which sources were permitted to produce emissions within the limits and at what levels.  There were many criticisms of the approach, most notably from environmentalists who fretted that it allowed large polluters to flex their economic muscle in buying permission to pollute. 

But the program is largely acknowledged to have been a success, not only achieving full compliance in reducing sulfur dioxide emissions but actually resulting in emissions that were 22% lower than mandated levels during the first phase of the program.  This was also achieved at a significantly lower cost than had been estimated, with actual costs now determined to be about 20 – 30% of what had been forecast.  The annual cost of having companies figure out for themselves how to reduce acid-rain emissions has been estimated at about $3 billion, contrasted with an estimated benefit of about $122 billion in avoided death and illness, and healthier forests and lakes.  

The success of the acid-rain program is naturally being considered as a way of addressing carbon emissions that are associated with global climate change.  Thus far, the United States has rejected a national implementation of cap-and-trade for carbon emissions, causing California to decide to implement it itself in accordance with its Assembly Bill 32, a copy of which can be found here.  Signed by Governor Schwarzenegger in 2006, the bill requires California to reduce that state’s carbon emissions by 2020 to levels that existed in 1990.  A copy of California’s plan to do so using an implementation of cap-and-trade can be found here

Part of what California seeks to do is to improve on a generally failed cap-and-trade program in Europe that began in 2005.  One of the more significant problems with the European implementation was that governments began the program with an inadequate understanding of the level of carbon emissions in their countries.  Too many allowances were issued, causing market forces quickly to force the price of carbon to zero by 2007.  In addition, a number of tax-fraud schemes and a recent theft of carbon credits stored in the Czech Republic registry have resulted in justifiable concern about the program that some worry will affect the California program. 

It is no surprise that the California program has been the subject of litigation, and last week a ruling was issued by the Superior Court in San Francisco agreeing that alternatives to a carbon-market program had not been sufficiently analyzed.  A copy of the ruling can be read here and a copy of the (much more informative) earlier Statement of Decision can be read here

There is considerable interest in the California program.  It is decidedly more ambitious than the more limited program implemented by ten states in the northeastern region of the United States and is being considered by some Canadian provinces as well as by some South American countries.  While last week’s decision certainly derails implementation of cap-and-trade in California temporarily, it is difficult to imagine that it will not ultimately be implemented after deficiencies in the studies have been addressed.  There is too much interest in it as a regulatory scheme that can have less adverse economic impact than other forms of regulation even while achieving the same overall objectives.

Of Hares and Lions

Alarmed at sound of fallen fruit
A hare once ran away
The other beasts all followed suit
Moved by that hare’s dismay.

They hastened not to view the scene,
But lent a willing ear
To idle gossip, and were clean
Distraught with foolish fear.

The quotation is a translation from The Jataka, a body of Indian literature that relates to previous births of the Buddha, and dates to somewhere around the third or fourth century BC.  The story is perhaps the origin of the more modern story of Chicken Little, and tells the parable of a hare that lived at the base of a vilva tree.  While idly wondering what would become of him should the earth be destroyed, a vilva fruit made a sound when it fell on a palm leaf, causing the hare to conclude that the earth was collapsing.  He spread his worry to other hares, then to larger mammals, all of them fleeing in panic that the Earth was coming to an end. 

When radiation is discussed in the news, I often think of the story of the hare and vilva tree.  It seems that we are perpetually confronted with calls to overregulate radiation based on irrational fears of its effects on the human body.  Here I mentioned the persistent fears about radiation from cell phones and the passing of legislation in San Francisco last year requiring that retailers display radiation-level information when selling such devices, but there are many others that appear repeatedly in the news:  radiation from power lines, from computer screens, from cell-phone towers, and from any number of common household devices have at times been alleged to be responsible for causing cancer.  All of these allegations have uniformly been discredited in thousands of scientific publications over the last decades because they do not produce radiation at energies sufficient to break chemical bonds, a factor that is critical in the mechanism by which cancer is caused. 

Most recently, of course, there have been widespread reports of radiation being emitted from the damaged Fukushima Dai-ichi power plants.  There have been demands for governments to suspend the use of nuclear reactors in generating power and at least the German government appears to be acceding, with Chancellor Angela Merkel using absurd hyperbole in characterizing what is happening in Fukushima as a “catastrophe of apocalyptic dimensions.” 

There are legitimate concerns about radiation.  It does cause cancer in human beings.  Regulating our exposure to harmful radiation is an important and necessary role for governments.  But at the same time, rationality must surely prevail based on accurate scientific understanding of the mechanisms by which radiation causes cancer. 

There are a number of simple facts relevant to the debate. 

First, human beings have evolved in an environment in which we are continually exposed to radiation  — from cosmic rays, the Sun, and the ground all around us — and our bodies are adapted to exist with certain levels of radiation.  Indeed, the biological mechanisms by which we evolved as human beings are intimately related to that radiation exposure.  A wonderful graphic produced by Randall Munroe at xkcd illustrates different exposure levels and can be seen here.  One amusing fact shown in the chart is that consumption of a banana exposes a person to more radiation than living within fifty miles of a nuclear power plant for a year — and is dramatically less than the exposure from the natural potassium that exists in the human body. 

Second, there are only a limited number of viable options to produce energy at the levels demanded by modern society.  These are basically through the use of coal power generation and the use of nuclear power generation.  Other “clean” technologies that are frequently pointed to, such as wind and solar power generation, are wonderful technologies that are worth pursuing and which may one day be efficient enough to provide adequate levels of energy to replace coal and nuclear methods.  But that day is not here yet.  Those technologies are simply incapable of producing energy at the levels needed to support modern society and, in any event, have their own environmental concerns that need to be considered and addressed.  I commented, for example, on some environmental concerns associated with wind power generation here

Third, with the particular safety mechanisms that are in place, nuclear power generation is safer than coal generation, its only realistically viable alternative.  The xkcd chart cited above notes, for instance, that there is greater radiation exposure from living within 50 miles of a coal power plant for a year than living within the same distance from a nuclear power plant for a year.  The Clear Air Task Force, an organization that has been monitoring the health effects of energy sources since 1996, released a report last year finding that pollution from existing coal plants is expected to cause about 13,200 deaths per year, in addition to about 9700 hospitalizations and 20,000 heart attacks.  A copy of the report can be read here.  Nuclear power generation is also more environmentally responsible since it does not release climate-changing greenhouse gases in the way that coal power generation does.  When considering policy to reduce or eliminate the use of nuclear power — driven largely because the smaller number of deaths from nuclear power result from isolated high-profile events instead of the greater number of deaths that result from the persistent low-level effects of coal power generation — it is important to take these comparisons into account. 

I want to offer a final comment about radiation hormesis, which was recently raised most prominently by political commentator Ann Coulter in the context of the Fukushima event, although it has also been raised by other, more scientifically reliable, sources.  For example, Bob Park, a respected physicist who comments regularly on science and government policy, raised it in the context of a recent study that he interprets as showing that “Chernobyl survivors today suffer cancer at about the same rate as others their age [and that t]he same is true of Hiroshima survivors.”  His remarks can be found here.   Radiation hormesis is an effect in which low-level exposure to radiation produces beneficial effects and that has apparently been observed in laboratory settings.  It has not been convincingly confirmed in human beings and the exposure rates to produce the effect are, in any event, low.  My own view is that pointing to radiation hormesis as a positive argument for the use of nuclear power is counterproductive.  The effect is too speculative and there are too many other — stronger — arguments in its favor. 

The Jataka tells us that when the panicked masses led by the timid hare met a lion, it was he who restored their sensibility.  He brought them to their senses by looking coldly at the facts and determining that the earth was not breaking apart; it was only the misunderstood sound of a vilva fruit.  Let’s be lions, not hares.