Cheaper Than Chimpanzees

It was World War II.  Soldiers were being deployed to areas where diseases hampered military efforts when soldiers became infected, notably tropical diseases such as malaria.  Physicians were working to find a cure or treatment, but there was one small difficulty:  experiments were needed on human subjects to test various drugs.  But it was only a small difficulty.  There were, after all, a host of prisoners on which the treatments could be tested, and those prisoners were conveniently being housed in a highly controlled environment that would aid the scientific studies. 

The above paragraph could easily describe the work of Dr. Klaus Schilling, who performed medical experiments on roughly 1000 prisoners at the Dachau concentration camp.  He was working to find a treatment for malaria that was infecting German soldiers fighting in North Africa.  After the war concluded, he was tried as part of the so-called “Dachau Trials” in 1946, convicted, and executed by hanging because of his experiments. 

But the paragraph could just as easily describe the work done at the Stateville Penitentiary in Illinois during the 1940’s.  Physicians from the Department of Medicine at the University of Chicago conducted experiments on more than 400 prisoners incarcerated at the prison.  They were testing antimalarial drugs to evaluate their use in the Pacific by American soldiers fighting the Japanese. 

Indeed, the similarity between the two courses of research was not lost on those defending themselves at the Nuremberg “Doctors’ Trial,” held before U.S. military courts.  Defense attorneys presented the argument that there was no meaningful difference between what the State of Illinois had done at Stateville and what the Nazis had done at Dachau, much the same way as I presented the scenarios above.  But the comparison is, in fact, an unfair one, and people instinctively recognize that there is something not quite right about it.  The Nazis conducted their experiments on people whose imprisonment was seen as unjustified and without their consent, while the experiments at Stateville were conducted on prisoner volunteers who had been convicted of crimes in a manner that was more morally satisfying. 

But the issue of performing medical research on prisoners is not a simple one.  When Columbia University last week needed to suspend research at a brain-imaging lab because of the injection of chemicals having unsafe impurity levels into subjects, concerns were expressed because the subjects were suffering from mental illnesses.  It was noted that the regulations governing federally funded research include special provisions for research conducted on women, children, and prisoners, but have no such special provisions for research conducted on the mentally ill.  And one still hears vivisectionists often express the view that research currently conducted on mammals would be more fairly conducted on prisoners instead. 

Those federal regulations (45 C.F.R., Part 46) have their origin in the very argument that was presented at the Doctors’ Trial at Nuremberg, which led to the Nuremberg Code.  The Nuremberg Code sets forth a set of ethical principles to be applied to human experimentation, embracing particularly the concepts of informed consent and the absence of coercion, but also recognizing the need for the research to be beneficial to society and conducted by trained scientists. 

The creation of the Nuremberg Code had an ironic effect in the United States.  Rather than follow its directives, the higher moral position of the United States at Nuremberg created an environment of license in the United States in which the guidelines of the Nuremberg Code were often ignored.  In a huge postwar expansion of experimentation on prisoners in the United States, numerous gruesome studies were performed.  The famous Sing Sing “blood cleaning” experiment in which the blood of an 8-year-old girl was circulated through the body of a volunteer prisoner for 24 hours in an effort to cleanse her of cancer.  The injection of live cancer cells into more than 100 inmates in the prisons of Ohio.  Holmesburg Prison in Philadelphia became a haven for studying the effects of everything from detergents to chemical warfare agents. 

Prisoners were convenient subjects for human experimentation and they were, as summarized by Jessica Mitford in a 1973 article, “cheaper than chimpanzees.”  It would be only a short time afterwards that the essential provisions of the Nuremberg Code would finally be codified as a part of United States law, changing significantly the role of prisoners in research.  But has it been enough?  The Institute of Medicine of the highly prestigious National Academies of the United States released a report in 2007 in which they evaluated the ethics of that legislation.  Their answer to the question whether the legislation has achieved an appropriate balance between scientific knowledge and prisoner vulnerability was “an emphatic ‘no.’ ”  They recommended a number of changes in the system of ethics that governs research conducted on prisoners, notably including the use of more modern ethical constructs.  In reaching their conclusions, they considered the realities of prison life and the implicit coercion that exists in the context of their knowledge of medical ethics in a thoughtful and thorough way. 

At this time, those recommendations have not yet been implemented by Congress.  It may be true that prisoners are no longer cheap substitutes for chimpanzees.  But when those among us who have the greatest knowledge about medical ethics tell us that the protections that do exist are “emphatically” insufficient, it is time to pay attention and rethink them.

Even Hell Hath Its Peculiar Laws

The legend undoubtedly has origins that predate the short German chapbook published in 1587 that tells the story of Johann Fausten.  A scholar at Wittenburg, Dr. Fausten’s thirst for knowledge is unquenchable.  When he feels that he has learned all that he possibly can, he turns to magic and, seduced by the power of the black arts, summons Mephistopheles as the representative of Lucifer.  The two negotiate and hammer out an agreement to be signed in blood in which Dr. Fausten will sell his soul in exchange for twenty-four years of power and Mephistopheles as a servant to his every whim.  Dr. Fausten indeed enjoys a time of great power, but in the end it is seen as a pittance when measured against the loss of his soul to eternal damnation.

The lesson of the legend of Faust, which recurs time and again in the course of human events, is that there are bargains we make that have consequences far worse than we imagine when we are seduced by the power that we can temporarily achieve with the bargain.

Physicians, confronted with the increasing resistance to antibiotics, are beginning to flirt with a seductive possibility.  The story of antibiotics is one that is well known.  It was 1928 and Dr. Alexander Fleming was working at St. Mary’s Hospital in London as a bacteriologist.  One day, he returned to a plate culture of staphylococcus to find it contaminated with a blue-green mold.  It was probably a moment of frustration, to find the contamination, but there was something interesting about it.  Bacteria colonies next to the mold (Penicillium) were being dissolved.  Fleming grew the mold in pure culture, leading to the discovery of penicillin, a substance produced by the mold and that had antibiotic properties.  Penicillin itself was not chemically isolated until World War II, earning Howard Florey and Ernst Chain the Nobel Prize in medicine.

The discovery of antibiotics revolutionized the treatment of infectious disease.  Physicians were given a miraculous cure for battling bacterial infections and through the last half of the twentieth century, it became commonplace for those who fell ill to rely on the availability of a simple pill to cure much disease.

But the promise of antibiotics was in some respects short-lived.  The mechanisms of natural selection operated on the genetic structure of bacteria when they were exposed to antibiotics, causing them to mutate into resistant strains.  Already in the 1950’s — little more than ten years after the isolation of penicillin — it was apparent that tuberculosis bacteria had undergone mutations to make it resistant to streptomycin.  Over time, things have worsened.  Widespread use of antibiotics has resulted in so many resistant bacterial strains that many antibiotics are all but useless.  And so enters the possible Faustian bargain.

As more and more antibiotics were being discovered, some were found to have high toxicity.  One example was chloramphenicol, which can produce aplastic anemia in some patients.  There is no known way to predict who may or may not get this side effect and it is generally fatal.  It can also cause inner-ear damage that produces tinnitus and balance problems and may be linked to chronic lymphcytic leukemia.  But even so, many physicians are now looking to chloramphenicol as a viable antibiotic in treating bacterial strains that have become resistant to even the most powerful antibiotics that are currently used and safer.  Many feel they have little choice.  There are few new antibiotics being developed and these old ones have the advantage that they have been so little used that bacteria have not had a chance to develop resistance.

The legal issues related to the use of antibiotics centers around regulatory approval processes.  This week, Lannett Co. in Philadelphia announced that it has formulated capsules of chloramphenicol with ingredients from a Spanish supplier with the intention of seeking approval from the Food and Drug Administration for oral use of chloramphenicol “as a drug of last resort.”

Is this a compromise of the type that Faust made with Mephistopheles, a bargain for greater power now over the treatment of infectious disease that has consequences we do not fully appreciate?  Only time will tell, but what is certain is that after less than a century, physicians are already seeing the need to compromise as the wonderful promise of Alexander Fleming’s discovery appears to be reaching its limits.

Sore Fear Upon Men

For the last month, the attention of sports fans around the world has been focused on South Africa as the soccer teams of 32 nations competed for the World Cup.  And yesterday it concluded:  Spain is the victor because of an extra-time goal by Andres Iniesta and massive celebrations continue today not only in the large cities of Madrid and Barcelona, but also in the small hamlets and villages of rural Spain. 

It was an exciting game, and one that even the cosmos seemed intent on recognizing.  Fourteen minutes before the final game began, the umbra of a solar eclipse touched down on the Earth and lifted off just around the time the game concluded.  It was as though, for a few short hours, the universe itself took an interest in the affairs of men and settled in to enjoy the game. 

There is perhaps no astronomical event as portentous as a solar eclipse.  And yesterday’s was special not only because it coincided with the World Cup final, but also because of the remote path of the umbra.  Travelling over the south Pacific, the eclipse was visible from very few land masses, all of them remote.  Indeed, tiny and mystical Easter Island, famous for its haunting stone moai and described frequently as the most remote inhabited place on the planet, was a particular focus for the eclipse.  Thousands of people made their way there just to experience being within the eerie totality of a solar eclipse while standing among those ancient stone statues. 

The last time a total solar eclipse was visible from Easter Island was almost 14 centuries ago, probably before the island was even settled by Hotu Matu’a.  I am unable to say exactly when Polynesian explorers settled on Easter Island — radiocarbon dating of the Tahai complex suggests a date of 690 ± 130 AD, although it is possible Tahai was not constructed until a couple of hundred years after settlement of the island. 

But I can say with certainty when that previous solar eclipse occurred on Easter Island.  It was on September 24, 656 AD.  And those thousands of people who arrived on Easter Island took confidence in the fact that the eclipse would begin at 12:41 PM local time.  Such is the level of understanding that astronomers have about the movements of the Earth, Sun, and Moon.  We almost take for granted these days that we know the precise times the Sun or Moon will rise or set — for any day of any year and for any location on the planet.  But what an amazing feat it is to have such predictive capabilities. 

The precision with which astronomers can provide information about such positions makes for evidence that can be used in court that has indisputable reliability.  Such information is exact and has been used in numerous cases to sway a factfinder to a conclusion about a crime. 

It was famously used by Abraham Lincoln when he was an attorney representing Duff Armstrong in a murder case.  Although Lincoln was mostly a civil attorney, he did take a small number of criminal cases, and had taken this one — pro bono — after being implored by Duff’s mother, the recently widowed wife of his old friend Jack. 

Duff and a second man named Norris had been accused of killing James Metzker during a drunken brawl sometime in August, 1857.  The case seemed solid and unwinnable.  There was, after all, an eyewitness to the event, a man by the name of Charles Allen.  One can easily imagine the tension in the courtroom as the man who would later issue the Emancipation Proclamation to lead to the end of slavery stood and cross-examined the eyewitness. 

Lincoln proceeded with a series of questions designed to elicit testimony about how clearly Allen could see the altercation. How bright was the moon?  Where was the moon?  Are you absolutely certain there was enough moonlight that you could see clearly?  These are perhaps the sorts of questions he asked.  When Lincoln then introduced an almanac showing that the moon could not possibly have been overhead as Allen testified, a “roar of laughter” arose among the spectators and some of the jurors.  The key witness in the trial had been discredited and the jury unanimously voted to acquit Duff with deliberations that lasted less than an hour.

Today, astronomers are still called as expert witnesses from time to time.  There are companies formed by astronomers that will calculate the position of the sun or moon at a particular location on a particular day to be used as evidence.  Could the driver really have had the sun in his eyes when he was driving west down Elm Street at 4:38 PM on October 12?  In a case in the late 1970’s, an astronomer analyzed a photograph of a woman and her dog to establish from the dog’s shadow that it could not have been taken when she claimed, resulting in her conviction for perjury.  In any case where the state of natural lighting is relevant to the case, astronomers are able to provide absolute and inarguable evidence of the positions of these celestial bodies. 

Recently, such “forensic astronomers” have turned their attention to artistic endeavors, looking at paintings, photographs, and poetry in which celestial objects appear.  Ansel Adams’s photograph titled Autumn Moon, we now know, was taken at 4:14 PM on December 28, 1960.  Van Gogh’s Moonrise must have been painted on July 13, 1889.  And just last month, astronomer Don Olson settled a debate about what celestial object Walt Whitman was referring to in a poem that appears in his collection Leaves of Grass — it was not the 1833 Leonid meteor shower as many supposed, and it was not an 1859 fireball that others had thought, but was instead the Great Comet of 1860. 

I am personally disappointed that Spain won the World Cup since I was rooting for The Netherlands, but I do hope the cosmos enjoyed its visit and the game.

The Missing Link

Eoanthropus dawsoni.

This is one of the most famous hominids ever discovered. But certainly for the wrong reasons.

It was 1912. Very few hominid fossils had so far been found. Neanderthal Man in 1856; Cro Magnon Man in 1869; Java Man in 1890; Peking Man in 1903; Heidelberg Man in 1908. Each of these discoveries had added a small piece to the great puzzle of modern man’s origins, but there was still no clear species that represented a clear link in the evolution between ape and man.

In a way, that changed with the 1912 discovery of the fossil remains of a hominid — identified as eoanthropus dawsoni after the man who discovered them — that clearly showed a mix of features between ape and man. Found in quarries in Sussex, England, the skull was similar to that of modern man, but would hold only a brain about two-thirds the size of a modern brain. And the jawbone was decidedly more apelike. The combination supported the theory that the evolution between ape and man would begin with the brain so that the skull would evolve before the jaw into a form closer to what exists today.

There were skeptics, mostly among French and American paleontologists. But there were also those who saw the discovery as a critically important one, particularly British paleontologists in a pique of nationalist pride. But as time went on and further hominid fossils were discovered, it became increasingly difficult to fit eoanthropus dawsoni into the developing framework of human evolution. Experts perhaps puzzled over it at times, recognizing it as anomalous.

It would be forty years before the discovery of the fossils — made in the quarries of Piltdown — would be discovered to be a fraud. It was an elaborate hoax perpetuated by someone whose identity remains a mystery today, even though there are various theories about who it might have been. Certainly the fossils had been carefully prepared to make them look far older than they actually were, and it was with the use of much more modern dating techniques that the fraud was ultimately exposed. The preparations clearly required knowledge about the techniques that paleontologists would use in analyzing the fossils, making the hoax seem elaborate and almost sinister.

The four-decade episode of Piltdown Man is an instructive one, exposing the limits that may exist with scientific analyses and the ability of some to exploit those limits to mislead.

The more recent episode of “Climategate” is an attempt to suggest that a similar hoax is being perpetuated by some climate scientists today. The episode began in November of last year when a variety of emails and documents were hacked from the University of East Anglia’s Climatic Research Unit computers. Through a rather selective citation of isolated phrases out of context from stolen documents, a scandal was orchestrated, accusing the scientists who were quoted of colluding in a campaign to withhold scientific information, to manipulate data, and to interfere with the peer-review process in order to perpetuate a hoax of increasing global temperatures.

One particular focus of the allegations has been language in an internal email related to the famous “hockey stick” graph that shows sharply increasing global temperatures in modern times. It is a private email written between two scientists and needs to be understood in that context. It undoubtedly refers to a “trick” used to “hide the decline.” But scientists use the word “trick” to describe something clever or insightful to deal with a difficult issue — not as something deceptive. And the decline that is referred to is well-known in dendroclimatology in which the properties of annual growth rings of trees are used to infer temperature changes.

The fact is that since about 1960, tree-ring data has tended to suggest a decline in global temperatures at a time when direct instrumental measurements of temperatures show that it has clearly increased. Before 1960, tree-ring proxy measurements are consistent with other proxies for temperature change at least back to about 1600 AD. Something is obviously amiss with tree-ring proxy measurements, at least after 1960, although the reason for the divergence is not understood. What the Climategate scientists were referring to in their exchange was a known way of dealing with this inconsistency. It is important to recognize that they were not “fooling” anyone — the anomaly with tree-ring data is well-known among climate experts, as is the statistical “trick” used to legitimately reconcile the different types of data.

Yesterday, a British panel exonerated the scientists involved in Climategate, even as it criticized them for some reluctance to release computer files supporting their work. “[W]e find that their rigour and honesty as scientists are not in doubt,” the review states. This is the third review to clear the scientists of allegations of fraud and the vice-chancellor of the university has now expressed his hope that this will “finally lay to rest the conspiracy theories, untruths and misunderstandings that have circulated.”

The history of Piltdown man reminds us that we need to be on guard against scientific misconduct and fraud. But we also need to be on guard against unwarranted allegations of such misconduct when there is no sustainable evidence to show it is there.

We’re All Going to Die

We knew the world would not be the same. A few people laughed. A few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita. Vishnu is trying to persuade the prince that he should do his duty and, to impress him, takes on his multiarmed form and says “Now I am become Death, the Destroyer of Worlds.” I suppose we all thought that, one way or another.

These were the words that J. Robert Oppenheimer used to describe the reactions of those who were present at the Trinity test site in New Mexico on the morning of July 16, 1945 when the first nuclear-fission bomb was detonated. One of the issues that had been considered when that bomb was being developed was whether there was any potential for catastrophic results, far beyond the level of devastation that was in fact witnessed when similar fission bombs were deployed a month later in Japan. Edward Teller, who would go on to promote the development of a fusion bomb, had raised the possibility that the temperatures produced by the explosion of a fission bomb would be sufficient to ignite the atmosphere, and thereby result in global catastrophe. The scientists involved in the Manhattan Project to develop the bomb concluded that such an occurrence was, in fact, impossible, and proceeded with the development of a device that still in many ways haunts the political environment of the planet.

More recently, a similar kind of concern has been raised with a different scientific project and in a different context. The Large Hadron Collider (LHC), recently built by the European Organization for Nuclear Research (CERN), is the most powerful particle accelerator ever constructed at a cost of some US$9,000,000,000. Expectations for its potential to shed light on some very fundamental questions in physics about the nature of matter are high, and of much interest in their own right. In particular, scientists hope that the LHC may eventually resolve one of the most important questions about the generation of mass through the Higgs mechanism by producing and detecting the so-called Higgs boson. Experimental identification of the Higgs boson has the potential for largely confirming the Standard Model of elementary particle physics.

But despite its scientific importance — or perhaps because of that importance — there have been a number of legal challenges to operation of the LHC, based mostly on the suggestion that the energies involved might produce small black holes that could swallow up the Earth. It is the stuff that makes up the plots of science-fiction thrillers and while those at CERN are hopeful that the accelerator could produce mini black holes, the danger has been dismissed by particle physicists because the black holes would survive only for very small fractions of a second.

The fact is that the various lawsuits have uniformly failed and on March 30, 2010, the LHC achieved record energy levels in colliding protons together without a hint of catastrophic results. This week, on June 28, 2010, CERN announced that the LHC had doubled the previous record for particle-beam collisions — it was previously held by the Tevatron at Fermilab in Illinois. The LHC is still running at only half the energy it was designed for, but it is hoped it will run at its full energy by sometime in 2013.

What does this say about the intersection of law and science? Scientists complain all the time that lawyers and judges lack the technical expertise to make decisions about which view of the science is correct. Indeed, they feel that far too often, legal decisions are made on the basis of an incorrect, biased, and alarmist view of the science. At the same time, the public is often distrustful of scientists because it believes that scientists get too caught up in the intellectual interest of a project, diminishing legitimate public concerns because they dislike interference in what they are doing. Robert Oppenheimer himself acknowledged this fervor that scientists can have in talking about the Manhattan project: “When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only you have had your technical success. That is the way it was with the atomic bomb.”

So what are judges who are confronted with such issues to do? To handcuff scientists in response to ignorant and irrational fears is clearly too drastic, especially given the historic benefits that science has had to Mankind. But it is equally too drastic to give scientists an unfettered license to investigate whatever they wish in the name of advancing knowledge when the risks are real and legitimate.

The best answer at the moment is to adhere to the centuries-old principles that have developed in deciding cases. Require that the challengers demonstrate the legitimacy of their concerns and apply balancing tests that evaluate the real level of risk — knowing that what we are talking about is risk and not certainty — against the potential benefit. This is precisely what the judges in the various lawsuits against the LHC have done, finding that the risk identified by the challengers is remote enough and the benefits provided by the LHC are great enough that it would be a mistake to shut it down.