To the End of Reckoning

It is not a shining moment for science or for law. 

Indeed, the story reads more like a gripping thriller — one that the reader finally sets down by the side of the bed at 3 AM thinking that it was an exciting enough tale to forgo sleep even if it was totally unbelievable.  Imagine the plot summary.  A physician, receiving handsome payments from an attorney who wants to bring a class-action suit against a vaccine manufacturer, invents a fictional disorder and fabricates data to establish a fraudulent link with the vaccine.  The stuff of Ian Fleming or John Le Carre. 

In 1998, Andrew Wakefield and twelve other others published a paper in Lancet, a prestigious medical journal, implying a link between the measles, mumps, and rubella (“MMR”) vaccine and a new syndrome of autism and bowel disease.  Many scientists were skeptical, pointing to any number of scientific weaknesses in what was reported.  But large segments of the public relied on the paper when its findings were reported in the general press.  Vaccination rates plummeted as parents determined that the risk of autism in their children was too great.  The inoculation rate dropped most dramatically in Britain, from about 92% to below 80%, but many children in other parts of the world also failed to receive the MMR vaccine as a direct consequence of the Wakefield paper.  From having been declared “under control” in the mid 1990’s in Britain, measles was declared “endemic” in Britain only ten years after publication.  The U.S. Centers for Disease Control notes that more cases of measles were reported in the United States in 2008 than in any year since before publication, with more that 90% of those having not been vaccinated (or having a vaccination status that was unknown). 

A series of articles being published in the British Medical Journal  began last week, exposing a fraud surrounding the paper that author Brian Deer likens to the Piltdown Man scandal, an elaborate hoax that disrupted the natural course of paleontology research and that took decades to uncover.  I briefly discussed Piltdown man some months ago here.   Deer describes how Wakefield was retained by attorney Richard Barr two years before the paper was published, and paid more than £435,000, six times his annual salary as a physician.  The children in the “study” that he conducted on MMR were targeted and preselected to have desired symptoms, he reinterpreted clinical records to suit his contrived syndrome, and “chiseled” histories to reach unsupported clinical diagnoses.  While admonishing many who failed to exercise sufficient diligence — coauthors, fellow scientists, hospital managers, journal editors — Deer reports that the evidence shows it was Wakefield alone who perpetrated the scandal and that not even the attorney who paid him knew what he was doing. 

When Deer first reported on irregularities surrounding the study in 2004, ten of Wakefield’s coauthors retracted the interpretation.  In January 2010, a five-member tribunal of the British General Medical Council found dozens of allegations of misconduct proved, including four counts of dishonesty and twelve involving the abuse of developmentally challenged children.  A copy of the results of the Fitness to Practise Hearing, which was the longest ever conducted by the Council, can be found here.  

The consequences of the misconduct are truly staggering.  Epidemiological studies were conducted at great public expense, unable to confirm any link between autism and the MMR vaccine.  Research funds and personnel were diverted from more legitimate avenues to understand the actual causes of autism and to help those affected by it.  Many children who might otherwise have been vaccinated suffered an illness that we have the technology to prevent and may be among the small number who died as a result of having contracted measles.  At the moment, the effect on mumps contraction remains unclear because its peak prevalence is in older adolescents. 

In its editorial, the British Medical Journal pulls no punches.  It asks whether it is possible that Wakefield was wrong but not dishonest.  After all, scientists are allowed to be wrong.  Indeed, it is a strength of the scientific method that all honest ideas should be considered so that they can be scrutinized and dismissed if they are incorrect. 

The answer it gives is a simple one:  “No.” 

Not a single one of the case studies was reported accurately.  The pattern of the misreporting shows an intent to mislead.  Already actions are being taken to examine Wakefield’s other publications, mindful of the experience that misconduct is rarely an isolated event. 

When I write about various issues on this blog, one of the things that I am reminded of repeatedly are the similar ethical requirements that attorneys and scientists are expected to adhere to.  Those ethical requirements exist because of the importance of the work that they do, and the trust that is necessarily placed in them by the public. 

Attorneys suffer a great deal from distrust by the public, and I have always thought that that is one of the prices to be paid for the system.  Attorneys deal with controversies between parties that have opposing interests and operate in a system that requires that the best arguments be put forward on behalf of their clients, most especially for those clients who seem unlikeable and potentially subject to victimization if the state or other opposing party is not held to the strictest standards of proof. 

Scientists have generally enjoyed a more favorable reputation by the public, but misconduct of this scale has repercussions that extend far beyond even vaccines and autism.  Such acts of misconduct erode public confidence in the legitimate conclusions of science, putting members of the public in a circumstance where they do not know who or what to believe.  They are inconsistent with the openness that is the very bedrock of science, which seeks to communicate not only what we do know but also what we don’t.  Because if people are to make decisions in their lives based on the results that science achieves, they deserve to know both.

By Any Other Name, It Would Taste as Sweet

It’s almost amusing.  Actually, it is amusing.  If Larry Page and Sergey Brin had not decided in 1997 to rename their now-famous search engine at Stanford University, it is entirely possible that much of our dialogue about the Internet would be filled with strange innuendo.  Imagine a friend is curious about some topic or other.  But for that change in name, you might be telling him, “Oh, I don’t know.  Just backrub it.” 

Which name is the better one, Google or Backrub?  “Backrub” was descriptive in a way.  The innovation of the search engine that we now call Google was that it ranked the importance of search results according to the number of backlinks that a web page had.  It was a different approach than most other search engines at the time used. 

We are probably no worse off today because of the name change.  Indeed we are probably better off, with Google being considered by some to be one of the best company names.  It takes only a quick backrub of the Internet — perhaps that does not sound so awful after all — to come up with many companies who have changed their names to avoid negative connotations with their name.  Yahoo! is surely much better than the far-too-parochial Jerry’s Guide to the World Wide Web.  And people are still not entirely sure whether KFC adopted that name in 1999 to avoid connections with “Kentucky” (the state by that name trademarked its name and introduced substantial licensing fees), “Fried” (the company did not want its food products to seem totally unhealthy), or “Chicken” (government regulators were pressuring the company about its livestock practices). 

But when does renaming something cross the line into becoming deceptive?  In the last several years, high fructose corn syrup has been increasingly disparaged.  It is found in many processed foods, most notably in soft drinks but also in soups, lunch meats, breads, cereals, condiments, and may others.  It has been blamed for the high levels of obesity in the United States and as a contributor to a number of health issues.

The name “high fructose corn syrup” is essentially an accurate description.  Derived from corn, this sweetener comes in a number of different varieties, all of which have a mixture of fructose and glucose.  The most widely used variety, HFCS 55 has about 55% fructose and 42% glucose as compared with about equal amounts of fructose and glucose in sucrose (sugar).  Critics point to the different metabolic pathways of fructose and glucose in the body, namely that fructose consumption does not result in the body’s production of leptin, which is an important substance in signalling the brain to stop sending hunger signals.  It is fair to say, though, that there is a lack of consensus on the full impact of sucrose versus high fructose corn syrup consumption. 

The Corn Refiners Association takes the view that both high fructose corn syrup and sucrose have “similar” glucose-to-fructose ratios, and that those ratios are similar to the ratios found in natural fruits and vegetables.  Responding to advice from many quarters to reduce consumption of high fructose corn syrup, the Association petitioned the Food and Drug Administration (“FDA”) in September to change the name.  “Corn sugar” is the name they prefer and believe it remains free of an unjustified stigma.  Their views can be be found here

There can be no doubt that in petitioning for a name change the corn industry is responding to criticisms about consumption levels of high fructose corn syrup.  What is really at issue is whether changing the name is intended to be helpful in giving the public accurate information or is intended to be deceptive. 

One name change that the Association points to in making its case to the FDA is the FDA’s approval in 2000 to rename prunes as “dried plums.”  That action was initiated by the California Prune Board, which asserted that the public associated negative imagery with the name “prunes.”  Indeed, after approval by the FDA, the Board itself was renamed and is now the California Dried Plum Board, still active in promoting the value of prunes, no matter what they call them. 

The comparison that the corn industry wishes to make strikes me as a weak one, though.  While the word “prune” does sound decidedly less palatable than “dried plum,” allegations of unhealthiness of the product were not part of the motivation for seeking a name change.  And the purchase of prunes is made in a very different way than the purchase of high fructose corn syrup.  Someone purchasing prunes knows exactly what they are buying, but no one really goes to the grocery store to pick up some high fructose corn syrup — it’s just there, so ubiquitously present that those who wish to avoid consuming it need to expend some fair effort in finding products that do not contain it. 

Consumers deserve to know what they are buying and deserve to have useful information so they can decide for themselves what they wish to consume.  This is true even if the scientific research about the health effects of products is unclear.  It is even true if people are being irrational about what they choose to consume.  It is, after all, their bodies and their responsibility to inform themselves about the health effects of what they eat.  The precise contours of the corn industry’s motivation in seeking a name change thus need to be an essential part of the FDA’s deliberations. 

It will be some time before the FDA reaches a decision.  In the meantime, documents filed in connection with the petition, including comments from the public, can be found here.  It is worth emphasizing the importance of these public comments, which very often have a real impact on the decisions reached by government agencies.

Have a Couple Beers and Call Me in the Morning

Catherine O’Leary’s cow.  Gavrilo Princip.  The hayfork of a drunk hired hand. 

Sometimes things have consequences all out of proportion to what we expect.  In my list, Catherine O’Leary’s cow is reputed to have kicked a kerosene lantern in 1871 and thereby cause the Great Chicago Fire, resulting in the death of some 300 people and the destruction of about a third of the city’s value.  Gavrilo Princip was a 19-year-old man who assassinated Archduke Franz Ferdinand of Austria, setting off a series of events that resulted in the Great War of 1914 – 1918. 

And the poor hired hand?  The story goes that one day when Wayne Wheeler was a boy, he was working on his family’s farm and his leg was poked by the hayfork of a drunken worker.  It traumatized him and he spent much of his life devoted to the abolition of alcohol, becoming the de facto leader of the Anti-Saloon League.  His efforts, organizing churches and other small temperance groups to implement a particularly effective form of pressure politics were instrumental in bringing about passage of the Eighteenth Amendment to the U.S. Constitution prohibiting “the manufacture, sale, or transportation of intoxicating liquors” within the U.S.  He claimed to have substantially written the National Prohibition Enforcement Act (a fact that the Act’s official author, Andrew Volstead, repeatedly denied) establishing executive power to enforce the Eighteenth Amendment.  The full text of the Volstead Act can be found here.  

During the 13 years that prohibition was in effect in the United States, alcohol consumption dropped by an estimated 30 percent as legal avenues for access disappeared and costs through illegal avenues rose higher than many people could afford.  It is generally accepted that the consequences of this experiment were mostly a flagrant contempt for the law and the rampant creation of illegal alcohol-distribution mechanisms. 

What is sometimes forgotten is the role that physicians played in maintaining access to alcohol.  They provided one of the few legal ways to obtain alcohol since the Volstead Act carved out an exception to “use liquor for medicinal purposes when prescribed by a physician.”  The alcohol prescription pads of the time look almost quaint today, with alcohol consumption as much a part of modern society as it was in the time before prohibition.  The idea of drugstores maintaining shelves of government-produced whiskey to dispense to those with prescriptions strikes our current mindsets as almost amusing.  It seems obvious to have been an exercise in futility trying to suppress a product so pervasively a part of the prevailing culture. 

The experiment on marijuana prohibition may similarly be coming to end, although that experiment has lasted far longer than the prohibition of alcohol.  In the early 1900’s, marijuana had been almost unknown in the United States, but it started to become more popular as Mexicans immigrated into the U.S., prompting efforts to ban it.  In these efforts, California has always been key, leading the way as other states progressively followed. 

California was the first state to ban marijuana in 1913, with many other states following in due course thereafter.  California was also the first state to soften its stance on marijuana consumption, allowing medical uses in 1996.  Again, many other states followed in due course.  And in next week’s election, California may again lead the way in fully relegalizing marijuana with its Proposition 19.  A copy of the text of the proposition may be found here.  It is worth noting that efforts to fully legalize marijuana in Alaska have failed, even though a 1975 case found personal use of marijuana at home protected by an unusually strong privacy provision in the Alaska Constitution.

In many ways, the currently availability of marijuana for medical purposes in a number of States mirrors what occurred during the era of alcohol prohibition, although marijuana is probably recognized for more legitimate medical purposes than alcohol.  During the period of alcohol prohibition, the availability of medical prescriptions made a mockery of prohibition efforts.  To be certain, there were many who believed in the therapeutic value of alcohol and there are genuine medical treatments to be had with alcohol, but most prescriptions were filled simply so that people could experience its intoxicating effects.  Similarly today, many believe that medical prescriptions for marijuana make a similar mockery of efforts to prohibit its use more generally.  Again, there are legitimate therapeutic uses for marijuana, but an honest assessment of medical-marijuana laws is that they have been used as a wedge to gain legal access to its psychoactive properties. 

Even if California relegalizes marijuana in next week’s vote, there is still the matter of federal drug-regulation statutes, which prohibit the use of marijuana throughout the United States.  In addition to the banning of marijuana use by several states, the federal government began to regulate its use nationally with passage of the Marijuana Tax Act in 1937.  A copy of the Act’s text may be found here.   What is perhaps interesting about the Marijuana Tax Act is that it included a specific exception for medical uses prescribed by physicians.

The current Controlled Substances Act, which was passed in 1970 in response to President Nixon’s declaration of a “war on drugs” and supplanted the Marijuana Tax Act, includes no such exception.  A conflict thus exists between federal and state law in the several states where medical uses of marijuana are permitted under state law.  The conflict was tested shortly after California decriminalized such uses in 1996 in the case of Gonzales v. Raich.  A copy of the full opinion can be read here.  While the case confirmed the authority of the federal government to ban the use of marijuana even in states that have approved it for medical purposes, the policy of the Obama administration has been not to enforce the federal ban with respect to medical users.  

The ultimate impact of California’s decision next week will be interesting and may well take years to understand fully.  But one thing is certain:  if the proposition passes, it is a certainty that similar proposals will be tried in other states and that additional pressure will be brought to bear on the federal government to follow the lesson of prohibition to its ultimate conclusion.

Poisoned Needles

One of the most impressive stories of global human cooperation began in 1796 with a young dairymaid named Sarah Nelms.  The stories had been apocryphal, mere legends whose truth was suspect:  dairymaids did not get smallpox. 

Smallpox, of course, was one of the great scourges mankind has faced.  Evidence has been found for the disease in Egyptian mummies who died at least three thousand years ago.  Human history is rife with descriptions of smallpox decimating local populations when it was introduced to areas where it was previously unknown.  Smallpox epidemics in North America after introduction of the disease by settlers at Plymouth in 1633 are estimated to have had 80% fatality rates among the native population.  Similar results occurred later in Australia with aborigines.  Numerous isolated island settlements in both the Pacific and Atlantic had almost all of their native populations wiped out by the disease. 

The fact that dairymaids seemed to have a peculiar immunity to smallpox was remarkable, a consequence of the fact that they had frequently suffered from cowpox, a much less fatal disease.  It was Edward Jenner who recognized that deliberate infection with cowpox could serve as a way to protect against smallpox.  On May 14, 1796, he obtain matter from fresh cowpox lesions on the hands and arms of Sarah Nelms, using it to infect James Phipps, an eight-year old boy.  James developed fever, loss of appetite, and discomfort in his armpits — symptoms of cowpox — and recovered about a week and a half after being infected.  A couple of months later, Jenner infected him again, but this time with matter obtained from smallpox lesions, and no disease developed. 

It would take Jenner some years to persuade scientific colleagues that “vaccination” in this way — the word being derived from the latin “vacca” for cow and “vaccinia” for cowpox — could prevent the spread of smallpox.  He was ultimately vindicated.  The end result of his campaign came on December 9, 1979 (and endorsed by the World Health Assembly on May 8, 1980) with a certification that smallpox had been eradicated from the planet except for some small stores maintained for research purposes. 

On October 12, the Supreme Court of the United States will consider oral arguments in what is likely to be a pivotal vaccination case.  At issue is a portion of the National Childhood Vaccine Injury Act, originally passed in 1986 (42 U.S.C. §§ 300aa-1 – 300aa-34).  Aside from its independent relevance, the Vaccine Act is interesting because it can be seen as a case study for many current efforts to introduce broader tort reform prompted by the concern, popular in some circles, that compensation awarded in tort cases is excessive and that litigation is generally an ineffective mechanism for giving compensation. 

Vaccine cases have traditionally been handled under state law, particularly conventional products-liability law.  The Vaccine Act was intended to provide a substitute mechanism for compensating those who may have been injured by vaccines that they received.  At the time, there was active protest by the manufacturers of vaccines, who complained that the cost associated with lawsuit threats was too high and who threatened to cease production of vaccines because the economic risk was too great.  At the same time, those who were injured by vaccines were generally dissatisfied — the time taken to litigate claims and engage in settlement negotiations was long and the entire process was costly, still sometimes resulting in no compensation at all.  These are all arguments that are still currently made more broadly by advocates of broader tort reform. 

Congress created an administrative program that focuses on compensation to the victims of vaccine-related harms.  If a victim can demonstrate receiving a vaccination of a particular type listed in a special Vaccine Injury Table, he or she is awarded compensation, without regard to either fault or causation.  This takes place in the Office of Special Masters of the U.S. Court of Federal Claims, commonly called the “Vaccine Court.”  If the compensation is unsatisfactory or if no award is made, the possibility still exists to bring a tort claim.  The program thus attempts to strike a balance between allowing claims to be brought in the traditional manner, while at the same time offering an alternative that results in faster and more predictable claims being paid.  How much the program is used thus offers an interesting perspective on the success of such a tort alternative. 

In exchange for the essentially automatic awarding of claims, the Vaccine Act preempts tort claims arising from “unavoidable” injuries: 

No vaccine manufacturer shall be liable in a civil action for damages arising from a vaccine-related injury or death associated with the administration of a vaccine … if the injury or death resulted from side effects that were unavoidable …

It is this provision that is at issue in the case being heard by the Supreme Court.

The fact that an injury is “unavoidable” from use of a product does not normally insulate the manufacturer of the product from having to pay damages when it injures someone.  Indeed, “design defects” of products frequently give rise to damages — it does not matter that the product was flawlessly produced in precise accordance with specifications and with extreme care; if it has a faulty design that injures people, there is still liability for the injuries. 

In 2008, the Georgia Supreme Court held that the Vaccine Act does not preempt all design defect claims — only those where the injurious side effects were unpreventable.  That decision can be read here.  In 2009, a federal appeals court ruled that all design-defect claims are preempted, i.e. even if it were possible to prepare a safer form of the vaccine, there is still no permissible state tort claim arising from use of the more harmful design. That opinion can be read here

While the issue the Supreme Court will consider appears at some level to be a narrow one in that it is a matter of resolving a disagreement over statutory interpretation, it has broader importance.  In the quarter century since it was passed, about two thirds of those applying for federal injury compensation have been turned away empty-handed.  The program was sold as one in which the adversarial nature of litigation claims was to be avoided, but critics suggest that proceedings before the Vaccine Court have turned out to be nearly as time-consuming, expensive, and contentious as traditional litigation.  Many thus consider this experiment in tort reform to have been a failure, and a decision on one of the Vaccine Act’s more important provisions by the nation’s highest court will help determine whether such a view is warranted.

WARNING: THIS BLOG MAY BE ADDICTIVE

Well, I have to confess that there is a part of me that hopes so! But my title today is, at least to me, very obviously a spoof.

People need to be careful about spoofs though. Sometimes they take on a life of their own.

Consider Ivan Goldberg, a respected physician who specializes in treating individuals with mood disorders. He fabricated the term “Internet Addiction Disorder” in 1995. It was intended to be satirical, and he patterned his list of diagnostic criteria for the disorder after the entry for pathological gambling in the Diagnostic and Statistical Manual of Mental Disorders (“DSM”) — the authoritative work on psychiatric disorders published by the American Psychiatric Association. A copy of Goldberg’s parody can be found here.

Perhaps it was the technical way it was written, or perhaps it was because it struck a nerve among those who sometimes feel widowed as a result of significant Internet use by loved ones, but the parody caught on. Some thought it was real — even a few serious and respected publications were fooled — and many people began to accept that Internet Addiction Disorder was legitimately recognized.

In the fifteen years since Goldberg’s spoof, there has been increasing research on the effect of a variety of modern forms of technology on the human brain. One of the more interesting is the Starcraft study. Starcraft is a military science-fiction strategy game that has received enthusiastic accolades from those in the video-game industry, who have praised it as one of the best video games ever. The multiplayer version of it has gained particular popularity in South Korea.

Because it is so good, a lot of people want to play it. And some of them spend a lot of time playing it. Of the eleven participants in a the study, six had dropped out of school for two months because of the amount of time they were playing Starcraft and two were divorced by their spouses as a direct result of the time they put into the game. Psychiatrists at the Chung Ang University in South Korea tested a treatment of this group by prescribing the antidepressant Bupropion for six weeks, resulting in their playing time decreasing by about a third. MRI studies of their brains by the Brain Institute at the University of Utah showed increased brain activity in three areas that was not present in a control group when shown images from the game.

Many people remain dismissive of the idea that there can be such a disorder as Internet Addiction, and suggest that the treatment with an antidepressant works because those individuals were depressed — under that view, the excessive time playing Starcraft is a sympton of depression, not of a new type of disorder. They perhaps spend so much time playing in an effort to escape from the sadness that their depression causes. Known psychiatric disorders of obsession and compulsion are other likely candidates for disorders that are sometimes manifested by excessive Internet usage and game playing.

The game Lineage II is another multiplayer online game, also extremely popular in South Korea. Instead of the science-fiction theme of Starcraft, though, Lineage II has a fantasy theme. The makers of the game, NCsoft Corporation and NC Interactive, Inc., were sued about a year ago by Craig Smallwood, a 51-year-old resident of Hawaii who claims that he became addicted to the game. His complaint asserts that over the period of 2004-2009, he played the game some 20,000 hours — that comes out to an average of 11 hours a day, every day of the year, for five years. When his access to the game was cut off, he claims to have “suffered extreme and serious emotional distress and depression,… been unable to function independently,… suffered psychological trauma,… was hospitalized, and … requires treatment and therapy three times a week.”

In a decision in August of this year, the judge in the case, Alan C. Kay, dismissed some of the causes of action but declined to dismiss them all. The causes of action that remain, and which Smallwood presumably will now attempt to prove, include defamation, negligence, gross negligence, and negligent infliction of emotional distress. The full opinion (which considers a number of procedural issues at some length) can be read here.

Although some commentators have ridiculed the decision, finding it stretches credibility to allow a cause of action to proceed on the basis of Internet addiction, it is worth noting that this ruling is still at a very early stage of the litigation. This was a ruling on a motion brought by the defendants, meaning that the plaintiff’s assertions were necessarily considered in their most favorable light and assumed to be true. Those claims that were dismissed — misrepresentation, unfair trade practices, intentional infliction of emotional distress, and assessment of punitive damages — were found by the court to have no merit even if all of the assertions were true and construed in that most favorable way.

It is likely to be an arduous and uphill battle now to prove his assertions and to prevail on the surviving causes of action. Currently, Internet addiction is not a disorder recognized by the DSM and this is likely to be an important factor in the remainder of the litigation. It is also true, though, that some serious researchers advocate including Internet addiction in the next edition of the DSM, currently scheduled for release in 2013. This advocacy has been formal, appearing in prestigious peer-reviewed journals such as the American Journal of Psychiatry, but it still appears that those in the psychiatric-research community who advocate including it currently remain in the minority.

The ultimate decision of the American Psychiatric Association will likely have a strong impact on litigation. Courts will take notice of such an expert assessment, whichever way it goes. If the next edition of the DSM includes Internet addiction as a disorder, expect to see many more lawsuits like those brought by Smallwood, and expect them to have a greater chance of success than currently seems likely. Also expect the producers of Internet content to respond and seek ways to avoid any liability.

I’m putting myself out ahead of the curve. You’ve been warned.