Eyes Closed Forever in the Sleeping Death

The name of Henry Labouchere is certainly unfamiliar to most, and yet he played a role in one of the great travesties of the 20th century.  I realize that it can be unfair to judge acts of history through the lens of modern morality, but I am going to do so anyways.  In this instance, it is justified. 

When the United Kingdom was amending its criminal laws in 1885, the major thrust of the revisions was to expand the protection of women in an era when they lacked power in any number of respects.  It was an era in which women were not permitted to vote and one in which the age of consent for sex was a mere thirteen, setting forth only misdemeanor penalties for those who would have sex with a girl between the ages of 10 and 12.  The amendments did any number of things that most today would recognize as good and responsible:  they raised the age of consent to 16 and introduced a number of provisions designed to curb the practice of abducting or otherwise procuring young, impoverished girls for prostitution.  The Labouchere amendment, added quietly to the bill at the last minute, did something quite unrelated:  It criminalized almost all homosexual behavior between men. 

The Labouchere Amendment:  Any male person who, in public or private, commits, or is a party to the commission of, or procures, or attempts to procure the commission by any male person of, any act of gross indecency shall be guilty of a misdemeanour, and being convicted shall be liable at the discretion of the Court to be imprisoned for any term not exceeding two years, with or without hard labour.

Sixty-seven years later, as part of reporting the burglary of his home by a friend of his lover, a man confessed to police that he was having a sexual relationship with another man.  He was convicted under the Labouchere Amendment and given the choice of a year’s imprisonment or probation on the condition that he undergo chemical castration.  He was to take female-hormone injections every week for a year, resulting in a humiliating feminization of his body.  “They’ve given me breasts,” he complained to a friend.  He had once run a marathon in a time that was only 21 minutes shy of the world record, and had one of the most brilliant scientific minds of the 20th century.

The man, of course, was Alan Turing, whose work as a cryptographer at Bletchley Park during World War II was, in no small measure, instrumental to the ultimate victory by allied forces.  His most important contribution was certainly development of the initial design of the “bombe,” an electromechanical device that allowed the British to determine settings on the German Enigma machines and that altered the flow of vital naval intelligence information.  While some generously consider the bombe to have been the world’s first computer, it was not programmable in the way we normally think of computers.  But it was there too that Turing had an impact, expanding on theoretical ideas he developed before the war to build some of the earliest programmable computers.  He is often called “the father of the computer” and his “Turing test” to evaluate the apparent intelligence of computers remains of fundamental importance in fields of artificial intelligence.

There is no question that Turing was eccentric and that he was socially different from others.  But when his country owed him a debt of gratitude for his impact in changing the course of a war and for his role in establishing one of the pillars of modern society, it instead convicted him for his personal and private activities, shaming him with a horrible demasculinizing of his body.  It removed his security clearance and banned him from continuing his consultant work with the British intelligence agencies. 

One of Turing’s eccentricities was his peculiar fascination with the tale of Snow White.  When he was found dead at the age of 42, it was beside an apple that had been dipped in cyanide and from which several bites had been taken.  Few doubt that Turing, sickened by what his government had done to his body, deliberately poisoned the apple and ate it as his method of committing suicide.

Dip the apple in the brew,
Let the sleeping death seep through,
Look at the skin,
A symbol of what lies within,
Now turn red to tempt Snow White,
To make her hunger for a bite,
(It’s not for you, it’s for Snow White)
When she breaks the tender peel,
To taste the apple from my hand,
Her breath will still, her blood congeal,
Then I’ll be the Fairest in the Land.

                                               Snow White (Disney, 1938)

June 23 is the 99th anniversary of Alan Turing’s birth.  A year from now, I expect that there will be any number of articles written about him and about the achievements he produced in his tragically shortened life.  We can only speculate what other accomplishments that awaited him the world was denied.  In my own way, I want to recognize his eccentricity by commemorating him a year early.

It is a sad testament on our humanity that we have in the past, continue today, and undoubtedly will in the future, misuse the power of the law to punish others for the simple crime of failing to conform.  But how much is that nonconformity itself responsible for the vision that men like Turing had in being able to see things the rest of us are blind to?  Why don’t we celebrate the gifts of that diversity instead?

A Mean Act of Revenge Upon Lifeless Clay

Jack Kevorkian died today, and many are commenting about his role in the “right to die” movement.  While I am a supporter of the movement generally, I did not find Kevorkian to be a courageous man.  His actions had a significant detrimental impact on the efforts of others to provide ways for physicians to aid the terminally ill to end their lives on their own terms and with dignity. 

Consider for a moment the case of Diane, and imagine the circumstances she found herself in.  She had been raised by an alcoholic family when she was a child and had suffered a great number of torments in her life, including vaginal cancer as a young woman, clinical depression, and her own alcoholism.  When her physician diagnosed her with myelomonocytic leukemia, she was presented with the options:  She could proceed without treatment and survive for a few weeks or perhaps even a few months if she was lucky, but the last days of her life would surely be spent in pain and without dignity; it was not how she wanted her friends and family to remember her.  If she accepted the treatment her doctor had discussed, there was a 25% change of long-term survival, but the treatment itself — chemotherapy, bone marrow transplantation, irradiation — would also rob her of much of what she valued about life, and would likely result in as much pain as doing nothing.  For her, the 25% chance that such treatment would succeed was not worth it.  Others might have differed in their assessment, but this was hers. 

Neither option presented to her — let the disease run its course or accept a treatment she had rejected — was acceptable, and so she considered the unspoken alternative.  Diane’s physician told her of the Hemlock Society, even knowing that he could be subject to criminal prosecution and professional review, potentially losing his license to practice medicine.  But by having a physician who knew her involved in her decision, her mental state could be assessed to ensure that it was well-considered and not a result of overwhelming despair.  Her physician could explain how to use the drugs he prescribed — ostensibly to help her sleep — so that until the time came, she could live her life with confidence that she had control over when to end it.  She could enjoy the short time she had remaining without being haunted by fears that it would be ineffective or result in any number of consequences she did not want.  In the end, Diane died alone, without her husband or her son at her side, and without her physician there.  She did it alone so that she could protect all of them, but died in the way that she herself chose. 

The story of Diane is one that her physician, Dr. Timothy Quill, published in the New England Journal of Medicine in 1991.  A copy of it can be found here.  It was one of the first public accounts of a physician acknowledging that he had aided a patient in taking her own life.  It was to prompt a debate about the role of physicians at the end of life, and a subsequent study published by the same journal in 1996 found that about 20% of physicians in the United States had knowingly and intentionally prescribed medication to hasten their patients’ deaths. 

But the quiet, thoughtful, and sober approach adopted by Quill and many other physicians to the issue of physician-assisted suicide was very much derailed by the grandstanding antics of Kevorkian.  His theatrical flouting of the law, prompting law-enforcement agencies to act in making an example of him rather than seriously considering the merits of his views, were counterproductive to the medical debate. 

Kevorkian’s fascination with death was long part of his life.  He was not, as many believe, christened with the nickname “Dr. Death” because of his efforts promoting physician-assisted suicide.  That happened long before, during the 1950’s shortly after receiving his medical degree.  While a resident at the University of Michigan hospital, he photographed the eyes of terminally ill patients, ostensibly to identify the actual moment of death as a diagnostic method, but more truly “because it was interesting [and] a taboo subject.”  Later, he presented a paper to the American Association for the Advancement of Science advocating “terminal human experimentation” on condemned convicts before they were executed.  Another of his proposals was to euthanize death-row inmates so that their organs could be harvested for transplantation. 

His views have politely been described as “controversial,” but are perhaps more accurately considered gruesome and bizarre, such as his experiments aimed at transfusing blood from corpses into injured soldiers when other sources of blood were unavailable.  The result of his various investigations was considerable professional damage, causing him to resign or be dismissed from a number of medical centers and hospitals.  His own clinic failed as a business.  For all his current notoriety, Kevorkian was throughout his career considered very much an outsider to the mainstream medical-science community. 

In considering the legacy of Kevorkian, it is important to recognize the long history of the debate over physician-assisted suicide, which dates at least from the days of ancient Greece and Rome.  The modern debate in the United States has its origins in the development of modern anaesthesia.  The first surgeon to use ether as an anaesthetic, J.C. Warren, suggested it could be used “in mitigating the agonies of death.”  In 1870, the nonphysician Samuel D. Williams suggested the use of chloroform and other medications not just to relieve the pain of dying, but to spare a patient that pain completely by ending his life.  Although the proposal was made by a relatively obscure person, it attracted attention, being quoted and discussed in prominent journals and prompting significant discussion within the medical profession.  The various discussions culminated in a formal attempt to legalize physician-assisted suicide in Ohio in 1906, although the act was rejected by the legislature in a vote of 79 to 23. 

Today, there are three states that have legalized the practice of physician-assisted suicide — Oregon, Washington, and Montana.  The history of how that legislation came to pass, and the various court challenges that have been raised, is fascinating in its own right.  For now, suffice it to say that my own view is that those states legalized the practice because of the courageous efforts of physicians who are largely unknown, not because of the actions of Kevorkian.  Indeed their courage is all the greater that they achieved as much as they did despite his activities.