Introduction:

Human subjects research has transitioned from a period of unguided individual inquiry to a highly regulated global standard. This transition was shaped by pivotal moments where scientific urgency collided with the physical safety of participants. This journey toward modern ethics was not a straight line of progress, rather, it was shaped by pivotal moments of crisis where scientific urgency collided with the physical safety and fundamental rights of human subjects. This blog series serves as a deep dive into that complicated history, examining the experiments, individuals, and regulations that would eventually create the Institutional Review Board (IRB) and the Belmont Report.

 

Pre 20th Century: 

Human subjects research is not a modern invention, rather it has been an integral part of scientific inquiry for centuries - long before the development of the ethical framework we recognize and follow today. Throughout the 18th and 19th centuries, the medical field was characterized by rapid discoveries, formalizing clinical practices. However, this era lacked a unified moral compass and researchers often relied on the moral and ethical principles of their individual cultures, making the standardization of practices difficult. During this time, formal oversight was also virtually non-existent, and the concept of ‘participants rights’ meant that there was nothing in place to ensure the protection of human subjects.

 

Early 20th Century:

One of the earliest and most significant turning points regarding the ethics of informed consent occurred at the dawn of the 20th century with Major Walter Reed’s Yellow Fever experiments. In the aftermath of the Spanish-American War, Yellow Fever remained a terrifying and lethal threat, with no known cure or effective treatment in sight. Driven by a sense of urgent necessity, the American Surgeon General commissioned Reed and a select team of physicians to travel to Cuba to uncover the disease's origins. To test their theories, Reed’s team took the radical step of intentionally exposing human participants, including several members of the research team itself, to the deadly virus through the bites of infected mosquitoes. This marked a profound moment in medical history, where the quest for a life-saving breakthrough directly collided with the physical safety of human subjects.

 

Phase I

The initial phase of the investigation was characterized by a lack of oversight and a desperate experimental design. Under the direction of Dr. Jesse Lazear, while Major Walter Reed was away in the United States, the team began exposing a small group of soldiers to mosquitoes believed to be carrying yellow fever. At the time, this "mosquito theory" was largely dismissed by the medical establishment, making the experiments appear both scientifically radical and ethically reckless. Because there were no sophisticated protocols for participant selection or safety, the boundaries of the study were dangerously blurred. Lazear and his colleagues, including Dr. James Carroll, decided to inoculate themselves to prove the theory’s validity and perhaps to silence critics of human experimentation. The results were devastating. Carroll contracted a severe, near-fatal case of the virus. Shortly thereafter, Lazear inoculated himself again, where he succumbed to the disease and died just one week later. Historians often debate Lazear’s motives, speculating that his self-sacrifice may have been driven by a combination of professional pressure, the desire of scientific fame, and a profound sense of guilt over Carroll’s suffering - all exacerbated by a total lack of safety guidelines.

Phase II

The tragedy of Lazear’s death served as a grim catalyst for change. Upon his urgent return to Cuba, Major Reed recognized that the experiments could not continue without a more rigorous structure. In the second phase of the study, Reed and his team developed what are now considered some of the first documented instances of informed consent in medical history.

For the first time, the team drafted formal guidelines to govern volunteer selection and clearly define the risks involved, as displayed below. To incentivize participation in what was still a life-threatening endeavor, volunteers were offered significant financial compensation: $200 for participating and an additional $300, bringing the total to $500, if they actually contracted the disease. In today’s economy, this would be roughly equivalent to $8,000 and $20,000, respectively - a sum so high it would likely be considered "undue inducement" by modern IRB standards.

By this stage, the undeniable evidence of the doctors' own infections had shifted public opinion. The mosquito theory was no longer a fringe idea; it was a widely accepted reality among the researchers and the volunteers alike, setting the stage for one of the most significant breakthroughs in public health history.

Phase two

Phase III

By the third phase, the scale of the research shifted. Volunteers were transferred from military encampments to a hospital in Havana, where the protocols grew increasingly invasive. In addition to mosquito exposure, the team began "blood-injection" experiments - transferring blood from infected patients into healthy volunteers - while simultaneously hunting for a viable antiserum or cure. While these later phases ultimately succeeded in proving the mosquito theory to a skeptical global medical community, they did not come without a grave human cost. Several more participants died, including Clara Maass, an American nurse and the study’s only female volunteer. Ultimately, the successes of the third phase laid the groundwork for the eradication of Yellow Fever in the Panama Canal Zone and beyond, but the deaths of Maass and others remain a somber reminder of a time when scientific progress was often prioritized over the lives of the individuals making it possible.

 

Conclusion:

When we analyze these experiments through a modern lens, the ethical failures are stark. While Major Reed’s introduction of a written consent document was a revolutionary "step in the right direction," it fell far short of the standard of Informed Consent we uphold today:

  • Ambiguity of Risk: Especially in the early stages, participants had no way of gauging the true danger because the researchers themselves didn't fully understand the disease’s progression.
  • Lack of Autonomy: The protocols lacked a "right to withdraw." Once a subject was enrolled and the experiment began, they had no clear mechanism to terminate their participation.
  • The Problem of Inducement: The massive financial incentives, equivalent to nearly a year's salary for some, likely clouded the volunteers' judgment. In modern ethics, this is known as "undue inducement," where a payment is so high it encourages participants to take risks they otherwise would not consider.
  • The Knowledge Gap: Many early volunteers were skeptical of the mosquito theory, leading them to believe the experiments were safer than they actually were. They weren't just consenting to a study; they were gambling on a theory they didn't believe in.

These experiments served as a bittersweet milestone: they provided the breakthrough needed to conquer Yellow Fever, but they did so by navigating a landscape where the rights of the individual were still largely experimental.

 

Citations:

Lederer, S. E. (2008). Walter Reed and the yellow fever experiments. In E. J. Emanuel, C. Grady, R. A. Crouch, R. K. Lie, F. G. Miller, & D. Wendler (Eds.), The Oxford textbook of clinical research ethics (pp. 9–17). Oxford University Press.

McCarthy, M. (2001). A century of the US Army yellow fever research. The Lancet, 357(9270), 1772.https://doi.org/10.1016/S0140-6736(00)04908-1

University of Virginia Health Sciences Library. (2004). Philip S. Hench Walter Reed yellow fever collection: Dr. Jesse Lazear and his contribution to the conquest of yellow fever. http://yellowfever.lib.virginia.edu/reed/collection.html