In 1793, only 10 years after the Revolutionary War ended, Caribbean ships imported devastating yellow fever epidemics that caused up to 10 percent mortality in major U.S. port cities. Historians claim that these epidemics and the fear they engendered forever stamped the American character. With an extensive coastline receiving ships from four continents, the young trading nation needed to support and enforce quarantine, then under local control, and to hospitalize ill sailors (6). Quarantine—based upon ancient ideas of separating the ill from the well and formalized in the 14th century to prevent pandemic shipborne spread of plague—was such an important civil activity that some state laws made breaking it punishable by death. On 16 July 1798, President John Adams (1735–1826) signed into law a bill modeled on British practices providing for the care of ill sailors, including hospital isolation (1, 2); however, at this point, the law had little to do with quarantine other than providing for patient isolation for communicable diseases in the context of clinical care. Over several decades, this system, which began as a medical insurance program, grew into the United States Marine-Hospital Service (MHS). In 1870, the MHS was reorganized and given a Supervisory Surgeon (later renamed the Surgeon General). The first Surgeon General was John Maynard Woodworth (1837–1879), the medical hero of Union General William Tecumseh Sherman’s (1820–1891) Civil War “March to the Sea.” The National Quarantine Act of 1878 gave the MHS substantial quarantine authority and an epidemic disease surveillance system to coordinate with state and local quarantine operations. This was followed by a number of supplementary acts, including an 1890 act authorizing interstate disease control and an 1893 act extending MHS authority over all infectious diseases (1, 2).

Because the causes of epidemics were unknown in the early 19th century, preventing disease introduction was a national priority. The dawning of the microbial era late in the century dramatically changed medical practice, public health, and quarantine. Since the 1830s, microscopists had been examining plant and animal disease specimens, suggesting a link between several fungi and protozoa and human skin conditions. Others had been producing diseases by animal inoculation. Casimir Davaine (1812–1882) spent 25 years researching anthrax, and in 1863 applied the term “bacteria” ("bactéries," from a general morphological term of Ehrenberg, 1838) to describe the living organisms he associated with that disease (7). Building upon Davaine’s work, Robert Koch’s (1843–1910) 1876 description of the life cycle of Bacillus anthracis represented the first human infectious disease established by new microbiological criteria (“Koch’s postulates”) (7–9). Most physicians and scientists did not appreciate the implications of these breakthroughs until 1882, when Koch identified the cause of tuberculosis (10). (Prominent American physician Austin Flint, Sr. [1812–1886] rushed excitedly into instructor William Henry Welch’s [1850–1934] bedroom screaming, “I knew it! I knew it!” as he waved the newspaper account of Koch’s discovery.) The next three decades, among the most groundbreaking in medical history, revealed the long obscure etiologies of the most important epidemic diseases: malaria and typhoid fever (1880); tuberculosis (1882); cholera, diphtheria, tetanus, and pneumococcal pneumonia (1884); and botulism, plague, and the first “filter-passing agents,” including viruses (1894). Vaccines, passive immunotherapy, and specific antimicrobial therapy would be developed, and clinical diagnosis, the epidemiology of communicable diseases, and other public health activities, such as quarantine, would be placed upon a solid biological basis. It was the most sweeping revolution medicine has ever seen.

Content last reviewed on