<<
>>

Disease and world history from 1750

MARK HARRISON

When we think about disease in world history, we are drawn instinctively to the movement of pathogens and peoples. Disease has followed trade, exploration, and conflict, and has magnified their consequences.1 Some historians even write of the “unification” of the globe by disease, as if its distribution provides an index of human connectivity.[185] [186] But the history of disease is as much about divergence as about convergence, and the last two and a half centuries provide ample evidence of both.

The period started quietly enough but the coming century experienced an exchange of patho­gens unequaled in variety and scope. Wave after wave of pandemic disease and livestock diseases circulated the globe, leaving enormous misery in their wake. And yet, the same period saw tremendous improvements in human health. Many common infectious diseases were banished from the developed world, widening the gap that already existed between the mortality profiles of rich and poor countries. These inequalities are still glaringly apparent but the affluent classes in Africa and Asia now experience disease in much the same way as those in the West, with rising rates of cardio-vascular and degenerative conditions. But as recent outbreaks of SARS and “swine flu” have reminded us, our globalized future may yet be a turbulent one. It is perhaps time, therefore, to take stock of our epidemiological past.

The retreat of plague

The middle of the eighteenth century saw few great shifts in patterns of disease but the advent of what would become a near global conflict between the European powers - the Seven Years' War (1754-1763) - brought heavy mortality to the affected regions. Typhus raged in the besieged cities of Europe; malaria and yellow fever in the Caribbean; and dysentery and fevers in the East Indies.

Other communicable diseases - including sexually trans­mitted diseases - were beginning to spread into new parts of North America and the Pacific Ocean as they were penetrated by European explorers and settlers. However, in Europe itself, the disease which had caused the most dreadful predations in previous centuries - plague - was in retreat. With the exception of an epidemic in Marseilles and its vicinity in 1720-1721, Western and Central Europe had not been troubled by the disease since the 1660s.

At the time, many thought that plague had been contained by the practice of quarantine. The Austro-Hungarian Empire maintained a 1,600 kilometer sanitary cordon along its eastern border, and many European countries strengthened their maritime quarantines in the wake of the epidemic in Marseilles. However, those who disliked quarantine began to question its efficacy, claiming, with some medical support, that plague was not particularly contagious and that it flourished only in unsanitary and unfavor­able climatic conditions. These debates have continued among historians but it seems likely that quarantine provides at least part of the explanation for the disappearance of plague. Nothing else seems capable of explaining why it continued to wreak havoc in the Ottoman provinces of Moldavia and Wallachia to the southeast of the Austro-Hungarian Empire and in Russia to the northeast. Climatic and sanitary conditions on the Habsburg side of the border were not so very different from those to the east.

But while much of Europe prided itself on having banished plague, there was growing unease about the propensity of civilization to breed new ills. There were concerns that the abundance generated by colonial ventures - the sugar colonies of the West Indies and the exotic trades of the various East India companies - was corrupting the minds and bodies of prosperous consumers. The agonizing complaint of gout - widely associated with high living - became almost a hallmark of social status, while a host of nervous affections threatened to enfeeble the wealthy denizens of countries such as France and Britain.

These problems seemed to be most evident in cities, and over the coming century successive generations of writers came to lament the infirmities of urban life. While existence in the countryside was harsh and often unsanitary, some saw it as preferable to the overcrowding, alienation, and crime that characterized the modern metropolis. Swollen by a rootless population of economic migrants, booming port cities like London and new industrial towns like Manchester were filthy, alienating, and dangerous. Economic progress, it seemed, came at a cost, and concerned citizens

began to band together in enlightened self-interest to construct fever hospi­tals and other such establishments for the poor.

Atlantic connections

Mounting anxiety about home-grown fevers was accompanied by the pros­pect of invasion from without. The first indication of this new threat came in the Caribbean, in 1793, when the island of Grenada was struck by a severe epidemic of yellow fever. Some physicians attributed the outbreak to an infection brought on ships returning from the west coast of Africa; from Grenada, the disease spread to the French colony of Saint-Domingue where slaves were in revolt, causing refugees to leave the island for the Eastern seaboard of America.[187] Soon after their arrival in Philadelphia - the capital of the recently proclaimed American republic - yellow fever made its appearance, causing thousands of deaths. By 1801, the disease had crossed the Atlantic, where it intermittently ravaged the Mediterranean coast of Spain for two decades, severely affecting cities such as Cadiz and Barcelona.

Yellow fever occurred relatively frequently in tropical Africa and on the western side of the Atlantic but this was probably the first time that it had affected Europe. The most likely reason was the advent of war between Britain and revolutionary France. During the 1790s, tens of thousands of soldiers died from yellow fever in the Caribbean but thousands of infected men also managed to make it back across the Atlantic, together with the mosquitoes which spread the disease.

However, the fever continued to break out periodically in the Mediterranean long after the war ended in 1815. After a mysterious lull in the 1830s and 1840s, it returned once more to Europe, badly affecting Lisbon in 1857, with smaller epidemics in St Nazaire and Swansea in the early 1860s.

The expansion of yellow fever thus appears to be something more than a consequence of war. Most of the outbreaks which occurred in Europe after 1815 were attributed to vessels involved in various forms of trade - delivering shipments of sugar, guano, and mineral ores, for example - or the arrival of the mail ships which traversed the oceans in increasing numbers. What many of these vessels had in common was that they were powered by steam rather than simply by sail. Their number grew steadily through the century, allow­ing the Atlantic crossing to be made in less than a week. As a result, persons infected with yellow fever were not always detected before their symptoms appeared. The disease therefore had the potential to disrupt the Atlantic economy, as did the measures designed to contain it. Merchants on both sides of the ocean complained bitterly of the effects of quarantine on their business, and their vexation was shared by persons returning on leave from the European colonies. This led countries such as France and Britain to dismantle or reduce quarantine in their ports. Many North American ports, too, took a relatively relaxed attitude to the disease in the first few decades of the century, but its recrudescence in the 1850s, with major outbreaks at New Orleans, Rio de Janeiro, and Lisbon, forced them to reconsider.[188]

It is unclear why yellow fever erupted at this time, but the most likely explanation is that the growing volume of shipping coincided with favorable environmental conditions, for the mosquito vector of yellow fever is ex­tremely sensitive to changes in climate. But the response to the resurgence of yellow fever - and in particular to the threat posed by steam navigation - set the tone for sanitary measures over the coming decades.

It became a test case for how to deal with disease in an increasingly integrated economy. Old-style quarantine - in which ships, their cargoes, and their crews were impounded (sometimes together) - disrupted commerce and was increasingly regarded as inhumane. Alternatives had to be found, and from the early 1870s, follow­ing a severe epidemic in Buenos Aires, more emphasis was placed on sanitary reform. Yellow fever was said to thrive only in humid, ill-ventilated, and unsanitary conditions, which meant that it could be controlled largely through environmental improvement. By the 1890s, however, it had been proven that the disease was spread by a mosquito (Aedes aegypti), and the focus changed from general sanitary improvement to fumigation and drain­age. Through these means, public health officials and engineers from the United States achieved the remarkable feat of ridding the Panama Canal of yellow fever, stemming heavy mortality among construction workers and securing the Canal's smooth operation when it opened in 1914.

Oriental perils

Despite fears that yellow fever might spread eastwards, it remained confined to the continents which surrounded the Atlantic Ocean. In this sense, it was very much an exception among diseases which had the potential to become epidemic. Most of the pandemics which swept the globe during the nine­teenth century had their origin in Asia and spread from there to almost every part of the world. Not the least of these was cholera. With its terrifying symptoms and high fatality rate, cholera was universally feared. Its victims suffered severe cramps and profuse diarrhea, leaving their nervous systems exhausted and their bodies depleted. The prospect of an agonizing and undignified death ensured that cholera ranked among the world's most dreaded diseases, but this fear was magnified by uncertainty. Its obscure provenance and unexplained appearances left medical practitioners bewil­dered for decades.

The origins of cholera as an epidemic disease are generally traced to 1817, when it erupted in severe form in the town of Jessore, in what is now Bangladesh.

Shortly afterwards, the disease spread to Calcutta, the capital of British India, and over the next two years it moved north and westwards, following the movement of British troops. By the early 1820s, it had arrived in many parts of Asia through maritime and overland trade. But the question of why cholera emerged as an epidemic at this point in time has yet to be satisfactorily explained. The most likely reason is the sweeping changes wrought by the British to the economy of Bengal, especially the destruction of local industries and the increasing use of migrant labor.

As cholera spread beyond the Ganges delta to affect other parts of the world, it appeared to show a “preference” for the filthiest abodes and the lowest social classes. It was, predominantly, a disease of the poor, the vanquished, and the desperate. It spread among refugees, economic migrants, and soldiers; all groups which were seen as threatening in their own right. Cholera added to their misery and the stigma which they bore. In view of these associations, it is not surprising that its arrival sharpened social tensions, especially in countries in which political relations were already strained. This was true of many parts of Europe and North America when cholera first reached those continents in the early 1830s. The streets of cities such as Moscow and Paris saw serious civil unrest, not to mention repression by the authorities.[189]

One of the main vectors for the spread of cholera over the coming decades was long-distance migration, especially the passage of indentured laborers from India. These “coolies” were bound to their masters by a contract of long duration, during which time they were often compelled to perform exacting and dangerous work in mines and plantations. This “legalized system of slavery,” as it was termed by critics, replaced the old system of slavery which had previously been the foundation of the Atlantic economy. As Britain and other European powers abolished slavery, many of their colonies faced ruin as their supply of cheap labor dried up. Indentured workers filled the gap but with them came a host of diseases, including cholera. Despite attempts to keep the disease at bay with quarantine, cholera soon became established on plantations in the Caribbean and subsequently in many parts of Asia and Africa.

Although labor migration was among the chief vehicles of cholera, the Western powers were increasingly preoccupied with its spread via the Islamic pilgrimage to Mecca, where 35,000 pilgrims perished during an epidemic in 1865. This was not the first time that cholera had broken out at Mecca, but the pathway between the city and India was now more visible. When cholera had spread on previous occasions, it had done so more slowly and by routes which were circuitous and obscure. Now they seemed more definite, for steamships and railways - not least the Hejaz railway which linked the Muslim holy land to northern Arabia - allowed the disease to spread rapidly, making the path of cholera more clearly defined. Within months of the outbreak in Mecca, the disease had spread with returning pilgrims to many parts of Asia and Africa, and from there to Europe and the United States.[190]

After the wave of cholera in 1865-1866, Western nations contemplated the terrifying prospect of further pandemics spreading unnoticed from Asia. These fears were magnified by the opening of the Suez Canal in 1869, which increased the likelihood of cholera being delivered directly to Europe from oriental ports. The answer, it seemed, was to create a sanitary barrier to protect the West from sources of infection. Behind this screen, European nations would be able to operate a sanitary regime which inter­fered relatively little with navigation. But in the sanitary buffer zone the strictest vigilance was to be maintained and this would impose a heavy burden on Middle Eastern states. Nevertheless, polities such as Egypt, Iran, and the Ottoman Empire were willing to undertake this task. The frequent epidemics that had occurred in their territories since 1821 were an impedi­ment to modernization, and quarantine stations were thus erected at Jeddah, Alexandria, and other ports receiving ships from potentially infected places.

It is hard to tell whether or not the imposition of quarantine in the Middle East had any effect on the spread of cholera. Epidemics to the west of this zone certainly became less frequent, but other explanations are equally plausible. Cholera thrived only in those places where filth and overcrowding abounded and where pure water supplies were hard to come by. For much of the century, these conditions were to be found in the squalid habitations of laborers and peasants in every part of the world, but, by the end of the century, a significant gap had emerged. After the outbreaks of 1866, most parts of Europe were never again visited by cholera, whereas it continued to thrive elsewhere. Several decades of sanitary reform, epitomized by the efforts of Edwin Chadwick in Britain and Georges Haussmann in Paris, transformed the urban environment, keeping sewage from contaminating supplies of drinking water. The epidemiologi­cal researches of John Snow in London in the 1840s and 1850s provided a stimulus to this process, and the disease was further demystified by the German bacteriologist Robert Koch, who in 1884 isolated the causal organ­ism - a bacterium - from a reservoir in Calcutta. This enabled preventive measures to become more specific and potentially less disruptive. At any rate, after the epidemic which occurred at Hamburg in 1892, cholera never again presented a significant threat west of Suez.

Just as cholera disappeared from the developed world, a new and even more terrifying threat emerged from the Orient. Throughout the nine­teenth century, outbreaks of plague had occurred periodically in parts of Asia and the Middle East, but there was little to suggest they might become widespread. Some of these localities were “reservoirs” of infection, in which wild rodents, such as marmots, carried the bacterium causing plague. Occasionally, these animals managed to infect humans and populations of domestic rodents like rats. Environmental change, war, and natural disas­ters all favored the spread of plague from these isolated pockets to sur­rounding areas. From the middle of the century, the upland province of Yunnan in southern China was ravaged by a civil war following a Muslim insurrection against the imperial authority of the Qing dynasty. Outbreaks of plague became common and, in 1890, it spread to Canton (Guangzhou), a busy and populous port on the Pearl River. This area had been opened up to trade over the previous decades, largely as a result of the traffic in opium, which came into China from India. Canton was only eighty miles up-river from the major commercial hub of Hong Kong, a British colony since 1892. Every year, thousands of economic migrants from southern China flocked to Hong Kong in the hope of gaining a laboring contract which would take them to work in other parts of the British Empire and also, increasingly, in North America.

Plague traveled with these desperate people just as cholera had with previous generations of migrants, and Hong Kong was declared infected in 1894. The world watched with horror in the expectation that plague would radiate outwards and many nations and companies suspended navigation into the port. Unexpectedly, plague remained confined to southern China for two more years, but, in the summer of 1896, it appeared in the great Indian port of Bombay. The disease spread quickly from Bombay but was not reported outside the subcontinent until 1898-1899, when it appeared in Madagascar, Egypt, and Japan. It was now clear that plague posed a threat of global proportions. Despite its ancient lineage, the disease was well suited to modern conditions and spread easily along the sinews of a mature global economy. By the early 1900s it had reached every inhabited continent, with a series of outbreaks in the South Pacific, Australia, and the United States.[191]

In the majority of affected countries, plague remained confined to the major port cities, the great exception being India; the western and northern part of the subcontinent being ravaged for decades, claiming millions of lives. From Bombay, the disease spread along railways to largely agricultural areas such as the Punjab and by sea to ports like Karachi. Railways were also the main avenue for the spread of plague in Manchuria in 1910-1911 and 1920-1921, but these outbreaks had a different source from other plague epidemics, including those in southern China. They originated in Mongolia and spread south and east from nearby railway towns with laborers as they returned home for the New Year. Unlike the majority of other outbreaks, the Manchurian plagues were unusual in that most cases assumed the virulent and highly contagious pneumonic form, capable of spreading easily from person to person.

The so-called Third Plague Pandemic lasted from 1890 to the 1940s, but it is the first, most turbulent, years of the pandemic which have received most attention from historians. Faced with damaging trade embargoes and the stigma of infection with a dreaded disease, most governments took drastic measures, forcibly isolating and hospitalizing plague victims and their con­tacts, destroying their property, and subjecting people to humiliating searches. Such measures elicited an angry response. In Hong Kong and India there were violent protests, in some cases culminating in the murder of plague officers. Elsewhere, there were numerous attacks on government buildings, strikes, and the mass flight of inhabitants from cities. Like cholera epidemics earlier in the century, many historians have studied these out­breaks in the hope that they would reveal tensions latent within the societies affected. But there is a danger of generalizing too much from what were, by definition, unusual circumstances. We should not assume that hostility to officials and institutions at times of crisis was expressed in the ordinary course of events.

Preoccupation with the early years of the pandemic has also blinded us to the fact that governments learned from their early mistakes and from a few well-managed anti-plague campaigns like those in Egypt in 1900 and Sydney in 1902. In these places, there was an attempt to work with local communities and to minimize disruption to commercial and social life. These examples pointed the way to a new sanitary order. Once it became clear that plague could not be contained, most governments saw the need for a measured and co-ordinated response, and, moreover, a response which recognized the interrelatedness of the global economy. This realization saw the passage of the first binding international sanitary regulations, following conferences at Paris in 1903 and Washington in 1905. While their implementation was far from perfect, they marked a decisive shift towards a light-touch sanitary regime based on epidemic surveillance and public health measures in ports of embarkation.

This transition was gradually eased by advances in scientific knowledge. In 1894, it had been found that plague was a bacterial disease, but this discovery had little immediate impact on the measures which were used to combat it. However, it gradually became apparent that plague, in its usual bubonic form, was not directly contagious and that it seemed to coincide with deaths among rats. Henceforth, sanitary authorities turned their attention from humans to rodents - to their destruction and methods to secure ships, dwellings, and warehouses against them. From 1906 most scientists also concurred that the disease had spread from rats to humans by the bite of the rat flea. This meant that there was less need for quarantine and similarly disruptive measures, the exception being the plagues in Manchuria, where the Japanese, Chinese, and Russian authorities placed quarantine and isolation of passengers at the center of their plans.

The plague pandemic crystallized a tendency which had been noticeable for some years in the case of diseases such as yellow fever and cholera - a desire to have better intelligence about epidemics in order to permit less intrusive and damaging interruptions to the business of the modern world. The sanitary conventions of the 1900s created offices in Paris and Washington for the collection of epidemiological data and distributed it to other states. This trend intensified in the years after the First World War, as the League of Nations Health Organization and its regional offices assisted in the collation and sharing of information. The advent of wireless radio speeded up com­munications further, allowing ships to notify ports of disease outbreaks before their arrival. Confidence in the accuracy and reliability of this informa­tion was enhanced by the maintenance of sanitary surveillance in ports around the world and by the activities of bodies such as the Rockefeller Foundation in the removal of some of the more obvious sanitary threats.

Epidemic diseases such as cholera remained a problem in the most deprived parts of Asia and Africa, particularly at times of famine and unrest. But the incidence of cholera, yellow fever, and plague fell markedly during the twentieth century, largely due to improvements in public health and possibly to acquired immunity, among humans and host animals such as rats. Either way, there was growing optimism that disease could be conquered. The outbreak of a world war in 1914 presented the ultimate challenge. In all previous conflicts, including recent ones like the Spanish-American War of 1898 and the South African War of 1899-1902, disease had invariably claimed more lives than injuries inflicted in battle. In many theatres of the First World War - especially the Eastern Front, the Middle East, the Mediterranean, and Africa - diseases such as typhus and cholera once again blighted military operations. Civilian populations also suffered enormously as a result of infection and destruction of sanitary infrastructure. But from a sanitary perspective, the war marked a turning point. In most armies - with the probable exception of the Ottoman forces - fatalities from disease were slightly less than fatalities from battle injuries. This owed something to the destructive potential of modern weapons but it was largely the result of increased attention to sanitation and the use of newly devised preventives such as inoculation against typhoid.[192]

Nevertheless, the war concluded with one of the greatest pandemics of all time. The three waves of influenza which began in 1918 and ended, in most countries, in 1919, claimed the lives of at least 25 million people and probably a good many more. Influenza had already established its potential as a pandemic disease, during the so-called Russian Flu of 1889. This disease had caused great alarm because rich and poor were equally likely to fall victim to it. The

Figure 9.1 Spanish flu epidemic 1918-1919. US school gymnasium converted into a flu ward with patients' beds separated by screens

(Everett Collection Historical/Alamy)

same was true of the pandemic of 1918-1919, which, unlike most “seasonal” epidemics of influenza, claimed victims disproportionately among young adults. The enormous number of deaths from this disease, coming hard on the heels of a major conflict, decimated an entire generation (Fig. 9.1).

Until recently, relatively little was known about this pandemic and its impact on the societies it affected. One of the reasons for this silence was that governments and the medical profession had been powerless to prevent it. They attacked the disease much as they had cholera and plague, assuming it to be a bacterial infection, but were impotent in the face of an agent - a virus - which was easily transmitted from person to person. There was, in short, no great victory to be celebrated. Moreover, as the war drew to a close, the combatant nations had other pressing concerns. Many aspects of the pandemic therefore remain shrouded in mystery; not least its causes and origins.[193] Some have traced its emergence to the battlefields of Europe, where conditions may have been conducive to the mutation of the virus; others have pointed to army camps in the United States or to northern China. None of these theories is totally convincing, but there is ample evidence linking the early spread of influenza to the passage of Chinese laborers and an East Asian source of the pandemic seems increasingly likely.

The diseases of animals

The influenza of 1918-1919 marked the end of a century of pandemic disease, but the great upheavals of previous decades affected many species other than humans. From the 1830s, foot-and-mouth disease and bovine pleuropneumo­nia or “lung sickness” began to spread beyond Eastern and Central Europe to affect Western parts of the continent and ultimately the Americas. Everywhere they went, these diseases created misery and economic hard­ship, but the ways in which they were managed varied considerably. In some cases, whole herds of cattle were slaughtered in an attempt to eradicate disease, but more often than not, infected cattle were simply isolated from those thought to be healthy. This was especially true of foot-and-mouth disease, which, while it caused great distress to livestock, was rarely fatal. As a result, these diseases became firmly established in many areas from which they had hitherto been absent.

The spread of livestock diseases was one consequence of the long-distance trade in animals; a trade fueled by urbanization. During the nineteenth century, it was becoming harder to supply the burgeoning populations of towns and cities from locally reared stock. Once real wages began to increase - as they did in most industrial nations - people came to demand more meat; a status symbol which they had hitherto been unable to afford. As demand increased, controls in markets and at borders began to slacken and disease passed easily from town to town and country to country. But there was one exception. Most Central European countries maintained a sanitary cordon against a disease known in German as rinderpest and in English as cattle plague. Unlike foot-and-mouth disease, this was a fatal infection and had the capacity to cause great devastation.

Europe had been ravaged by this disease on several occasions in the past, spreading most often at times of war when large numbers of cattle were moved to supply armies. But from the 1740s, some countries began to establish quarantines and slaughter infected cattle. Prussia, for example, maintained a very strict sanitary cordon against rinderpest, but the majority of cattle entered Europe - from the Russia Steppes where they were reared - through the Austro-Hungarian Empire. Livestock from Russia was supposed to be quarantined at stockyards in Hungary, but many unscrupulous dealers smuggled cattle through. By the 1850s, there was also concern that the imminent completion of a railway connecting the stockyards to ports in the north and west of Europe would allow disease to travel quickly and un­noticed. These warnings were prescient, and in 1866 the disease spread rapidly throughout Europe by rail and sea.

Rinderpest bore the hallmarks of a biblical plague, and its ravages - contemporaneous with the spread of cholera - led many to believe that humanity was being punished for its wickedness and lack of religious obser­vance. Some also blamed the disease on the maltreatment of animals. But most European countries again attempted to “stamp out” the disease by slaughtering infected herds, and some, like Great Britain, introduced legisla­tion to legitimize such measures and the imposition of quarantine at ports. These measures were deeply unpopular at first, as farmers initially received no compensation, but they were maintained and extended to cover other diseases such as foot and mouth.

In other parts of the world, however, the response was decidedly mixed. North America was fortunate in never experiencing an outbreak of rinder­pest, but the disease appeared briefly in Argentina in the 1870s and was quickly stamped out by slaughter. From the 1890s, as Japan increased its influence on the eastern tip of Asia, it also maintained strict quarantines against the importation of infected cattle from the mainland, following out­breaks of rinderpest in Korea and Japan in the early 1890s. But in British India, where the disease became far more prevalent during the 1860s, there were no attempts to prevent its spread apart from one rather weak piece of legislation confined to southern India. British officials justified this on the grounds that the slaughter of cattle would upset Hindus, but, in any case, the practicability of such measures was doubtful. India was simply too vast a country and its herds too large to make quarantine practicable. For much the same reason, the disease spread unchecked through Africa after being introduced to the northeast of the continent in 1889. Within a couple of years it had reached the southern tip, having destroyed many pastoral societies along the way. Had it not been for the development of an effective inoculation against the disease at the end of the century by Robert Koch, there would have been no effective measures to prevent it. Even then, it was not until the late twentieth century that the disease was finally eradicated from Africa and South Asia.

Like most of the livestock diseases which spread in the course of the nineteenth century, rinderpest was a Eurasian disease, but there were some exceptions. During the late nineteenth century, there was a good deal of concern about the spread of a cattle disease known as Texas fever (a tick-borne disease) which became more prevalent during the Civil War. It later moved north with the shipment of livestock to industrial cities, and cattle drives through the Midwest provoked violent unrest among local farmers who feared their stock would be infected. Proposals to quarantine cattle in the region were also resisted by ranchers and the meat industry based in northern cities like Chicago, which disliked anything that added to their costs. European countries, too, were wary about the spread of the disease from the United States and became unpopular with American farmers by regularly imposing quarantines. At much the same time, in the 1870s and 1880s, there was much dispute over the disease trichinosis, which appeared to spread to Europe with shipments of pork meat from the United States. American farmers suspected that the imposition of sanitary embargoes by countries such as Germany was really a way of protecting smaller and less- efficient European producers.

Disagreements over animal disease blighted international relations throughout the coming century, despite the fact that rules had been devised to regulate the prevention of epidemics. Argentina - a net exporter of cattle - and importing countries such as the United States and Britain were regularly at odds over the issue of foot and mouth, for example. The disease was endemic in many South American countries but had been stamped out in the importing countries, which were anxious to keep their disease-free status. Such disputes were difficult to resolve because producer interests were inclined to justify protectionist measures with dubious sanitary risk assess­ments. It was hoped that such abuses would diminish following the forma­tion of the World Trade Organization and the drafting of the Agreement on the Application of Sanitary and Phytosanitary Measures in trade, in 1995. But, as formal tariffs were dismantled, the resort to sanitary and other “technical” obstacles to trade became more common.[194]

A world divided by disease

Towards the end of the nineteenth century, sanitary measures (some informed by the new science of bacteriology) enabled the most developed nations to control many common infectious diseases. In poor countries, however, the burden of disease changed little or actually increased. This was true not only of diseases considered pestilential, like rinderpest or plague, but also of those which were indigenous and endemic. A good example is smallpox, which was more or less ubiquitous by the turn of the nineteenth century. However, smallpox was also the only disease for which a specific and potentially highly effective method of prevention had been developed. After EdwardJenner established the efficacy of vaccination with cowpox in 1796, the technique was quickly exported to other parts of the world. Its proponents hoped that this procedure - which seemed to be safe as well as effective - would displace the older and more dangerous practice of inocula­tion with dried crusts from the scabs of smallpox pustules. This practice - often known as variolation - was fairly common in parts of Asia and North Africa and had been introduced to Europe and North America from the 1720s.

Despite the backing of Western medical elites, vaccination encountered opposition wherever it was introduced. Many people regarded it as unnatural that humans should be inoculated with matter from an animal and remained skeptical about its efficacy. In fact, for many years, vaccination was not as efficacious as its proponents claimed. It later became apparent that a second vaccination was required to confer complete protection. In hot climates, too, the lymph was prone to corruption and quickly lost its potency. For well over a hundred years, the operation also entailed cuts to the body with a lancet (rather than insertion by a syringe) and this brought with it the risk of infection and scarification. Attempts to make the practice compulsory thus produced vocal and often violent demonstrations.

Nevertheless, those governments which possessed sufficient resources pressed ahead, and mortality from smallpox was reduced massively as a result. In the European colonies, however, a combination of insufficient funds, bureaucratic inertia, cultural sensitivities, and technical and logistic problems prevented similar progress. In India, for example, some reduction in mortality was achieved by the middle of the twentieth century, but the contrast with Western countries, in which the disease had been virtually eradicated, was stark. Only after the World Health Organization - which was established in 1948 - made smallpox the subject of a worldwide campaign, would developing countries possess the resources to be able to eradicate the disease: an objective which was achieved in 1979.11

Although the campaign against smallpox was ultimately successful, pro­gress regarding other common infectious diseases was more uneven. Tuberculosis, earlier termed “consumption” or the “white plague,” is a case in point. Although the disease is caused by a bacterium which is easily [195] transmissible, not everyone has the same reaction to it. Many persons exposed to infection never develop any symptoms, but in the nineteenth century those who did usually died. This meant that the meaning of the disease was open to interpretation: people sought an explanation for why some people succumbed and others not. At the beginning of the century, consumption had something of a romantic image, being associated with the death of famous poets and musicians, but by the middle of the century it had become a malady of factories and slums. Increasingly, too, it was regarded as an infectious disease. At the beginning of the century, many doctors regarded consumption as hereditary, the result of a tubercular “diathesis” which was activated by lifestyle or environmental conditions. These ideas did not disappear, but many came to believe that there was an infectious quality to tuberculosis and that it spread easily in confined and ill-ventilated spaces. After the bacterium causing the disease was discovered in 1882, this transition was complete, and posters went up around railway stations and other public places to discourage people from spitting, which seemed to be the most obvious way of spreading the disease.

Public health measures and growing natural immunity to infection combined to reduce mortality from tuberculosis in most industrialized countries. This, in turn, had a significant impact on mortality as whole, for tuberculosis was almost invariably the largest cause of death in most indus­trial countries. But while the disease was declining in the West, it was spreading rapidly in Africa and Asia. It was most evident in new manufactur­ing centers like the cotton mills of western India, as well as around mining settlements like those in South Africa.[196] As most of these industries relied heavily on migrant labor, it was not long before the disease spread to villages. Despite the inexorable rise in tuberculosis among the industrial workforce, responses to the disease were less than vigorous. In Europe and North America, there had been a concerted effort to improve public health and to provide treatment in sanatoria, even for the poor, but neither was very apparent elsewhere. Moreover, as an effective preventative inoculation (the BCG) became available from the 1930s, the gap opened still further, as it took many years before it began to be widely employed outside Europe and America. The same was true of the first effective treatment - the antibiotic streptomycin - which was given routinely in richer nations from the late 1940s. Some resistance to this drug was noted within a few years of its introduction, but the irregular use of antibiotics - a product of poverty and unregulated selling of pharmaceuticals - allowed new drug-resistant forms of tuberculosis (XDR-TB) to become established in South Africa, and they are now spreading to other parts of the world.

Some doctors regarded the rise of tuberculosis as inevitable. In their view, it was a “disease of civilization,” a rite of passage through which all indus­trializing societies had to travel. But another endemic disease - malaria - was generally ascribed to the absence of civilization. For this reason, European countries like Italy, which still suffered heavily from malaria, made an enormous effort to get rid of it, aided by a procession of scientific advances, most obviously the discovery of the malaria parasite in 1880 and the mosquito vector in 1898. The latter gave rise to hopes that malaria could be eradicated by destroying the larval or adult forms of the Anopheles mosquito. Either on its own, or in conjunction with the drug quinine - which had been synthe­sized from cinchona bark in 1820 - measures like drainage of breeding pools and the spraying of insecticide were tried with varying degrees of success.

They were most successful in relatively confined areas, such as Singapore, or where massive resources were marshaled for the purpose, such as fascist Italy. But in the majority of malaria-afflicted countries, resources were woefully insufficient. Moreover, malaria proved to be a much more complex problem than initially imagined. Parasites gained immunity to a succession of chemically synthesized drugs, while mosquitoes acquired immunity to a variety of insecticides, most notably DDT, which had showed such enor­mous promise during the Second World War. In some areas blighted by malaria, its transmission and mortality have stabilized, but in others new drug-resistant strains have emerged (Fig. 9.2). Malaria-bearing mosquitoes have also found new breeding sites in the pitted ground in and around rapidly developing cities.[197] The same trends, combined with global trade and climate change, have led to increasing outbreaks of other vector-borne diseases such as dengue fever.

These diseases are not confined to the developing world but their burden falls most heavily upon it, especially in the case of malaria. Most of the 250 million cases of malaria which are recorded each year, and the 1 million deaths from this disease, occur in low-income countries. Indeed, the popula­tions of poor and affluent countries experience disease in strikingly different ways. In high-income countries, the principal causes of death are, in

Figure 9.2 A woman looks out of her window next to a banner during a preventive campaign against dengue fever organized by the health ministry in a shantytown in Lima, Peru in 2012

(© Pilar Olivares/Reuters/Corbis)

descending order, heart disease, cerebrovascular diseases like stroke, cancers of the respiratory tract, and Alzheimer's and other forms of dementia. These are largely diseases of an aging population; or more precisely, a population that has been able to grow old because of the effective conquest of most infectious diseases. In low-income countries, by contrast, infectious diseases continue to loom large. The main causes of death are, in turn, lower respiratory infections, diarrhea, HIV/AIDS, heart disease, and malaria. Most of these diseases can be easily prevented and either cured or managed with the use of drugs.[198]

The role of socio-economic inequality is clearly evident in these mortality profiles but we should not ignore the considerable differences which exist within individual countries. Nations such as India, which have developed rapidly in recent years, now possess a large middle class with a mortality profile little different from Western countries. In fact, obesity-related diseases such as diabetes mellitus have been growing in many countries as food preferences change (towards highly processed and sugary foods) and habits become more sedentary. The same is true of certain types of cancer. Cancer was once considered a Western disease, but from the mid-twentieth century it became apparent that its incidence was increasing in most countries as they industrialized and urbanized, partly due to pollutants and partly due to diet and lifestyle. Now, there are proportionately more deaths from certain cancers in the developing world because of less consistent public health education and fewer legislative restrictions. Lung cancer provides probably the clearest example: it has fallen markedly in the developed world but is increasing in most developing countries. But in all countries, the incidence of certain forms of cancer - particularly lung cancer - is more common among persons of lower social class and educational attainment, the chief risk factor being smoking.

Globalization

Although non-infectious diseases are a growing problem globally, we ought not to assume that epidemics are a thing of the past. The emergence of HIV/AIDS in the 1980s has served as a reminder of the potential of Nature to generate new infections and of the man-made world to disseminate them. From the very beginning, AIDS drew attention to the interconnect­edness of the modern world. Lurid stories, focusing on the sexual exploits of a supposed “patient zero,” awakened fears about the propensity of new infections to pass swiftly over vast distances by air travel; not least because “patient zero” was an air steward. As a consensus began to form around the theory that HIV had mutated from simian forms of retrovirus in tropical Africa, there was also intense speculation about what other diseases might emerge from this “hot-house” of infection. Outbreaks of previously obscure fevers like Ebola reinforced the impression that fresh dangers awaited humanity in the tropics, and that they might now spread easily to other parts of the world.

The threat from these so-called “emerging” diseases led to calls for the “securitization” of health.[199] Disease, it was argued, had the capacity to destabi­lize regimes and to threaten the newly globalized economy. The language and practices of public health began to change, with more emphasis being placed on surveillance and containment. This tendency became even more apparent after the terrorist attacks of 9/11, after which fears began to circulate about biological warfare. The arrival in 2003 of Severe Acute Respiratory Syndrome (SARS) crystallized this trend, and the war against terror and the war against disease began to coalesce. These specific fears also reflected growing unease about globalization: about economic insecurity, the destruc­tion of established communities, and mass migration. SARS was, above all, a disease of “global” cities such as Hong Kong, Singapore, and Toronto; cities famed for their large diaspora communities and as hubs of global trade.[200]

This defensive mind-set framed the response to some of the major disease threats of the coming decade, most of which involved influenza. During the 2000s, there were numerous outbreaks of a deadly form of influenza, H5N1 or “bird flu,” usually beginning in East or Southeast Asia, but in some cases spreading as far as Western Europe. These outbreaks destroyed the liveli­hood of many farmers and killed some humans who came into close contact with infected poultry. But the disease did not appear to pass from person to person, which was the great fear of most public health officials. When a more transmissible strain of influenza - H1N1 or “swine flu” - was reported in Mexico in 2009, many believed that a serious pandemic was in the offing. Initially, the death rate appeared to be higher than that for ordinary “season­al” influenza, but while the disease was officially declared a pandemic, the expected high mortality failed to materialize.

What characterized all these outbreaks was the emphasis placed on surveillance, containment, and contingency measures - the stockpiling of vaccines and anti-viral drugs, and emergency planning for large organiza­tions. But influenza specialists such as Robert Webster and organizations such as Compassion in World Farming also drew attention to the origin of such diseases. In particular, they highlighted the intensive production of poultry and pigs, which had increased massively in recent years, especially in Asia. These “factory farms” were said to escalate the risks of a mutation that was both lethal and highly contagious. The Director of the World Health Organization also warned in 2012 that the regular dosing of intensively reared animals with antibiotics was contributing to the growing problem of diseases resistant to treatment. Indeed, the phenomenon of antibiotic resistance may herald a return to an era in which even common infections may once again kill, reversing the gains of over half a century.

As consciousness of the interdependence of human and animal health grows, there have been calls for more holistic public health strategies, and organizations such as the WHO, WTO, and the World Organization for Animal Health have begun to co-ordinate their efforts. But the problem is a difficult one to solve, for it is structural in nature. Factory farming and the associated risk of new and drug-resistant diseases is driven above all by urbanization and the lifestyle changes normally associated with it. Those parts of the world bearing the brunt of rapid urbanization thus face difficult decisions about how to manage development in the interests of health. Their actions will affect not only their own communities but possibly the rest of the world.

Further reading

Bhattacharya, Sanjoy. Expunging Variola: The Control and Eradication of Smallpox in India, 1947-1977. Hyderabad: Orient Longman, 2006.

Echenberg, Myron. Africa in the Time of Cholera: A History of Pandemics from 1817 to the Present. Cambridge University Press, 2011.

Plague Ports: The Global Urban Impact of Bubonic Plague, 1894-1901. New York University Press, 2007.

Farmer, Paul. Infections and Inequalities: The Modern Plagues. Berkeley, ca: University of California Press, 1999.

Fidler, David P. SARS, Governance and the Globalization of Disease. Houndmills: Palgrave Macmillan, 2004.

Hamlin, Christopher. Cholera: The Biography. Oxford University Press, 2009.

Harrison, Mark. Contagion: How Commerce Has Spread Disease. New Haven, cτ, and London: Yale University Press, 2012.

The Medical War: British Military Medicine in the First World War. Oxford University Press, 2010.

Ladurie, Emmanuel Le Roy. “A concept: the unification of the globe by disease.” In The Mind and Method of the Historian. Brighton: Harvester, 1981, pp. 28-83.

McNeill, J. R. Mosquito Empires: Ecology and War in the Greater Caribbean, 1620-1914. Cambridge University Press, 2010.

Mishra, Saurabh. Pilgrimage, Politics, and Pestilence: The Haj from the Indian Subcontinent, 1860-1920. New Delhi: Oxford University Press, 2011.

Packard, Randall M. The Making of a Tropical Disease: A Short History of Malaria. Baltimore, md: Johns Hopkins University Press, 2007.

White Plague, Black Labor: Tuberculosis and the Political Economy of Health and Diseases in South Africa. Berkeley, ca: University of California Press, 1989.

Phillips, Howard, and David Killingray, eds. The Spanish Influenza Pandemic of 1918-19: New Perspectives. London: Routledge, 2003.

Price-Smith, Andrew T. Contagion and Chaos: Disease, Ecology, and National Security in the Era of Globalization. Cambridge, ma: MIT Press, 2009.

Webb, James L. A., Jr. Humanity’s Burden: A Global History of Malaria. Cambridge University Press, 2008.

<< | >>
Source: Wiesner-Hanks Merry E., McNeill John, Pomeranz Kenneth. (Eds). The Cambridge World History. Volume 7. Production, Destruction, and Connection, 1750-Present. Part 1: Structures, Spaces, and Boundary Making. Cambridge University Press,2015. — 674 p.. 2015

More on the topic Disease and world history from 1750: