The history of world technology, 1750-present
PAUL JOSEPHSON
Technological change accelerated with the Industrial Revolution and extended to all processes on all continents from smelting and mining to power production, to transportation, agriculture, and housing, and to communications.
It has often been connected to military innovation. It involved the effort to replicate, standardize, and mass-produce techniques and processes. Its impact reached materials as well, from non-ferrous metals to building materials and plastics. Technological change reflected growing interconnectedness between processes; for example, that in the chemical industry connected with dyes leading to innovations in medicine and materials (plastics, for example). The increasingly industrial and large-scale technologies were not individual components or machines, but also systems - business, financial, construction, and state institutions, as well as armies of laborers whose tasks were routinized as they were de-skilled. This tendency held in the twentieth century across capitalist, socialist, and fascist states. Another feature of the history of technology was the increasingly formal relationship between industry and research and development; industrial firms and state governments supported R&D in new, large-scale laboratories connected with business, public health, and military innovation.This chapter focuses on the United States, Europe, and the former Soviet Union because these nations have been the major engines of technological change since the 1750s for economic reasons (the rise of the factory and modern corporation and the determination of businesspeople and entrepreneurs to tap technology for increased production and profit); political reasons (the centrality of technology to ideological, public health, and those very economic reasons); military concerns (and not only the development of new weapons based on industrial processes); and the competition between these states for resources and power.
The Industrial Revolution: from human labor to engine power
The Industrial Revolution was not one revolution, but a series of revolutions in production, power generation, and distribution. It occurred first in Britain and on the European continent, then in the Americas and Asia, all roughly in the period 1770-1900. The first industries to be affected were textiles, iron and steel, and mining, especially coal. In production, handicraft gave way to machine production. In textiles, the flying shuttle, spinning jenny, water and steam power enabled rapid increases in production of yarn, while canals and railroads facilitated delivery of raw materials and sale of finished cloth. In steel, the Bessemer process permitted inexpensive mass production and removal of impurities in iron by oxidation. Production grew cheaper, faster, and more efficient, which put many craftsmen out of work, forcing them to seek employment in mills, and triggered rapid urbanization. Another issue of labor is the fact that mechanization and the expansion of handwork were frequently complementary. Thus, mechanized cotton ginning, spinning, and weaving greatly increased demand for cotton, and more people than ever worked picking it until that began to mechanize a century and a half later; steam engines pumped water from mines, while for several generations, the number of people swinging picks, shoveling, and pushing wheelbarrows to the surface increased. Exploitative child labor was a final feature of the Industrial Revolution.
A crucial aspect of the Industrial Revolution, tied to the others, was the rise of steam power (in the late nineteenth and twentieth centuries augmented by electrical energy, the internal combustion engine, and eventually large-scale hydroelectricity, and nuclear power). Millers, farmers, town residents, merchants, and others have tapped water for agriculture, drinking, sewerage, and to power mills for millennia. Now water powered the steam engine. The Scotsman James Watt introduced major improvements to the steam engine including high-pressure steam in pistons and a reciprocating engine.
The engines served burgeoning factories and mills, and later powered boats, ships, and the railroad (Fig. 5.1). Eventually, with boilers that powered turbines and power lines, businesses, homes, and cities gained electricity, followed by rural regions from the 1930s onward.In the early nineteenth century the American system of manufacturing based on interchangeable parts and the use of machine tools to produce them developed. This system grew out of the efforts of Federal armories and their contractors to provide guns more inexpensively. Its principles contributed to
Figure 5.1 Early steam locomotive associated with London & North Western Railway (Science & Society Picture Library/SSPL/Getty Images)
the industrial manufacture of bicycles, sewing machines, typewriters, and eventually automobiles. The American system spread throughout the world in the effort to use semi-skilled labor and machine tools that cut, shaped, extruded, and ground metal to produce standardized, identical interchangeable parts.
Eventually this gave rise to the assembly line with division of labor among a variety of tasks, and by the late twentieth century became routine in cheap labor production of textiles, shoes, and foods in Asia and Africa. Because of success in mass-producing automobiles along an assembly line at Henry Ford's River Rouge, Michigan, factory, assembly line mass production is often called the Fordist system. The assembly line resulted in extensive deskilling of the labor force since everything was built through machines operated by semi-skilled workers. Henry Ford, a patriarchal conservative and anti-Semite, ordered a private police force to keep an eye on his workers. At the same time he recognized the need for a market for his mass-produced Model T automobiles; he doubled the salary of many of his workers in 1914 to US$5 per day. This increased purchasing power among the workers, stimulated greater demand for Fords, contributed to the rise of the middle class in America, and led many workers to migrate to Detroit, Michigan, and other northern cities.
But many workers buckled under Ford's paternalistic scrutiny of their private lives and the hard work in the factory (Fig. 5.2).
Figure 5.2 Ford Motor Company in the early 1900s, showing the assembly line (Everett Collection Historical/Alamy)
Social and political concern about technological change
The changes in production methods brought social upheaval. Among economists there is some dispute about whether wages increased, since generally speaking the standard of living of workers improved during the course of the Industrial Revolution. But at least initially, because workers were no longer self-sufficient as they had been on farms, they lived in poverty and they feared the noisy, powerful new machines and mills. Wages were low enough that children were forced to join their parents in mills and mines and suffered the consequences of poor health, disability, and death. In many countries, average human height declined, suggesting pervasive malnutrition. On the other hand, population grew quite rapidly. This led Thomas Malthus and others to suggest that soon there would be too many mouths to feed as numbers of people grew faster than food supply. Many wealthy individuals suggested that poor laws and other social programs intended to alleviate poverty in fact encouraged the poor to reproduce since they did not feel the full brunt of their behavior and put further strains on government and society. Historians have had their differences over the social consequences of the Industrial Revolution, particularly its impacts upon living standards. The closest to a consensus they have come is the proposition that for the working classes, early stages of industrialization, whether in Britain or elsewhere, typically brought deterioration of living standards, but that after a generation or two that trend changes direction.
Most individuals take for granted that technologies improve the quality of life by achieving efficiencies of production.
However, a number of voices have challenged the view of uncorrupted technology or technology as somehow commensurate with progress - and the belief that “progress” always and everywhere is good. Many farmers, craftsmen, and other individuals protested against industrialization which endangered their livelihoods. In England in the early nineteenth century, the followers of the perhaps legendary Ned Ludd gathered in mobs with farm implements to destroy mills that had put them out of work. Luddism had a significant literary resonance as well. Persons with an anti-technological world view are called Luddites.Another group of critics views technology almost as an autonomous force that shapes human institutions. These technological determinists include Jacques Ellul and Langdon Winners. Others criticize large-scale technologies as undemocratic and exploitative, but see small-scale systems and simple technologies as reasonable; they embrace a view that “small is beautiful,” or that there is some kind of alternative technology (for example, Ernst “Fritz” Schumacher or Peter Harper). They prefer composting, drip irrigation using greywater, solar panels, and wind generators to nuclear reactors and gasguzzling vehicles. These technologies are decentralized, labor, not capital intensive, energy efficient, and locally controlled.
Karl Marx and Friedrich Engels also criticized technology's effects, but not technology per se. Specifically they criticized its concentration as capital in the hands of the owners of the means of production, the bourgeoisie. In their works they railed against the poverty of the proletariat and in particular the alienation of the worker from the products of his labor. While claiming their historical analysis of class struggle between worker and manager, laborer and owner was “scientific,” they advanced the notion of the inevitable rise of utopian society in which workers owned the means of production and in which technology - the productive forces (factories, machinery, equipment, tools) - was organized to create a society of plenty where labor was no longer exploitative toil.
In other words, modern technology would be a force of liberation in socialist or communist society. In this way, their views of technology were determinist since technology served as the engine of history. In any event, such Marxist leaders as Vladimir Lenin and Joseph Stalin in the Soviet Union, Mao Zedong in China, and Kim Il Sung in North Korea sought to create such free societies through technology.These Marxist leaders underestimated the way in which large-scale technological systems had a significant impact on workers' lives and de-skilled them wherever they toiled - under socialism or capitalism. Stalin, for example, determined to copy the Gary, Indiana, steel mills. Founded in 1906, Gary grew to five mills, twelve blast furnaces and forty-seven steel furnaces, plus an excavated harbor to facilitate delivery and shipment of ore and steel. Gary attracted 16,000 laborers, many of them immigrants, in its first three years, and had a combined labor force of 100,000 at its peak. Stalin determined to copy these mills at Magnitogorsk in the Urals region of the USSR. The failure of the Magnitogorsk mills to operate as intended, the conflicts between workers and Communist Party officials, and the squalor, cold, and hunger under which the workers toiled indicated that large-scale technological systems will nearly always be exploitative no matter the political and economic system.
Interdependence of technology, science, and industry: chemistry
One of the most important aspects of the history of technology was the growing connection between industrial research and development where industries supported industrial research. In 1856 in England William Henry Perkin, who was trying to synthesize quinine from coal tar, first produced dyes from coal-tar chemicals and built a factory based on his methods. German chemists quickly took the lead in the synthesis of a large number of bright-colored dyes and in production, in 1865, for example, founding BASF, famous especially for its indigo, although the first colors were not colorfast and faded. BASF expanded its chemical research department - and its facilities generally, including housing for workers - and within fifteen years had sales offices abroad from New York to Moscow. They built on a well-established railroad industry, a burgeoning textile industry, and extensive use of bleaches. The increase in demand for natural dyes accompanied the expansion of the textile industry; synthesis of dyes answered this demand and also triggered development of other chemical processes. An active patent office assisted in the expansion of BASF. Synthesis of dyes also led chemists to synthesize substances that turned out to have medical applications.
The German industry left that in England and in particular in the United States behind, in the latter where industrial chemistry lagged, although by the turn of the twentieth century a series of factories had arisen on the Hudson River in New York state. During the First World War German factories turned to the manufacture of explosives and a blockade prevented trade of German dyes, which triggered rapid expansion of the US industry. During the 1920s, a German chemical industry conglomerate, I. G. Farben, was formed out of those firms that had worked closely during the war.
In addition, during the nineteenth century, industrial firms recognized the importance of underwriting scientific and engineering research. A number of them established quasi-independent laboratories and employed specialists to develop new processes and products with significant market advantages. They included I. E. DuPont de Nemours Company, Westinghouse Electric, the Radio Corporation of America, Bell Laboratories of Bell Telephone, Siemens, and I. G. Farben. During the National Socialist (Nazi) period, I. G. Farben directors worked closely with the Nazi leadership and knowingly provided Zyklon B, a cyanide-based pesticide, to the regime which was used to gas to death millions of Jews in concentration camps during the Holocaust. A number of its executives were tried for war crimes at the Nuremberg Trials, with several executed. But most of them served only short prison terms.
Perhaps the most far-reaching development was the creation of polymers (very large molecules) or plastics, lightweight materials that could be easily molded and shaped, and replaced ivory, tortoiseshell, and linen. At the 1862 International Exhibition in London, Alexander Parkes revealed a material called Parkesine, an organic material derived from cellulose that once heated could be molded, and retained its shape when cooled. Celluloid followed although it was not strong enough for many applications but is known as the foundation of the film industry. The most significant invention was Bakelite, developed by Leo Hendrik Baekeland in 1907, which found uses as electrical insulators, casings, kitchenware, jewelry, piping, and other applications because it molded quickly; it was recognized in 1993 by the American Chemical Society as the world's first synthetic plastic (Fig. 5.3).
Most developments in plastics have occurred since 1910. In the 1930s polyvinyl chloride (PVC), low density polyethylene, and polystyrene were developed. The Second World War stimulated further search for new materials as substitutes for those materials in short supply (e.g. rubber). These materials competed with wood, paper, metal, glass, and leather. In clothing, nylon, a substitute for silk, found application as parachutes, ropes, helmets, and in clothing; plexiglass replaced glass; plastic replaced wood in furniture.
Figure 5.3 Bakelite radio
(Interfoto/Alamy)
Plastics are encountered everywhere every day now in toys, computers, clothing, furniture and carpets, appliances, building materials (e.g. PVC), and medical applications. A significant drawback is that many plastics, while readily disposable, do not biodegrade, and recycling for these materials has limited effect. The Great Pacific Ocean Garbage Patch - floating, churning pieces of plastic at hundreds of thousands of square kilometers - indicates the extent of this critical problem. Another problem is that Bisphenol (BPA) and other additives in plastic may leach out of the plastics into our food, water, and bodies and are likely endocrine disrupters; in addition, a number of them may be carcinogenic.
Float or plate glass, another result of the marriage of chemistry and technology, dates to the early nineteenth century, and grew out of industrial processes and innovations connected with automation, metallurgy, and plastics. In 1848 Henry Bessemer built a system of rollers to produce a continuous ribbon of flat glass which, by the 1920s, with the addition of polishers and grinders, cut production costs considerably. By the 1950s the process had been fully automated. The Crystal Palace (1851), a cast iron and plate glass building in Hyde Park, London, England, at 564 meters long and 39 meters tall, demonstrated the potential of plate glass. Other important innovations included wired cast glass for extra strength and security.
Such consumer-oriented glass applications as industrial bottling of soda drinks and beer developed in the late nineteenth century. Automobile glass gave great impetus to innovation, too. In the early twentieth century, builders employed glass in horseless carriages to protect drivers from wind and debris, but it was standard glass and did not offer full protection, especially in an accident where the glass shattered. Chemists developed shatter-resistant, laminated, and then tempered and tinted safety glass that became standard in the 1920s and 1930s. In the 1960s and 1970s automobile manufacturers added a thin layer of polyvinyl butyral between two layers of glass. Since by 2010 there were more than 1 billion automobiles in the world, with 240 million in the United States alone, this is a vast quantity of glass - not to mention plastics in the interior as seats, molding, insulation, dashboard, and so on.
River engineering and hydroelectricity
Putting rivers to work was important in the early Industrial Revolution. Subsequently, harnessing them more fully became a technological hallmark of modern societies. After the early nineteenth century, under the lead of French, American, and other engineers, the efforts to sculpt river basins for economic, political, and military purposes accelerated and expanded greatly in scale. No longer content to erect dams across rivers, the engineers proposed straightening and dredging them, building huge reservoirs, and storing water for power generation, flood control, and irrigation. The control of rivers across national, state, and provincial boundaries required brute force political machinations to overcome local opposition to external control of water, and in the twentieth century, as dams grew bigger, generated great controversy as millions of people were ousted (“oustees”) from traditional homes and lifestyles in floodplains to make way for large-scale projects to transform the rivers into powerful machines. The modern dams also affected fisheries and forestry operations, the former by interrupting the migration of anadromous fish, changing water chemistry and temperature, and destroying habitat, and the latter by precluding the spring float of lumber downstream from foresting yards to mills.
In the nineteenth century, engineers advanced projects with hubris and certainty, while in fact many of their efforts moved forward on seat-of-the- pants calculations and, as in all engineering work, trial and error. But because of the scale of projects, they inevitably destroyed ecosystems - intentionally or otherwise - and often created more problems than existed in the first place. One of the most heavily studied of these efforts - but repeated in Egypt, India, Brazil, the USSR, China, and other settings into the twenty-first century - concerns the engineering of the Mississippi River to prevent flooding by the US Army Corps of Engineers. Huge areas of irrigation and other water management projects grew from the Ganges and Indus Rivers in India and Pakistan and along the Nile River in Egypt. India began hydroelectric projects in the late 1890s under British rule. By 2010, independent India produced tens of thousands of megawatts of electricity from hydropower stations built since the 1960s - and had ousted millions of people from river basins. Many of the facilities failed to meet their hydroelectric goals because they silt up more rapidly than originally expected, block the downstream flow of nutrient laden silt to river deltas and fisheries, and increase salt water infiltration of these same deltas. They also have been less effective at flood control than predicted.
For example, engineers were convinced that a “channeled” Mississippi would scour the river bottom, digging a deeper channel, and carry away flood waters safely downstream. Instead, water during floods had nowhere to go, pouring over levees into farmland and towns that had been overbuilt in floodplains, like a funnel overwhelmed by an impatient cook. More efforts at improvement followed as engineers gained bigger budgets, but in the process made floods more frequent. In 1927 the Mississippi flooded massively, killing hundreds of people, inundating ten states, covering 70,000 square kilometers, and reaching 97 kilometers in width below Memphis, Tennessee.
Engineering knowledge spread from American, Soviet, and French specialists, both as part of foreign policy efforts and as an international engineering community promoted technological transfer. Soviet engineers trained Egyptian and Indian specialists and provided foreign aid, for example, for the Aswan Dam in Egypt in the 1960s. Chinese specialists who promoted and completed the Three Gorges Dam on the Yangtze River were trained at the Zhuk Gidroproekt Construction and Design Trust, itself a kind of Soviet Army Corp of Engineers firm with roots in building dams and canals in the 1930s for Joseph Stalin's gulag system. The project's first adherents had received training from the US Bureau of Reclamation before Mao and the communists seized power. German specialists also contributed to hydroelectricity abroad.
The Three Gorges Dam is the world's largest at 20,500 megawatts with thirty-two main turbines (Fig. 5.4). Touted for flood control, power generation, and irrigation, the dam destroyed archaeological and cultural sites, destroyed local and regional ecology, and ousted 1.3 million people.
Figure 5.4 Three Gorges Dam (Top Photo Corporation/Alamy)
Another massive project, the Itaipu Hydroelectric Power Station shared between Brazil and Paraguay, drowned the Guaira Falls, the world's largest by volume. Composer Philip Glass wrote a cantata, “Itaipu,” in honor of the structure.
Brazilian engineers both developed their own visions of hydroelectric grandeur in the first half of the twentieth century and after 1945 also worked with American engineers from the Tennessee Valley Authority (TVA) to advance larger projects. TVA specialists, who had begun ambitious transformation of the American South in the 1930s to bring electricity, modern appliances, fertilizers, and communications technologies to the poorest Americans in a self-proclaimed effort to promote democracy, exported their expertise to such rivers as the Sao Francisco in Brazil in part to establish an engineering bulwark against the spread of communism.
Another form of hydrological engineering is irrigation. Many irrigation projects date to the colonial era. Australia, the driest inhabited continent, advanced massive irrigation projects with the assistance of engineers from the United States (California), who had themselves gained training from British engineers, at the turn of the nineteenth century around the time of independence from Great Britain. Irrigation, segregation, and later apartheid were joined in South Africa. With almost the entire cultivated area receiving irrigation, India's Punjab system dates to 1849, although a series of more extensive projects followed independence. French engineers attempted to bring water control to the Mekong Delta and colonial Vietnam. At the turn of the twenty-first century, Vietnam and other Southeast Asian nations have begun to tame the Mekong River delta with scores of hydroelectricity projects that raise questions of post-colonial oustees and environmental degradation.
Transportation
After 1750 a revolution in transportation changed the face of human interaction, commerce, military thinking, diet, leisure, and much else. It shortened the temporal distances across the globe. Railroads and steamships lowered consumer costs for perishable goods and provided greater variety in diet. Railroads were the crucial transport technology, later to be augmented by steamships and diesel trucks. In about 100 years the iron horse (and automobile) replaced the horse for many transportation needs. These technologies contributed to social mobility, international travel, and the migration of millions of people through the rapid expansion of passenger service. Transport technologies also facilitated colonial control, military conquest, and getting goods - fibers, minerals, and the like - to markets, for the British in India; the French in West Africa; and the Japanese in Manchuria, that enabled the exploitation of coal, the construction of harbors, warehouses, and mills, and the construction of shale oil and chemical plants; and so on.
Taken together with advances in refrigeration, for example, transport innovations enabled the international trade of fruits and vegetables that would have spoiled in earlier years. Steamships stimulated the development of banana culture from plantations in the Caribbean and Central America with shipments to the United States and today from India, several African countries, and China. Overall, long distance freight rates dropped by over 80 percent in many cases, shipping times fell even more, and volumes increased substantially. By the turn of the twentieth century intermodal freight transport (container shipping involving rail, ship, and/or truck), without any handling of the freight itself when changing modes, reduced handling, damage, and loss, while permitting even higher speeds. Although slower than air, modern sea transport which uses containers and other modes, reached 7.4 billion tons of cargo in 2007.
The railroad was a symbol of nineteenth-century technology. Rails and flanged wheels enabled a reduction in friction when carrying heavier loads, but the most important invention was using steam engines for locomotion. British and American entrepreneurs led the way. British visionaries saw their country's small land mass as facilitating a national rail system. Rails sped the transport of coal and finished goods like cotton, tying together industrial areas and ports. In 1830 there were 98 total miles of rail in England and in i860 there were 10,433 miles.
In the United States, between 1810 and 1830 several individuals proposed building locomotives and test tracks; over the next thirty years railroad construction accelerated as they replaced canals as the major mode of transportation. In 1869 a transcontinental link was established, securing the coherence of a sprawling country. As in Britain, in the United States railroads helped build a national market and national consciousness, and served as emblems of national technological prowess. The American rail network was also war-related (during the Civil War and wars against Native Americans) and often heavily state-subsidized.
All major European countries sought both symbolic and practical goals in building national rail networks. In France a state-sponsored centralized system aimed to achieve political and cultural goals; in Germany railroads contributed to the unification of the state politically and militarily in 1871; in Russia rail development lagged far behind Europe as the Tsar supported it belatedly and only for military reasons - to compete in the Far East and Manchuria with Japan. His minister of finance, Sergei Witte, pursued the railroad as a tool of modernization. The First World War revealed the failure of the Tsarist government to support the development of transportation as the need to move troops and weapons overwhelmed the system. In 1916 the government funded the construction of a rail link from Murmansk on the Barents Sea to St. Petersburg using slave labor, in a belated effort to turn the tide in a failed war against Germany.
Governmental regulation of technological risk accompanied these developments in transport. In 1838 the Moselle paddle boat exploded near Cincinnati killing at least eighty people when four boilers burst; from 1816 until 1848 at least 1,433 people died in steamboat accidents in the United States. The accident led to the passage of the 1838 Steamboat Act, the first federal regulation of private industry that required licenses and inspections. Railways in Britain also inspired early efforts at government regulation in the interest of safety. Transport technology was not alone in attracting regulatory supervision in the nineteenth century, but as one of the most dynamic, visible, and dangerous technologies going, it came in for more than most. In the years after the Second World War governments became much more active in regulating transport technology, and indeed technology in general, an important part of the general expansion of the role of the modern state. Other state actions - for example, the evolution of tort law and growth of limited liability corporations - served to limit the risk borne by transport companies. By the late twentieth century corporations were selling “safety” in their conveyances; for example, the automobile with its airbags and increased crashworthiness.
Like the railroad in the nineteenth century, automobiles in the twentieth became a symbol of industrialization and mass production. Demand and supply grew exponentially, especially in the United States, from under 3,000 total in 1901 to 13,000 in 1904, to 130,000 in 1910 and over 1.3 million units in 1916. By the turn of the twenty-first century, manufacturers in the United States turned out over 11 million cars and commercial vehicles annually and other nations of the world produced another 76 million vehicles, with growing markets in India and China. In 2010 Indian production reached 3 million units, while in 2009 China passed the United States in overall automobile production.
Automobiles have become pervasive and affect just about everything. They became a major source of air pollution, even in modern models, with significant contribution to global warming. They were a social force and cultural icon and, rightly or not, a mark of independence and freedom. The “Trabant” in socialist East Germany with its inefficient, polluting two-stroke engine was an icon of failed consumer culture. Almost everywhere the automobile has supplanted public transport, especially after the Second World War, and contributed to suburbanization. Since the construction of the Interstate Highway system in the United States and the rise of the automobile after the Second World War, the railroad has declined significantly in terms of freight miles and passenger miles. Also in the United States, the automobile contributed to racial segregation as primarily white people moved out of cities, lowering the tax base, and leading to urban decay.
Building materials from concrete to girders
The rise of the built environment of cities and suburbs accompanied revolutions in transport. Connected with this, building materials have undergone significant and rapid change especially through the application plastics; concrete and reinforced concrete; float glass; and new steels. While concrete and iron have been used for centuries, their industrial production in factories and improvements in ways to produce, strengthen, test, and cure them, while cutting their cost, led to significant expansion in their use. At first, concrete was used mostly in industrial buildings, since many people considered it aesthetically unpleasant. But use of concrete in buildings became widespread in the second half of the nineteenth century, especially through the use of reinforced concrete and improvements in Portland cement. By the 1920s it was being used in large buildings and by the 1930s on major concrete dams. The Hoover Dam, completed in 1935, consists of 3.3 million cubic yards (2.5 million cubic metres) of concrete that was poured in a series of blocks and columns so that it dried quickly and without forming significant stresses and cracks. “Jersey barriers" - highway dividers - became widespread in the 1950s, as did concrete slabs, beams, columns, and floors. Growing experience in building with reinforced concrete gave rise to “thin-shell” techniques for roofs, domes, and arches. Examples of innovative concrete structures include Spanish engineer Eduardo Torroja's design of a low-rise dome for the market at Algeciras; Italian Pier Luigi Nervi's hangars for the Italian Air Force; and Frank Lloyd Wright's Guggenheim Museum in New York City.
Concrete also had local uses. Manufacturers began to sell it in drums and then in bags. With industrial pre-fabrication, they sold inexpensive forms - concrete blocks, pavers, flower pots. Concrete blocks have widespread use in developing countries to raise houses off the ground and as pavers where fulldepth roadways are too expensive. By the end of the twentieth century concrete was a US$35 billion industry that employed more than 2 million people in the United States alone.
Urbanization meant vast, noisy, and disorderly housing, often overcrowded tenements, with garbage, waste, and excrement accumulating next to industrial waste, and inadequate water supply. Typhus and other water-borne diseases frequently felled city residents in epidemics. In the late nineteenth century public health officials pushed to build sewer and water supply systems and to build safe and clean public housing. Often extractive industries provided their own housing from the nineteenth century, but many of these “company towns” were monopolistic, used scrip to pay workers, and enabled the companies to consume the workers' pay in housing and food. Company towns included Le Creusot (mining) in France; Ludwigshafen, Wolfsburg (automobiles), and Leverkusen (chemical dyes) in Germany; Kiruna (mining) in Sweden; Kitakyushu (mining) in Japan; and Widzew (textiles) in Poland. Weimar Germany in the 1920s was a pioneer in public housing.
Industrial construction techniques and materials enabled urban housing to be built relatively inexpensively in the name of public health and welfare. After the Second World War this housing expanded rapidly in the United States and Western Europe. Under socialism in Eastern Europe and the Soviet Union, major industries gained responsibility to provide housing for their laborers who thus ended up living in close proximity to noise and pollution. Magnitogorsk in the Ural Mountain region was the first of many such Soviet industrial towns. Under Nikita Khrushchev the Soviet housing program began to make headway in providing decent housing with millions of mass-produced units added annually beginning in the late 1950s.
In most countries of Asia and Africa public housing has lagged considerably and the average household in these countries contained several times more people than in the industrial north and west. Migration to cities accelerated in the second half of the twentieth century. In Vietnam, because of government policy, by 2020 there will be 45 million city dwellers, about 45 percent of the population, which will put more pressure on housing stock, 25 percent of which the government already classifies as substandard or temporary. In Bangladesh, according to the 1991 census, four-fifths of dwellings were made of straw or bamboo. In India, the majority of people occupy 10 square meters of space for living, sleeping, cooking, washing, and toilet needs; 400 million Indians do not have access to a proper toilet and there are extensive slums. In Brazil, similarly, because of rapid urbanization and the government’s late turn to public housing, millions of people lack proper housing, mostly in the southeast and northeast regions, many existing dwellings fail to meet public health requirements, and programs in the early twenty-first century to increase stock by millions of units annually have lagged.
Suburban housing, like urban apartments, was often mass produced in developments that, through excavation and clearing of land, had a significant environmental impact. Mass production greatly lowered the cost per unit and made possible the postwar rise in home ownership in the United States from just under 40 percent (where the level had remained since the early 1900s) to almost 70 percent by 1970 (where it has remained since). Suburbanization involved also mass consumption of such consumer goods as refrigerators and washing machines, and was accompanied by the spread of such distribution technologies as shopping centers and fast-food restaurants, all of which were connected together in the built environment by the automobile.
Communications
Communications technology changed rapidly after 1750 as well. From the postal service to the telephone and email; from pens to typewriters and carbon paper to copy machines and faxes and electronic mail; from long-play records to CDs and music-sharing services; and from telegraph, radio, film, and television to the computer. All these trajectories are based on increasing miniaturization of components and circuits, making communications technology ubiquitous in every country of the world. Even poor people have mobile telephones. While Internet access remains unevenly distributed, it too is spreading.
Communications also reveal one of the major paradoxes of technologies: they may be used for good purposes, for example the spread of democratic institutions, education, and overcoming isolation, or they may be used by oppressive regimes, for example in the USSR or North Korea, to control citizens. In the USSR, the government strictly regulated access to telephones and copy machines, forcing dissident literature underground. Social media and cell phones were crucial to the Arab Spring of 2011.
Postal systems were established in many countries throughout the world in the nineteenth century. They have generally operated with prepaid fees (stamps) and as subsidized government monopolies. This enabled governments to end expensive, confusing private postal systems. In some countries, postal systems distribute pensions, handle passport applications, and provide other services. In the United States, the Continental Congress appointed Benjamin Franklin postmaster general in 1775, expecting him to harmonize the postal service from Maine to Georgia. By 1808 Robert Fulton's “Clermont” steamboat carried mail - at least unofficially - and by 1815 congress authorized the postmaster general to contract for carriage of mail by steamboat. Regular carriage of mail by rail - and sorting of mail in special wagons - began in the United States in the 1860s. The post offices often established facilities adjacent to major stations throughout the United States and Europe. By the end of the nineteenth century, city and rural free delivery had been established. “Air mail” commenced in 1918. In 1959 postal codes were introduced in Great Britain, in 1964 in the United States, and then generally in all other countries.
The telegraph promised much faster communication than any post office. It uses electrical signals, usually conveyed by wires or radio. By the early nineteenth century, a number of physicists in France and Prussia (Hans Christian Orsted, Andre-Marie Ampere, Carl Friedrich Gauss, and others) had made progress using wires and magnets to transmit signals that could contain language. In the United States, Samuel Morse and Alfred Vail developed Morse code in the 1830s; in the 1840s the United States Congress funded a telegraph line from Baltimore to Washington, with the first message “What hath God wrought.” A transcontinental telegraph began operating in 1861, with submarine transatlantic cables soon thereafter. By the turn of the twentieth century, telegraph had dropped in cost and extended throughout much of the world, enabling instantaneous global communication with applications for trade, war, and newspapers (with shared wire services). The Internet, however, and electronic fund transfers put an end to the telegraph by the twenty-first century.
As with many technologies, a number of individuals nearly simultaneously developed the telephone. Since Alexander Graham Bell and his company dominated the US business and his patents were upheld by the US courts, normally he receives credit. The telegraph office remained important to businesses, post offices, railroads, and bureaucracies until the development of the telephone exchange and switchboard. By the early twentieth century the United States led the world in telephone density per person and had 3 million customers. Phones became easier to use, of higher quality, and with better connections with the advent of the single handset model with a rotary dial, automated switching equipment, then push button telephones, and eventually greater versatility as smart phones (with computers, cameras, and other devices) owing to miniaturization through solid-state circuitry.
Many people once worried that the telephone would destroy the family; interrupt dinner, privacy, and sleep; pull impressionable people away to distant contacts; lead to risque behavior and fraudulent practices; and, paradoxically, interfere with conversation. To some extent, it did. On the other hand, the home telephone broke isolation and added to a family's social circle. It aided in reporting accidents and the need for emergency assistance. Work phones cut many of the costs of doing business. If at first only the wealthy could afford the phone and connections were poor, now all persons in all walks of life possess a phone. It has become a necessity, not a luxury, with many governments requiring phone companies to provide life-line service at a discount to ensure all citizens are connected for emergencies.
In 2010 there were approximately 6.8 billion mobile phones in the world out of a population of 7 billion people, and in many countries the number of phones exceeds the number of residents: Russia, Italy, Brazil, Germany, the United States, Ukraine, Egypt, and Indonesia, among others. Mobile phones contributed to the openness and rapid spread of information that supported the Arab Spring of 2011 and other social movements. In Tunisia, Egypt, Libya, and Yemen, citizens forced leaders from power and uprisings elsewhere demonstrated citizen dissatisfaction with the status quo. But modern communications technologies, notably mobile phones, also empower governments against their populations. Most governments monitored phone communications, some of them (e.g. China, Russia, the United States) in the twenty-first century did so on a massive scale, using computers to track millions of people. This practice had deep roots: early on, operators at switchboards could and did listen in on conversations; many early phones were party lines; and beginning in the 1920s and through the Nixon presidency, arms of the US government wire-tapped without court order. Businesses and governments now routinely track individuals' cell phone use and - in the computer world - which websites they visit. Privacy has become a precious commodity that is difficult to secure amidst modern technologies.
Computing machines
Early computing machines were connected to efforts in cryptography and code breaking. Scientists long sought computing machines but rapid development of the computer itself was connected with the need to undertake complex calculations for weapons of mass destruction - hydrogen bombs. By the early 1970s, many academic institutions had begun to standardize use of mainframe computers. Portable or smaller models developed in the 1980s, with microprocessors facilitating this step. Home computers and mass marketing followed soon thereafter. In 2010 there were likely 1 billion computers in the world, up from 500 million in 2002, and 48,000 in 1977. Beyond rapid calculation, computers have myriad uses including data storage and manipulation, calculation, and CAD/CAM (computer assisted design and manufacture).
But as noted, the most visible use of computing machines is as part of digital communications. By the twenty-first century through linked phone and computer systems, social networking sites proliferated, allowing people to communicate, share photographs and news, meet more easily, organize, find individuals with similar interests, and so on. The first step was the SMS (short message service) that at first grew popular among young people. People of all walks of life now communicate constantly using their phones and computers. In a word, modern communications technologies both encouraged democratic activities through education and access to information, and by mobilization to assemble or organize, and also served tyranny. But without a doubt the ubiquitous phone has penetrated social, business, and political life no matter an individual's occupations or habits.
Nuclear technologies: military and peaceful
Many of the technologies and processes discussed grew out of or found military applications. Interchangeable parts contributed to the rise of the American system and the assembly line. New steels contributed to armored vessels, then tanks and modern ships. Computers helped design nuclear weapons and became essential to so-called smart weapons and drones. From the points of view of geopolitics, environmental degradation, and misplaced hubris, nuclear technologies may be the most significant technology of the twentieth century.
When President Dwight Eisenhower gave his “Atoms for Peace” speech at the United Nations in 1953, he sought to slow the Cold War by encouraging the United States, USSR, and other nations to promote peaceful applications through the UN and to defuse public suspicions that the “atom” was only a menace. Scientists around the world welcomed the opportunity to “domesticate” the atom for public consumption and pretend that the benefit of peaceful applications outweighed the dangers of nuclear war. Scientists had long understood that they could use X-rays and quickly found isotopes for diagnostics and treatments, and for industrial sensors and trackers. They irradiated food to “sterilize” it, increase its shelf life, to disinfest it from insects. Radiation sterilization was not a widespread practice because it costs a great deal to build facilities to irradiate food, while washing food is cheaper and usually just as good, but over sixty nations have programs; the impetus to food irradiation was from the military to provide rations at lower costs at great distances in inhospitable climates.
The most visible peaceful use of nuclear energy has been in power reactors to generate heat and electricity; the heat has industrial applications or could be used for desalination of water. The first civilian power reactors were modestly sized and have grown to a standard of about 1,000 or 1,200 megawatts of electricity. The Soviets built the first station to provide electricity to the civilian grid at Obninsk in 1954 at 5,000 kilowatts. ByJuly 2013 there were 403 reactors operating worldwide with 100 in the United States, 58 in France, and 33 in Russia, most of them pressurized water reactors. A reactor is basically a giant tea kettle. It generates power through a controlled, sustained chain reaction, usually using fissile 235U or 239Pu as fuel. Fuel, made up of heavy atoms that split when they absorb neutrons, is placed into the reactor vessel (basically a large tank) along with a small neutron source. The neutrons start a chain reaction where each atom that splits releases more neutrons that cause other atoms to split. Each time an atom splits, it releases large amounts of energy in the form of heat. The heat is carried out of the reactor by coolant, which is most commonly water. The coolant turns water into steam that pushes a turbine to spin a generator or drive shaft.
The advantages of nuclear power include the fact that reactors produce no such pollutants as sulfates, nitrates, or greenhouse gases. They are relatively clean compared to fossil-fuel energy. Nuclear power stations, however, require huge takings of land for the station site and exclusion zone, and cooling water released into the environment has a direct and immediate impact on ecosystems, even when it is first cooled substantially. When factoring in mining accidents, transport of fuel, respiratory diseases, and the like, nuclear power has proven safer than fossil fuel. Yet nuclear power has high capital costs, an ominous presence because of the need for extensive security, including against terror, and the risk of catastrophic accidents. To meet those costs, nuclear power has required billions of dollars, pounds, Euros, marks, yen, and rubles of subsidies and special indemnity insurance that limits the legal liability of nuclear facilities.
Major accidents have also been quite costly: at Three Mile Island in Pennsylvania in 1979 with a partial meltdown; Chernobyl in 1986 with a massive explosion, destruction of a reactor, and release of vast quantities of radioactivity throughout the globe and likely 50,000 excess deaths; and the destruction of Fukushima Daiichi in 2012 by a tsunami with continuing, extensive radioactive and indeterminable human and environmental costs. Plant managers after all three accidents also lied about the dangers and risks.
Mlitary applications - in the form of atomic bombs - were the icon of military technology and of a state-science-engineering partnership. President Franklin Roosevelt authorized the US project after Leo Szilard and Albert Einstein alerted him to the danger of a Nazi bomb in 1939. This led to a crash program (the “Manhattan Project”) which demonstrated the effectiveness of fully supported military research conducted by quasi-independent civilian scientists. J. Robert Oppenheimer, a theoretical physicist and superb organizer, was selected to run the project. General Leslie Groves headed the military side. Oppenheimer gathered some of the world's great physicists in Los Alamos, New Mexico, and in July 1945 they detonated the first atomic bomb in the nearby desert. On August 6, the United States dropped an atomic bomb on Hiroshima, Japan, and on August 9 dropped a bomb on Nagasaki. In all, both bombs killed 150,000 directly and likely another 50,000 people died from illness, radiation poisoning, and other factors within the next year; the
The history of world technology, 1750-present decision to use these weapons has remained controversial ever since. Japan agreed to unconditional surrender to the United States on August 15.
The Soviets did not face any similar moral, financial, or political conundrums in racing the United States to build the atomic bomb. Joseph Stalin ordered Secret Police ChiefLavrenty Beria to assemble the necessary manpower, facilities, and other resources to complete the project without delay. He appointed Igor Kurchatov, Oppenheimer's counterpart, to run the scientific aspects of the project. Very quickly, on about the same timescale as in the United States, the Soviets assembled the F-ι test reactor, began producing plutonium, learned how to enrich fissile fuel, and in August 1949 detonated their first atomic bomb. Granted, the Soviets had the benefit of knowing that a bomb could be assembled and had access to information via espionage. But the espionage only confirmed that scientists were on the right track; it was not a bomb blueprint. The Soviet bomb was an indigenous effort. The British, French, and Chinese next built nuclear weapons. (The Nazi effort had failed.) By 2010, India, Pakistan, and North Korea had acquired the bomb; Israel almost certainly has nuclear weapons; Iran and several other nations have pursued advanced uranium fuel processing, which is a precondition to enrich uranium sufficiently to build a bomb.
Scientists studied nuclear fission and fusion under the pressures of secrecy, world war, and the Cold War. Over seventy-five years they supervised atmospheric and underground testing of thousands of nuclear weapons that spread dangerous radioisotopes with significant human health concerns (roughly 2,000 tests in all, half by the United States); subjected animals and humans to higher-than-appropriate exposures in “tests”; sent soldiers into “ground zero” immediately after detonation without proper safety and monitoring equipment; and approved the haphazard disposal of low- and high-level radioactive waste that has leached into groundwater and spread through ecosystems. They assured policymakers and citizens that reactors were safe as they built larger and larger units whose failures are now common language (Three Mile Island, Chernobyl, Fukushima).
The human and environmental costs, and inherent immorality, of nuclear weapons becomes clearer when understanding how indigenous people bore the brunt of testing and developing atomic bombs. Much of the uranium ore in the United States was located on Indian territories, so the government coerced tribes into granting concessions at low royalty rates or through existing laws that favored prospectors to get at the stuff. Indians also mined the ore; many Navajo miners have significantly higher
lung cancer rates than other populations. The US government also moved the Bikinians off their homeland atoll in the South Pacific to use it as an atomic bomb testing ground. The government promised to move them back and has not been able to scores of years later because of persistent radioactive contamination. Similarly, the Soviet Union and France used imperial power to test nuclear devices, in the former case subjecting Kazakhstan and Nenets lands in the Arctic to extensive pollution, while the French exploited and abused the islands and populations of French Polynesia. Many other individuals suffered the severe, yet often not completely documented or still “secret” effects of exposure to radiation from testing: so-called downwinders in Utah and Nevada and Kazakhstan. Few of these individuals were able to get compensation, let alone answers, even in the 1990s after the Cold War ended.
Most of the other applications for nuclear technology were military as well. The Soviets built over two hundred reactors for submarines. Because of high power density and elimination of the need to carry fuel, nuclear propulsion left more space for cargo and enabled higher speeds and greater distances without refueling. The US and Soviets spent billions of dollars and rubles on space applications (rocket engines) and nuclear airplanes. These programs were unsuccessful, although various nuclear-powered satellites were launched and perhaps some thirty space-based reactors orbited the earth in 2012. The Soviets pursued also nuclear icebreakers. But the most expensive and extensive application was nuclear warheads, of which there were 68,000 at peak in 1985 versus roughly 17,000 in nine countries in 2013.
Agriculture, agronomy, and agribusiness: self-augmenting technology
An industrial paradigm extended to agriculture, with each innovation calling forth another, substituting machine power for human power, lowering prices so that demand increased, and triggering still more efforts to mechanize agriculture to lower production costs and increase output. Research scientists developed crops that were amenable to mechanical tending. By the 1950s, this cycle found full expression in the planting of monocultures of cash crops that could be planted, tended, harvested, and transported to markets by machines. It should be noted that some of these technologies, mostly biological or chemical, aimed at increasing the amount of something that could be grown with limited inputs (that is, by raising yields per acre), while other technologies, mostly mechanical, mostly aimed at reducing the cost of resources (mostly labor) needed for production. These technologies had very different implications for social relations, with some “releasing” labor from agriculture and contributing to urbanization, and others having an impact on the carrying capacity of the earth - the maximum population size that can be sustained indefinitely - and so on.
As a first step, during the Industrial Revolution inventors sought to apply advances in power generation, new materials, and machines to agriculture. Many of the early technological innovations originated in Britain and the United States. Such machines as reapers, rakes, mowers, and hay loaders had a mutually reinforcing effect on supply and demand. For example, the cotton gin quickly separated seeds from cotton, enabling more planting, and mechanized spinning made yarn cheaper - and more in demand - that led to more cotton being grown. Milking machines enabled the industrial production of milk. The internal combustion engine found its place in agriculture in the early twentieth century in the tractor and other machines. No longer would land contours prevent reshaping of farms, nor would forests, stumps, rocks and boulders, streams and rivers create insurmountable obstacles. The increasingly powerful tractor appeared in large numbers, enabling farms to extend to the horizon, and almost requiring the planting of monocultures of corn, wheat, soybeans, and other cash crops beyond the limits of demand. The tractor, according to this “technologically determinist” argument, pushed agricultural production beyond the needs of demand to production for the sake of production. Along with other machines, such as bulldozers for clearing land, planters, and harvesters, tractors could operate seemingly without limits. The result by the second half of the twentieth century was mega-farms or agribusinesses. While the United States was a leader in tractor production, the USSR embraced it with verve, for Soviet leaders saw the tractor as the means by which to convert the peasant to socialism.
By the 1920s industrial terms had penetrated agricultural, fishery, and forestry journals, as engineers pushed developments in technology to the living world. Humans have understood for centuries that they could produce better crops and farm animals through hybridization even if they did not comprehend the mechanisms. Scientists learned, first in the laboratory and now commercially, how to manipulate genes in genetic engineering. Most of the effort to apply genetic technology has been in agriculture and driven by large corporations seeking to produce crops less susceptible to pests and unexpected weather. In addition, genetic modification has allowed earlier harvesting, easier shipment, heavier doses of pesticides and herbicides, and faster-growing livestock that offer more meat, milk, and fat. To date, researchers, governments, and corporations have spent billions of dollars on research, commercial development, and regulation, with the European Union taking a lead in regulating genetically modified organisms (GMOs). In the United States with its relatively weak regulatory framework, many aspects of GMO regulation are voluntary, and depend on companies to consult on safety issues or share data with the government. An adjunct of GMOs is the Concentrated Animal Feed Operations (CAFOs), or industrialization of animal production, raising, and slaughter. The industrialization of animal slaughter goes back to the mid-nineteenth century, when cities in the American Midwest pioneered new slaughterhouse and meat-packing technologies that allowed the swift dispatch of several million cattle and hogs every year. CAFOs, developed especially after 1945, brought similar industrial efficiency to the production of livestock. They came to rely on standard, “mass-produced” animals. CAFOs confine large numbers of animals in close quarters in which feed, manure, sick animals, and disease share space - often metal buildings and pens that restrict the behavior and movement of animals who rarely have access to sunlight or fresh air. Their environmental costs - the spread of vast offal and manure lagoons, antibiotic-resistant bacteria, erosion, and so on - proved difficult to remediate (Fig. 5.5).
Figure 5.5 Dairy cows feed on grain inside a barn of a modern dairy farm in Loganville,
Wisconsin, United States
(© Paul Damien/National Geographic Society/Corbis)
Technology and the state in the twenty-first century
By the twenty-first century, technology was integrally tied to every economy in the world. It had made the world more interconnected through advances in transport and communication. It appeared in such areas of human activity in industrial forms and attitudes as agriculture and geo-engineering of rivers and forests. The pace of technological innovation appeared to have increased in each field, and because of miniaturization and deft manipulation had had its largest recent impacts on data management devices and on genetics through genetic engineering and GMOs.
In the twentieth century, technological systems became increasingly large scale and tied to state power, and the major expenditures on technology occurred in the military sphere directly or indirectly. The communist Soviet Union, fascist National Socialist Germany, the People's Republic of China, and democratic United States all supported big science and technology. In the United States, not only the Manhattan Project, but also NASA (the National Aeronautics and Space Administration, established in 1957) and its predecessors grew rapidly on the united force of the private sector, the military, and universities. In Nazi Germany, highways, research institutes, and military research and development united the country's industrialists with modern technology, as had the Kaiser Wilhelm Society institutes earlier in the twentieth century. In the USSR, the entire technological endeavor rested on state-funded research, development, innovation, and production in research institutes of the Academy of Sciences and industrial ministries. Brazil, China, France, Great Britain, India, South Korea - and dozens of other countries -similarly worked to link state support with technological advance for public health, economic, and military purposes.
Finally, its benefits and risks still remained a question of debate. World industrialization, or globalization as it is sometimes called, involved the establishment of factories in countries with an unskilled and poorly paid labor force that produces goods and services more cheaply than in the industrialized nations; some individuals believe that globalization will inevitably lead to modernization which they see as inherently good. Others believe that globalization repeats the social displacement and political changes that accompanied the Industrial Revolution, including poor safety conditions and child labor.
Further reading
Bennett, David. Skyscrapers: Form and Function. New York: Simon & Schuster, 1995. Bijker, Wiebe. Of Bicycles, Bakelites, and Bulbs: Toward a Theory of Sociotechnical Change.
Cambridge, ma: MIT Press, 1995.
Binfield, Kevin. Writings of the Luddites Baltimore, md, and London: Johns Hopkins University Press, 2004.
Flink, James. The Automobile Age. Cambridge, ma: MIT Press, 1988.
Henkin, David. The Postal Age: The Emergence of Modern Communications in Nineteeth-Century America. University of Chicago Press, 2006.
Hills, Richard. Power from Steam: A History of the Stationary Steam Engine. Cambridge University Press, 1989.
Hounshell, David. From the American System to Mass Production, 1800-1932: The Development of Manufacturing Technology in the United States. Baltimore, md: Johns Hopkins University Press, 1984.
IG Farben: von Anilin bis Zwangsarbeit: zur Geschichte von BASF, Bayer, Hoechst und anderen deutschen Chemie-Konzernen. Stuttgart: Schmetterling, 1995.
Josephson, Paul. Industrialized Nature: Brute Force Technology and the Transformation of the Natural World. Washington, dc: Island Press, 2002.
Kallinich, Joachim, and Sylvia de Pasquale, eds. Ein offenes Geheimnis: Post- und Telefonkontrolle in der DDR. Heidelberg: Edition Braus, 2002.
Kansteiner, Wulf. “Nazis, viewers and statistics: television history, television audience research and collective memory in West Germany.” Journal of Contemporary History 39:4, Special Issue: Collective Memory (October 2004), 575-598.
Landes, David. The Unbound Prometheus: Technological Change and Industrial Development in Western Europe from 1750 to the Present. Cambridge University Press, 1969.
Marx, Karl. Capital: Critique of Political Economy, trans. Ben Fowkes, 3 vols. [1867, 1895, 1894]; London: Penguin, 1990-1992.
Medvedev, Zhores. The Legacy of Chernobyl. New York: Norton, 1992.
Nilsen, Alf Gunvald. Dispossession and Resistance in India: The River and the Rage. London and New York: Routledge, 2010.
O'Brien, Patrick. Railways and the Economic Development of Western Europe, 1830-1914. New York: St. Martin's Press, 1983.
Peyret, Henry. Histoire des Chemins defer en France et dans le Monde. Paris: Societe d'Editions Franpaises et Internationales, 1949.
Pomeranz, Kenneth. The Great Divergence: China, Europe, and the Making of the Modern World Economy. Princeton University Press, 2000.
Rees, Jonathan. Refrigeration Nation: A History of Ice, Appliances, and Enterprise in America. Baltimore, md: Johns Hopkins University Press, 2013.
Rhodes, Richard. The Making of the Atomic Bomb. New York: Simon & Schuster, 1986.
Seel, Peter. Digital Universe: The Global Telecommunication Revolution. Chichester, West Sussex and Malden, ma: Wiley-Blackwell, 2012.
Tiwari, R. D. Railways in Modern India. Bombay: New Book Company, 1941.
Tressler, Donald, and Clifford Evers. The Freezing Preservation of Foods, 3rd edn. Westport, cτ: Avi Publishing Company, 1957.
Tucker, Barbara M. Samuel Slater and the Origins of the American Textile Industry, 1790-1860. Ithaca, ny: Cornell University Press, 1984.
Wilson, A. C. “A thousand years of postal and telecommunications services in Russia.” New Zealand Slavonic Journal (1989-1990), 135-166.
Yamaguchi, Tomiko, and Fumiaki Suda. “Changing social order and the quest for justification: GMO controversies in Japan.” Science, Technology, & Human Values 35:3 (May 2010), 382-407.
Zemin, Jiang. “Water Law of the People's Republic of China (Order of the President No. 74),” August 29, 2002, www.gov.cn/english/laws/2005-10/09/content_753i3.htm.