The estimated reading time for this post is 10 minutes.
Over the last few decades there have been several calls for a ‘big picture’ of the history of science. The gradual fragmentation – or even dismissal – of older grand narratives, accelerated by the cultural turn, is increasingly seen as problematic. There is a general need for a concise overview of the rise of modern science, with a clear structure allowing for a rough division in periods. Here I would like to propose such a scheme, one that is both elementary and comprehensive. It focuses on particular technical artefacts or machines, which mediated between science and society during successive periods of time. Each of these machines was used as a powerful resource for the understanding of both inorganic and organic nature. More specifically, their metaphorical use helped to construe and refine some key concepts that would play a prominent role in such understanding.
The four machines that modeled nature in the modern era were respectively the mechanical clock, the balance or weighing scales, the steam engine and the computer. All of these machines came to play a highly visible role in Western societies, both socially and economically. The concepts they helped to bring to the fore were matter in motion, force, energy and information. In the course of time the referents of these concepts would even be taken as the ultimate stuff that reality is made of. Moreover, the ties between these machines and nature were strengthened by the fact that these machines would eventually also make their physical entry in scientific research itself. This remarkable pattern has repeated itself roughly every hundred years (a highly convenient frequency) up to the very present.
In the 17th century the mechanical clock, developed during the three preceding centuries, became a powerful metaphor for nature. This metaphorical role was instrumental in the gradual mechanization of the worldview. It was applied to the universe at large as well as its smaller parts. It is no coincidence that Robert Boyle, the man who coined the phrase ‘mechanical philosophy’ and defined it as the attempt to explain all natural phenomena in terms of those ‘two grand and most catholick principles of bodies, matter and motion’, often referred to the famous clock of Strasbourg. In the Cartesian view even the living body was essentially an intricate piece of machinery. Both the universe and the body were no longer held to need a soul to account for their activity. Motion, conserved and transmitted by passive material particles through mutual physical contact, as by the wheels in a clock, did the trick.
Meanwhile, in the hands of mathematicians like Galileo, Huygens and Newton, the concept of motion was transformed, refined and eventually subjected to strict rules or ‘laws’. Crucial in this process was the new conception of inertia, which attributed to motion a kind of autonomy. Motion, once put into the world, no longer needed a continuous source for its persistence. In return, the mechanical clock itself was also improved, above all through Huygens’ invention of the pendulum clock. By further increasing its precision the clock was eventually turned into an indispensable astronomical instrument, the chronometer.
Clock towers, which regulated markets, jurisdiction, administration and everyday life, were not the only conspicuous architectural elements in early modern towns. In their wake, as trade increased, many towns, especially those in Northern Europe, were provided with impressive weighing houses, as shrines to the balances kept inside. In the 18th century, the balance, a time-honoured symbol of justice, gradually came to model new views of nature and society alike. Whereas the clock, consisting of passive wheels driven by a single prime mover (be it a weight or a spring), nicely symbolized the blessings of absolutism, the balance suggested the need for a balance of powers, such as proposed by Montesquieu.
The use of the balance as a model for understanding nature was based upon – and in turn reinforced by – a static view of nature. The metaphorical use of the balance is perhaps best exemplified by the still prevalent notion of the ‘balance of nature’. Although this concept was rooted in antiquity, it became far more popular in the eighteenth-century, partly as a result of the flourishing of the new genre of natural theology. In the late eighteenth-century this notion also took on more specific and secular forms, such as Laplace’s work on the stability of the solar system, Cuvier’s notion of the functional interdependence of the different parts of the organism, Condorcet’s view of algebra as basically a balancing of quantities, Lavoisier’s view of chemical reactions as a balancing of equivalent weights of chemical substances, and Adam Smith’s balance of supply and demand.
But the model was also applied to nature in a more specific sense, namely that of a balance of powers or ‘forces’. These natural forces, whether physical or vital forces, were increasingly seen as irreducible categories. It was Newton himself who had first put a spoke in the wheel of the mechanical philosophy. His attractive force of universal gravitation simply defied all attempts at mechanical explanation. In its wake non-mechanical forces rapidly multiplied. Late 18th-century philosophers distinguished electric and magnetic forces, as well as several short range attractive and repulsive forces, responsible for cohesion and adhesion, chemical reactions, as well as optical and thermal phenomena. Some philosophers, notably Kant, even suggested that at the deepest level, matter itself was essentially a complex of attractive and repulsive forces, thus making ‘force’ the ultimate building block of nature.
In the late 18th century balances were drastically improved and new kinds of balances introduced, such as those based on springs rather than on counterbalances. Their prominent role as measuring instruments in Lavoisier’s anti-phlogistic revolution has been well documented. In experimental physics another new kind of balance, the torsion balance, enabled Charles-Augustin Coulomb to determine the laws of electric and magnetic forces, and Henry Cavendish to perform no less a feat than the weighing of the entire earth.
As the French and industrial revolutions spelled the end of the static social order of the ancien régime, they also marked the downfall of the static, balanced view of the natural order. During the ‘age of revolutions’, the balance was replaced by the steam engine as the dominant metaphor. In a series of ground breaking papers, Norton Wise has analysed the transition to a more dynamic world picture in an industrial context, focusing on the mediating role of the steam engine. As he made clear, historicist notions of change and transformation manifested themselves not merely in the earth and life sciences, but also in the physical sciences. Not only did the steam engine produce motion, it did so by transforming the chemical powers of coal into both heat and work. The rational analysis of the steam engine eventually gave rise to the new science of thermodynamics, based upon two fundamental laws, the second of which came to underline the irreversibility inherent in nature’s workings.
Another, even more important offshoot was the new concept of energy, connected to the corresponding conservation law and gradually understood as the capacity to perform work. This capacity was conceptually transferred from machines to nature itself. In the second half of the 19th century, energy became the central concept in the emerging discipline of physics. And as it was applied to a broadening range of phenomena its definition and interpretation were gradually refined. From the very beginning, moreover, energy conservation played a vital role in the attempts of the Berlin physiologists to transform the conception of life, especially through the elimination of vital forces. Like the steam engine, an organism transforms chemical energy (food) into heat and work. As these different forms of energy exactly match one another, the law of energy conservation leaves no room for intervening ‘vital’ forces. Chemistry and physics thus rule the living world as much as that of inorganic nature.
In the late 19th century prominent scientists such as Wilhelm Ostwald and Georg Helm developed a program, ‘energetics’, that aimed to reduce all natural phenomena to transformations of different forms of energy. Neither force, nor matter, but only energy was accepted as the sole ‘real’ substance. At the same time, energy also entered the mental world. The repercussions of energy conservation for the notion of free will were widely discussed. And in the wake of efforts of the Berlin physiologists, one of their former pupils, Sigmund Freud, created a new field of ‘psychodynamics’. He explicitly modeled human consciousness (including the subconscious) on the steam engine, connecting the ‘ego’, the ‘id’ and the ‘superego’ to distinct parts of the machine. Hence our need at times of high pressure to ‘let off steam’.
The steam engine was also housed in an architectural novelty, the modern factory, which radically changed the appearance of the modern industrial town. During the second half of the 19th century similar structures also emerged at a unlikely places: institutes of higher education. New laboratories radically changed the nature and role of universities. Academic research now also assumed an almost industrial guise. Scientific research gave rise to the improvement of existing engines and the invention of new ones, such as the electromotor. Around the turn of the century several kinds of engines made their appearance in experimental set ups, thereby changing the nature of experiment. Kamerlingh Onnes’ low temperature facility with its industrial overtones is a classic example of this development. The rise of industrial laboratories, finally, sealed the relationship between natural science and industrial society.
But the times, they keep on changing. The second half of the 20th century was marked by a gradual shift from an industrial to a post-industrial or information society. By now the fastest growing industries, the Googles, Facebooks and Alibabas, no longer produce material products, but rather information or information networks. The machine which processes such information, the computer, has become the leading metaphor for nature, the living world and the human mind. The study of communication networks and computers during the 1930s and 1940s gave rise to the new concept of information and the emergence of information theory. The war itself did much to intensify research on the communication, coding and decoding of information, as did the subsequent Cold War. The postwar interdisciplinary field of cybernetics explicitly looked for the underlying principles of such complex systems as society, life, the brain and, indeed, the computer, once again linking the technical and the social with the natural. Information was a key notion in this program. It was seen to be fundamental, and irreducible. As the father of cybernetics Norbert Wiener put it: ‘Information is information, not matter or energy’.
In the 1960s biologists gradually came to understand the nature of life in terms of information, encoded in large molecules (DNA), and translated through messenger-RNA into the amino acids that make up proteins. The Human Genome Project, the world’s largest collaborative project in the life sciences, resulted in the mapping and identification of the 20,000 genes of the human instruction set. The development of software tools and other computer related methods for analysing and understanding such biological data, so-called bioinformatics, has now become an integral part of the life sciences. In a similar vein neuroscientists and those working on artificial intelligence struggle to understand the workings of the human mind in terms of exchanges of information between two billion neurons. The Human Brain Project aims to build a super computer able to mimic brain processes so as to get a better grasp of who we are.
During the last three decades physicists, following suggestions from John Archibald Wheeler, increasingly tend to see space, time and the four basic forces as emergent properties rising out of the distributions of information at the so-called Planck scale. In this program information, rather than matter, energy or force, is seen as the stuff that ultimately makes up our world, the universe being pictured, in the words of Gerard ‘t Hooft, as a large machine that processes information. The so-called holographic principle, pioneered by ‘t Hooft, was an important step in this direction. It suggests that our three-dimensional world, like the visible holographic image, emerges from a lower dimensional reality, as with the holographic plate, in which all the required information is stored. A further step in this direction is Eric Verlinde’s recent theory of gravity as an emergent, entropic force. Here entropy is understood in terms of information.
Finally the last decades have also seen an increasing role of computers in scientific experimentation. As Peter Galison has shown in the case of high energy physics the processing of data has become a part of the experiment itself and the computer has become an indispensable part of the measuring instrument. In many experiments data are no longer first registered and then processed, but they are fed immediately into the computer. This new use of computers has once again changed experimental practices to a considerable extent.
This, then, is the broad picture. If this scheme makes any sense at all, the question remains how we should account for the striking role of these machines in modelling several aspects of nature. As yet, I do not have an adequate answer to this question. But I would like to suggest that among the several cultural resources available to those aiming to deepen our understanding of nature, technological artefacts are more powerful than most other cultural products in one important sense. They are more rapidly embraced by outsiders and therefore they spread much easier. Knowledge likewise has to be spread in order to become certified knowledge. This latter process may be eased by the mobility of the technology that has inspired and facilitated such knowledge.
So far the best known attempt at a big picture is Ways of knowing by the late John Pickstone. Pickstone identified several elementary ‘ways of knowing’, such as ‘natural history’, ‘analysis’, ‘synthesis’ and ‘experiment’, which he used to organize his narrative. My scheme differs from that of Pickstone in several ways. It is both bolder and simpler. For one, it is based upon ontological categories rather than methodological ones. It focuses on nature’s fundamental principles and substances as perceived in successive periods, rather than on ‘ways of knowing’. In spite of its ‘philosophical’ nature, it is more firmly embedded in social-economic realities. Above all, it allows for a convenient periodization.
 M.N. Wise (with the collaboration of C. Smith), ‘Work and Waste: Political Economy and Natural Philosophy in Nineteenth Century Britain’, History of Science 27 (1989) pp. 263 – 301, pp. 391 – 449, 28 (1990) pp. 221-262.
 N. Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine (MIT Press:
Cambridge, Mass. 19612) p. 132.
 J. Pickstone, Ways of Knowing. A new history of science, technology and medicine (University of Chicago Press: Chicago 2001).
Frans van Lunteren studied physics at Utrecht University and wrote a doctoral dissertation on gravitational theories from Newton to Einstein. He now works as a historian of science at both the VU University of Amsterdam and Leiden University.