Science and technology are an integral part of our society. But if you look at the whole history, it is quite extraordinary that science has come to play such an important role for us. Historian of science Lorraine Daston, director emerita of the Max Planck Institute for the History of Science in Berlin, examines how this development was possible. In 2020, she received the A.H. Heineken Prize for History for her work.
Our society is steeped in science and technology. Thanks to our many years of investment, scientific research is flourishing, and we are reaping plenty of benefits – directly and indirectly. For example, governments make many of their decisions based on scientific evidence. During the pandemic, we saw this more clearly than ever before. The technological marvel in our pocket – the smartphone – is here thanks to science, and many centuries of medical research have brought healthcare to the high level it is at today.
But it could just as easily have been different. If you look at history, you will see that every society has some degree of interest in its natural environment. But a society that structurally invests large amounts of money and manpower in a systematic study of that environment is a historical rarity, as historian of science Lorraine Daston discovered, both from the point of view of historical eras and from that of different cultures.
She researches how this rarity came to be: what conditions are needed to create a society in which science flourishes? In her study at the Max Planck Institute for the History of Science in Berlin, all the walls are covered with bookcases from floor to ceiling. ‘I surround myself with books, and read, and read, and read,’ Daston says. ‘I take copious amounts of notes and sort them by theme, a process I sometimes compare to chemical distillation, taking notes upon notes to locate the essential. At a certain moment, patterns reveal themselves to me, as well as a lot of new questions. With those questions, I delve back into the books.’
This led Daston to the discovery that there are two main conditions for richly flourishing science. ‘First, the prevailing view in society must be that science is so important that society is willing to invest heavily in research itself,’ Daston says. ‘But not only that: the career path of a scientist must also have cultural value: it must have a certain glamour factor. In every era, there are certain professions that are highly valued by society. These career paths attract the best and brightest. A second key requirement is that scientists be given sufficient autonomy and freedom. There must be institutionalisation, an organised system that protects researchers from the pressure to be immediately useful and generate economic value. Science must have a certain critical mass to bring about such institutionalisation. This institutionalisation is needed to protect the fragile ecosystem of science.’
Both conditions are currently under pressure. In highly developed countries in particular, fewer and fewer people aspire to careers in science. ‘This is a signal that the cultural standing of science is declining,’ says Daston. ‘Universities are also in danger of being overwhelmed by the demand for immediate usability and economic value. Historically, that’s not a sign of the flourishing of science. It’s a very fragile ecosystem that we need to protect.’
Bedrock layer of science
Daston also delved deeper into this ecosystem and examined the extent to which the way scientific research is done has changed throughout history. She has identified three different timelines, each with its own pace of change. ‘The first is the fastest,’ Daston begins. ‘In musical terms, you could compare the first tempo to allegretto. This is the speed of the latest empirical results, which follow each other in rapid succession. Each new issue of scientific journals like Nature or Science is filled with these hot-off-the-press scientific developments. Then there is a slightly slower timeline – andante in musical terms. This timeline consists of the various frameworks that make sense of these empirical results. An example of such a framework would be the theory of relativity or the theory of evolution. Such frameworks are not perpetual but remain in place for decades or sometimes hundreds of years.’
Daston is most interested in the third, deepest level. ‘This is the basso continuo of science – the slow evolution of fundamental practices and epistemic virtues. Examples of practices include the controlled experiment, which first appeared in the late seventeenth century; statistical surveys, a product of the nineteenth century; or computer simulations, which we have used since the late twentieth century. Once such a method is introduced, it is never lost. So, this ‘bedrock layer of science’ forms a very slow accumulation of new ways of gaining knowledge
In general, changes on this timeline take place very slowly, but sometimes a new method appears suddenly. ‘An example is the “volcanic eruption” of probability theory in the mid-seventeenth century,’ says Daston. ‘People have been playing games of chance since time immemorial; there is archaeological evidence that the oldest civilisations were already using some kind of dice. But the idea of mathematics of probability suddenly emerged in the mid-seventeenth century. And that eruption still forms the basis for the probability theory that we use today.’
If you look purely at the bedrock layer of science, you can see beyond the boundaries of disciplines. For example, you see that statistics begins in the social sciences, as a method of describing populations. But then the method migrates to other sciences, such as the physics of gas molecules. ‘When Scottish physicist James Clerk Maxwell developed the theory of the ideal gas, he explicitly drew the analogy with a population in which individuals make contingent decisions,’ says Daston. ‘If you take those individuals together, you see a normal distribution: the familiar bell-shaped probability distribution. He compared these individual decisions of human beings to the velocities of the molecules in an ideal gas.’ Even today, there is a great deal of migration of new methods between different fields. For example, techniques from artificial intelligence can be used in many different situations, as long as you use the right training sets. Scientists use similar techniques to discover patterns in weather data, and to study whether plays were written by Shakespeare or someone else, for example.’
Emergence of photography
‘My research is constantly surprising me,’ says Daston. ‘For example, when I realised that concepts like objectivity or observation have a relatively recent history.’ Daston’s research, conducted with fellow historian of science Peter Galison, found that ‘objectivity’ as a virtue in science did not make an appearance until the nineteenth century. Before that ‘certainty’ and ‘accuracy’, for example, played a major role. The emergence of the pursuit of objectivity coincided, not entirely by chance, with the emergence of photography. Some scientific disciplines saw in the photograph the ultimate way to restrain their subjective projections onto their research subjects. ‘Objectivity is a way of trying to eliminate errors,’ Daston explains. ‘In particular, the errors that result from a scientist projecting his or her fantasies and wishful thinking onto nature,’ In disciplines such as anatomy or embryological development, for example, scientists exchanged the beautifully detailed colour drawings for blurred black-and-white photographs. Interestingly, here you can see directly that a virtue like objectivity can clash with other virtues, such as ‘accuracy’. Because a blurred black-and-white photograph may be objective, but cannot compete with a carefully executed colour drawing in perspective of an organ or stage of embryonic development for accuracy.
Photographs became more and more accurate, but this did not resolve a more fundamental clash: objectivity can also clash with the virtue of ‘truth’. ‘Truth-to-nature’ is about the underlying truths about nature, and not the properties of often variable specific manifestations. For example, in the eighteenth century, botanists such as Carl Linnaeus depicted in their drawings not one particular individual flower or plant, male or female, at a particular stage of development, but an idealised version. They were looking for the general characteristics of the species. Botanists worked with photography briefly in the nineteenth century, only to turn away from it very quickly. Photography was useless for their purposes. ‘Drawings are still often used in field guides about birds or plants for this reason,’ Daston says. ‘Biologists consider the “truth-to-nature” that a drawing gives more important than the objectivity of a photograph.’
Daston recently completed a study on the history of rules, in the broadest sense of the word. ‘I looked at the history of rules in cookbooks, computer algorithms, manuals for warfare, game rules, traffic rules, dress code, spelling rules, laws of nature, and so on.’ Daston sought to discover underlying patterns: were there general developments on a large time scale in what a rule meant, or how it was formulated? She discovered that you can distinguish three categories of rules: rules as models, as precisely formulated laws, and as algorithms. We are still familiar with the latter two today. But rules as models were by far the most important until about the year 1800, and then largely disappeared from the scene. Such a model was an example that you could imitate but did not have to copy exactly – in much the way that role models might serve as a guide to conduct. Today, the vast majority of rules are formulated much more precisely, as ‘laws’ or ‘algorithms’.
Daston also looked at the history of exceptions to rules. ‘The interesting thing about exceptions,’ she explains, ‘is whether or not they are part of the rule. Until about 1800, rules were always formulated with a huge number of examples, exceptions, and appeals to experience and context. In a cookbook, for example, the authors tell you what to adjust if you are cooking at high altitude, or if you are using a different kind of oven. Or they make suggestions for how to replace one ingredient with another. In short, they include in the rules the fact that the world is surprising, that there are always unforeseen circumstances. But that flexibility begins to disappear after about 1800, when at least certain oases of stability emerge as a result of standardisation and globalisation. This is a world in which planning can extend months or even years into the future, in which “just-in-time supply chains” work and trains and planes are punctual.’
But how did this change come about? ‘In parts of the world, “pockets” of orderliness emerged. First, in certain cities, with Amsterdam as a shining example. By the eighteenth century, Amsterdam had become the perfect example of an orderly city in the minds of Europeans. Travelers admired its canals, waste disposal system, uniform house façades, and streetlights. Cities throughout Europe copied this model of orderliness. It took a century for this vision of urban order to be more widely realised, but because of this, around 1800, the utopian vision slowly emerged that perhaps all of society could be organised in this way.’ This was also increasingly seen in the formulation of rules: people dared to leave out all the different circumstances and exceptions.
Knowing the history of science is not only important for historians, but for everyone. We’ve seen this, for example, during the Covid pandemic. ‘It’s very disorienting for citizens, but also for policy makers, to see scientists change their minds,’ says Daston. ‘This is part of the normal and necessary process of science: in the light of new evidence, they modify their theories. But because of the limited understanding of how science works, such changes often confused the public and could be used to undermine trust in science. But the history of science shows that gathering new evidence and modifying conclusions on that basis is the necessary condition for scientific progress, not a flaw. This is one of the main messages to emerge from my research: the concept of an eternally unchanging truth, which we know from philosophy and religion, is not suitable for science. Science requires a dynamic idea of truth. Otherwise, there would be no scientific progress.’
Lorraine Daston (East Lansing, United States, 1951) studied history and philosophy of science at Cambridge University in the United Kingdom and Harvard University in the United States. At the latter university, she received her doctorate in the history of science. After this, she held various positions at Princeton University, Georg-August Universität in Göttingen, and the University of Chicago, among others. Since 1995, she has been the director of the Max Planck Institute for the History of Science in Berlin, from which she retired in 2019. In addition, she continues to be connected to the University of Chicago. She has written many books about her work, including (with Peter Galison) Objectivity (2007), Against Nature (2019), and Rules: A Short History of What We Live By (2022). In addition to the A.H. Heineken Prize for History, Daston been awarded the George Sarton Medal from the History of Science Society, the Dan David Prize, and twice the Pfizer Prize from the History of Science Society.
Lorraine Daston studies the historical conditions under which science flourishes in a society and the long-term development of forms of scientific rationality, such as probabilistic thinking, systematic observation, natural laws, and epistemic virtues such as truth-to-nature and objectivity. Her work translates abstract-sounding concepts into the concrete practices of doing science: making an image, conducting a statistical test, tabulating observations. She has published on a wide range of topics, from the history of probability to the wonders of nature; her most recent work has been on the moral authority of nature, the history of rules, and the origins of the scientific community.