For my PhD research I will be using Galison’s concept of the “trading zone” to describe digital history projects where historians collaborate with people from other backgrounds. In his book, Image & Logic, Galison developed this concept to describe the development of the field of physics in the period of 1880s-1970s where physicists of the “image” tradition (taking photos to discover new elements) and physicists of the “logic” tradition (using statistics to discover new elements) ended up working together. What is of interest to me, besides his development of the “trading zone” concept, is that automatisation of work plays a key role in this development, and from the 1940s on the computer starts playing a prominent role, shaping the field of physics. What becomes apparent from reading this book is that the integration of the computer in physics was by no means a natural inclusion, but a process of debate and negotiation of what it meant to “do” physics and what kind of knowledge can be acquired using computers. In this blogpost I’ll briefly touch upon this debateSince Image & Logic is an 850 page book, I can in no way summarise this satisfactorily in a blogpost, but I will do my best., as described in Galison’s work, and consider parallels with the debates in digital humanities (dh). Assuming dh describes a transition to include computers in humanities workZaagsma, G. (2013). On Digital History. BMGN – Low Countries Historical Review, 128(4), 3. http://doi.org/10.18352/bmgn-lchr.9344, maybe we can describe this transition of physics as “digital physics“.Not to be confused with the field of physics that describes the universe in terms of information https://en.wikipedia.org/wiki/Digital_physics
Of course, the integration of computers in physics work did not drop from the sky. Before computers, there is the history of instruments in physics, used in experiments to simulate reality or measure effects. As the experiments grew larger over time, especially during WWII with the radar and Manhattan projects, this meant physicists more and more had to collaborate with engineers, chemists, and others, to be able to do their research. With larger projects, this simultaneously led to more data, with the “image” side of physics producing tens of thousands of photos to be analysed by women ‘computers’. Already in this phase, I start seeing parallels with dh: in the new collaboration the early papers are written by engineers and technicians, not by physicists (p346). Technical conferences became necessary evils; although physicists wanted to do physics and not keep discussing technique, these conferences were necessary to discuss the reliability of new methods (p228). Do we not see likewise in dh that conferences focus on techniques, and papers focus more on describing technology than its application?
And yet, with ever-increasing amounts of produced data (measurements), automatisation became further necessary, and rather than tools in the hands of humans, humans became further integrated into the tools, as nicely shown by the picture below. And then came the computer, which began “as an instrument, a tool like other tools around the laboratory” (p692).
At first, the computer replaced the women ‘computers’ who were going through the thousands and thousands of measurements to find the ones of interest. Data reduction started playing a vital role here, removing noise or uninteresting parts of the measurements. This was followed by automatic analysis, where the remaining parts of interest were ‘interpreted’ by the computer itself to provide the physicist with charts. Finally, the computer begot the role of simulating reality. This process of measuring, interpreting, and finally simulating was not met without protest however, as can be seen from the following quote:
[T]he burden of photographic plenitude was not simply economic: physicists struggled to define the right relation between scientist and data, between physicist and technician, and between human and machine. At stake, the physicists argued, was not simply a matter of new technologies: each solution to the picture problem immediately joined a continuing argument about the boundary of what it meant to be a physicist, a scientist, and a human being. (p370)
Galison describes this debate by two opposing views: 1) “interactionist”, who held that machines would aid humans, but that humans are the only ones capable of final interpretive processing of data, and 2) “segregationist”, who held that humans do the preparation to feed the data to the machine which ultimately ‘takes over’ (p371). Concerns from the interactionist side were epistemological; could entirely new discoveries be made by computers that remove noise for analysis (p407), or are only humans able to do so? Concerns were also social: is a physicist who only sits behind a computer rather than in a lab still doing an “experiment” (p735)? Again here I see parallels with dh, are computers able to perform a valid hermeneutical analysis, or does the exceptional get lost in the average of big data?See for a discussion of this Hitchcock, T. (2014). Big Data, Small Data and Meaning. http://historyonics.blogspot.co.uk/2014/11/big-data-small-data-and-meaning_9.html And is one still a historian when doing research entirely behind a computer, never visiting an archive?
And yet, in the competition between physicists for more and faster research, the computer became indispensable. Contrary to losing control over the experiment as feared, Galison argues physicists would be strengthened in their control as they would receive immediate feedback of results, rather than wait for months or years to find out what came out of an experiment. Finally, with physics conquered by the computer, the physicist Lynch described priorities of computer programs, maybe we in dh can learn from this as well:
First priority: computer programs had to give correct answers, and the programs therefore had to be checked by someone other than the author. This demand for correctness was quickly followed by demands for reliability (the programs could not crash) and intelligibility (documentation both external and internal to the code had to be clear and well directed). A fourth priority was the constraint of efficiency; scarce resources both in core memory and CPU time could be swallowed up by an inefficient piece of code, and off-line work, while it needed less core, would still tie up crucial CPU time. Fifth and finally, “The programs must be easy to use. . . . Since we are a group where people use and depend upon the programs of others, considerable attention should be applied to the ‘human engineering’ aspect.” (p525)
The trading zone
As such, Galison describes three stages of the integration of computers in the field of physics (p770). At first, computers served an auxiliary function, purely for supporting the work the physicist was already doing, a stage we have definitely reached in the humanities.
In the second stage however a language was developed to exchange problems, theorems, and programs between researchers from different fields such as mathematics, statistics, and physics. Galison describes this stage as a pidgin, a language sufficient enough for people from different cultures to coordinate certain practices, without leaving behind their original culture. In dh, we already see such pidgin with scholars in discussion with developers acquiring vocabulary such as “interface”, “user”, “iteration”, and developers acquiring vocabulary such as “annotation”, “transcription”.  van Zundert, J. (2016). Barely Beyond the Book? In Digital Scholarly Editing: Theories and Practices (Online, pp. 83–106). Open Book Publishers. http://doi.org/10.11647/OBP.0095.05 Van Zundert considers it debatable whether this is the forming of a methodological pidgin, as there is no methodological influence after the coordination; the words are used merely within the discussion. It seems to me this is exactly what Galison aims at with the concept of pidgin; a language only used in certain interactions, without influencing the culture beyond this interaction. At the very least, this signifies a technological pidgin in dh projects, even if not the methodological pidgin Van Zundert hoped to find.
Finally, in the third stage the trading zone around the computer is strong enough to be its own field of research with graduates, conferences, and journals, and a researcher can identify as a “computer specialist” without needing to choose between any other culture present in the trading zone. Galison describes this stage as a creole, a complete language strong enough to stand on its own.
DH as a distinct field?
Whether dh forms such a creole is a central theme in my PhD. Do historians change their practices to approach research questions in a significantly different way? And do historians talk differently about their research practices? These are questions I hope to answer by studying digital history as trading zones in which historians interact with people from other backgrounds such as computer science, computational linguistics, and information science. While Galison as a historian looked back to reflect on how physics changed over a century, the transition of digital history is still very much ongoing, which for me gives an exciting opportunity to see this coordination and negotiation as it progresses.
|↑1||Since Image & Logic is an 850 page book, I can in no way summarise this satisfactorily in a blogpost, but I will do my best.|
|↑2||Zaagsma, G. (2013). On Digital History. BMGN – Low Countries Historical Review, 128(4), 3. http://doi.org/10.18352/bmgn-lchr.9344|
|↑3||Not to be confused with the field of physics that describes the universe in terms of information https://en.wikipedia.org/wiki/Digital_physics|
|↑4||See for a discussion of this Hitchcock, T. (2014). Big Data, Small Data and Meaning. http://historyonics.blogspot.co.uk/2014/11/big-data-small-data-and-meaning_9.html|
|↑5||van Zundert, J. (2016). Barely Beyond the Book? In Digital Scholarly Editing: Theories and Practices (Online, pp. 83–106). Open Book Publishers. http://doi.org/10.11647/OBP.0095.05|