We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Writing a conceptual history of QCD is exciting. The materials of the project are not natural phenomena of nuclear forces, which are inherently meaningless occurrences; not quasi-independent abstract ideas, whose causal effectiveness in moving history forward only idealist historians would believe; but human activities, in which meaning was radically changing with the change of perspective, and whose long-term significance for physics, metaphysics, and culture in general is to be discerned and interpreted by historians of science.
Historians' interpretations, however, are conditioned and constrained by the wide cultural milieu. Broadly speaking, at the center of contemporary cultural debate on issues related to science, such as objectivity and progress in science or the nature of scientific knowledge and its historical changes, sit two closely related questions. First, can science provide us with objective knowledge of the world? Second, is the evolution of science progressive in nature in the sense that it involves the accumulation of objective knowledge? The old wisdom that science aims at discovering truths is seriously challenged by such prominent commentators as Richard Lewontin and Arthur Fine. According to Lewontin (1998), taking science as a reality-driven enterprise has missed the essential point of science as a social activity, namely, its ambiguity and complexity caused by the socio-historical setting that has put severe constraints on the thought and action of scientists. In his Presidential Address at the Central Division of the American Philosophical Association, Fine (1998) has skillfully dissolved the notion of objectivity and reduced it to a democratic procedure, which may contribute to enhance our trust in the product of scientific endeavor, but has nothing to do with the objective knowledge the product may provide.
This volume is the first part of a large project which has its origin in conversations with Cecilia Jarlskog and Anders Barany in December 1999, in which the difficulties and confusions in understanding various issues related to the discovery of QCD were highly appreciated.
While the forthcoming part will be a comprehensive historical study of the evolution of various conceptions about the strong interactions from the late 1940s to the late 1970s, covering the meson theory, Pauli's non-abelian gauge theory, S-matrix theory (from dispersion relation, Regge trajectories to bootstrap program), current algebra, dual resonance model and string theory for the strong interactions, QCD, lattice gauge theory, and also briefly the supersymmetry approach, the D-brane approach, and the string–gauge theory duality approach, titled The Making of QCD, this volume is a brief treatment, from a structural realist perspective, of the conceptual development from 1962 to 1972, covering the major advances in the current algebraic approach to QCD, and philosophical analysis of the historical movement.
The division of labor between the two parts of the project is as follows. This volume is more philosophically oriented and deals mainly with those conceptual developments within the scope of current algebra and QCD that are philosophically interesting from the perspective of structural realism; while the whole history and all the historical complexity in the making of QCD will be properly dealt with in the longer historical treatise. They will be mutually supportive but have minimal overlap and no repetition.
The notion of scaling conceived and proposed by Bjorken in 1968 played a decisive role in the conceptual development of particle physics. It was a bridge leading from the hypothetical scheme of current algebra and its sum rules to predicted and observable structural patterns of behavior in deep inelastic lepton–nucleon scattering. Both its underlying assumptions and subsequent interpretations had directly pointed to the notion of constituents of hadrons, and its experimental verifications had posed strong constraints on the construction of theories about the constituents of hadrons and their interactions.
The practical need for analyzing deep inelastic scatterings planned and performed at SLAC in the mid to late 1960s had provided Bjorken the general context and major motivation to focus on the deep inelastic kinematic region in his constructive approach to the saturation of sum rules derived from local current algebra. The deep inelastic experiments themselves, however, were not designed to test Bjorken's scaling hypothesis. Rather, the experimenters, when designing and performing their experiments, were ignorant, if not completely unaware, of the esoteric current algebra and all those concepts and issues related with it. They had their own agenda. This being said, the fact remains that the important implications of their experiments would not be properly understood and appreciated without being interpreted in terms of scaling and subsequent theoretical developments triggered by it.
The experimental confirmation of the approximate scaling from SLAC stimulated intensive theoretical activities for conceptualizing the observed short-distance behavior of hadron currents and for developing a self-consistent theory of strong interactions, starting from and constrained by the observed scaling. At first, the most prominent among these efforts was the parton model that was originated from Bjorken's thinking on deep inelastic scattering and Feynman's speculations on hadron–hadron collision, and made popular by Feynman's influential advocacy.
The assumption adopted by the parton model that the short-distance behavior of hadron currents should be described by free field theory, however, was immediately challenged once the scaling results were published. The theoretical framework the challengers used was the renormalized perturbation theory. Detailed studies of renormalization effects on the behavior of currents, by Adler and many others, reinforced the conviction about the limitations of formal manipulations in the PCAC and current algebra reasoning, which made these physical effects theoretically invisible, and discovered, first, the chiral anomaly and, then, the logarithmic violation of scaling. The theoretically rigorous argument for scaling violation was soon to be incorporated into the notion of broken scale invariance by Kenneth G. Wilson, Curtis Callan, and others, which was taken to be the foundation of such approaches as Wilson's operator product expansion and Callan's scaling law version of the renormalization group equation for conceptualizing the short-distance behavior of hadron currents, for giving a more detailed picture of these behaviors than current algebra could offer, and even for a general theory of strong interactions.
Gell-Mann's idea of current algebra as a physical hypothesis was rendered testable by Adler's sum rules. The success of the Adler–Weisberger zero momentum transfer sum rule testing the integrated algebra gave credit to Gell-Mann's general idea. But what about Adler's nonzero momentum transfer sum rules testing local current algebra? When Bjorken raised the issue of how the nonzero momentum transfer sum rule might be satisfied to Adler, at the Varenna summer school in July 1967 when both Adler and Bjorken were lecturers there, it seemed to be a very serious problem to Adler, since all the conceptual development up till then left undetermined the mechanism by which the nonforward neutrino sum rules could be saturated at large q2. After the Solvay meeting in October, 1967, Adler discussed this issue with his mentor Sam Treiman, and “then put the saturation issue aside, both because of the press of other projects and concerns, and a feeling that both shared that how the sum rule might be saturated ‘would be settled by experiment’ ” (Adler, 2003, 2009, private communications).
The response by Adler and Treiman to Bjorken's question seemed to be natural. How could anything else but experiments be the ultimate arbitrator for a physical hypothesis? However, as we will show, solely relying on experiments missed an opportunity to use the exploration of mechanisms for saturating the sum rule as a way to go beyond the conservative philosophy of the current algebra program, which was to abstract from field theory relations that might hold in general field theories, rather than searching for particular attractive field theory models that might underlie the general relations.
The importance of underlying fundamental entities in theoretical sciences, which in most cases are hypothetical, unobservable, or even speculative in nature, such as quarks and gluons in QCD, from the realist perspective, and also from the hypothetic-deductive methodology, is clear and understandable. It has deep roots in human desire for explanation. However, it might be argued that the stress on the importance of underlying entities, which ground a reductive analysis of science, is in direct opposition to the holistic stance of structuralism. According to this stance, the empirical content of a scientific theory lies in the global correspondence between the theory and the phenomena in the domain under investigations at the structural level, which is cashed out with mathematical structures without any reference to the nature of the phenomena, either in terms of their intrinsic properties, or in terms of the underlying unobservable entities. Thus for structuralists no ontological interpretation of structure would be possible or even desirable. A structuralist, such as Bas van Fraassen (1997), would argue that if you insist to interpret the mathematical structure anyway, then different ontological interpretations would make no difference to science. That is, no ontological interpretation should be taken seriously.
Sir George Biddell Airy (1801–1892) was a prominent mathematician and astronomer. He was an honorary fellow of Trinity College, Cambridge, fellow of the Royal Society and Astronomer Royal from 1835 until 1881. His many achievements include important work on planetary orbits, the calculation of the mean density of the earth and the establishment of the prime meridian at Greenwich. He was also consulted by the government on a wide range of issues and projects, serving on the weights and measures commission, the tidal harbours commission and the railway gauge commission as well as acting as an advisor for the repair of Big Ben and the laying of the Atlantic cable. His autobiography, edited by his son Wilfred, comprises ten chapters and is drawn from the astronomer's own records of the scientific work he carried out at Greenwich Observatory along with his printed reports and private and business correspondence.
Up to modernity, the majority of Christian thinkers presupposed the world of creation to be composed of two parts: the material and the spiritual, existing alongside one another as independent yet interacting realms. In the traditional exegesis of Genesis 1, for example, the creation of light in Genesis 1:3 (“Let there be light”) was interpreted as a spiritual light for spiritual beings in a spiritual world (kosmos noētos), preceding the creation of the corporeal light of the sun in the empirical world (kosmos aisthētikos) in Genesis 1:14.
This two-stock universe lost its plausibility with the advent of classical physics in the seventeenth century, when nature came to be seen as a seamless unity. The scientific intuition of the oneness of the universe, however, was initially combined with a narrow interpretation of the nature of the material. As Isaac Newton (1642–1727) argued in his Opticks, matter is basically atomic: “solid, massy, hard, impenetrable, moveable particles”. According to Newton, these particles, formed in the beginning by God and held together by the mechanical laws of nature, serve the divine purpose of the universe while at the same time being embraced by God, who is ubiquitously at work ordering, shaping, and reshaping the universe. For Newton, mechanism and theism were two sides of the same coin. How else to explain the orderliness of the otherwise arraying particles?
The use of informational terms is widespread in molecular and developmental biology. The usage dates back to Weismann. In both protein synthesis and in later development, genes are symbols, in that there is no necessary connection between their form (sequence) and their effects. The sequence of a gene has been determined by past natural selection, because of the effects it produces. In biology, the use of informational terms implies intentionality, in that both the form of the signal, and the response to it, have evolved by selection. Where an engineer sees design, a biologist sees natural selection.
A central idea in contemporary biology is that of information. Developmental biology can be seen as the study of how information in the genome is translated into adult structure, and evolutionary biology of how the information came to be there in the first place. Our excuse for writing a chapter concerning topics as diverse as the origins of genes, of cells, and of language is that all are concerned with the storage and transmission of information.
(Szathmáry and Maynard Smith, 1995)
Let us begin with the notions involved in classical information theory … These concepts do not apply to DNA because they presuppose a genuine information system, which is composed of a coder, a transmitter, a receiver, a decoder, and an information channel in between. No such components are apparent in a chemical system (Apter and Wolpert, 1965). To describe chemical processes with the help of linguistic metaphors such as “transcription” and “translation” does not alter the chemical nature of these processes.[…]
Scientists who speculate on philosophical questions usually agree that classical materialism – the view that reality consists of nothing but small massy particles bumping into one another in an absolute and unique space–time – is intellectually dead. Accounts of the universe now regularly involve notions such as that of manifold space–times, quantum realities that exist at a more ultimate level than, and are very different from, massy particles in one specific space, and informational codes that contain instructions for building complex integrated structures displaying new sorts of emergent property.
What this suggests is that the nature of the reality investigated by physics and biology is much more complex and mysterious than some Newtonian materialists thought (though of course Newton himself was as far from being a materialist as one can get). In particular, the role of information in any account of our universe has come to take on a new importance.
Most contributors to this volume distinguish three main types of information – Shannon information, “shaping” information, and semantic information.
Shannon information is a matter of how to input the maximum amount of information into a closed physical system. It is concerned, it might be said, with quantity rather than quality, in that it totally ignores questions of the significance or function of the information that a physical system might contain. This is a technical matter for information technologists, and I shall not consider it further.
A host of surveys indicate that what Christians, and indeed other religious believers, today affirm as ‘real’ fails to generate any conviction among many of those who seek spiritual insight and who continue regretfully as wistful agnostics in relation to the formulations of traditional religions – notably Christianity in Europe, and in intellectual circles in the USA. Many factors contribute to this state of affairs, but one of these, I would suggest, is that the traditional language in which much Christian theology, certainly in its Western form, has been and is cast is so saturated with terms that have a supernatural reference and colour that a culture accustomed to think in naturalistic terms, conditioned by the power and prestige of the natural sciences, finds it increasingly difficult to attribute any plausibility to it. Be that as it may, there is clearly a pressing need to describe the realities that Christian belief wishes to articulate in terms that can make sense to that culture without reducing its content to insignificance.
Correspondingly, there is also a perennial pressure, even among those not given to any form of traditional religiosity, to integrate the understandings of the natural world afforded by the sciences with very real, ‘spiritual’ experiences, which include interactions with other people and awareness of the transcendent.
Both of these pressures in contemporary life accentuate the need to find ways of integrating ‘talk about God’ – that is, theology – with the world view engendered and warranted by the natural sciences.
The most important single issue in the conversation of theology with science is whether and how God acts in or influences the world. Here I shall ask whether the notion of information can help theologians address this question. It is well known that traditional philosophies and theologies intuited a universal “informational” principle running through all things. Their sense that “Mind,” “Wisdom,” or “Logos” inhabits and globally patterns the universe has been repeated in widely different ways time and again: in ancient Greek philosophy, the Wisdom literature of the Hebrew Scriptures, Philo, early Christianity, Stoicism, Hegel, Whitehead, and others. But can the intuition that the universe is the bearer of an overarching meaning – of an informational principle actively present to the entire cosmic process – have any plausibility whatsoever in the age of science?
These days, after all, one must hesitate before connecting the Logos of theology immediately to patterns in nature. The life process as seen through the eyes of evolutionary biologists, to cite the main reason for such reluctance, scarcely seems to be the embodiment of any universal divine principle of meaning or wisdom. Contrary to the picture of cosmic order expressed in much religious thought, evolution involves seemingly endless experimentation with different “forms,” most of which are eventually discarded and replaced by those only accidentally suited to the demands of natural selection.
By the end of the modern period, a particular world view had become firmly entrenched in the public understanding. Unlike most philosophical positions, which are sharply distinguished from scientific theories, this world view was widely seen as a direct implication of science, and even as the sine qua non for all scientific activity. For shorthand, let's call this view “materialism.”
Materialism consisted of five central theses:
(1) Matter is the fundamental constituent of the natural world.
(2) Forces act on matter.
(3) The fundamental material particles or “atoms” – together with the fundamental physical forces, whatever they turn out to be – determine the motion of all objects in nature. Thus materialism entails determinism.
(4) All more complex objects that we encounter in the natural world are aggregates of these fundamental particles, and their motions and behaviors can ultimately be understood in terms of the fundamental physical forces acting on them. Nothing exists that is not the product of these same particles and forces. In particular, there are no uniquely biological forces (vitalism or “entelechies”), no conscious forces (dualism), and no divine forces (what came to be known as supernaturalism). Thus materialism implied the exclusion of dualism, downward causation (Bøgh Andersen et al., 2000), and divine activity.
“I refute it thus!” Samuel Johnson famously dismissed Bishop George Berkeley's argument for the unreality of matter by kicking a large stone (Boswell, 1823). In the light of modern physics, however, Johnson's simple reasoning evaporates. Apparently solid matter is revealed, on closer inspection, to be almost all empty space, and the particles of which matter is composed are themselves ghostly patterns of quantum energy, mere excitations of invisible quantum fields, or possibly vibrating loops of string living in a ten-dimensional space–time (Greene, 1999). The history of physics is one of successive abstractions from daily experience and common sense, into a counterintuitive realm of mathematical forms and relationships, with a link to the stark sense data of human observation that is long and often tortuous. Yet at the end of the day, science is empirical, and our finest theories must be grounded, somehow, “in reality.” But where is reality? Is it in acts of observation of the world made by human and possibly non-human observers? In records stored in computer or laboratory notebooks? In some objective world “out there”? Or in a more abstract location?
THE GROUND OF REALITY
When a physicist performs an experiment, he or she interrogates nature and receives a response that, ultimately, is in the form of discrete bits of information (think of “yes” or “no” binary answers to specific questions), the discreteness implied by the underlying quantum nature of the universe (Zeilinger, 2004).
It is no secret that we are in the midst of an information-processing revolution based on electronic computers and optical communication systems. This revolution has transformed work, education, and thought, and has affected the life of every person on Earth.
THE INFORMATION-PROCESSING REVOLUTIONS
The effect of the digital revolution on humanity as a whole, however, pales when compared with the effect of the previous information-processing revolution: the invention of moveable type. The invention of the printing press was an information-processing revolution of the first magnitude. Moveable type allowed the information in each book, once accessible only to the few people who possessed the book's hand-copied text, to be accessible to thousands or millions of people. The resulting widespread literacy and dissemination of information completely transformed society. Access to the written word empowered individuals not only in their intellectual lives, but in their economic, legal, and religious lives as well.
Similarly, the effect of the printed word is small when compared with the effect of the written word. Writing – the discovery that spoken sounds could be put into correspondence with marks on clay, stone, or paper – was a huge information-processing revolution. The existence of complicated, hierarchical societies with extended division of labor depends crucially on writing. Tax records figure heavily in the earliest cuneiform tablets.
Just as printing is based on writing, writing stems from one of the greatest information-processing revolutions in the history of our planet: the development of the spoken word.