We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
By
Giacomo Morpurgo, Born Florence, Italy, 1927; Laurea, 1948 (physics), University of Rome; Professor of Physics, University of Genoa; elementary particle physics (theory and experiment) and nuclear physics (theory).
Speaking of the birth, in 1969, of the parton model, David Gross wrote: “From then on I was always convinced of the reality of the quarks, not just as mnemonic devices for summarizing hadron symmetries, that they were then universally regarded to be, but as physical pointlike constituents of the nucleon” (italics mine). In a letter of reply (note 1) I noted that while it is hard to predict how the notion of quarks will evolve, it is sure that - already since 1965 – their most productive description was a realistic one.
In a review article about the discovery of quarks, Michael Riordan stated: “After several years of fruitless searches most particle physicists agreed that although quarks might be useful mathematical constructs, they had no innate physical reality as objects of experience.” Again I disagree. For many people trying to understand the remarkable developments of hadron spectroscopy, the quarks of the nonrelativistic quark model (NRQM) were, already five years before partons, not a mathematical construct or a mnemonic device but something very realistic. I started a long experiment (from 1965 to 1982) to search for real free quarks because of the quantitative results (well beyond group theory) that I had obtained with the NRQM.
Of course in that period many theorists did not like the NRQM. As one example, at Vienna in 1968 my rapporteur talk (note 6) on the NRQM was inserted in the session on “Current Algebra.”
By
Leonard Susskind, Born New York City, 1940; Ph.D., 1965 (physics), Cornell University; Professor of Physics at Stanford University; high-energy physics (theory).
In this chapter I present a personal reminiscence of the development of our current ideas about quark confinement. I describe what I remember of my own involvement and that of the people who influenced me. If others remember it differently, I hope they will not be too angry.
By the end of the 1960s our empirical knowledge of hadrons consisted of a vast mountain of data about their spectrum, their low- and high-energy interactions, and their electromagnetic and weak properties. To some extent the story of the eventual interpretation in terms of QCD was like digging a tunnel through the mountain with crews of diggers starting independently at the two ends. At one end was the short-distance behavior of local currents and its interpretation in terms of freely moving quark-parton constituents. At the other end was the low-momentum-transfer Regge structure including a spectrum of highly excited rotational states, shrinking diffraction peaks, and multihadron final states of peripheral collisions, but no free quarks. Sometime in 1973 the two tunnel crews discovered that they had met and a complete picture of the strong interactions existed. Of course the two crews were not entirely unaware of each other. The Regge workers were beginning to organize the trajectories by quantum numbers suggested by the quark model. Eventually, the Regge picture culminated in 1968 with a set of scattering amplitudes based on the duality principle of R. Dolen, D. Horn, and C. Schmidt.
In the spring of 1955 in Moscow there was a small conference on QED and elementary particle theory that took place at the Lebedev Physical Institute from March 31 through April 7. Among the participants were a few foreigners, including Ning Hu and Gunnar Källen. I remember it quite well as it was my first conference on quantum field theory (QFT) problems with scientists from abroad. My short contribution concerned finite Dyson transformations for renormalized Green's functions and matrix elements in QED.
The central point of the conference was Lev Davydovich Landau's review talk “Basic Problems of Quantum Field Theory,” devoted to the ultraviolet behavior in local QFT. The point is that a few months earlier, the problem of short-distance behavior in QED was successfully attacked by Landau and his brilliant pupils Alesha Abrikosov and Isaak Khalatnikov. They managed to find a closed approximation to the Schwinger–Dyson equations for two propagators and the three-vertex function that was compatible with renormalizability and gauge invariance. Besides, this so-called three-gammas approximation admitted a solution in the massless limit that, in modern terms, was equivalent to the summation of leading ultraviolet logarithms.
This solution had a peculiar feature that was controversial from the physical point of view (the “ghost-pole” in the renormalized photon propagator amplitude or Moscow-zero puzzle in the formal expression for the “physical electron charge”) that attracted attention and excited one's imagination.
By
Harry Lipkin, Born New York City, 1921; Ph.D., 1950 (physics), Princeton University; Professor of Physics at the Weizmann Institute of Science, Israel; School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel Aviv University, Tel Aviv, Israel; high-energy physics (theory).
I begin with a tribute to a great physicist who taught me how to think about quarks and physics in general, John Bardeen. A few sentences from John could often teach you more and give more deep insight than ten hours of lectures from almost anyone else. In 1966 when I began to take quarks seriously, I was unknowingly thinking about them in the language I had learned from John during two years at the University of Illinois, as quasi-particle degrees of freedom describing the low-lying elementary excitations of hadronic matter. Unfortunately I did not realize how much my own thinking had been influenced by John Bardeen until he was gone. I dedicate this paper to his memory.
Were quarks real? Quarks as real as Cooper pairs would have been enough. Quarks leading to anything remotely approaching the exciting physics of the BCS theory would have been more than enough. John always emphasized that Cooper pairs were not bosons, and that super-conductivity was not Bose condensation. The physics was all in the difference between Cooper pairs and bosons. I was not disturbed when quarks did not behave according to the establishment criteria for particles. The physics might all be in the difference between quarks and normal particles. One had to explore the physics and see where the quark model led.
The arguments of the BCS critics that the theory was not gauge invariant did not disturb John; he knew where the right physics was.
By
Nicholas Samios, Born New York City, 1932; Ph.D., 1957 (physics), Columbia University; Director, Brookhaven National Laboratory; high-energy particle physics (experimental).
The era of studying particle resonance production in the mesonic and baryonic domain was truly exciting and productive. As one looks back, the most important findings occurred in a relatively short time period – roughly 1958–1964, with the preliminaries in the 1950s and lots of details in the 1970s and 1980s. This period of intense activity had many characteristics among which are the following:
1. Accelerators came into their own. Previous productive work occurred in cosmic rays, but now came the Cosmotron, Bevatron, AGS, and PS machines, all contributing important physics results.
2. There was strong interplay between experiment and theory. Global symmetry, the Sakata model, Pais–Piccioni conjecture, Treiman–Yang angle, Jackson angle, Lee–Yang inequalities (and of course, the Gell-Mann–Nishijima, Gell-Mann–Okubo mass formulas), all attest to this close relationship.
3. The early experimental results – even with low statistics – were usually correct. As you will see, the discovery of the ρ, K*, φ, and η just popped out. On the other hand, one had to use some caution, for some of the early indications could be misleading, a case in point being the τ spin-parity, where Robert Oppenheimer cautioned Jay Orear not to bet on horses.
4. As data accumulated, a few incorrect results emerged – some of a major nature, which required large efforts in time and money to correct.
I begin by discussing the Barkas Table, the November 1957 version. It is worth noting that this earliest of compilations is very short – 16 entries.
By
Gerson Goldhaber, Born Chemnitz, Germany, 1924; Ph.D., 1950 (physics), University of Wisconsin; Professor in the Graduate School, Physics Department and at the Lawrence Berkeley Laboratory and Center for Particle Astrophysics, University of California at Berkeley; high-energy physics (experimental).
As I look back at the first three years or so at SPEAR, I consider this one of the most revolutionary, or perhaps the most revolutionary, experiment in the history of particle physics. It was certainly the most exciting time – in a laboratory, that is – that I have ever experienced. In this chapter I discuss the period 1973–76, which saw the discoveries of the ψ and ψ′ resonances, the χ states and most of the psion spectroscopy, the D0, D+ charmed meson doublet, and the D*0 and D*+ doublet. I will also refer briefly to some more recent results.
Most of these discoveries were made with the SLAC-LBL Magnetic Detector – or, as it later became known, the MARK I – that we operated at SPEAR from 1973 to 1976. The groups involved in this work were led by Burton Richter and Martin Perl of SLAC and by William Chinowsky, Gerson Goldhaber, and George Trilling of LBL.
The discovery of the ψ
Some of my personal reminiscences regarding the weekend of the ψ discovery have already been published and I will only allude to them briefly here.
Our first task was to learn how our detector behaved in the SPEAR environment. For this purpose we developed two independent analysis systems, one at LBL and the other at SLAC. The overall data acquisition was due to Martin Breidenbach.
By
Jerome Friedman, Born Chicago, Illinois, 1930; Ph.D., 1956 (physics), University of Chicago; Professor of Physics, Massachusetts Institute of Technology; Nobel Prize in Physics, 1990; high-energy physics (experimental).
In 1961 Murray Gell-Mann and Yuval Ne'eman independently introduced a classification scheme, based on SU(3) symmetry, that placed hadrons into families on the basis of spin and parity. Like the periodic table for the elements, this scheme had predictive as well as descriptive powers. Hadrons that were predicted within this framework, such as the Ω−, were later discovered.
In 1964 Gell-Mann and George Zweig independently proposed quarks as the building blocks of hadrons as a way of generating the SU(3) classification scheme. When the quark model was first proposed, it postulated three types of quarks - up (u), down (d), and strange (s), having charges ⅔, – ⅓, and – ⅓, respectively; each of these was hypothesized to be a spin-½ particle. In this model the nucleon (and all other baryons) is made up of three quarks, and all mesons each consist of a quark and an antiquark. For example, as the proton and neutron both have zero strangeness, they are (u,u,d) and (d,d,u) systems, respectively. Though the quark model provided the best available tool for understanding the properties of the hadrons that had been discovered at the time, the model was thought by many to be merely a mathematical representation of some deeper dynamics, but one of heuristic value.
By
Gerard 'T Hooft, Born Den Helder, The Netherlands, 1946; Ph.D., 1972 (physics), University of Utrecht; Professor of Physics at the Institute for Theoretical Physics, University of Utrecht; high-energy physics (theory).
Like most other presentations by scientists in this Symposium, my account of the most important developments that led toward our present view of the fundamental interactions among elementary particles is a personal one, recounting discoveries I was just about to make when someone else beat me to it. But there is also something else I wish to emphasize. This is the dominant position reoccupied during the last two decades by theory, in its relation to experiment. In particular quantum field theory not only fully regained respectability but has become absolutely essential for understanding those basic facts now commonly known as the “Standard Model.” So much happened here, so many discoveries were made, that the space allotted to theory in this volume runs far too short to cover it all. Therefore, I will limit myself only to the nicest goodies among the many interesting developments in theory, and of those I'll only pick the ones that were of direct importance to me.
Renormalization
Before the seventies there was only one renormalizable quantum field theory that seemed to give a reasonable and useful description of (parts of) the real world: quantum electrodynamics. Its remarkable successes in explaining, among others, the Lamb shift and the anomalous magnetic moment of the electron did not go unnoticed. Yet the idea that other interactions should also be described in the context of renormalizable field theories became less and less popular. Indeed, the notion of renormalizability was quite controversial, and to some it still is.
By
Peter Galison, Born New York City, 1955; Ph.D., 1983 (physics and history of science), Harvard University; Mallinckrodt Professor of History of Science and of Physics, Harvard University; history of science, high-energy physics (theory).
The broad sweep of theoretical claims and programs commands our attention: even the title of this book, The Rise of the Standard Model, points to theory as the capstone of physics. But outside the commitment of theorists to principles of their practice such as causality, determinism, unification, and symmetry breaking, there are commitments built into the hardware of the laboratory. Less dramatic perhaps, less often spoken of without doubt, these traditions of instrumentation shape the practice of experimental physics and embody views about the nature of acceptable empirical evidence. In this chapter, I want to explore the coming together of two great lines of instruments in the twentieth century: on one side, the image tradition instantiated in the sequence cloud chambers, nuclear emulsions, and bubble chambers. These devices make pictures, the delicate array of crisscrossed lines that have come to serve as symbols not only of particle physics but of physics more generally. On the other side, there stands a competing logic tradition, this one aiming not to make pictures but instead to produce counts – the staccato clicks of a Geiger–Müller counter rather than the glossy print from a cloud chamber. In the line of such counters came a host of other electronic devices that built their persuasive power not through the sharpness of images but through the accumulation of a statistically significant number of clicks.
Where is the frontier of physics? Some would say 10−33 cm, some 10−15 cm and some 10+28 cm. My vote is for 10−6 cm. Two of the greatest puzzles of our age have their origins at this interface between the macroscopic and microscopic worlds. The older mystery is the thermodynamic arrow of time, the way that (mostly) time-symmetric microscopic laws acquire a manifest asymmetry at larger scales. And then there's the superposition principle of quantum mechanics, a profound revolution of the twentieth century. When this principle is extrapolated to macroscopic scales, its predictions seem wildly at odds with ordinary experience.
This book deals with both these ‘mysteries,’ the foundations of statistical mechanics and the foundations of quantum mechanics. It is my thesis that they are related. Moreover, I have teased the reader with the word ‘foundations,’ a term that many of our hardheaded colleagues view with disdain. I think that new experimental techniques will soon subject these ‘foundations’ to the usual scrutiny, provided the right questions and tests can be formulated. Historically, it is controlled observation that transforms philosophy into science, and I am optimistic that the time has come for speculations on these two important issues to undergo that transformation.
In the next few pages I provide previews of the book: Section 1.1 is a statement of the main ideas. Section 1.2 is a chapter by chapter guide.
There are two principal themes: time's arrows and quantum measurement. In both areas I will make significant statements about what are usually called their foundations. These statements are related, and involve modification of the underlying hypotheses of statistical mechanics. The modified statistical mechanics contains notions that are at variance with certain primitive intuitions, but it is consistent with all known experiments.
I will try to present these ideas as intellectually attractive, but this virtue will be offered as a reason for study, not as a reason for belief. Historically, intellectual satisfaction has not been a reliable guide to scientific truth. For this reason I have striven to provide experimental and observational tests where I could, even where such experiment is not feasible today. The need for this hardheaded, or perhaps, intellectually humble, approach is particularly felt in the two areas that I will address. The foundations of thermodynamics and the foundations of quantum mechanics have been among the most contentious areas of physics; indeed, some would deny their place within that discipline. In my opinion, this situation is a result of the paucity of relevant experiment.
In the last chapter we enumerated ‘arrows of time.’ There was a subtheme concerned with which candidates made it to the list, which didn't, trying to eliminate arrows that were immediate consequences of others. Now the subtheme becomes the theme.
We are concerned with correlating arrows of time. Our most important conclusion will be that the thermodynamic arrow of time is a consequence of the expansion of the universe. Coffee cools because the quasar 3C273 grows more distant. We will discuss other arrows, in particular the radiative and the biological, but for them the discussion is a matter of proving (or perhaps formulating) what you already believe. For the thermo/cosmo connection there remains significant controversy.
As far as I know it was Thomas Gold who proposed that the thermodynamic arrow of time had its origins in cosmology, in particular in the expansion of the universe. Certainly there had been a lot of discussion of arrows of time before his proposal, but in much of this discussion you could easily get lost, not knowing whether someone was making a definition or solving a problem. Now I'm sure the following statement slights many deep thinkers, but I would say that prior to Gold's idea the best candidate for an explanation of the thermodynamic arrow was that there had been an enormous fluctuation. If you have a big enough volume and if you wait long enough, you would get a fluctuation big enough for life on earth.