We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This contribution reviews the current status of optical wide field survey astronomy and the basic techniques that have been developed to capitalize on the large volumes of data generated by modern optical survey instruments. Topics covered include: telescope design constraints on wide field imaging; the properties of CCD detectors and wide field CCD mosaic cameras; preprocessing CCD data and combining independent digitized frames; optimal detection of images and digital image centering and photometry methods. Although the emphasis is geared toward optical imaging problems, most of the techniques reviewed are applicable to any large format two-dimensional astronomical image data.
Wide Field Survey Astronomy
Background
Astronomy is basically an observational science, rather than an experimental one, and the development and advancement of the subject has relied heavily on surveys of the sky at optical wavelengths to expand our knowledge of the observable Universe. Surveys form a basic foundation of observational astronomy, and provide three generic types of information:
(a) quantitative statistical information on the distribution of objects in our own galaxy and the Universe
(b) the ability to discover radically new types of object
(c) the means of selecting representative samples of certain types of (rare) objects, particularly the brightest examples, for further study with large telescopes.
Statistical surveys are beginning to rely ever more heavily on the wide field multi-object fibre spectroscopy capabilities of large telescopes, described elsewhere in this volume.
Astronomical telescopes are devices which collect as much radiation from astronomical (stellar) objects and put it in an as sharp (small) an image as possible. Both collecting area and angular resolution play a role. The relative merit of these two functions has changed over the years in optical astronomy, with the angular resolution initially dominating and then, as the atmospheric seeing limit was reached, the collecting area becoming the most important factor. Therefore it is the habit these days to express the quality of a telescope by its (collecting) diameter rather than by its angular resolution. With the introduction of techniques which overcome the limits set by atmospheric seeing, the emphasis is changing back to angular resolution. This time, however, it is set by the diffraction limit of the telescope so that both angular resolution and collecting power of a telescope will be determined by its diameter. Both telescope functions will therefore go hand-in-hand.
Although image selection and various speckle image reconstruction techniques have been successful in giving diffraction limited images (see, e.g., the paper by Oskar von der Lühe in the First Canary Island Winter School, 1989), the most powerful and promising technique for all astronomical applications is the one using adaptive optics. That is because, for an unresolved image, it puts most of the collected photons in an as small an image as possible which benefits both in discriminating against the sky background, in doing high spectral and spatial resolution spectroscopy and in doing interferometric imaging with telescope arrays.
The new generation of 8-10m telescopes is opening up important possibilities for polarimetry of astrophysically interesting sources, mainly because the large collecting area is particularly advantageous in this technique, which requires high S/N ratio. This course starts by emphasizing the importance of polarimetry in astronomy and giving some examples of polarizing phenomena in everyday life. Then an introduction to the Stokes parameters and to Mueller calculus is given, with examples on how to describe the most common polarizing optical components, and the main mechanisms producing polarized light in astrophysics are reviewed. The section devoted to instruments starts with a brief overview of the classical photopolarimeter, follows with a description of an imaging polarimeter, with examples of data obtained and an analysis of the sources of errors, and ends with a discussion of modern spectropolarimetry. The following section is devoted to an analysis of the gains of large 8–10 m telescopes for polarimetry and to a review of the polarimeters planned for them. The course ends with a discussion of polarimetry of AGN, as an example of a field of research, where polarimetry has provided important results, by disentangling unresolved geometries and mixed spectral components.
The beauty of polarimetry
Astronomy is an observational science, not an experimental one in the usual sense, since for the understanding of the objects in the Universe we cannot perform controlled experiments, but have to relay on observations of what these objects do, independently of us.
An alternative title of this material could be “The Data Everyone Would Like to Get for their Research!” The first thing we seem to do in astronomy is ‘see’ something, be it simply looking in the sky, using a big telescope, or helping ourselves with sophisticated adaptive optics or space probes. But the very next thing we want to do is get that light into a spectrograph! We might get spectral information from colors, energy distributions, modest resolution or real honest high resolution spectroscopy, but we desperately need such information. Why? Well, because that's where most of the physical information is, and higher spectral resolution means access to more and better information. High resolution implies actually resolving the structure of the spectrum. Naturally we want to do this as precisely as possible, not only pushing toward good spectral resolution and high signal-to-noise, but also by understanding how the equipment has modified the true spectrum and by weeding out problems and undesirable characteristics. The main focus here will be on the machinery of spectroscopy, but oriented toward optical spectrographs and the spectral lines they are best suited to analyze. I do not concentrate on the specific instruments, but rather on the techniques and thought patterns we need. These are the fundamental things you can take with you and apply to any spectroscopic work you do. Of course, you will always have to fill in specific details for the particular machinery and tools you use.
Astronomy is entering a new observational era with the advent of several Large Telescopes, 8 to 10 metre in size, which will shape the kind of Astrophysics that will be done in the next century. Scientific focal plane instruments have always been recognized as key factors enabling astronomers to obtain the maximum performance out of the telescope in which they are installed. Building instruments is therefore not only a state of the art endeavour, but the ultimate way of reaching the observational limits of the new generation of telescopes. Instruments also define the type of science that the new telescopes will be capable of addressing in an optimal way. It is clear therefore that whatever instruments are built in the comming years they will influence the kind of science that is done well into the 21st century.
The goal of the 1995 Canary Islands Winter School of Astrophysics was to bring together advanced graduate students, recent postdocs and interested scientists and engineers, with a group of prominent specialists in the field of astronomical instrumentation, to make a comprehensive review of the driving science and techniques behind the instrumentation being developed for large ground based telescopes. This book is unique indeed in that it combines the scientific ideas behind the instruments, something at times not appreciated by engineers, with the techniques required to design and build scientific instruments, something that few astronomers grasp during their education.
Chapter 1 reviews the image restoration/reconstruction problem in its general setting. We first discuss linear methods for solving the problem of image deconvolution, i.e. the case in which the data is a convolution of a point-spread function and an underlying unblurred image. Next, non-linear methods are introduced in the context of Bayesian estimation, including Maximum-Likelihood and Maximum Entropy methods. Finally, the successes and failures of these methods are discussed along with some of the roots of these problems and the suggestion that these difficulties might be overcome by new (e.g. pixon-based) image reconstruction methods.
Chapter 2 discusses the role of language and information theory concepts for data compression and solving the inverse problem. The concept of Algorithmic Information Content (AIC) is introduced and shown to be crucial to achieving optimal data compression and optimized Bayesian priors for image reconstruction. The dependence of the AIC on the selection of language then suggests how efficient coordinate systems for the inverse problem may be selected. This motivates the selection of a multiresolution language for the reconstruction of generic images.
Chapter 3 introduces pixon-based image restoration/reconstruction methods. The relationship between image Algorithmic Information Content and the Bayesian incarnation of Occam's Razor are discussed as well as the relationship of multiresolution pixon languages and image fractal dimension. Also discussed is the relationship of pixons to the role played by the Heisenberg uncertainty principle in statistical physics and how pixon-based image reconstruction provides a natural extension to the Akaike information criterion for Maximum Likelihood estimation.
This paper reviews near infrared instrumentation for large telescopes. Modern instrumentation for near infrared astronomy is dominated by systems which employ state-of-the-art infrared array detectors. Following a general introduction to the near infrared wavebands and transmission features of the atmosphere, a description of the latest detector technology is given. Matching of these detectors to large telescopes is then discussed in the context of imaging and spectroscopic instruments. Both the seeing-limited and diffraction-limited cases are considered. Practical considerations (e.g. the impact of operation in a vacuum cryogenic environment) that enter into the design of infrared cameras and spectrographs are explored in more detail and specific examples are described. One of these is a 2-channel IR camera and the other is a NIR echelle spectrograph, both of which are designed for the f/15 focus of the 10-m W. M. Keck Telescope.
The Near Infrared Waveband
In the last ten years there has been tremendous growth in the field of Infrared Astronomy. This growth has been stimulated in large part by the development of very sensitive imaging devices called infrared arrays. These detectors are similar, but not identical, to the better-known silicon charge-coupled device or CCD, which is limited to wavelengths shorter than 1.1 µm. In particular, near infrared array detectors are now sufficiently sensitive that images of comparable depth to those obtained with visible-light CCDs can be achieved from 1.0 µm to 2.4 µm and high resolution IR spectrographs are now feasible.
This lecture introduces the opportunities presented by ground-based telescopes for new discoveries in the thermal infrared, and discusses techniques used to make sensitive observations in an environment with high background flux levels from atmospheric emission and from the telescope structure and mirrors.
Mid-IR astronomy—opportunities and problems
The capability now exists to observe mid-IR astronomical objects with spatial resolution of a third of an arcsecond and sensitivities reaching well below a mJy. Both imaging and spectroscopy with new array instruments on optimized large telescopes are producing new data on sources from comets, to active galactic nuclei. With sensitivity to emission from cool dust, diagnostic lines from ionized gas and molecular species, and the capability to look through clouds opaque in the visible, many new results are appearing, and many more can be anticipated. In particular, our understanding of the star formation process should improve significantly in the next decade. Yet all of this is achieved operating through the earth's atmosphere which absorbs and distorts the signals, and which, together with the telescope structure itself, radiates into the beam up to a million times the power detected from the source. The problems encountered, and the techniques used to make ground based mid-IR observations will be discussed here.
IRAS (Infrared Astronomical Satellite) revealed how fascinating and complex the IR sky is at wavelengths of 12, 25, 60 and 100 µm. The IRAS mission lasted for 300 days in 1983 completing an all sky survey with a 57-cm diameter cooled telescope.
Physics is the most fundamental of the sciences, and some knowledge of it is required in fields as disparate as chemistry, biology, engineering, medicine, and architecture. Our experience in teaching physics to a wide variety of audiences in the U.S. and Europe over many years is that, while students may acquire some familiarity with formal concepts of physics, they are all too often uneasy about applying these concepts in a variety of practical situations. As an elementary example, they may be able to quote the law of conservation of angular momentum in the absence of external torques, but be quite unable to explain why a spinning top does not fall over. The physicist Richard Feynman coined the phrase “fragile knowledge” to describe this kind of mismatch between knowledge of an idea and the ability to apply it.
In our view there is really only one way of acquiring a robust ability to use physics: the repeated employment of physical concepts in a wide variety of applications. Only then can students appreciate the strength of these ideas and feel confident in using them. This book aims to meet this need by providing a large number of problems for individual study. We think it very important to provide a full solution for each one, so that students can check their progress or discover where they have gone wrong. We hope that users of this book will be able to acquire a working knowledge of those parts of physics they need for their science.