We review the evidence that spectral curvature in the extended emission of radio galaxies is caused by synchrotron losses, and that the spatial variation can be interpreted to yield ages and expansion speeds. One of the biggest worries has been the true value of the magnetic field, but X-ray detections of inverse-Compton radiation are beginning to suggest that “minimum energy” estimates are remarkably accurate. A critical test is to compare model and observed spectra over a broad frequency range; to date this has has only been done for Cygnus A, and the results proved controversial. Here we discuss several more cases and begin to draw some general conclusions.
Hotspots are usually well fitted by continuous injection models, as expected. In two cases the implied injection index is flatter than 0.5, too flat to be produced by standard Fermi acceleration in a non-relativistic shock. The bridge spectra are reasonably fitted by single-burst models, but in some objects the injection index is not constant across the lobes, showing instead a tendency to steepen in the inner bridge, where the break frequencies are lowest. The true spectral shape may be a more gradual curve than the standard models, possibly because of mixing of electron populations with different ages. Our results are limited by the inaccuracy of the absolute flux density scale, especially at low frequencies, and a fresh attack on the flux scale would be timely.