Antenna arrays are a main driver of next generation millimeter-wave communication and radar systems as shrinking antenna sizes leverage larger arrays to compensate for reduced link budget. However, conventional phase controlled arrays exhibit a frequency dependent scan angle that appears as loss to a fixed counterpart. Bandwidth limitations introduced by the so-called beam squint effect hinder larger array sizes and data rates thereby generating a demand for timed arrays as a solution. This paper gives a quantified overview of the beam squint phenomenon, different hardware architectures as well as evaluation parameters and common shortcomings of true-time delay (TTD) elements. A broad variety of TTD realizations from literature are compared by their operational principles and performance. Finally, the delay interpolation principle, its non-idealities, and their impact on a hierarchically time delay controlled D-band antenna array are described. Extended content on a previously published, continuously tunable TTD implementation at a center frequency of 144 GHz with a bandwidth of 26 GHz and a delay range of 1.75 ps that requires only 0.53 × 0.3 mm2 of core chip area is presented. Measurement results have been obtained from a demonstrator manufactured in 130 nm BiCMOS technology.