The importance of phase response in the audio chain has been
brought to greater focus recently by equipment claims of phase
coherency, (the output signal has the same phase relationships
as the input signal). It is not particularly obvious that two different
frequency components of a signal can go into a device at
precisely the same time and emerge at different times, but it is
extremely common. All audio components distort the phase of
the signal to some degree-even air alters the time alignment of a
signal, but the biggest offenders are loudspeakers and their
crossover networks. Phase shifts in the audio signal destroy the
wave shape of the important attack characteristics of many
instruments and hamper our ability to perceive the localization of
the image,smearing the apparent source. They can change the
steady state waveforms of vocal sounds so that the singer
seems to be ten feet wide.
Historically, the phase integrity of the audio signal has been
considered much less important than amplitude and
harmonic/intermodulation distortions, but as more of those
problems are solved and the quality of reproduction improves,
phase distortion stands out in greater relief. The question of the
audibility of these distortions has become the object of heated
discussions regarding the perceivability of absolute phase,
frequency dependent phase shifts, and the rate of phase shift.
Nonetheless we know that the ear is sensitive to phase and uses
phase cues to help determine directionality. In the belief that
proper attention to phase does produce better sound, I will
discuss the design of crossover networks which ever minimum
phase distortion.