In an ideal transmission medium of "infinite" bandwidth such as
free space or (in most cases) a well-terminated coaxial
transmission line, the phase of a signal passing through the
transmission path is directly related to the signal frequency and
the propagation velocity. A plot of frequency vs. phase would
result in a straight line with a slope dependent on the propagation
velocity and effective electrical length of the path. In this ideal
case, the phase delay term defined by equation 1 on the following
page would remain constant regardless of the signal frequency.
When a single-frequency signal is applied to a filter, amplifier or
other device with limited bandwidth, the phase delay becomes
frequency-sensitive. The plot of phase vs. frequency is no longer a
straight line and the phase delay (tf) would vary as the frequency
is changed.
Phase linearity can be expressed as the maximum deviation from
the ideal straight line phase vs. frequency plot which would be
produced by an ideal transmission line of similar electrical length,
or simply by reference to a tabular listing of phase deviation at a
number of discrete frequencies.
In most cases, the value of knowing the phase deviation for the
single-frequency signal passing through a device is limited, since
amplifiers are generally called upon to process signals consisting
of many frequency components such as modulated or keyed
carriers.
|