Chapter 15. Video Adapters
A video adapter accepts raw video data
from the CPU, processes that data, and supplies it to the monitor in
a form that the monitor can display. In DOS text-mode days, that
wasn't a demanding job. Early video adapters simply
interfaced the CPU to the monitor, did little or no manipulation of
the raw data, and depended on the CPU itself to render the data into
a form usable by the monitor. When Windows arrived, the emphasis
shifted from text mode to graphics mode, which increased video
processing demands dramatically.
That made it impractical to use the CPU to perform video processing,
and a new generation of video adapters, called graphics
accelerators , was born. A graphics accelerator
offloads the video processing burden from the main CPU by serving as
a dedicated video coprocessor. In doing so, it not only frees up the
main CPU, but reduces the amount of video data that crosses the
system bus, which also contributes to faster system performance. All
modern video adapters are also graphics accelerators.
Formerly, all video adapters were separate expansion cards, a form in
which they are still readily available today. However, demand for
reduced costs has resulted in motherboards with embedded video
circuitry becoming much more common, a trend that is likely to
continue. Although they are inexpensive and tightly integrated, the
problem with embedded video adapters is that upgrading the video may
require replacing the motherboard. But all current video adapters are
so good that anyone other than a hardcore gamer is likely to find
them more than good enough to get the job done.
|