Color
Depth
The combination of the display modes supported by your graphics adapter and
the color capability of your monitor determine how many colors can be displayed.
For example, a display that can operate in SuperVGA (SVGA) mode can display
up to 16,777,216 (usually rounded to 16.8 million) colors because it can process
a 24-bit long description of a pixel. The number of bits used to describe a
pixel is known as its bit depth. With 24-bit bit depth, eight bits are
dedicated to each of the three additive primary colors -- red, green and blue.
This bit depth is also called true color because it can produce the 10,000,000
colors discernible to the human eye, while a 16-bit display is only capable
of producing 65,536 colors. Displays jumped from 16-bit color to 24-bit color
because working in 8-bit increments makes things a whole lot easier for developers
and programmers.
Simply put, color bit depth refers to the number of bits used to describe the color of a single pixel. The bit depth determines the number of colors that can be displayed at one time. Take a look at the following chart to see the number of colors different bit depths can produce.
Bit-Depth | Number of Colors |
1 |
2 (monochrome) |
2 |
4 (CGA) |
4 |
16 (EGA) |
8 |
256 (VGA) |
16 |
65,536 (High Color, XGA) |
24 |
16,777,216 (True Color, SVGA) |
32 |
16,777,216 (True Color + Alpha Channel) |
You will notice that the last entry in the chart is for 32 bits. This is a special graphics mode used by digital video, animation and video games to achieve certain effects. Essentially, 24 bits are used for color and the other 8 bits are used as a separate layer for representing levels of translucency in an object or image.
Nearly every monitor sold today can handle 24-bit color using a standard VGA connector, as discussed previously.
Power
Consumption
Power consumption varies greatly with different technologies. CRTs are somewhat
power-hungry, at about 110 watts for a typical display, especially when compared
to LCDs, which average between 30 and 40 watts. In a typical home computer setup,
with a CRT-based display, the monitor accounts for over 80% of the electricity
used! Because most users don't interact with the computer much of the time it
is on, the U.S. government initiated the Energy Star program in 1992. Energy
Star compliant equipment monitors user activity and suspends non-critical processes,
such as maintaining a visual display, until you move the mouse or tap the keyboard.
According to the EPA, if you use a computer system that is Energy Star compliant,
it could save you approximately $400 a year on your electric bill! Similarly,
because of the difference in power usage, an LCD monitor might cost more up
front but end up saving you money in the long run.
CRT technology is still the most prevalent system in desktop displays. Because standard CRT technology requires a certain distance between the beam projection device and the screen, monitors employing this type of display technology tend to be very bulky. Other technologies make it possible to have much thinner displays, commonly known as flat-panel displays. Liquid Crystal Display (LCD) technology works by blocking light rather than creating it, while Light Emitting Diode (LED) and gas plasma work by lighting up display screen positions based on the voltages at different grid intersections. LCDs require far less energy than LED and gas plasma technologies and are currently the primary technology for notebook and other mobile computers. As flat panel displays continue to grow in screen size and improve in resolution and affordability, expect them to gradually replace CRT-based displays.
![]() |