The graphics card plays an essential role in the PC. It takes the digital information that the computer produces and turns it into something human beings can see. On most computers, the graphics card converts digital information to analog information for display on the monitor; on laptops, the data remains digital because laptop displays are digital.


RADEON 64-MB AGP Graphics Card

If you look at the screen of a typical PC very closely, you can see that all of the different things on the screen are made up of individual dots. These dots are called pixels, and each pixel has a color. On some screens (for example, on the original Macintosh), the pixels could have just two colors -- black or white. On some screens today, a pixel can be one of 256 colors. On many screens, the pixels are full-color (also known as true color) and have 16.8 million possible shades. Since the human eye can only discern about 10-million different colors, 16.8 million colors is more than enough for most people.

The goal of a graphics card is to create a set of signals that display the dots on the computer screen. If you have read How Computer Monitors Work and How Television Works, you have a good sense of what those signals are and how a monitor turns them into light.

In this edition of HowStuffWorks, you'll learn all about graphics cards and how they optimize your PC experience.