Video and Monitors

If you want the very best performance for video gaming, you can buy several very expensive video cards, spending more on them than the rest of the system. For everyone else, you can get perfectly adequate performance with a $30 card or with video integrated into the mainboard.

Monitors

Computer monitors and flat panel TV sets are essentially the same thing. The monitor is designed to sit on your desk two feet away from you, so it can display small letters and fine detail. The TV set is designed to sit on a table across the room. It will generally have a brighter picture than a monitor (to be seen from a distance) and it will have a lower resolution and a bigger screen, so everything will be bigger and easier to read.

If you are getting older and have trouble seeing fine print, then you may want to buy something sold as a TV set but use it as a computer monitor.

Computer monitors cost between $100 and $350 in sizes from 19 to 26 inches. Then at around 30 inches the price goes way up to $1000 but the amount of information that can be displayed on the screen (the resolution) jumps way up also. A potential sweet spot is to buy a 23 inch (measured diagonally) computer monitor for $180 that has a resolution of 1920 x 1080 (which happens to be the TV resolution called 1080p used for most Blu-Ray HD movies).

TVs generally cost more than computer monitors. However, a good TV will have better control over colors and contrast. The picture may be processed to make it look sharper and more realistic.

Most monitors and TV sets are LCD panels. An LCD starts with a white back light. The color is generated by tiny bits of red, green, and blue glass at each dot on the screen. The “LCD” itself is a circuit that acts like a valve to control the amount of the back light that shines through the little pieces of red, green, and blue glass. The LCD “valve” can be off, cutting off light, or open to let all the light through, or somewhere in between.

One recent improvement in LDC monitors is that the best ones use LEDs to generate the white back light instead of using a fluorescent bulb. As you know, every fluorescent bulb eventually burns out. When that happens, the monitor is dead. However, LEDs last almost forever.

At one time there were Plasma TV sets. In a Plasma set, each dot is a small light bulb that is either red, green, or blue. The controller can decide how much light each bulb should generate.

The LCD starts with white light and then takes light away, by blocking part of it with the valve and then by filtering out all but the red, green, or blue part of what is left. The Plasma screen generates the colored light directly. Plasma screens can be brighter, but there is a limit to how small the Plasma light bulb can be which makes them impractical for a computer monitor. That is why this technology is only used in wall mounted TV sets.

There is a new technology on the horizon called OLED. Like Plasma, OLED generates individual dots of colored light, but using a smaller LED circuit. Consider a largely black screen. In traditional LCD screens, there the back light is completely on and white, so if the screen is dark it is because the LCD valves are all in the off position and the light is being blocked. Still, you are generating all that light and using all the electricity. In an OLED screen, however, only the non-black parts of the screen are generating tiny amounts of light, and the black part of the screen is using no power. OLED is hard to manufacture, so it will begin to show up on phones and other small devices where batter power is important and then slowly migrate up to larger screens.

Although TVs and computer monitors are optimized differently (for fine up close, or large far away), they still mostly use the same parts. Therefore, modern computer monitors tend to be available at resolutions that match, or are at least reasonably close, to one of the two HDTV resolutions. A 720p TV program is broadcast in a resolution of 1280x720, and you will find computer monitors that have 1280x768 or 1280x800 (basically 720p with a few extra lines of dots). Similarly a 1080p picture is 1920x1080, but you might find a few 24 inch computer monitors with a resolution of 1920x1200.

We Only Really See Three Colors

The human eye has three types of “cone” cells that sense color. One group of cells is most sensitive to light near the red end of the spectrum, with reduced sensitivity to nearby colors. The other cells are sensitive to colors near green or colors near blue.

Yellow is between Red and Green on the visible light spectrum. When you see something that is yellow, the Red sensitive cones react some (but not as strongly as they do for Red) and the Green cones react some (but not as strongly as they do for Green). Your brain has learned that when the Red and Green cones are both reacting to a point of light, then that light is “Yellow”.

This means that if someone hold up two sheets of paper at a distance, and one sheet is pure yellow while the other sheet contains tiny little alternating Red and Green dots, you will not be able to see the difference. From a distance, both sheets look yellow.

That is why computer monitors and TV sets can generate what we see as every possible light simply by controlling the amount of Red, Green, and Blue light being emitted from every tiny dot position on the screen. The LDC panel commonly used today generated a bright white light and then removes colors and blocks part of the light. Plasma and OLED generate light at each dot and control the amount of light they generate to be brighter or darker.

2D or 3D Hardware Assist

Microsoft released Windows 3.0 in the first quarter of 1990. It was an instant success, but it did use a lot of CPU to move the cursor around on the screen, to move and resize windows, and scroll text up and down. Soon Windows was shipping on every computer, and hardware vendors had a great incentive to optimize its performance. By 1995 every video hardware vendor had added hardware that optimized the cursor, scrolling, moving, and the rest of the 2 Dimensional Windows/Office interface.

For the next 10 years, the big problems in Windows video came from 3D games and multimedia (displaying DVD or TV pictures in a window on the screen). Rather than extending the operating system interface that handled the mouse, menus, titles, status, buttons, and scroll bars of standard text windows, Microsoft create an entirely separate program interface for games and multimedia called DirectX.

Of course, there is some overlap. An application program that displays TV or a DVD movie on the screen uses the old interface to write the top title bar, menu, and any button controls that stop, fast forward, or reverse. It only uses DirectX to run the actual movie in the big middle window. For ten years video card vendors pretty much ignored the Window interface and concentrated on DirectX, games, and multimedia. This meant that the Windows user interface was frozen to a time when video cards had 2-4 megabytes of memory.

This changes with Windows Vista. The old Windows interface now makes use of the same 3D capability on the video card that games have used for the last decade. This allows some of the nifty 3D displays and rotations that Vista makes possible (though almost nothing really uses).

Integrated

All laptop computers and some desktop computers (with MATX mainboards) have integrated video built into the mainboard. Until 2008, integrated video meant crummy video with limited performance. Today Intel, Nvidia, and AMD/ATI all offer integrated video chips based on current generation GPUs with a reasonable amount of dedicated video memory. This will not provide the kind of performance that gamers want from a 3D system, but it is good enough for every other purpose. It will also reduce electrical power use because integrated video is more efficient than a separate card.

If you look at systems with integrated video, look for DirectX 10 support (rather than 9). Also look for hardware support in the video chip for H.264 and VC1 (the formats used on HD Blu-Ray disk files). Even if you do not intend to use the system for video, support for HD in the hardware is an easy indicator that you are getting a modern system and not one of the older chips that only does DVD MPEG 2.

Connection

The integrated video or separate adapter card connect to the monitor over an old analog VGA connector or a new digital DVI or HDMI connector. The digital connectors produce the best picture.

DVI and HDMI are basically the same connection with different plugs. You can buy a cable that is DVI on one end and HDMI on the other to convert between them. DVI is mostly a computer standard but is used on a few consumer electronic products, while HDMI is mostly a consumer electronic standard that exists on some computers to connect them to HD TV sets and to allow the playback of Blu-Ray movies on a big screen TV (if your computer has a Blu-Ray disk reader).  

There is a new standard called DisplayPort, but it exists on about three products currently available for sale. It will be important in the future, but whether that means 2009 or 2010 nobody can say at the moment.

If you have lots of money and buy one of those 30 inch Dell, HP, or Samsung monitors with the 4 or 6 million pixel resolutions, you may need a “dual link” DVI connection. That is available on some video adapter cards (check the specs).

Resolution

The screen has a certain number of lines. Each line has a certain number of dots. In 1987 a screen with 768 lines of 1024 dots was about as much as you could hope for. Today there is no practical limit on screen size or resolution, although it helps to win the lottery before you go shopping.

Screens used to be nearly square (actually 4x3 in ratio). Then DVD movies came along and a more cinematic “wide aspect” screen image with a ratio of (16x9). This produced a second set of resolutions for wide screen displays. So eventually the video card vendors stopped supporting only specific resolutions. Today the monitor at start up time tells the video card what resolutions it supports, and the card tells the operating system, and Windows lets you choose.

However, this does not work for devices that really are just TV sets. They may connect to a DVI or HDMI plug on the computer, but the rules of TV are different from the rules for computer monitors. Mostly, there is the problem of “overscan”. For sixty years, the TV stations broadcast a slightly larger picture than the TV sets actually displayed. This was required for picture tube TVs where the picture on the left and right ends of the tube were slightly distorted by the way picture tubes were manufactured. The old sets simply covered over the left and right ends of the tube so you didn’t see the distortion. This difference between what was broadcast and what you saw is called “overscan”.

When you hook your computer up to something that really, really wants to be a TV, you may find that the sides, top, and bottom of the desktop have been “pushed” off the edges of the display. You have to install the Nvidia or ATI control program and then adjust the overscan to make sure you see the entire desktop.

A 20” computer monitor on your desk typically has a higher resolution than a 50” TV set hanging on the wall across the room. The highest TV resolution is 1920x1080, and your eye doesn’t see finer detail from a distance. Less expensive TV sets have an even lower resolution of 1280x720.