Stands for "Video Graphics Array." It is the standard monitor or display interface used in most PC's (Personal Computers). Therefore, if a monitor is VGA-compatible, it should work with most new computers. The VGA standard was originally developed by IBM in 1987 and allowed for a display resolution of 640x480 pixels. Since then, many revisions of the standard have been introduced. The most common is SVGA (Super Video Graphics Array), which allows for resolutions greater than 640x480, such as 800x600 or 1024x768. A standard VGA connection has 15 pins and is shaped like a trapezoid.


Today, the VGA analog interface is used for high definition video, including resolutions of 1080p and higher. While the transmission bandwidth of VGA is high enough to support even higher resolution playback, there can be picture quality degradation depending on cable quality and length. How discernible this degradation is depends on the individual's eyesight and the display, though it is more noticeable when switching to and from digital inputs like HDMI or DVI.

Standard graphics modes are:

• 640x480 in 16 colours or monochrome (the latter matching IBM's lesser Multi-colour Graphics Array standard)

• 640x350 or 640x200 in 16 colours or monochrome (EGA compatibility mode)

• 320x200 in 4 or 16 colours

• 320x200 in 256 colours (Mode 13h)


The 640x480 16-color and 320x200 256-color modes had fully redefinable palettes, with each entry selectable from within an 18-bit (262,144-color) RGB table, although the high resolution mode is most commonly familiar from its use with a fixed palette under Microsoft Windows. The other colour modes defaulted to standard EGA or CGA compatible palettes (including the ability for programs to redefine the 16-color EGA palette from a master 64-color table), but could still be redefined if desired using VGA-specific commands.


Add comment

Log in to post comments