What Does graphics card Mean
A video card, also called a graphics card (among other names) is in charge of processing the data that comes from the main processor (CPU or UCP) and converting it into information that can be represented on devices such as monitors and televisions. . It is worth mentioning that this component can have a great variety of architectures, although they are commonly called the same, even if it refers to a video chip integrated into a motherboard; in the latter case, it is more correct to say GPU (Graphics Processing Unit).
Since their inception, graphics cards have included various features and functions, such as the ability to tune in to television or to capture video footage from an external device. It is important to note that it is not a component found exclusively in current computers , but that they have existed for more than four decades and today they are also an indispensable part of video game consoles , both portable and home. .
Its creation dates from the end of the 60s, a time when the use of a printer to visualize computer activity was left behind and monitors began to be used . At first, the resolutions were tiny compared to the already known to all high definition. It was thanks to Motorola's research and development work that the characteristics of the chips became more complex and its products led to the standardization of the name of video cards.
As computers for personal use and the first video game consoles became popular, it was decided to integrate graphics chips into motherboards , since this allowed to reduce manufacturing costs considerably. At first glance, this presents a clear disadvantage: the impossibility of updating the equipment ; However, they were closed systems , which were built taking into consideration each and every one of its components, so that the final product was consistent and offered the highest possible performance.
It should be noted that to this day this continues to happen with consoles, and it is thanks to this type of unalterable design that after a few years the developers obtain results far superior to the first experiments; This is not possible on a PC, no matter how powerful, since a software company cannot consider all possible combinations of its consumers' machines. In addition, the architecture of a computer has weak points precisely because its parts are interchangeable , the most notable being the distance between the memory , the graphics card and the main processor.
In the early 1980s, IBM built on the design of the unforgettable Apple II and made the interchangeable video card popular, although in their case it only offered the ability to display characters on the screen . It was an adapter with the modest amount of 4KB of memory (currently they can have 2GB, 512 times more) and that was used with a monochrome monitor. This was the starting point, and improvements did not take long.
Some time later, IBM standardized the term VGA , which refers to a video card technology capable of offering a resolution of 640 pixels wide by 480 high, as well as the monitors that could represent these images and the connector necessary for their use. After the work of several companies dedicated exclusively to graphics, Super VGA (also known as SVGA ) saw the light of day, increasing the available definition (to 1024 x 768) as well as the number of colors that could be represented simultaneously ( from 16 colors in 640 x 480 to 256 in 1024 x 768).