If you’ve been thinking about building your PC, then there’s no better time to do so than now.
While picking up a pre-built computer is always an option, building one yourself often provides better value and is an excellent experience for many.
But, if you decide to go down this PC-building route, how easy is it to do so?
Where do all the components go? Where does the motherboard fit? How much RAM do you need?
And: Do VGA cables go into the graphics card?
The answer is yes: the VGA cables do go straight into your GPU.
By doing so, you enable your monitor to take full advantage of your GPU, providing a better resolution and higher refresh rate.
Unfortunately, VGA is already obsolete and has long been replaced by modern standards like DVI and DisplayPort.
That said, if you want to learn more about VGA cables, then this article will go over what you need to know about VGA, GPUs, and everything else in-between.
What is VGA?
To work and complete tasks, a computer typically uses various cables and connectors. VGA was one of them.
Introduced back in 1987, VGA (which stands for video graphics array) is a video connectivity port that has long been a standard for computers.
Being used for over 30 years at this point, the VGA cable sports the familiar male 15-pin D-sub connector.
The cable transfers an analog RGB signal from another connected device — in this case, your computer’s GPU. Furthermore, it’s able to support both standard-definition and high-definition resolutions.
There are plenty of VGA cables available on the market, which also come with other names, including:
- RGB connector
- Mini 015
- Mini Sub D15
- Mini D15
Unlike the more recent connectors that have followed it, however, VGA cables aren’t capable of transmitting audio signals, which means that you need a separate connection for it.
This can be frustrating for any media where audio is essential.
This limitation also led to the creation of the HDMI connector back in the early 2000s, as it was advanced enough to provide a higher resolution while also carrying both audio and video signals.
Luckily, there are options to convert your VGA cable to HDMI, like an adapter cable. Although in this case, using a native HDMI output will be more convenient and provide you with better quality.
What is a GPU?
Commonly known as graphic cards, the GPU is one of the most critical components of any modern computer system. Essentially, a GPU is responsible for the images that are displayed on your monitor.
Initially, the GPU was invented for gaming purposes, as well as other graphically intensive tasks.
The first widely-available GPU was the GeForce 256 add-in board (AIB), released by Nvidia in 1999. This is why most GPUs work on CUDA (Compute Unified Device Architecture), Nvidia has also designed it.
Since then, the GPU has constantly evolved to adapt to more graphically intensive environments, like modern video games.
Most notably, GPUs use complex math calculations and parallel operations sent by the CPU to render and execute game graphics in real-time.
But it’s not just gaming computers that use GPUs, as these powerful components are also utilized for other purposes.
For one thing, the GPU’s ability to solve complex mathematical calculations is also the exact reason why Bitcoin miners used them in the past to compute intensive hash problems.
Aside from that, GPUs are also used for:
- Genomic sequencing in life sciences
- Back-testing various financial services and platforms
- Fraud detection
- AI/ML Workloads in self-driving cars
- 3D rendering
- Video editing
- Machine Learning
However, they are commonly found in gaming PCs, accelerating computer graphics workload to enable the most immersive game environments.
The former is built into your computer’s processor (like in laptops and notebooks), while the latter are GPUs that are separate from your system. Usually, this is the type used in building gaming PCs.
Do VGA Cables Go Into the Graphics Card?
While GPUs are powerful and very useful, you won’t be able to use them without cables to connect to them with, which is where a VGA comes into the picture.
Because monitors need to be able to output high refresh rates and graphical fidelity to play video games in real-time, they need to connect to a GPU via the CPU.
In general, using just a CPU will let you run games, especially older ones that don’t require high graphical settings.
However, the bigger and more graphically-intensive a game becomes, the more power it needs, hence your GPU.
While your CPU handles all the computer program’s instructions, your GPU helps accelerate creating and rendering of images, video, and animations.
Unfortunately, VGAs in GPUs are just about obsolete, and they aren’t capable of transferring the higher pixel density and refresh rates that modern games demand.
The good news is that better alternatives are now readily available.
Instead of VGA, we now have HDMI, DisplayPort, and Thunderbolt to connect computers to monitors and other devices.
And, unlike VGA, all of these now carry a stronger digital signal, which lets it provide higher resolutions like 1440p, 4K, and even 8K gaming (given that you’re using DisplayPort 1.4).
Do You Need a GPU for VGA Cables?
Yes, especially if you’re going to use your monitor for gaming.
That being said, most modern GPUs no longer come with VGA sockets, nor do gaming monitors.
As early as the 2010s, the VGA cable was phased out by the HDMI and DisplayPort. Both are now the standard, and it would be counterintuitive to build a modern PC using VGA cables unless you’re specifically building a vintage one for older games.
Capable of supporting up to HD (1080p) resolution, VGA used to be the standard for computer monitors.
During its time, it was connected to GPUs to deliver a maximum resolution of 640 x 480 with a 60 Hz refresh rate.
However, it has since been replaced with more modern solutions like HDMI and DisplayPort, which are no longer analog and, thus, can support both audio and video output.