For the longest time, a VGA connection was the standard. It started with a max resolution of 640 x 480.
Much has changed since then, and the capabilities of VGA cables have increased drastically. Today, I want to discuss whether VGA is that bad.
Let’s take a look.
The maximum resolution for VGA is 1080p with a refresh rate of 60 Hz . For most people, that is more than enough to do your daily tasks.
However, when you push VGA above 720p, you start to notice a significant loss in quality. VGA is not that bad for people with light workloads.
I am sure you still have a lot of questions. My goal is to answer as many of those questions as possible.
We will start by taking a look at how VGA stacks up in 2022. So, let’s jump into it.
VGA in 2022: Is It That Bad?
VGA is quickly becoming outdated.
It has been losing relevance over the last few years, and you might struggle to find a new motherboard that supports it. A new motherboard will generally support HDMI and display ports.
After looking at most Nvidia 30 series, Nvidia Quadro 4000 series, and AMD 6000 series graphics cards, it is clear that these manufacturers are phasing out VGA. That does not mean they don’t support it, though.
If you need to use VGA on a new system, you must use an adapter that converts DisplayPort or HDMI to VGA. Modern GPUs use a digital signal, not analog, which is utilized by VGA.
My point is that the lack of support is making it harder to justify using VGA.
We also need to consider what the other connections, such as HDMI and DisplayPort, offer:
- VGA supports a maximum of 1080p, but already struggles to do so.
- HDMI 2.1 can reach up to 10K at 120 Hz. (Unrealistic with today’s hardware but still incredible.)
- DisplayPort is similar to HDMI. However, with a few compression techniques, it can push out 16K at 120 Hz.
- DVI is complicated. It can reach a resolution of 1200p, but it is one of the rarest connections.
Is VGA Bad For Gaming?
According to a report by Statista, there are over 1.7 billion gamers in the world. The majority of them don’t have high-end hardware.
So, to say VGA is bad for gaming would be insulting to the hundreds of millions of people who still use the technology. Also, if you are using VGA, chances are you aren’t trying to run games at a higher resolution than 1080p.
With all of that said, VGA loses quality as you bump up the resolution. So, you might experience a few problems while playing with VGA at 1080p.
These problems include:
- Resolution drops.
- Screen tearing.
- Frame time lag. (It’s a huge issue and can make games unplayable. I have seen this happen to someone trying to play Battlefield 2042 at 1080p via VGA. The game is too vast.)
- Blank screens can happen. It only lasts for a second or two but can break immersion. Also, it can feel game-breaking when it happens every few minutes.
Look, if you have a monitor that supports VGA and not any of the other connections, you should still play games. Using VGA is not bad for gaming, it is just not the best connection for it.
If you have a VGA monitor and a GPU with both VGA and HDMI/DisplayPort, you will always be limited to the lowest common denominator.
So, using an adaptor from HDMI to VGA means you are limited to the capabilities of VGA. In some cases, using an adaptor could reduce the capabilities of the VGA connection.
Is VGA Bad For Content Creation?
This is a complicated question to answer.
There are many variables that we need to consider. There are also a lot of misconceptions that I see people throwing around.
VGA supports the full 24-bit RGB color spectrum. So, in terms of color accuracy, you shouldn’t run into any problems, right?
Well, you run into problems when the signal quality going from your PC to the monitor starts to decline. When the quality dips, colors might not be represented accurately.
So, for the best color accuracy, you would preferably be on 720p resolution. Although you can edit videos at 720p and target a higher resolution, it makes life a bit more complicated.
As I mentioned with gaming, VGA is not particularly bad for content creation, but it is not the best option, especially as display technology increases.
Let me explain:
If you create content for TV, you would want to be able to target 4K resolution. You also want to see your work in 4K with and without HDR. However, you cannot do that using VGA.
Is VGA Bad For Office Work?
VGA is not bad for office work, but I can see scenarios where using it could impact your productivity.
We have established VGA struggles at 1080p and can’t go above that, so what does this mean for office work?
Higher resolutions offer more real estate on your monitor; this allows you to do more, especially if you have two monitors side-by-side.
It is the same concept that we covered in the content creation section for design or development work.
Why Doesn’t New Hardware Have VGA Connections?
Anyone who buys a new piece of equipment is most likely not going to use VGA.
So, instead of putting resources into supporting outdated standards, a company would instead devote those resources to making the new standards as good as possible.
You also need to consider that if someone buys a new Quadro GPU for content creation, they can most likely get a monitor to go with it.
Suppose you decide to get a very cheap monitor, it might have a VGA connection. This is great for people who are still using older GPUs or motherboards.
On the other hand, if you have a monitor with only VGA, you will probably need adapters if you decide to get a new motherboard or GPU.
While writing this article, I encountered a few challenging moments. How can you ultimately say whether VGA is bad?
It has worked for so many years, and a lot of people still use it.
That is why the most accurate answer is to say that it is not the best.
Vance is a dad, former software engineer, and tech lover. Knowing how a computer works becomes handy when he builds Pointer Clicker. His quest is to make tech more accessible for non-techie users. When not working with his team, you can find him caring for his son and gaming.
Sunday 23rd of January 2022
Some would say VGA with a flat panel screen opens up the door to potential signal conversion issues. Flat panels use digital signals so an Analog signal would be converted twice. Once from digital to analog at PC out, and then analog back to digital at monitor. If you use good quality D/A converters and cable I doubt this is a problem. But it would add complexity to the signal path whereas back with CRT monitors the signal was always analog. But I know plenty of systems still using VGA with no complaints. Some even take a digital out and use a converter back to VGA to monitor. Seems to work just fine up to 1080p so I guess whatever works, right.