Look at your TV, laptop, or smartphone screen very closely. You’ll see small dots or pixels that change color with the visual running on the screen.
Based on the display resolution, the dots could be visible from a relatively fair distance from the screen or may not be discernible at all, irrespective of how hard you try.
The pixels in 4K and 8K displays are extremely hard to detect. On a 480p or 720p screen, pixels float right in front of your eyes, even from a few inches away.
So, are screen resolutions only about the number of pixels display pack in vertically and horizontally? Aren’t there other differences between them or ways one is better than the other?
Keep reading to find out and get a complete lowdown of the different display resolutions and what they bring to the table.
- What is WXGA?
- What is 720p (HD)?
- What is 1080p (FHD)?
- What is 4K (UHD)?
- A Few Other Display Resolutions Worth Mentioning
- WXGA, 720p, 1080p, and 4K: Stacking Them Up
- Comparison Table: WXGA, 720p, 1080p, and 4K
What is WXGA?
WXGA (wide extended graphics array) is a 1280x800 display resolution, or it has 1,280 columns and 800 rows of pixels. It’s a popular display resolution type for tablet and laptop computers with a 16:10 aspect ratio.
Considered the wide variant of WXGA, WXGA’s 16:9 aspect ratio has a 1366 x 768 resolution. There’s WXGA+, which is 1440x900 pixels at a 16:10 aspect ratio.
WXGA was a popular display resolution for LCD TVs in 2006. After FHD made its foray, WXGA became more familiar with small, low-end screens.
What is 720p (HD)?
Quite similar to WXGA, 720p (HD) is a screen resolution with 1,280 (horizontal) and 720 (vertical) pixels, or it’s half the resolution of 1080p. The “p” in 720p stands for “progressive scanning” or “non-interlaced video.” (More on that later)
When HD televisions first came out, 720p was the display and broadcast resolution of choice. It looked noticeably sharper than the standard definition (SD) or XGA displays.
However, with the introduction of 1080p resolution screens, 720p lost ground and is no longer the bare minimum resolution guaranteeing sharper visuals on laptop and desktop computer screens.
The abbreviation “HD” denotes any resolution greater than the standard definition. Technically, 1080p and even 4K can be referred to as HD.
But the HD short form got stuck to 720p since it was the first to one-up SD.
What is 1080p (FHD)?
Referred to as Full HD or FHD, the 1080p resolution is 1,920 columns and 1,080 rows of pixels in the 16:9 picture aspect ratio. That’s twice the number of pixels as 720p. The 1080p 4:3 standard has a resolution of 1440x1080 pixels.
Then there are ultrawide HD (2560x1080), Full HD+ (2160x1080; 18:10) or (2400x1080; 20:9) resolution variations.
1080p has been the standard resolution for display hardware, television shows, and movies for quite some time.
Things, however, have transitioned to 4K in the past few years. But gaming or streaming at resolutions more than 1080p may also warrant a more powerful video card.
1080p gaming, as a result, is very prevalent among current mainstream gamers.
Progressive and Interlaced FHD Screen Resolution
There’s another flavor of FHD, called 1080i, wherein the letter “i” stands for “interlaced.” 1080p and 1080i differ in the way the visuals get scanned.
A non-interlaced or progressive FHD video (1080p) is refreshed sequentially from top to bottom or left to right of an image without missing a line.
With an interlaced video, alternate lines of a visual are refreshed during the initial scan, and the remaining lines are restored next to complete the image.
In numerical speak, the 1080 lines in an interlaced video are split into two 540-line factions. With 1080p, there’s no such demarcation, and all pixels are drawn sequentially.
Similarly, in a 720p video, 720 lines of resolution are progressively scanned.
Although these refreshes happen within a fraction of seconds, you may notice a difference in how smoothly the videos are rendered if you watch them closely or run them in slow motion.
Here’s a video expounding on the subject further:
Compared to 1080i, 1080p is better since it projects visuals onto the screen much more efficiently, impeding the overall experience. 1080p is, therefore, the standard.
Non-interlaced videos with their progressive scanning method have a decreased likelihood of line movements or discernible flickering compared to interlaced videos.
What is 4K (UHD)?
4K is a video resolution that’s four times more pixel-dense than 1080p. 4K resolution pixel arrangements could be 4096x2160 pixels (true 4K) and 3840x2160 pixels (UHD). 4K resolution is, at times, also referred to as 2160p.
The terms 4K and UHD (ultra-high definition) are interchangeable but aren’t synonymous. The abbreviation UHD can also denote resolutions more than 4K. 8K, for instance, is also UHD.
UHD is the 4K standard of choice for the TV industry and online video-sharing websites, such as YouTube.
True 4K resolution (4096x2160) is primarily used in digital cinema. It is the native 4K resolution of all 4K monitors and digital projectors compliant with the Digital Cinema Initiatives (DCI) standard.
The difference between true 4K and UHD boils down to their marginally different aspect ratios. But because UHD’s horizontal resolution of 3,840 pixels doesn’t cross the 4K threshold, DCI doesn’t consider UHD as 4K.
A few other 4K resolutions include:
- 4096x2560 (16:10; 1.60:1)
- 4096x2304 (16:9; 1.77:1)
- 3840x2400 (16:10; 1.60:1)
- 3840x1600 (12:5; 2.40:1)
- 3840x1080 (32:9; 3.55:1)
The Beginnings of 4K
Based on the device and implementation, 4K has graced the scene during different periods.
Hitachi, in 1984, released a CMOS graphics processor that supported up to 4K screen resolution in monochrome.
In 2003, Dalsa Origin was released, the first 4K camera to be available commercially. YouTube first started support for 4K video uploads in 2010. That was when several leading camera companies began making 4K cameras.
Movie theaters started 4K movie screenings in 2011, although 4K projectors were doing the rounds as early as 2004. The first 4K projectors for homes were released in 2012 by Sony.
After that, gaming consoles, smartphones, and other electronic devices adopted 4K screens and video recording capabilities.
4K Streaming Requirements
Internet bandwidth of 10 to 20 Mbps is needed for streaming online 4K content. Some service providers could require more.
Although a 4K TV or an external monitor is not mandatory to play 4K content, 4K videos look their best on a native 4K display.
A Few Other Display Resolutions Worth Mentioning
There are quite a few other screen resolutions preceding or interspersed with the above.
A 480p screen is a standard definition (SD) display with 640x480 pixels.
480p’s vertical resolution is always 480 pixels, but the horizontal resolution could be 640 pixels (4:3) or 854 pixels (16:9). The 4:3 aspect ratio has been more the norm.
Introduced during the 1980s, the first VGA display or screen to adopt the 640x480 resolution arrangement was launched by IBM. Several early plasma televisions used 480p screens, mainly in the 4:3 aspect ratio.
Besides 4:3 and 16:9, other 480p video aspect ratios include:
- 3:2 (720x480)
- 16:10 (768x480)
- 5:3 (800x480), etc.
- 18:10 (864x480)
Although it’s long since 480p was the norm for televisions, devices with smaller screens continue to look plenty sharp with a 480p screen.
qHD is short for “quarter high definition.” It’s a fourth of 1080p resolution. Its screen resolution is 960x540 pixels. It often gets confused with QHD, which stands for “quad high definition.”
qHD is a slight bump over 480p and was quite popular during the early 2010s when small-screen smartphones like the Samsung Galaxy S4 Mini, HTC Sensation, and the Microsoft Lumia 535 were quite popular.
Like 480p, qHD is almost non-existent now, except in devices with tiny screens like a smartwatch.
QHD is four times 720p or 2560x1440 pixels. According to the DCI, the QHD resolution is 2048x1080 pixels. But most 2K laptops or PC monitors have 1440p displays.
QHD is also referred to as 2K since it’s half the resolution of 4K. QHD may also be called “2.5K.”
The 16:9 wide aspect ratio QHD is abbreviated as WQHD. Displays with the ultra-wide 21:9 aspect ratio are called Ultra-wide QHD and have a resolution of 3440x1440 pixels.
QHD is not as big of a jump as 4K is from 1080p. Most people, therefore, opt for a 4K monitor or TV as an upgrade over their FHD screen devices. In fact, there aren’t many QHD televisions.
Phone manufacturers tried QHD panels on their smartphones when 1080p screens were standard. But because 2K did not look significantly better than 1080p and was more power-hungry, companies swiftly abandoned 2K on phones. FHD+ is the new flavor.
Some smartphone companies, however, use 2K displays on their flagship devices to justify the price difference between their premium offerings and mid-range devices.
Similar to 2K, 5K (5120x2880; 16:9) is somewhat of a niche resolution between 4K and 8K. Most manufacturers don’t make or employ 5K screens in their devices.
The resolution is certainly not standard with TVs or in the digital cinematography space where 4K is more the standard.
5K is mainly employed in computer monitors. If you can discern pixels on a 4K monitor, 5K would iron them out.
A 5K screen is also good at addressing some of the shortcomings of viewing 4K content on a 4K display.
For example, viewing a 4K image on a 4K display leaves no room to see other things on the screen. In other words, a 5K display provides close to 700 more vertical pixels and 1,000 additional horizontal pixels with a 4K picture open.
In the real world, that means enough room for an Explorer window, web page, multiple software utilities, etc., besides a 4K image. The extra display estate would help people who use several windows simultaneously.
Introduced in September 2014, the now discontinued Dell UltraSharp UP2715K was the first 5K monitor. Some noteworthy currently available 5K panels include the 2020 5K iMac and the LG 27MD5KL-B Ultrafine 5K Display .
8K resolution is what 4K was not very long ago — expensive and out of reach. If you’re holding off your 4K TV purchase for an 8K TV, be ready to spend a lot of money.
Besides being ultra-expensive, 8K isn’t a significant jump over 4K if the display sizes are the same. A 45-inch 4K TV looks sharper than an FHD TV of the same screen size.
8K needs screens bigger than existing TV screens to showcase their magic. A 65-inch TV is the bare minimum expanse, but 80 inches or higher is where 8K can truly shine. Not to mention, those televisions will cost a lot.
Also, movies are not made in 8K yet, and there isn’t much 8K content in general.
Even the most tech-savvy individual content creators for the web and indie production houses have just started shooting in native 4K. It would take them a few years to transition from 4K to 8K. So that’s yet another pushback.
Considering the hurdles, it could take some time for 8K to truly penetrate the market.
WXGA, 720p, 1080p, and 4K: Stacking Them Up
Besides the technical differences between the two, no single display resolution dominates or replaces the other screen resolutions. The following is what the present is like for them:
4K is the Current Toast
4K resolution is slowly becoming the norm in TVs after being out of reach of the masses for long.
Due to the democratization of 4K, movie studios or content producers are producing more 4K material than before. Or, it’s the other way around.
Besides 4K TVs, 4K gaming monitors, 4K security cameras, and smartphones with native 4K displays or 4K video recording capabilities are also cropping up in large numbers.
4K packs in a lot more detail in an inch. The curves are smoother, and the lines are sharper in 4K. If the move from 720p to 1080p was noticeable, the transition from FHD to 4K is out there to see.
5K and 8K are, no doubt, sharper than 4K, but those panels are expensive and, therefore, will take time to become the standard—particularly 8K.
FHD Resolution Will Be Around
If you thought the emergence of 4K was the beginning of the decline of 1080p, you’re not wrong.
But, unlike most people would have thought, 1080p resolution is still around and will continue to be so for a few years at least, mainly through smartphones and external monitors.
Full HD TVs, however, could become history sooner rather than later as the panels are getting bigger and 1080p isn’t sharp enough to complement those giant screens. Smartphones have found a sweet spot with 1080p+ resolution displays.
720p and lower-resolution displays, on the other hand, are now pretty much non-existent on phones and other devices with larger screens.
Cost and Size Difference
The more the number of pixels, the higher the price. The cost of making a 4K display is more than producing an FHD panel. And it’s not just the number of pixels; the panel type also determines the cost.
On another note, the lesser the resolution or number of pixels, the smaller the file size. For example, 60 minutes of 480p video would occupy approximately 500 MB of space on your disk or card.
An hour of RAW 4K content, on the contrary, will consume more than 100 GB of storage. Not to mention, frame rates and bitrates also determine the file size.
Comparison Table: WXGA, 720p, 1080p, and 4K
Here’s a more number-based comparison between the different display resolutions:
|Other name(s)||n/a||HD, HD+||FHD, FHD+||UHD|
|Aspect ratio(s)||16:10; 16:9||4:3; 16:9||4:3; 16:9; 21:9
18:9, 16:10; 3:2
4:3; 16:10; 16:9;
- The total number of pixels is calculated by multiplying the pixels across the length and breadth of the screen. For instance, 1280x800 = 1.02 million pixels.
- Based on the aspect ratio, the number of horizontal and vertical pixels could vary. For example, 1280x720 is 16:9 HD resolution. HD at 4:3 is 960x720.
What is WUXGA?
WUXGA (widescreen ultra-extended graphics array) is a display resolution with a 16:10 aspect ratio and 1920x1200 pixels arrangement. The resolution is commonly found in laptops and desktop computer monitors.
- Professional artist monitor: 24.1-inch 16:10 WUXGA (1920 x 1200) IPS display with frameless design
- Industry-standard color space coverage: 100% sRGB and 100% Rec. 709 color gamut with a brightness of...
- Accurate image reproduction: Calman Verified and factory pre-calibrated for impeccable color...
Last update on 2022-12-08 / Affiliate links / Images from Amazon Product Advertising API.
As important resolution is to the sharpness or restoring the details in an image, it’s not the be-all, end-all of picture quality. In fact, it’s entirely possible for a 4K screen to look inferior to a 2K or even an FHD display if the panel is not good.
4K is a better screen to look at, but don’t ignore other aspects of the screen, such as color reproduction, contrast ratio, viewing angles, maximum brightness, etc., when shopping for a TV or external monitor. The display type (OLED, IPS LCD, QLED) is another critical aspect not to ignore.
In the future, 8K or even 10K will become more accessible. But we are quite a few years away from that reality. So please don’t hold your breaths for it. Currently, 4K is plenty sharp for all screen sizes and viewing requirements if you’re chasing pixels.
Catherine Tramell has been covering technology as a freelance writer for over a decade. She has been writing for Pointer Clicker for over a year, further expanding her expertise as a tech columnist. Catherine likes spending time with her family and friends and her pastimes are reading books and news articles.