A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish.
That seems in line with common knowledge? Say you want to keep your viewing angle at ~40º for a home cinema, at 2.5m of distance, that means your TV needs to have an horizontal length of ~180cm, which corresponds to ~75" diagonal, give or take a few inches depending on the aspect ratio.
For a more conservative 30° viewing angle, at the same distance, you’d need a 55" TV. So, 4K is perceivable at that distance regardless, and 8K is a waste of everyone’s time and money.
That’s why I have a 65" and sit barely 2m from it. Stick on a 4k Dolby Vision encoded file through Jellyfin. Looks fucking great!
This study was brought to you by every streaming service.
Sure but, hear me out, imagine having most of your project sourcecode on the screen at the same time without having to line-wrap.
I’ve been using “cheap” 43" 4k TVs as my main monitor for over a decade now. I used to go purely with Hisense, they have great colour and PC text clarity, and I could get them most places for $250 CAD. But this year’s model they switched from RGB subpixel layout to BGR, which is tricky to get working cleanly on a computer, even when forcing a BGR layout in the OS. One trick is to just flip the TV upside down (yes it actually works) but it just made the whole physical setup awkward. I went with a Sony recently for significantly more, but the picture quality is fantastic.
And then there’s the dev that still insists on limiting lines to 80 chars & you have all that blank space to the side & have to scroll forever per file, sigh….
80 is a tad short these days, but that’s still kind of win/win since now you can have way more files all showing side-by-side.
Split screen yo
Really depends on the size of the screen, the viewing distance, and your age/eye condition. For more people 720 or 1080 is just fine. With 4k, you will get some better detail on the fabric on clothes and environments, but not a huge difference.
8k is gonna be a huge waste and will fail.
This finding is becoming less important by the year. It’s been quite a while since you could easily buy an HD TV - they’re all 4K, even the small ones.
And then all your old media looks like shit due to upscaling. Progress!
4k is way better than 1080p, it’s not even a question. You can see that shit from a mile away. 8k is only better if your TV is comically large.
I can immediately tell when a game is running at 1080p on my 2K monitor (yeah, I’m not interested in 4K over higher refresh rate, so I’m picking the middle ground.)
Its blatantly obvious when everything suddenly looks muddy and washed together.
As someone who has a 4k monitor, 1440p is a great middle ground for gaming
I think that’s relevant to the discussion though. Most people sit like two feet from their gaming monitor and lean forward in their chair to make the character go faster.
But most people put a big TV on the other side of a boring white room, with a bare white ikea coffee table in between you and it, and I bet it doesn’t matter as much.
I bet the closest people ever are to their TV is when they’re at the store buying it…
I think you overestimate the quality of many humans’ eyes. Many people walk around with slightly bad vision no problem. Many older folks have bad vision even corrected. I cannot distinguish between 1080 and 4k in the majority of circumstances. Stick me in front of a computer and I can notice, but tvs and computers are at wildly different distances.
And the size of most people’s TV versus how far away they are.
Yeah a lot of people have massive TV’s if your into sport but most people have more reasonable sized TV’s.
I used to have 20/10 vision, this 20/20 BS my cataract surgeon says I have now sucks.
Thats what you humans get for having the eyes from fish
Seriously. Eyes basically disprove intelligent design because they’re kinda shitty at what they do.
You’re just jealous we can breath above water, cephalopod.
Oh look I’m a human I only have two arms and can’t even squirt ink at people. Oh ha ha I have all these stupid bones so if I’m locked in a prison I can’t get out, like a loser.
I’ve been saying this for years.
It depends on how far away you sit. But streaming has taken over everything and even a little compression ruins the perceived image quality of a higher-DPI display.
I have 65" 4K TV that runs in tandem with Beelink S12 pro mini-pc. I ran mini in FHD mode to ease up on resources and usually just watch streams/online content on it which is 99% 1080p@60. Unless compression is bad, I don’t feel much difference. In fact, my digitalized DVDs look good even in their native resolution.
For me 4K is a nice-to-have but not a necessity when consuming media. 1080p still looks crisp with enough bitrate.
I’d add that maybe this 4K-8K race is sort of like mp3@320kbps vs flac/wav. Both sound good when played on a decent system. But say, flac is nicer on a specific hardware that a typical consumer wouldn’t buy. Almost none of us own studio-grade 7.1 sytems at home. JBL speaker is what we have and I doubt flac sounds noticeably better on it against mp3@192kbps.
Yeah, when I got my most recent GPU, my plan had been to also get a 4k monitor and step up from 1440p to 4k. But when I was sorting through the options to find the few with decent specs all around, I realized that there was nothing about 1440p that left me dissapointed and the 4k monitor I had used at work already indicated that I’d just be zooming the UI anyways.
Plus even with the new GPU, 4k numbers weren’t as good as 1440p numbers, and stutters/frame drops are still annoying… So I ended up just getting an ultra-wide 1440p monitor that was much easier to find good specs for and won’t bother with 4k for a monitor until maybe one day if it becomes the minimum, kinda like how analog displays have become much less available than digital displays, even if some people still prefer the old ones for some purposes. I won’t dig my heels in and refuse to move on to 4k, but I don’t see any value added over 1440p. Same goes for 8k TVs.
Interestingly enough, I was casually window browsing TVs and was surprised to find that LG killed off their OLED 8K TVs a couple years ago!
Until/if we get to a point where more people want/can fit 110in+ TVs into their living rooms - 8K will likely remain a niche for the wealthy to show off, more than anything.
An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.
I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.
For an ELI5 explanation, this is what happens when you lower the bit rate: https://youtu.be/QEzhxP-pdos
No matter the resolution you have of the video, if the amount of information per frame is so low that it has to lump different coloured pixels together, it will look like crap.
On codecs and bitrate? It’s basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it’ll get really blurry with a low bitrate even in 4k.
“File types” like avi, mp4, etc are container formats. Codecs encode video streams that can be held in different container formats. Some container formats can only hold video streams encoded with specific codecs.
ah yeah I figured it wasn’t quite right, I just remember seeing the codec on the details and figured it was tied to it, thanks.
I’ll add another explanation for bitrate that I find understandable: You can think of resolution as basically the max quality of a display, no matter the bitrate, you can’t display more information/pixwls than the screen possess. Bitrate, on the other hand, represents how much information you are receiving from e.g. Netflix. If you didn’t use any compression, in HDR each pixel would require 30 bits, or 3.75 bytes of data. A 4k screen has 8 million pixels. An HDR stream running at 60 fps would require about 1.7GB/s of download wihout any compression. Bitrate is basically the measure of that, how much we’ve managed to compress that data flow. There are many ways you can achieve this compression, and a lot of it relates to how individual codecs work, but put simply, one of the many methods effectively involves grouping pixels into larger blocks (e.g. 32x32 pixels) and saying they all have the same colour. As a result, at low bitrates you’ll start to see blocking and other visual artifacts that significantly degrade the viewing experience.
As a side note, one cool thing that codecs do (not sure if literally all of them do it, but I think most by far), is that not each frame is encoded in its entirety. You have, I, P and B frames. I frames (also known as keyframes) are a full frame, they’re fully defined and are basically like a picture. P frames don’t define every pixel, instead they define the difference between their frame and the previous frame, e.g. that the pixel at x: 210 y: 925 changed from red to orange. B frames do the same, but they use both previous and future frames for reference. That’s why you might sometimes notice that in a stream, even when the quality isn’t changing, every couple of seconds the picture will become really clear, before gradually degrading in quality, and then suddenly jumping up in quality again.
The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.
Suppose you’ve recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can’t be computed from the data.
In summary, the TV in my living room might be more capable, but my streaming provider probably isn’t sending enough data to really use it.
This is true. That said, if can’t tell the difference between 1080p and 4K from the pixels alone, then either your TV is too small, or you’re sitting too far away. In which case there’s no point in going with 4K.
At the right seating distance, there is a benefit to be had even by going with an 8K TV. However, very few people sit close enough/have a large enough screen to benefit from going any higher than 4K:

Source: https://www.rtings.com/tv/learn/what-is-the-resolution
4k with shit streaming bitrate is barely better than high bitrate 1080p
But full bitrate 4k from a Blu-ray IS better.
But full bitrate 4k from a Blu-ray IS better.
Full Blu-Ray quality 1080p sources will look significantly better than Netflix 4K.
Hence why “4K” doesn’t actually matter unless your panel is gigantic or you’re sitting very close to it. Resolution is a very small part of our perceived notion of quality.
ITT: people defending their 4K/8K display purchases as if this study was a personal attack on their financial decision making.
They don’t need to this study does it for them. 94 pixels per degree is the top end of perceptible. On a 50" screen 10 feet away 1080p = 93. Closer than 10 feet or larger than 50 or some combination of both and its better to have a higher resolution.
For millennials home ownership has crashed but TVs are cheaper and cheaper. For the half of motherfuckers rocking their 70" tv that cost $600 in their shitty apartment where they sit 8 feet from the TV its pretty obvious 4K is better at 109 v 54
Also although the article points out that there are other features that matter as much as resolution these aren’t uncorrelated factors. 1080p TVs of any size in 2025 are normally bargain basement garbage that suck on all fronts.
My 50" 4K TV was $250. That TV is now $200, nobody is flexing the resolution of their 4k TV, that’s just a regular cheap-ass TV now. When I got home and started using my new TV, right next to my old 1080p TV just to compare, the difference in resolution was instantly apparent. It’s not people trying to defend their purchase, it’s people questioning the methodology of the study because the difference between 1080p and 4k is stark unless your TV is small or you’re far away from it. If you play video games, it’s especially obvious.
Old people with bad eyesight watching their 50" 12 feet away in their big ass living room vs young people with good eyesight 5 feet away from their 65-70" playing a game might have inherently differing opinions.
12’ 50" FHD = 112 PPD
5’ 70" FHD = 36 PPD
The study basically says that FHD is about as good as you can get 10 feet away on a 50" screen all other things being equal. That doesn’t seem that unreasonable
Resolution doesn’t matter as much as pixel density.
Right? “Yeah, there is a scientific study about it, but what if I didn’t read it and go by feelings? Then I will be right and don’t have to reexamine shit about my life, isn’t that convenient”
deleted by creator
Please note at 18-24" with a 27" screen 4K does not max out what the eye can see according to this very study. EG all the assholes who told you that 4K monitors are a waste are confirmed blind assholes.
They are a waste of time since the things with enough fidelity to matter run like shit on them without a large investment. Its just a money sink with little reward.
Are you talking about 8K or 4K? Not only can you game in 4K with a cheap card depending on the game the desktop and everything else just looks nicer.
Ether, 1440p is about the limit I draw before the extra fidelity is not worth the performance hit.
Your own budget is by definition your business but you can run some stuff in 4K on my desktop I bought in 2020 for $700. Not worth it “TO ME” requires no defense but it is pretty silly to say its a money sink with no reward when we are talking about PC gaming. You know where you game on a 24-32" screen 1 foot or 2 from your face. The study clearly says its not.
I have at one point of time made my living in hardware, I would not advise running in 4k or higher without good reason. You being able to run at 4k does not in anyway change the terrible value proposition of losing frames and latency for fidelity. I would not recommend anyone not wanting to go absolutely silly to run a 4 or 8k monitor. Run an multiscreen setup at lower resolution like a normal person. Don’t make your own preferences or sunk costs your position on tech in general.
Credentials like “made my living in hardware” are both non-specific and non-verifiable they mean nothing. I have 2 27" 4K 60hz monitors because last gen hardware just isn’t that expensive.
When not gaming this looks nicer than 2x FHD and I run it in either 1080 or 4K depending on the game depending on what settings need to be set to get a consistent 60 FPS. My hardware isn’t poverty level nor is it expensive. An entry level Mac would be more expensive.
Leaving aside gaming isn’t it obvious to you that 4K looks nicer in desktop use or are your eyes literally failing?
I have 2 collage diplomas and worked 10 years in the industry at IBM alone. Your not going to cow me or tell me I have no credentials, those accusations mean nothing. I don’t really get why you are so very aggressively pushing this nonsense, do you just love tech slop so much? Are you getting a kickback with every 4k monitor sold? Why of all the hills to die on it is this?
And no, 4k desktops do not “look nicer”, it is stupid and tiny for no reason. Unless you have like 250 shortcuts on your desktop what is the point?
Subjective obviously.
Oh there are more pixels, sure. But not worth the money and most (and a big most) applications want more frames and smoother movement with less input lag over more pixels. The push for 4k gaming has went no where and it has been more then 10 years. You want to watch some 4k video? sure! That is a use case, but just get a TV with the nicer lumen, slower rates and comparably tiny price tag. I can not stop people from buying stupid crap, but I am judging them.
What about the vast majority of people who stare at screens for work?
Frame rates aren’t really important, it’s making things more readable in less space.
The cost / benefit is a completely different dynamic.
Oh I said it before there are use cases. Most working monitors are 1080p since excel is not really benefited from 4k+. However I have seen some graphic designers want the higher resolutions for example.
The vast majority of people working will get pissed at you if you changed their monitor to an ultra high resolution (I have been the one getting yelled at) without scaling it to look like 1080p. No one wants to squint to use their workstation.
There’s this thing called scaling that allows you to see things in an appropriate size but higher definition.
Anyone who uses spreadsheets regularly wants the extra real estate. Anyone who works with complex documents wants the extra real estate.
It’s not about more dots on your 24 inch, it’s about larger monitors that can display more stuff simultaneously. Instead of 4x 1080p monitors you can have 2x larger 4k monitors. Offer this to anyone who makes money by staring at a screen all day and they’ll tell you it’s worth it.
Anyone who uses spreadsheets regularly wants the extra real estate. Anyone who works with complex documents wants the extra real estate.
And yet as I have stated this is not the case for most users. I remember when a national here bank decided to do an “upgrade” to 4k monitors there was so much push back from users (in this case mortgage lending) that after installing the monitors I was back two weeks later to change them back.
People who use spreadsheets regularly (myself included) would rather have a second monitor or a bigger one then one 4k one. I have a 32 inch 1080p monitor as my secondary and it works great at a cheap price. I went with one that is brighter and a slower refresh rate since I don’t need or want that on a secondary. And if you are going big why spend the money on a 4k one if you are just going to use scaling anyway?









