I don’t want 8K. I want my current 4K streaming to have less pixilation. I want my sound to be less compressed. Make them closer to Ultra BluRay disc quality before forcing 8K down our throats… unless doing that gives us better 4K overall.
Bingo, if I were still collecting DVDs/HD DVDs like I was in the 90’s, it might be an issue. Streaming services and other online media routed through the TV can hardly buffer to keep up with play speed at 720, so what the fuck would I want with a TV that can show a higher quality of picture which it can also not display without stutter-buffering the whole of a 1:30:00 movie?
Streaming services and other online media routed through the TV can hardly buffer to keep up with play speed at 720
This is a problem with your internet/network, not the TV.
Yep, just imagine how bad the compression artefacts will be if they double the resolution but keep storage/network costs the same.
Doubling the dimensions make it 4x the data.
Not if you only double it in one direction. Checkmate.
That’s not true for compressed video. It doubles the bitrate for the same quality on modern codecs (265, av1, etc.)
Yeah 4K means jack if it’s compressed to hell, if you end up with pixels being repeated 4x to save on storage and bandwidth, you’ve effectively just recreated 1080p without upscaling.
Just like internet. I’d rather have guaranteed latency than 5Gbps.
probably because i dont even care about 1080p tvs. they all look the same.
I’d buy one if it came with every David Attenborough (or similar) nature documentary included. I don’t need 8k for games or movies or anything else but I’ll watch the shit out of whatever high budget nature documentaries are produced and put my nose against the screen to see the critter details.
Put them into Microsoft Windows as mandatory enabled or something.
I can’t see ever owning an 8k display unless it’s like 200 inches… The ones that are available now are too expensive to be a justifiable purchase and there’s not really an abundance of content that takes advantage of the format.
LASIK actually made a huge difference in being able to appreciate the sharpness of 4K, but I doubt 8K is as big a leap.
I don’t know if it changed, but when I started looking around to replace my set about 2 years ago, it was a nightmare of marketing "gotcha"s.
Some TVs were advertising 240fps, but only had 60fps panels with special tricks to double framerate twice or something silly. Other TVs offered 120fps, but only on one HDMI port. More TVs wouldn’t work without internet. Even more had shoddy UIs that were confusing to navigate and did stuff like default to their own proprietary software showing Fox News on every boot (Samsung). I gave up when I found out that most of them had abysmal latency since they all had crappy software running that messed with color values for no reason. So I just went and bought the cheapest TV at a bargain overstock store. Days of shopping time wasted, and a customer lost.
If I were shown something that advertised with 8K at that point, I’d have laughed and said it was obviously a marketing lie like everything else I encountered.
Asus makes their version of a 4k OLED LG panel with no shitty ‘smart’ software.
in that situation, Asus are the shitty part, though it is nice to see more TV-sized monitors. Fuck HDMI.
Did I miss something with Asus recently? I’ve only had good experiences with their hardware.
ASUS used to be the goat brand. They have since enshittified, and the biggest hit was their customer service. It’s 100% ass now. The product itself is really hit or miss now too.
I’ll consider you lucky. I’ve had many experiences with their hardware across different segments (phones, tablets, laptops, mainboards, NICs, displays, GPUs).
They’re an atrocious vendor with extremely poor customer support (and shitty SW practicies for UMA systems and motherboards).
I don’t think many people have been as unfortunate as I have with them, the general consensus is they mark their products up considerably relative to competition (particularly mainboards & GPUs).
To be fair, their contemporaries arent much butter.
Dang.
I switched to ASRock for my AMD build for specific feature sets and reading ASUS AM5 stuff it looks like that was a good idea.
But ASRock 800 series AM5 boards are killing granite ridge 3D CPUs en masse. Funny enough, it happened to me.
I begrudgingly switched to Asus after my CPU was RMA’d as that was the only other vendor to offer ECC compat on a consumer platform.
How about 7800X3D?
I work off metered data. I’m happy with 360p.
There was a while that I exclusively used apps where I could lower the bitrate of music I listened to. Because I’m not rocking crazy good headsets and such for when I needed it, and I really saw no reason to use up larger amounts of data when I was listening to music over the sound of a lawnmower walking around the yard for an hour. If I was going to leave music on and not have wifi, it just didn’t seem worth it.
Also if you had poor bandwidth in an area, it plays better
Yup - not a solution for everyone but there are typically Quality Of Service (QOS) services on routers that will do something similar - where it will target a certain threshold.
Maybe if they add 3D, people will buy them!
/s
Forget 3D, I want smellovision!
Well they say all new tech is driven by the porn industry, so, um…
Yes.
Man, I would. I am 100% the target demographic, jumped in the 3D TV rabbit hole and loved it. Totally knew it was a gimmick, but didn’t care. Would have friends over for 3D movie parties.
But adding them to my Plex server sucked. TAB or SBS files were half-assed and the PlayStation I used took sooooooo damn long to freaking start the movie and skipping was an issue.
Tbf one if the use cases for display technologies with high pixel density is vr headsets.
Yeah, very much looking forward to headsets with 8k panels. Most are up to 4k now, and it’s getting pretty good. If it stays at 4k for a bit, that would be fine. But it’s definitely an area where 8k will still be a very noticeable upgrade.
Even if the only short-term practical use for an 8k panel is how far away a 4k or 1080p screen would be clear to read in an augmented reality situation, that would be reason enough. But I personally will gladly lower quality settings to run VR games in 8k instead of 4k as well.
Honestly, at 3m (10 feet) I can’t see the pixels at 1080p. My corrected vision is no longer 20/20
I mostly use my TV for gaming and watching old movies and anime.
The former task will be unviable at 8k and make my GPU cry, and the latter one makes 8k unnecessary.
I really don’t see the point in 8k displays right now.
Removed by mod
BR standards haven’t caught up, that’s probably the answer for most who can afford it.
I’ll take an 8k computer monitor though. In fact, send two. Kthnx.
What’s the point? Even if you pay extra for “4K” streaming, it’s compressed to hell and the quality is no better than 1080p. What are you going to even watch on an 8K TV?
If memory serves, last year’s summer Olympics were televised (to some degree, anyways) at 8K, even if it was just a technological flex or test.
I can’t imagine the bitrate was high enough to make much difference in quality… But I don’t know what the technical details were.
Here’s something, though it seems to have a heavy “look at how great Intel is” spin to it: https://www.digitalcameraworld.com/news/the-olympics-is-being-streamed-in-8k-but-its-kind-of-a-secret
Nothing with Bluray stagnant at 4K. Not that I care, my aging eyes are fine sticking with 4K forever.













