By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
Arkaign said:
Supposedly 900p (1600x900) was 'impossible' to tell from 1080 (1920x1080), so how is 3840x2160 checkerboard supposed to be distinguishable from 3840x2160 native in a living room setup unless you have a 100" TV?

Because Checkerboard isn't 3840x2160. It is two 1920x1080 frames.

Best to put it into pixel counts to give you a better idea.

900P is 1,440,000 pixels.
1080P is 2,073,600 pixels.
Checkerboard is 4,147,200 pixels.
4K is 8,294,400 pixels.

There is a difference between all of them, anyone who has seen them in the flesh can testify to that.
With that said, there are a few tricks to cover up such inadequacies like: Upscaling, Anti-Aliasing, Blur etc' which helps blur the lines.

Arkaign said:

IMHO, 900p is a bit blurry but not horrendously so at typical viewing distances to a 60" TV. I find that indeed most people can't tell the difference when I change my HTPC gaming setup between 900/1080. 720 does look pretty awful though.


I disagree. 900P is horrible. 720P is disgusting. 1080P is dated. - I have been a 1440P gamer for years. Before that I was running Eyefinity 1080P panels for a total of 5760x1080.


Arkaign said:
I think we're in the realm of diminishing returns with resolution, I wish console resources were going to 60fps ultra 1080p instead. I'm not even going to replace my 2560x1440 144hz gysnc screen for most of my PC gaming until I can reasonably buy enough GPU power for ~100fps at 4K native.


Disagree.
I got to see an 5k panel in action a few months ago. All I could say was: Wow.
Once I find a high-refresh rate 4k (Or better) non-tn panel monitor at a good price, I'm jumping on it like flies to poop.


Arkaign said:
I have a 10-bit 4K 42" display for the bedroom that I tried out briefly, but even with twin 1070s, the experience wasn't great. Dual GPU is not efficient enough to make it worthwhile, so I split them back up to HTPC + Gaming PC once again (replaced a 970 and R390, I briefly replaced the 390 with a 480, but it was incredibly underwhelming, so I doubt I'll buy another AMD GPU anytime soon).


Your mistake was thinking the Radeon 480 was a replacement for the 390. That was never AMD's intention. The 390 is actually the faster card in a ton of games, especially when compared against the Radeon 480 4Gb version.
That mistake lays with you not AMD.

Arkaign said:
Going by that, what would be the ultimate 2D setup? Where moving beyond it wouldn't be noticeable?

64K resolution (64,440x34,560) on a 200" Wall-flush curved OLED (or better) 12-bit color 240hz display.
200FPS with maximum AA and fully accurate Vsync and no perceivable delay (sub 2ms)
400TF GPU with 2TB Dedicated Memory (HBM3 or GDDR6)
Hybrid Quad-Core 8Ghz CPU with 64-Core 4Ghz CPU (most games demand from 1 to 4 cores pretty highly, and scale poorly from there, but a ton of secondary cores could help with background OS/AI/MP/Networking/etc)
2TB Eight-Channel 4Ghz Main System Memory on 1024-bit Bus, for keeping entire OS/game in memory at all times

$299? :D LOL



Okay. So.
1) Eyes don't see the world in pixels. You can have vision that exceeds 20/20.

2) Eyes don't see the world in terms of "frames per second".
3) There is little sense in making a 64-core CPU. CPU's aren't meant for highly parallel tasks, that's the job of the GPU, keep them serialised. - Also the frequency they operate isn't their performance.
4) You will not have System Ram on a 1024-bit bus, like. Ever. It requires far to many traces on the motherboard to make it economically feasible.

Did you wake up today and feel like being a disagreeable person just for the same of being argumentative and assumptive?

You make a bunch of replies based on your opinions (900p is horrible, 1080p is dated, etc). Your opinion is valid for you, but sitting way back at a buddies house looking at a 60' TV, SW Battlefront doesn't look terrible by any means. Your own chart pretty much says this for a 60" TV @ 10'.

On your 4K comments, derp, I've SEEN 4K because I've done it with 1070s on a 43" 60hz setup at close range. I simply preferred 1440p @ 100+FPS on Gsync. The smoothness was much better to me than the resolution bump. You basically started an argument on this point to basically agree with me (eg; give me a high refresh 4K/5K whenever that happens, but for now 1440P/100+ >> 2160P/60, and ESPECIALLY 2160/30). It was weirdly located as well because I was quite clearly referring to consoles pushing 4K, when we know to expect a lot of 30fps games.

Then you make assumptions on why I got the 480, I gifted my 390 to a friend whose card died and I needed a drop-in for an HTPC that rarely sees games other than for guests, not an upgrade. Unfortunately the 480 was a whiny, coil whine mess that had basically zilch for OC.

Then you make some weird argumentative things about an insanely hypothetical speclist. Of course the human brain doesn't process visual data in 'frames per second', but higher FPS appears smoother, this is quite obvious. What's the point in really arguing about a hypothetical 1024-bit bus, when speaking of a vaporware system with 2TB of memory?

It's sort of ironic, because this is far from the most off-the-wall thing on that speclist : http://www.kitguru.net/components/graphic-cards/anton-shilov/samsung-expects-graphics-cards-with-6144-bit-bus-48gb-of-hbm-memory-onboard/

^^ Yes, that is an article about GPU/HBM, not specifically system ram, but as we've seen with consoles, that's becoming merged in many cases, so if it makes you feel better, make my ludicrous hypothetical system have a single pool of 2TB HBM3 @ 1024-bit interface lol.

And where exactly did I state specifically that clock speed was the only thing that determined performance? Uarch, IPC, scaling, all of this has a ton to do with what the end result would be. An 8Ghz Intel Atom gen1 would be 8Ghz trash. An 8Ghz Jaguar would be better, but not ideal. An 8Ghz Haswell would be pretty damned good. Would you have preferred that I state some particular basis for the core design on a hypothetical level? Fine, Custom 8Ghz Icelake Quad, 64-Core on the BIG.little concept (on package 64 ARM-style cores to run OS/etc in the background without notably interfering with primary gaming threads). Until GPGPU gets more mature, AI in particular amongst other things does better on the CPU.

I am well aware of core loading and how difficult it is to utilize multicore setups to the max, quite frequently 1 main core sees 90%+ usage while the others vary widely. Extremely talented devs with a great grasp of a particular setup can do more, as Naughty Dog did with TLOU on PS3 (some great videos on the making of it showing just how well they loaded the Cell's SPEs).

My whole post was meant not to be a light-hearted observation and what-if kind of offhand thing, not an aspergers episode. I've worked in IT for over a quarter century, and I wouldn't think to bore the crap out of everyone going into excessive detail about something so utterly meaningless.