By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Crowd is completely blown away by the 4K HDR graphics of the PS4 Pro

Random_Matt said:
I'm assuming upscaled 4K, not powerful enough for native 4K, for AAA games that is.

its 2x1080p which is 4 million pixels something around 1600p, they are using an upscalling method that makes it almost as good as true 4K Digital Foundry said



Around the Network
Arkaign said:
Supposedly 900p (1600x900) was 'impossible' to tell from 1080 (1920x1080), so how is 3840x2160 checkerboard supposed to be distinguishable from 3840x2160 native in a living room setup unless you have a 100" TV?

Because Checkerboard isn't 3840x2160. It is two 1920x1080 frames.

Best to put it into pixel counts to give you a better idea.

900P is 1,440,000 pixels.
1080P is 2,073,600 pixels.
Checkerboard is 4,147,200 pixels.
4K is 8,294,400 pixels.

There is a difference between all of them, anyone who has seen them in the flesh can testify to that.
With that said, there are a few tricks to cover up such inadequacies like: Upscaling, Anti-Aliasing, Blur etc' which helps blur the lines.

Arkaign said:

IMHO, 900p is a bit blurry but not horrendously so at typical viewing distances to a 60" TV. I find that indeed most people can't tell the difference when I change my HTPC gaming setup between 900/1080. 720 does look pretty awful though.


I disagree. 900P is horrible. 720P is disgusting. 1080P is dated. - I have been a 1440P gamer for years. Before that I was running Eyefinity 1080P panels for a total of 5760x1080.


Arkaign said:
I think we're in the realm of diminishing returns with resolution, I wish console resources were going to 60fps ultra 1080p instead. I'm not even going to replace my 2560x1440 144hz gysnc screen for most of my PC gaming until I can reasonably buy enough GPU power for ~100fps at 4K native.


Disagree.
I got to see an 5k panel in action a few months ago. All I could say was: Wow.
Once I find a high-refresh rate 4k (Or better) non-tn panel monitor at a good price, I'm jumping on it like flies to poop.


Arkaign said:
I have a 10-bit 4K 42" display for the bedroom that I tried out briefly, but even with twin 1070s, the experience wasn't great. Dual GPU is not efficient enough to make it worthwhile, so I split them back up to HTPC + Gaming PC once again (replaced a 970 and R390, I briefly replaced the 390 with a 480, but it was incredibly underwhelming, so I doubt I'll buy another AMD GPU anytime soon).


Your mistake was thinking the Radeon 480 was a replacement for the 390. That was never AMD's intention. The 390 is actually the faster card in a ton of games, especially when compared against the Radeon 480 4Gb version.
That mistake lays with you not AMD.

Arkaign said:
Going by that, what would be the ultimate 2D setup? Where moving beyond it wouldn't be noticeable?

64K resolution (64,440x34,560) on a 200" Wall-flush curved OLED (or better) 12-bit color 240hz display.
200FPS with maximum AA and fully accurate Vsync and no perceivable delay (sub 2ms)
400TF GPU with 2TB Dedicated Memory (HBM3 or GDDR6)
Hybrid Quad-Core 8Ghz CPU with 64-Core 4Ghz CPU (most games demand from 1 to 4 cores pretty highly, and scale poorly from there, but a ton of secondary cores could help with background OS/AI/MP/Networking/etc)
2TB Eight-Channel 4Ghz Main System Memory on 1024-bit Bus, for keeping entire OS/game in memory at all times

$299? :D LOL



Okay. So.
1) Eyes don't see the world in pixels. You can have vision that exceeds 20/20.

2) Eyes don't see the world in terms of "frames per second".
3) There is little sense in making a 64-core CPU. CPU's aren't meant for highly parallel tasks, that's the job of the GPU, keep them serialised. - Also the frequency they operate isn't their performance.
4) You will not have System Ram on a 1024-bit bus, like. Ever. It requires far to many traces on the motherboard to make it economically feasible.



--::{PC Gaming Master Race}::--

Pemalite said:
Arkaign said:
Supposedly 900p (1600x900) was 'impossible' to tell from 1080 (1920x1080), so how is 3840x2160 checkerboard supposed to be distinguishable from 3840x2160 native in a living room setup unless you have a 100" TV?

Because Checkerboard isn't 3840x2160. It is two 1920x1080 frames.

Best to put it into pixel counts to give you a better idea.

900P is 1,440,000 pixels.
1080P is 2,073,600 pixels.
Checkerboard is 4,147,200 pixels.
4K is 8,294,400 pixels.

There is a difference between all of them, anyone who has seen them in the flesh can testify to that.
With that said, there are a few tricks to cover up such inadequacies like: Upscaling, Anti-Aliasing, Blur etc' which helps blur the lines.

Arkaign said:

IMHO, 900p is a bit blurry but not horrendously so at typical viewing distances to a 60" TV. I find that indeed most people can't tell the difference when I change my HTPC gaming setup between 900/1080. 720 does look pretty awful though.


I disagree. 900P is horrible. 720P is disgusting. 1080P is dated. - I have been a 1440P gamer for years. Before that I was running Eyefinity 1080P panels for a total of 5760x1080.


Arkaign said:
I think we're in the realm of diminishing returns with resolution, I wish console resources were going to 60fps ultra 1080p instead. I'm not even going to replace my 2560x1440 144hz gysnc screen for most of my PC gaming until I can reasonably buy enough GPU power for ~100fps at 4K native.


Disagree.
I got to see an 5k panel in action a few months ago. All I could say was: Wow.
Once I find a high-refresh rate 4k (Or better) non-tn panel monitor at a good price, I'm jumping on it like flies to poop.


Arkaign said:
I have a 10-bit 4K 42" display for the bedroom that I tried out briefly, but even with twin 1070s, the experience wasn't great. Dual GPU is not efficient enough to make it worthwhile, so I split them back up to HTPC + Gaming PC once again (replaced a 970 and R390, I briefly replaced the 390 with a 480, but it was incredibly underwhelming, so I doubt I'll buy another AMD GPU anytime soon).


Your mistake was thinking the Radeon 480 was a replacement for the 390. That was never AMD's intention. The 390 is actually the faster card in a ton of games, especially when compared against the Radeon 480 4Gb version.
That mistake lays with you not AMD.

Arkaign said:
Going by that, what would be the ultimate 2D setup? Where moving beyond it wouldn't be noticeable?

64K resolution (64,440x34,560) on a 200" Wall-flush curved OLED (or better) 12-bit color 240hz display.
200FPS with maximum AA and fully accurate Vsync and no perceivable delay (sub 2ms)
400TF GPU with 2TB Dedicated Memory (HBM3 or GDDR6)
Hybrid Quad-Core 8Ghz CPU with 64-Core 4Ghz CPU (most games demand from 1 to 4 cores pretty highly, and scale poorly from there, but a ton of secondary cores could help with background OS/AI/MP/Networking/etc)
2TB Eight-Channel 4Ghz Main System Memory on 1024-bit Bus, for keeping entire OS/game in memory at all times

$299? :D LOL



Okay. So.
1) Eyes don't see the world in pixels. You can have vision that exceeds 20/20.

2) Eyes don't see the world in terms of "frames per second".
3) There is little sense in making a 64-core CPU. CPU's aren't meant for highly parallel tasks, that's the job of the GPU, keep them serialised. - Also the frequency they operate isn't their performance.
4) You will not have System Ram on a 1024-bit bus, like. Ever. It requires far to many traces on the motherboard to make it economically feasible.

Did you wake up today and feel like being a disagreeable person just for the same of being argumentative and assumptive?

You make a bunch of replies based on your opinions (900p is horrible, 1080p is dated, etc). Your opinion is valid for you, but sitting way back at a buddies house looking at a 60' TV, SW Battlefront doesn't look terrible by any means. Your own chart pretty much says this for a 60" TV @ 10'.

On your 4K comments, derp, I've SEEN 4K because I've done it with 1070s on a 43" 60hz setup at close range. I simply preferred 1440p @ 100+FPS on Gsync. The smoothness was much better to me than the resolution bump. You basically started an argument on this point to basically agree with me (eg; give me a high refresh 4K/5K whenever that happens, but for now 1440P/100+ >> 2160P/60, and ESPECIALLY 2160/30). It was weirdly located as well because I was quite clearly referring to consoles pushing 4K, when we know to expect a lot of 30fps games.

Then you make assumptions on why I got the 480, I gifted my 390 to a friend whose card died and I needed a drop-in for an HTPC that rarely sees games other than for guests, not an upgrade. Unfortunately the 480 was a whiny, coil whine mess that had basically zilch for OC.

Then you make some weird argumentative things about an insanely hypothetical speclist. Of course the human brain doesn't process visual data in 'frames per second', but higher FPS appears smoother, this is quite obvious. What's the point in really arguing about a hypothetical 1024-bit bus, when speaking of a vaporware system with 2TB of memory?

It's sort of ironic, because this is far from the most off-the-wall thing on that speclist : http://www.kitguru.net/components/graphic-cards/anton-shilov/samsung-expects-graphics-cards-with-6144-bit-bus-48gb-of-hbm-memory-onboard/

^^ Yes, that is an article about GPU/HBM, not specifically system ram, but as we've seen with consoles, that's becoming merged in many cases, so if it makes you feel better, make my ludicrous hypothetical system have a single pool of 2TB HBM3 @ 1024-bit interface lol.

And where exactly did I state specifically that clock speed was the only thing that determined performance? Uarch, IPC, scaling, all of this has a ton to do with what the end result would be. An 8Ghz Intel Atom gen1 would be 8Ghz trash. An 8Ghz Jaguar would be better, but not ideal. An 8Ghz Haswell would be pretty damned good. Would you have preferred that I state some particular basis for the core design on a hypothetical level? Fine, Custom 8Ghz Icelake Quad, 64-Core on the BIG.little concept (on package 64 ARM-style cores to run OS/etc in the background without notably interfering with primary gaming threads). Until GPGPU gets more mature, AI in particular amongst other things does better on the CPU.

I am well aware of core loading and how difficult it is to utilize multicore setups to the max, quite frequently 1 main core sees 90%+ usage while the others vary widely. Extremely talented devs with a great grasp of a particular setup can do more, as Naughty Dog did with TLOU on PS3 (some great videos on the making of it showing just how well they loaded the Cell's SPEs).

My whole post was meant not to be a light-hearted observation and what-if kind of offhand thing, not an aspergers episode. I've worked in IT for over a quarter century, and I wouldn't think to bore the crap out of everyone going into excessive detail about something so utterly meaningless.



Arkaign said:

Did you wake up today and feel like being a disagreeable person just for the same of being argumentative and assumptive?


It is how I wake up every day. Due to walls in my premises... I can only get out of bed on the wrong side.


Arkaign said:

You make a bunch of replies based on your opinions (900p is horrible, 1080p is dated, etc). Your opinion is valid for you, but sitting way back at a buddies house looking at a 60' TV, SW Battlefront doesn't look terrible by any means. Your own chart pretty much says this for a 60" TV @ 10'.

Yet. The chart doesn't tell the entire story. You still see benefits with increased resolutions, I can go in-depth and technical once again if you so desire.

Arkaign said:

On your 4K comments, derp, I've SEEN 4K because I've done it with 1070s on a 43" 60hz setup at close range. I simply preferred 1440p @ 100+FPS on Gsync. The smoothness was much better to me than the resolution bump. You basically started an argument on this point to basically agree with me (eg; give me a high refresh 4K/5K whenever that happens, but for now 1440P/100+ >> 2160P/60, and ESPECIALLY 2160/30). It was weirdly located as well because I was quite clearly referring to consoles pushing 4K, when we know to expect a lot of 30fps games.

Far from it.
I didn't start an argument about that point. You probably misconstrued my intention, I merely just added my opinion, take it as you will, you obviously took it hostile which is far from my original intention.

Arkaign said:

Then you make assumptions on why I got the 480, I gifted my 390 to a friend whose card died and I needed a drop-in for an HTPC that rarely sees games other than for guests, not an upgrade. Unfortunately the 480 was a whiny, coil whine mess that had basically zilch for OC.

No. I didn't make any assumptions on why you bought the Radeon 480, I couldn't care less on why you bought it, you could use it for a door-stopper or as some exotic type of freaky toilet paper.
I merely stated that you need to re-align your expectations with that card relative to AMD's prior cards, it's not a replacement for AMD's high-end offerings.

I am under the firm belief that there is no such thing as a bad card, only a bad price. - Coil Whine is also not an AMD issue. That is a manufacturer issue, nVidia cards can also exhibit the same problem.

Arkaign said:

Then you make some weird argumentative things about an insanely hypothetical speclist. Of course the human brain doesn't process visual data in 'frames per second', but higher FPS appears smoother, this is quite obvious. What's the point in really arguing about a hypothetical 1024-bit bus, when speaking of a vaporware system with 2TB of memory?

Because you brought it up?

I think you don't understand the point of a Forum.
A forum is for people to share and express themselves in writing... But it also allows other people to reply, breakdown those expressions.
If that is something you do not agree with, perhaps you might need to think about why you are here?
Anyway. No need for the attacks.

Arkaign said:
And where exactly did I state specifically that clock speed was the only thing that determined performance? Uarch, IPC, scaling, all of this has a ton to do with what the end result would be. An 8Ghz Intel Atom gen1 would be 8Ghz trash. An 8Ghz Jaguar would be better, but not ideal. An 8Ghz Haswell would be pretty damned good. Would you have preferred that I state some particular basis for the core design on a hypothetical level? Fine, Custom 8Ghz Icelake Quad, 64-Core on the BIG.little concept (on package 64 ARM-style cores to run OS/etc in the background without notably interfering with primary gaming threads). Until GPGPU gets more mature, AI in particular amongst other things does better on the CPU.

Glad we cleared that up.

Arkaign said:
I am well aware of core loading and how difficult it is to utilize multicore setups to the max, quite frequently 1 main core sees 90%+ usage while the others vary widely. Extremely talented devs with a great grasp of a particular setup can do more, as Naughty Dog did with TLOU on PS3 (some great videos on the making of it showing just how well they loaded the Cell's SPEs).

That's not the point.
You see you can take CPU development approaches in two ways.

You can spend transisters to make the CPU wider by throwing more cores at the problem... Or you can spend transisters to bolster the performance of fewer cores.
CPU's tend to work on highly serialized tasks that take advantage of high-performing cores rather than lots of cores, thus it makes sense to be conservative with your core counts (Intel could have a 100 Core CPU if it wanted...) and focus on fewer higher performing cores.

Then you move the highly parallel stuff to a chip that is designed to handle that, like a GPU.

As for Naughty Dog and the TLOU and the Playstation 3, I could go in depth with it's technical underpinnings, but I'll just state it looks as good as it does thanks to very smart artistic assets and use of those assets.

The Playstation 3 is also not representive of newer hardware that we have today, the hardware has already gone through various paradigm shifts since the PS3's launch a decade ago, GPU's are far more powerful, far more flexible, far more programmable. There is no need for Cell-styled CPU architectures. (Which wasn't really a good all-round CPU anyway.)

Arkaign said:
My whole post was meant not to be a light-hearted observation and what-if kind of offhand thing, not an aspergers episode. I've worked in IT for over a quarter century, and I wouldn't think to bore the crap out of everyone going into excessive detail about something so utterly meaningless.

As a Carer... Making fun at the disabled, ill, frail is not on. Don't do it.

And you have obviously misconstrued my intention as something hostile, I wasn't. Lighten up.



--::{PC Gaming Master Race}::--

so whos getting the Pro?



Around the Network
Ruler said:
so whos getting the Pro?

I may, I'll probably wait until there are better deals next year. I wish I had more time to mess with gaming, but I remember the most time I spent with my PS3 was with Gran Turismo 5. I have had X1 and PS4 vanilla for brief periods, but end up giving them away because I just find myself on my PC with limited gaming time.

I'm really happy about PSVR and hope it takes off, it could really help PCVR by proxy, giving devs more of an install base to make more ambitious titles, and I also expect $200 headsets of pretty decent quality in short order (fall '17) thanks to the incredible downward price pressure out of China. None of the components are exotic or all that expensive now unless you're going for bleeding edge, and something I might spend 30 minutes a month with I couldn't justify Vive or Rift pricing.

-Pemalite, lol no worries mate, I find myself a bit grumpy oft times. I got home, saw a reply and was looking forward to it, and then found a bunch of assorted nitpicks that rubbed me the wrong way. I am the one who has to refrain himself from getting extremely pedantic, because I will go overboard with detail at the drop of a hat, and I work hard to mentally manage myself not to do so. This is the internet, chances are nobody cares anyway, so I keep it light on purpose. Cheers.



Ruler said:
so whos getting the Pro?

Depend of the features i get in 1080p

I don't plan in getting a 4k HDR tv this year.



Ruler said:
so whos getting the Pro?

In law bro will. I'll wait for Scorpio... 



Ruler said:
so whos getting the Pro?

Eventually I guess. Until I get a 4k TV it's pointless though. Unless it turns out that I can set games up at 1080/60 instead of 4k/30 then I'll be getting it ASAP. But I don't think that's a thing.



Ruler said:
so whos getting the Pro?

Me.



Watch me stream games and hunt trophies on my Twitch channel!

Check out my Twitch Channel!:

www.twitch.tv/AzurenGames