By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Otter said:
Nu-13 said:

1800p is the minimun. We don't see 720p ps4 games, right? Then we won't really see 1440p ps5 games (not including disgraces like arc).

I know I said I'll leave this thread but its too irksome to read this and not respond lol

You cannot arbitrarily compare 1440p for next gen systems to 720p for current gen.  720p is a blurry image by modern standards which hides lots of details like textures and DOF, making a move to higher resolutions a necessity to actually explpoit the hardwares potential.  Its not adequet  for most people's gaming set ups unless they have a small 27" TV and sit quite far from their display. Even content on youtube rarely sits at 720p, 1080p is standard for what people experience and expect.

1440p with quality reconstruction is not a blurry image that hides lots of details, it is not below standards, infact it is an quality very few console gamers have ever experience. Its a brilliant quality, clear image which makes a notible improvement on 1080p and has delivered us some of (if not the) best looking games of the console generation (Uncharted 4, Death Stranding, God of War, Horizon, soon to be TLOU2 as well).  Even 1080p (reconstructed to 4k) next generation will be way more acceptable compared to 720p this generation, so I expect we will see it, especially in experimental ray tracing titles and in game settings where 60fps/RTX is seen as an option but not the default. There's an actual human experience behind these numbers which is why 30-60fps has been the standard for 30 years, the numbers aren't just going up for the sake of it. Developers are not going to mandate 1800p because its X % of 4k or X multiples of 1080p. They're going to look at the balance of visual characterics in their game and aim for a combination that achieves their vision and is most impressive for the end user, this is also why dynamic resolution and VRS will permeate next gen.

Again I would really love to see someone who has played a game like God of War at 1440p on a 50" 4k display complain that the image quality is bad & that gamers will not be happy with it over the coming years.

Just because a game is 1440P, doesn't mean everything in said game is rendered at 1440P.

Bofferbrauer2 said:

1: Until we know more about RDNA2 and Ampere, I'd rest that case. AMD could be closing the gap with a strong leap, Ampere not turn out as great as expected, and so on...

There will definitely be a sizable uptick in performance either way thanks to the increased transistor budget allowing nVidia some movement on that front and to account for any stuff ups.

shikamaru317 said:

2. Nvidia's 20 series will be 2 years old when next-gen consoles launch. There is a good chance that AMD's RDNA 2 ray tracing will be more efficient than Nvidia's 20 series raytracing, even if it is less efficient than Nvidia's upcoming 30 series. And like I said before, developers may make ray tracing an optional feature in many next-gen games, giving gamers a choice between ray tracing and higher resolution and/or framerate. 

There is a chance that AMD's RDNA 2 will be less efficient too and less capable on the Ray Tracing front, lets not count our eggs in a row just yet... nVidia has had the technology edge for years now.

shikamaru317 said:

3. PC games don't receive the same level of optimization that console games do, and console hardware has less OS overhead than PC's. So comparing Control on PC to next-gen console games and saying that you won't see a graphical improvement if they aim for native 4K is a bad comparison.  

Do consoles receive extra optimization? Sure.
But they aren't receiving a level of optimization that has made the Radeon 7850 unable to play console games with Playstation 4/Xbox One levels of visuals for the most part.

And OS overhead is actually *less* on PC. Windows 10 doesn't need 2x CPU cores and 3GB of Ram on a PC.




www.youtube.com/@Pemalite