Pemalite said:
Otter said:
I know I said I'll leave this thread but its too irksome to read this and not respond lol
You cannot arbitrarily compare 1440p for next gen systems to 720p for current gen. 720p is a blurry image by modern standards which hides lots of details like textures and DOF, making a move to higher resolutions a necessity to actually explpoit the hardwares potential. Its not adequet for most people's gaming set ups unless they have a small 27" TV and sit quite far from their display. Even content on youtube rarely sits at 720p, 1080p is standard for what people experience and expect.
1440p with quality reconstruction is not a blurry image that hides lots of details, it is not below standards, infact it is an quality very few console gamers have ever experience. Its a brilliant quality, clear image which makes a notible improvement on 1080p and has delivered us some of (if not the) best looking games of the console generation (Uncharted 4, Death Stranding, God of War, Horizon, soon to be TLOU2 as well). Even 1080p (reconstructed to 4k) next generation will be way more acceptable compared to 720p this generation, so I expect we will see it, especially in experimental ray tracing titles and in game settings where 60fps/RTX is seen as an option but not the default. There's an actual human experience behind these numbers which is why 30-60fps has been the standard for 30 years, the numbers aren't just going up for the sake of it. Developers are not going to mandate 1800p because its X % of 4k or X multiples of 1080p. They're going to look at the balance of visual characterics in their game and aim for a combination that achieves their vision and is most impressive for the end user, this is also why dynamic resolution and VRS will permeate next gen.
Again I would really love to see someone who has played a game like God of War at 1440p on a 50" 4k display complain that the image quality is bad & that gamers will not be happy with it over the coming years.
|
Just because a game is 1440P, doesn't mean everything in said game is rendered at 1440P.
Bofferbrauer2 said:
1: Until we know more about RDNA2 and Ampere, I'd rest that case. AMD could be closing the gap with a strong leap, Ampere not turn out as great as expected, and so on...
|
There will definitely be a sizable uptick in performance either way thanks to the increased transistor budget allowing nVidia some movement on that front and to account for any stuff ups.
shikamaru317 said:
2. Nvidia's 20 series will be 2 years old when next-gen consoles launch. There is a good chance that AMD's RDNA 2 ray tracing will be more efficient than Nvidia's 20 series raytracing, even if it is less efficient than Nvidia's upcoming 30 series. And like I said before, developers may make ray tracing an optional feature in many next-gen games, giving gamers a choice between ray tracing and higher resolution and/or framerate.
|
There is a chance that AMD's RDNA 2 will be less efficient too and less capable on the Ray Tracing front, lets not count our eggs in a row just yet... nVidia has had the technology edge for years now.
shikamaru317 said:
3. PC games don't receive the same level of optimization that console games do, and console hardware has less OS overhead than PC's. So comparing Control on PC to next-gen console games and saying that you won't see a graphical improvement if they aim for native 4K is a bad comparison.
|
Do consoles receive extra optimization? Sure. But they aren't receiving a level of optimization that has made the Radeon 7850 unable to play console games with Playstation 4/Xbox One levels of visuals for the most part.
And OS overhead is actually *less* on PC. Windows 10 doesn't need 2x CPU cores and 3GB of Ram on a PC.
|
Guys, so much talk about resolution, its pointless, anything that is 1080p is already fantastic, I have a 1080p 120projector, and I've played same consoles, same games before on my 4k HDR Panasonic 50inch, I've played full 4k on my xbox X, and I still prefer it in 1080p on the projector. (reason I have bought is the projector is in my house back in Portugal, the TV is with me in the UK where I work, I'd love a projector here too but I have no space here)
Movement seems more natural, the overall image looks more realistic, not so fake as on LCD TVS, the only good thing about high resolution is the super sampling, hence why my xbox X looks better playing in 4K on a 1080p screen, image is clean, not 1 single jaggie even at 120inch. In fact when I build my next PC I will look for a 1080p monitor to get the benefit of super sampling, I prefer antialiasing over resolution any day of the week. In fact even N64games look fantastic with AA on.
On the other hand 720p is unacceptable, looks way more blurry, very noticeable.
Forget resolution, 1440p is more than anyone really needs at any screen size.
Start worrying about 60fps on all games as a minimum with AA at minimum 4x
1080p is more than sharp enough, the biggest difference is AA and many other effects, I've also played games at 1440p on PC that look horrible even with 16x AA.
Image quality is way more than resolution. Even HDR is just marketing, because contrast looks better on my projector than on my 4K HDR Panasonic. In fact everyone that comes is way more impressed with image size and how natural and realistic it looks, no one has praised my 4K TV, its barely better than a good 1080p TV, has no wow factor.
If you want quality then get good quality equipment and great speakers. a 4K HDR cheap Chinese brand will look miles worse than a top of the range 1080p TV.
Last edited by victor83fernandes - on 13 March 2020