Chrkeller said:
Soundwave said:
The PS5 is what it is. There's no magic it has, if it did it would run Cyberpunk 2077 better than it does, unless you are saying they purposely gimped the port of the game. 1440p/60 fps at medium tier settings is the max it can do. Maybe you can squeeze a touch more, but I seriously doubt the Cyberpunk 2077 developers left like 30-40% performance on the floor, if the PS5 was able to run that game at like 60 fps/4K, I'm sure they would be happy to do it. It can't do that because resolutions above 1440p are incredibly taxing on any GPU and eat up a disproportionate amount of power to make an already clean looking image even sharper.
Just like a 2050 isn't a PS5 exactly, a PS5 certainly isn't a high end GPU either.
If we're going to bring a 2050 into this, a 2050 runs anything a PS5 can with some dropped settings sure, but it's no game a PS5 game run right now that a 2050 wouldn't be able to run in playable state barring exclusivity nonsense. It's not like the PS5 is running anything at High/Ultra settings, it's ray tracing abilities are shit too, and that's going to be the standard for the next 4-5 years really, like it or not. PS5 Pro is never going to touch sales of the base model.
|
Pretty sure everyone agrees. A 1070 runs 95% the same games as a 4090. Running the same games is common, it isn't 1990 anymore. Running the same games doesn't equate to same class of hardware.
|
Well that's pushing to an extreme, but a PS5 isn't much more powerful than a 2050 to begin with. It's maybe double the performance and that doesn't mean jack shit today because resolution and frame rate basically can eat that difference up in a second, so those two pieces of hardware are the same class of hardware.
It's not like a PS5 is running Alan Wake II on Ultra with ray tracing, not even close.
In my day simply flipping from low to medium settings is not a generational leap, not sure why I'm supposed to believe that is a different class of hardware today.