By using this site, you agree to our Privacy Policy and our Terms of Use. Close
sc94597 said:
HoloDust said:

~40-45fps @4K native I reckon, from seeing how 5090 performs? 

At 1800p (~56% of 4k) over Oculink eGPU (Ryzen HX 370 for the mobile processor) I am able to cap it at 60fps with DLAA.  Given that this is on a 13.3 inch display, the image is very clean. Also tried it on my 5120 x 1440 UW display (~90% of 4k) and got a variable 65-90 fps with DLSS Quality, almost can cap it at 60fps without DLSS but there are too many drops to 50fps for it to be a smooth experience. Plus even DLSS Quality looks better than the native AA or no AA. If I enable DLAA I can almost get a stable 50fps though, which isn't that bad of a frame-rate for a game like this and this monitor supports G-Sync. 

But yeah, native 4k 60fps is likely not going to happen on the 4090 at max settings for this one. Even if I popped the 4090 into a full build without the slight (5%) performance penalty of Oculink I think this would be true. 

I remember playing Oblivion on a Geforce 8600GT back in the day at 20-30 fps, at like 1024 x 768, if I recall? So definitely a wonderful experience compared to how I first played the game. 

Yeah, I guess that what's UE5 over Gamebryo does to even most powerful of GPUs.

That said, 3050 laptop (so 2048 shaders at 1950+ MHz):




Atrocious "optimization" - I looked up Atomfall on 3050, it runs high at 60fps@1080, 100% scale...of course, it doesn't run on UE5, but in-house engine.

UE5 is bane to modern VG industry.

Anyway, back to Switch 2 - I went down the rabbit hole, looked up some 10 year old benchmarks and did some cross-referencing - Switch 2 vs PS4 should be around 1.6x - initial math regarding where SW2 falls compared to 2050/3050 mobile still stands (which is some 5-6% below 1050Ti), but these old benchmarks where 1050Ti is pitted against R7 265 give clearer picture where it is compared to PS4, as well against PS4 Pro (around 1.6x SW2) - so I guess right in the middle between base PS4 and PS4 Pro.

Of course, these are just conjectures based on equivalent PC GPUs and cross-referencing various benchmarks across the years with sometimes fairly limited selection.