Trumpstyle said:
Okey, my point was that a native 60fps game should hover around 50ms, a 30fps game around 100ms and a cloud service will add additional around 50ms. If something else is happening I either blame the testing method or the game devs. |
It largely depends on how it's being tested hence why you need the same or similar testing environment to make a proper comparison. Cloud won't automatically add around 50ms as it depends on a lot of factors. Sometimes it will add way more and sometimes, less. As the technology improves and so do peoples internet, cloud will get better. It's just a matter of when and Nvidia so far seems to be leading the pack.
If you think something is off, you need to find another outlet that has done similar testing in a similar testing environment that shows that it's off. From what I heard in the video that's in your example, NXgamer is "deducting" screen latency from his numbers. DF isn't deducting the screen latency from what I can tell as they are using LDAT. They are completely different testing methodologies hence why you see different numbers. Maybe next time, you should research this yourself before thinking something is off...
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850







