By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Xxain said:
Cerebralbore101 said:

Like having a 2070 Super vs 2080 Ti. Or 100 FPS 1440p on Witcher 3 at high, but not max settings, and 135 FPS 1440p on Witcher 3 at high, but not max settings. If we are talking about max settings on Witcher 3, then that would be something like 80 FPS 1440p for 10 TF, and 100 FPS 1440p for 12 TF.

Source: Logicailincrements.com

Use their chart, and mouse over the GPU section to see FPS rates on Witcher 3.

Another Source for the TFlops of the GPUs: https://www.engadget.com/2018/09/14/nvidia-rtx-mesh-shaders-dlss-tensor-cores/

But ultimately not a major performance difference? I'm tryna see what the fuss about. I have a feeling this thread is about techs hoes flexing about something that is ultimately minor.

It's a huge improvement over the base model PS4, and XB1.

But yeah, it's not that big of a difference. One console could launch at $400 and 10 TF, while another one launches at $500 and 12 TF. When factoring in price to performance both machines would be on equal footing in the graphics race. It would just be a repeat of PS4 Pro vs XB1X.