Quantcast
View Post
Bofferbrauer2 said:
curl-6 said:

Is that 5-year-old desktop GPU though? Cos if so even a card from 2013 is likely going to have more grunt to it than a mobile GPU from 2015. Furthermore, we can see here that the aesthetic of the game is basically completely broken by running at these settings. It's more likely the Switch port will try to maintain the core look as much as possible which means it won't have the luxury of stripping things back in this way. The footage shown in the direct suggests they're taking an approach closer to Doom 2016 and Wolfenstein II where as much of the rendering pipeline as possible is kept intact with the tradeoffs being made in things like resolution, alpha, etc.

You just could have checked the video and then you'd know that they used a 750Ti coupled with a Core 2 Duo. That combo barely compares to the Xbox One (slightly more GPU power in theory, but that gets hold back by the dualcore CPU and it's FSB so much the Xbox should almost be able to run circles around the rig. In effect it's about what the Switch should reach.

The 750 Ti is a PS4 level GPU. Maxwell is much more efficient, per streaming multiprocessor, than the GCN architecture in the XB1/PS4. And CPUs don't evolve as fast as GPUs. The Core 2 Duo performs better than Jaguar CPUs quite easily:

https://www.game-debate.com/cpu/index.php?pid=1961&pid2=15&compare=%20APU%20A6-5200M%20Quad-Core-vs-Intel%20Core%202%20Duo%20E8400%203.0GHz

Granted, the PS4/XB1 can use 6 or 7 cores instead of 4, but at a lower clock than desktop Jaguars, and even then, the difference would still be there in favor of the Core 2 Duo. If it were a console, the Core 2 Duo + GTX 750 Ti would be close to the PS4 graphically, and be much more balanced than the Wii-U (the console which had the largest CPU-GPU gap I can think of ).