By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Overhead/waste in PCs regarding gaming graphics hasn't been nearly like it was back in the bad old days. In fact, you have two distinct areas to think about :

CPU : In this area, a PC usually has quite a bit more going on. It would get really boring to go into all of the details, but suffice to say, unless you're really running barebones with absolutely nothing in the background, a console OS will have less happening that demands CPU attention. Result : on consoles you can generally get away with a good bit less CPU power, though sadly this results in making 60fps titles hard to do unless you really cut down on API and make good use of GPGPU functions.

GPU : Vastly different. nVidia and AMD have made massive strides in their drivers and architecture for the pursuit of efficiency. Back in DX9 and earlier, draw calls were a massive roadblock to performance, but multithreaded drivers are a thing now, and newer/better DX versions make the gap tiny between console and PC. DX12 and Vulcan eliminate any real difference at all. This isn't a new 2016+ era thing either, there are TONS of games where an i3 and 750ti (a dual core processor and a GPU virtually equal in power to the 7850-level APU in the PS4) meets or exceeds console performance easily. See : 720/900P games like BF4/SWBF that run at 1080P on 750ti with console detail settings.

TLDR : It's not 2009 anymore. Consoles aren't any more efficient than PCs in using their GPUs effectively to any substantive degree.