ethomaz said:
No it will never happen...
What the consoles did better than PC is to use the 1.8 TFLOPS better than 2.5 TFLOPS for example... is is like the console uses 90% of 1.8 FLOPS and a PC uses 50% of 2.5 TFLOPS... there are a lot of overhead in PC not present in consoles.
In raw power the GPU on PC will be always more powerful but the result will just be noticiable in way more powerful GPU like GTX 680, HD 7970, HD 7790, etc...
Anyway the HD 7790 is a dual-gpu that uses Crossfire to work... the Crossfire is even less eficient than SLI... so the unused raw power in this GPU is even bigger than HD 7970 (single-gpu).
|
You forget that whilst consoles can technically run the same games that a high-end PC can, it also does at a massive cost to image quality and framerate.
You get reduced lighting, reduced particles, reduced textures, reduced resolution, reduced texture filtering, reduced shader effects, reduced anti-aliasing, reduced geometry... Heck even reduced map sizes, player counts and A.I enemies in some cases.
Seriously, get your PS3 to play any PS3 game at 7680x1440 and watch it get 1fps or lower, it can't keep up to my PC, neither could the PS4 as that's the point of the PC, to use the better hardware for much better visuals and framerates, otherwise there would be no point to the PC gaming master race.
So yes, while a console can play the same games as the PC, the difference is like comparing a Wii to a PS3 graphically, it can't compete, never has and never will.
Also those percentages are just plucked out of thin air with zero factual backing.