By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Pemalite has it nailed. People need to stop thinking that 'console=magic amazing use of resources' like it was back when companies had to learn how to squeeze incredible tricks from bizarre architectures.

X86+GCN is pretty well understood and mature at this point, and Windows gaming performance has gotten extremely efficient at using the hardware. This is one reason an i3 + 750ti can match or exceed PS4 performance in most cases. Even the most impressive console games (UC4 comes to mind) are impressive primarily from the budget and incredible talent at work. Art direction, motion capture, the ability to put genuine effort into minute background details abound, and the result is awesome. But in pure performance terms, UC4 being 1080p/30 with high textures, decent AA, etc doesn't look like something that wouldn't run pretty close to that on a similarly specced PC.

Hence, Neo/Scorpio aren't going to magically do 4K with ease. AMD 6TF is pretty good, but not ideal for 4K gaming without cutting some corners.

Now, turn off AA, scale back lighting, and I think it can look pretty good. Personally in a typical console setting (say 60" TV from a 10' couch distance), I'd prefer 1080p/60 with the extra resources going into very high polycount models, UHD textures everywhere, etc. A great example can even be done on PC. Run vanilla Skyrim maxed at 4K, then run modded Skyrim with HD textures, model upgrades, HDR, etc at 1080p, and the 1080p game looks MILES better.

I'm not anti-4K either. I have a 4K 60hz displayport panel (Dell), and a 1440p 144hz display, and I even prefer 1440p/60 over 1080/144 by a pretty big margin. This is sitting at my computer desk in my gaming throne though :) I have exceptional vision, but 4K @ 10' is meh to me. I can tell 720 from 1080 at that distance, but 1080 vs 4K at that distance is tougher (granted, I only have a 60" TV for my gaming HTPC, an 85" or higher and I might feel different).