JRPGfan said:
LegitHyperbole said:
Too much cost for too little benefit. In Spiderman 2 you're playing the game at such speed you aren't going to no5ice reflections and overall it just looks like the contrast has been adjusted. In Cyberpunk and The Witcher 3 you'd swear you just adjusted gamma or contrast it's that pitiful. I do see some really cool stuff on PC but even 4090's struggle to bring about the really cool stuff. We are struggling for 60fps even on the PRO 60fps is still an issue with some patches, Raytracing should be completely ignored until the hardware can hit 4k 60 and then, try supplementing with RT but it shouldn't be a tool to ignore creating great graphics. It should be a complete afterthought. |
The real mind-f*** is pathtracing. Theres these really old games they added it too, which then look much much better, but has a 4090 doing like 15-20 fps. While older and weaker cards are like into the 1-2 fps range.
As soon as you unlock more power, developers will find a way to waste it :P |
Yep, old games can take it more so than newer games cause they're not trying to do both graphical evolution and this tech at the same time. Like Minecraft can do Raytracing so well cause it's so basic. This power should never have been unlocked on console, like you said, the biggest waste we've ever seen and one that has buggered the 'next gen" feel of 9th gen.