TallSilhouette said:
The full PS6 (not cross-gen) era, I'm guessing. So maybe 2030? |
I don't think we'll see next-next gen only titles by 2030, if the rumor about PS6 launching in 2027 is accurate, something we don't know.
In any case, if MSoft is already going for it now, I won't be surprised if the change happens sooner than that.
Cyran said:
Good video, one thing that surprise me is in this video and multiple others like the level one tech review for example for a single player game like cyber-punk multi frame generation seem to work really well to the point that if you have a 4k 240hz monitor it might be the best experience using Muti-frame generation on. I plan to use mine for more then gaming like running AI LLMs and eventually training which is where the 5090 does have a big advantage over a 4090. Either way am coming from a 3090 so I still see a huge increase in gaming + the huge increase in AI stuff. My only concern with power draw is Tripping the circuit breaker in my office as it be a pain trying to get a electrician to run another line to my office as it on opposite side of house then my breaker box. Already running a 1600 watt PSU from that standpoint I got that covered.  |
Keep in mind that while you see more frames, only one in four is real, and that real one is the that dictates latency. If you're sensitive to that or play fast paced games, it may feel weird.
Reflex 2 should help with that, but Nvidia hasn't launched it yet so we don't know for sure.
Jizz_Beard_thePirate said: ASUS PCIe Slot Q-Release Slim mechanism may scratch your GPU, first RTX 5090 affected Pretty big fuckup by Asus NVIDIA has removed “Hot Spot†sensor data from GeForce RTX 50 GPUs https://videocardz.com/pixel/nvidia-has-removed-hot-spot-sensor-data-from-geforce-rtx-50-gpus |
It feels like Asus can't get it right nowadays. The worst part is that they had a system that worked so well everyone else copied it, and now they've decided to abandon that for a new thing... that creates new problems.
Granted, this is onlya problem for reviewers and now regular users that rarely change GPUs, but it's another stain on their perceived quality.
Getting rid of the Hot Spot data is weird. I wonder why they've done it.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.