By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Jizz_Beard_thePirate said:
JEMC said:

To be fair, we're reaching the limits of what can be done with silicon chips and how much the nodes can be shrunk. Nvidia will have no node advantage this gen while AMD will go, I think, from 5/6 to 4nm, which isn't a big improvement there.

There's also the physical limitatin of how big can the chips be. AMD tried the chiplet approach to solve that and failed, in its first attempt, while Nvidia has yet to try.

We can expect architectural improvements and ne designs, of course, but that alone can't make up for the other two problems.

And let's not forget that we're seeing new engines that don't give a f*ck about optimization and shoehorn in the different upscaling techs to make the games run by brute force.

So, all in all, this is the new reality and you'll be a fool if you think that a miracle will happen and things will be as they used to.

While I do agree that there is an impending wall of doom coming. I think Nvidia still could have configured it a lot better while still making a ton of money. There's nearly 10,000 cuda core difference between 5080 and 5090. If they configured 5080 as GB202 and gave it 12,000 or 14,000 cuda cores, it probably could have matched a 4090. And if they priced it $1000 while pricing the rest accordingly and upping 5070 vram to 16GB, I think this would have been a pretty good generation. But instead they choose the super greedy route where the only GPU in the line up that's remotely worth it is the 5070 Ti. The rest of the GPUs feel bad to get including the 5090 where as at least the 4090 felt godly if you had the big bucks. The 5090 feels meh at best.

I honestly have no idea what is Nvidia trying to do with the 5080. For once, it's the first time that the xx80 part gets roughly half the cores of the top of the line part. There was a time were that would have been xx70 specs, but no anymore.

But then there's the fact that they're using the fullGB203-400 chip for the 5080, which means that the inevitable 5080 Super will have to be a very cut down GB202 or Nvidia will just replace the 2Gb memory chips with 3Gb ones, maybe a slight frequency bump, and call it a day.

It's obvious that they've taken advantage of AMD not having high-end parts and have cut the specs all they could and more.

Bofferbrauer2 said:
Jizz_Beard_thePirate said:

NVIDIA’s GeForce RTX 50 GPUs Expected to Have Limited Launch Availability; Team Green To “Nitpick” Retailer Distribution of SKUs

https://wccftech.com/nvidia-geforce-rtx-50-gpus-expected-to-have-limited-launch-availability/

NVIDIA GeForce RTX 5090 appears in first Geekbench OpenCL & Vulkan leaks

https://videocardz.com/newz/nvidia-geforce-rtx-5090-appears-in-first-geekbench-opencl-vulkan-leaks

Worst case 26% and best case 37%

Somehow, the RTX 50 keeps getting worse

There's a good reason why Nvidia is killing the 4000 series cards so soon, unlike with past new launches.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.