By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Darc Requiem said:

The RTX 40 series is actually worse than I thought. I saw a commenter saying the 4080 16GB is a rebranded 4070 and the 4080 12GB is a rebranded 4060ti. If you check the percentage of CUDA cores for a 3070 vs a 3090 and a 3060ti vs a 3090 and then compare them to the 4080 12GB and 4080 16GB vs the 4090 the dude isn't be BSing. WTF Nvidia.

4090 - 16384 Cuda Cores
4080 (16GB) - 9728 Cuda Cores (58.38%)
4080 (12GB) - 7680 Cuda Cores (46.88%)

3090 - 10496 Cuda Cores
3070 - 5888 Cuda Cores (56.1%)
3060ti - 4864 Cuda Cores (46.34%)

I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough.

After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name.  

I couldn't disagree more. This is about Nvidia trying to hide the fact that they are charging double the price for what should be the 4070 and 4060ti. There is a 6656 CUDA core gap between the "4080" 16GB and the 4090. There is only a 1792 CUDA core difference between the 4090 and the RTX 6000. Which has the CUDA core count that the eventual 4090ti will have as it has the full AD102 die. Also here is the RTX 20 series for further reference.

2080ti - 4352 Cuda Cores

2070 - 2304 Cuda Cores (52.9%)

2060 - 1920 Cuda Cores (44.1%)