To both of you:
Darc Requiem said:
I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough. After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name. |
I couldn't disagree more. This is about Nvidia trying to hide the fact that they are charging double the price for what should be the 4070 and 4060ti. There is a 6656 CUDA core gap between the "4080" 16GB and the 4090. There is only a 1792 CUDA core difference between the 4090 and the RTX 6000. Which has the CUDA core count that the eventual 4090ti will have as it has the full AD102 die. Also here is the RTX 20 series for further reference. 2080ti - 4352 Cuda Cores 2070 - 2304 Cuda Cores (52.9%) 2060 - 1920 Cuda Cores (44.1%) |
And
Bofferbrauer2 said:
JEMC said:
I don't really agree with that. Sure, the difference between the 4090 and real 4080 is massive, But I think it has more to do with Nvidia leaving a gap big enough between the different models to fit the -Ti models that will come next year, so they bring enough performance improvements to make them look good enough. After all, while Nvidia didn't say a thing, GALAX confirmed the chipsets uded on each GPU, and they don't differ much from what they've been using these last generations, with the exception of the 4080 being AD103 when Nvidia rarely uses the x03 name. |
The 12GB 4080 has less Cuda Cores than either of the 3080 models (8704 for the 10GB version; 8960 for the 12GB version), while the 3080Ti was almost a full 4090 with just half the VRAM. Also, both 3080 had a higher memory bandwidth than either of the 4080 models. While I can understand NVidia wanting to make a bigger gap between the 4090 and some 4080Ti down the line, the gap between 4080 and 4090 is too big. There's enough space for 3 3080Ti versions to fill up the gap between 3080 and 3090. Basically, the 16GB 4080 is a more like a 4070Ti with a 3080Ti pricetag. |
I guess we'll have to agree to disagree here becuse I don't think Nvidia would put a 4060 chip and disguise it as a 4080 unless:
A) They'd gone nuts and had lost all connection with reality
B) They, somehow, know that AMD's RDNA3 sucks and can use a mid range chip to compete with their high end stuff
Personally, for the sake of competitiveness and the market, I don't like neither of them.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.