By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Captain_Yuri said:

NVIDIA GeForce RTX 40 Graphics Card Rumors: AD102 GPU For RTX 4090 24 GB , AD103 GPU For RTX 4080 16 GB, AD104 GPU For RTX 4070 12 GB, Titan-Class Up To 48 GB & 900W

https://wccftech.com/nvidia-geforce-rtx-40-graphics-card-rumors-ad102-gpu-for-rtx-4090-24-gb-ad103-gpu-for-rtx-4080-16-gb-ad104-gpu-for-rtx-4070-12-gb-titan-class-up-to-48-gb-900w/

900 Watt Titan?

Tbh that's a pretty large cuda core gap between a 4080 and a 4090:

This gen, they had to use GA102 for 3080 where as next gen, if they do decide to use AD103 for 4080, either Nvidia is going greedy or they think that top end RDNA 3 won't be able to compete with AD102 full spec. It could also be that using AD103 was the only way to keep the 4080 price somewhat reasonable where as 4090 might be crazy high. But we will see though. The cache increase will be a very interesting factor as well. We know that infinity cache from AMD helps a lot with achieving high fps at lower resolutions. So with next gen, you get a massive increase with Lovelace by going from 6mb of L2 cache + 384 bit memory bus to 96mb of L2 cache + 384 bit memory bus vs 512mb of L3 cache on RDNA 3 + 256 bit memory bus.

Also 300 watt TDP for a 70 class is kind of insane...

I'm having troubles diferentiating between what I hope and what I believe regarding these rumors, but I'm having a hard time believing the power consumptions listed by kopite.

A 900W card, really? That's simply insane. I could understand doing some stress tests to see if the PCB will be able to handle that, specially with the "power excursions" that can occur but, for an actual retail card, it's really hard to believe. Just think about the coolers that we've seen with the 3090 and 3090Ti cards and then try to imagine them for a card that uses twice as much power. You would need a 240/280 AIO cooler to keep it in check, if not a bigger one.

Using smaller chips for the 4080 and 4070 doesn't seem right either unless Ada brings much, much bigger improvements than first thought or Nvidia knows that, for some reason, AMD has screwed RDNA 3. Or well, cost as you said.

As fore the 4070 using 300W of power... oh, c'mon! i mean, let's see: the 3080 using the GA102 chip and Samsung's inneficient N8 process has a 320W TDP. And I have to believe that the lower in scale AD104, with rumored similar specs when it comes to SMs and CUDA cores (althought these could be doubled again) will use almost the same amount of power despite using the best process node available? How?!

/rant

Last edited by JEMC - on 27 April 2022

Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.