By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Bofferbrauer2 said:
Captain_Yuri said:

NVIDIA GeForce RTX 40 Graphics Card Rumors: AD102 GPU For RTX 4090 24 GB , AD103 GPU For RTX 4080 16 GB, AD104 GPU For RTX 4070 12 GB, Titan-Class Up To 48 GB & 900W

https://wccftech.com/nvidia-geforce-rtx-40-graphics-card-rumors-ad102-gpu-for-rtx-4090-24-gb-ad103-gpu-for-rtx-4080-16-gb-ad104-gpu-for-rtx-4070-12-gb-titan-class-up-to-48-gb-900w/

900 Watt Titan?

Tbh that's a pretty large cuda core gap between a 4080 and a 4090:

This gen, they had to use GA102 for 3080 where as next gen, if they do decide to use AD103 for 4080, either Nvidia is going greedy or they think that top end RDNA 3 won't be able to compete with AD102 full spec. It could also be that using AD103 was the only way to keep the 4080 price somewhat reasonable where as 4090 might be crazy high. But we will see though. The cache increase will be a very interesting factor as well. We know that infinity cache from AMD helps a lot with achieving high fps at lower resolutions. So with next gen, you get a massive increase with Lovelace by going from 6mb of L2 cache + 384 bit memory bus to 96mb of L2 cache + 384 bit memory bus vs 512mb of L3 cache on RDNA 3 + 256 bit memory bus.

Also 300 watt TDP for a 70 class is kind of insane...

I wonder if that 900W GPU ain't actually 2 GA102 in some kind of SLI setup.

Either way, no need for a heater anymore if you got a Lovelace GPU. I wonder how NVidia wants to use those in the laptop market if they are so power-hungry...

Captain_Yuri said:

NVIDIA claims superiority with its Game Ready drivers, jokes about competitors’ sub-par beta drivers with multiple forks

https://videocardz.com/newz/nvidia-claims-superiority-with-its-game-ready-drivers-jokes-about-competitors-sub-par-beta-drivers-with-multiple-forks

While AMD drivers aren't as buggy anymore as they once were, Nvidia does overall have better driver support. Especially when you consider how AMD has stopped driver support for their 300 series GPUs that released in 2015 while Nvidia continues to support their 900 series GPUs that released in 2014...

Could be due to Maxwell supporting DX12_1 while GCN3 only supports DX12_0 and WDDM 2.0 (Maxwell: 2.1). GCN3 is just too outdated for modern Games.

Meh, I think 300 series had plenty of life left in it.

Just look how well the open source community driver performs in the tests and it can still play a ton of games. The fact that Kepler with it's terrible architecture received longer driver support than 300 series which had a much more modern feature set is nuts.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850