By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Captain_Yuri said:
7900XTX4080
24GB 384 Bit Bus16GB 256 Bit Bus
192 ROPs112 ROPs
960.0 GB/s716.8 GB/s
96 CU76 SM
96 RT Cores76 RT Cores
61.42 Teraflops48.74 Teraflops
355 Watt TDP320 Watt TDP
ChipletMonolithic


Yet some how, these two are with in single digit % of each other in Raster and around 30-40% difference in Ray Tracing in favor of the 4080. This is like the inverse of Ampere vs RDNA 2 just that Ampere had the Ray Tracing and DLSS advantage while RDNA 3 has no advantage other than price. But when a person is spending $1000 and the competition is $200 more but gives you superior Ray Tracing + DLSS while being more power efficient... AMD just might have sold the 4080...

Idk how a company can create a chiplet architecture that is more inefficient than monolith but here we are... Even their transient spikes feels like Ampere spikes but on TSMC.

Overall both the 7900XTX and 4080 are terrible products for the price. Hopefully both will get a price cut sooner than later.

Agree, overall. As I mentioned in my post above, when dealing with high-tier cards, people go for the sheer performance. If RT is one generation behind, and there's no advantage in wattage, the selling points decrease for AMD. People are counting cost-per-frame, but failing to take into consideration the superior RT performance, as well as what will likely be immense strides in DLSS 3.0 in the coming 1-2 years. I really can't believe how poorly they managed the efficiency on these cards, they're thirsty as hell. I was equally surprised by the new line of Ryzen CPUs; they have some insane consumption figures attached to them.

Personally, I think that the 4080 will still be tough to move, at this price at least. There were probably a few gamers on the fence, waiting to see what RDNA3 could provide. I have a feeling most of them were a bit disappointed, I know I was. This reminds me of when I waited to get the 980 Ti, waiting to see what AMD had to offer. Turned out to be a waste of time back then, I believe this to be the same case. As you say, a couple of hundred dollars this way or that isn't much when dealing with folks willing to dish out 1000$ + on a GPU.