By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Bofferbrauer2 said:

Higher TDP?

  • HD 7970: 250W
  • R9 380X: 190W
  • RX 470: 120W
  • RX 570: 150W
  • RX 5600: 150W
  • RX 6600XT: 160W
  • RX 7600: 170W
  • RX 7600XT: 190W

Until the RX 470 the TDP actually went down a lot for the same number of CU with a small clock speed increase.

Since then however, the clock speeds got ramped up significantly, and with it, the TDP also went up - but not that much that it would be the reason for the higher performance, though it certainly helped.

Context. :P
I was talking about the RX 470 vs RX 570 as they are essentially the same part.


JEMC said:
Jizz_Beard_thePirate said:

Intel says ‘stay tuned’ to those asking for Arc B770

https://videocardz.com/newz/intel-says-stay-tuned-to-those-asking-for-arc-b770

Looking at the TechPowerUp GPU database, which is what I mostly use to compare GPUs besides Guru3D (they're the only ones that still include my GTX 1070 in some tests), there's an almost 30% difference in power between the B580 and the 5060Ti 16GB.

That's quite a big gap and it would be unfair to ask the B770 to actually deliver that kind of performance jump but, if Intel can make the B770 be just between 5 or 10% slower than the 5060Ti 16GB and launch it for $350, then it would be a hell of a card that would put Nvidia and AMD in trouble... if Intel can actually produce enough of them.

One can dream, right?

How -is- that 1070 fairing these days?


Bofferbrauer2 said:

I wish I still had the old PC with the Core 2 duo E4300 and Radeon X1650 Pro (was my stepfather's PC, and despite the low-power hardware inside got hot as hell) and pair that thing with a 5090...

I still have my Core 2 Quad Q9650 PC paired up with a SFF GDDR5 Geforce 1030.

I also have an old Intel Atom 330 Dual Core, 2GB of Ram and a PCI-Radeon 4650 (NOT PCI-E) graphics card mini-ITX PC.
The 4650 was interesting as it used a PCI-E to PCI bridge chip and was an extremely rare and unique part which I assume was for industrial/signage as it was passively cooled. - It was actually my test bench for testing out more modern shaders for Oblivion that I was developing back in the day... And seeing how far I could push the very first Intel Atom Dual-Core in gaming. (Hint: Not very far.)

And of course I have an Athlon XP 2600+ 512MB Ram with a pair of 3dfx Voodoo 2's and a Radeon 9550 for Truform in Morrowind (Early Tessellation) all on Win98.

Conina said:
haxxiy said:

You reminded me that early 28 nm and early GCN were both a big oof in terms of power consumption.

The 7970 was exceptionally bad in that regard, a typical AMD GPU flagship clocked way above its optimal power-frequency curve.

Crossfire systems with up to 1000 Watts:

"A 1,000 watt power supply should be the minimum to ensure stable operation. In the case of the Radeon HD 7970 GHz Edition, however, even this is not enough - especially when overclocking. We ourselves opted for a 1,600 watt power supply after the system could not be operated stably in extreme situations with a 1,000 watt power supply."

I ran 4x Radeon HD 6950's in Crossfire back in the day, managed that on an 950w Corsair PSU at the time.

The interesting tidbit is that the jump from 3 > 4 GPU's actually resulted in a negligible power consumption increase, but also a negligible performance increase, AMD's drivers at the time weren't capable of utilizing them effectively.

...But I was also gaming at 5760x1080 resolution back then, almost double the pixels of 1440P which was stupidly demanding.

Last edited by Pemalite - on 15 May 2025


www.youtube.com/@Pemalite