By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
JEMC said:

It's incredible how much do Intel, AMD and Nvidia push their products away from their sweet spot in terms of efficiency just for a relatively small increase in performance. As those test show, Nvidia could have easily toned down the card a bit and still offer a huge leap in performance while being a marvel of efficiency.

It's no wonder overclocking the newer CPUs and GPUs is almost pointless.

I mean, it has always been kind of like this, with few exceptions, due to the quadratic relation between voltage and power consumption. It's just far more noticeable now due to the latter reaching ludicrous levels.

(And in the 80s and early 90s CPUs consumed like 10 Watts...)

10W was already considered extreme. I can still remember my old 286 having no cooler at all, not even a passive one, and reports that the 68k series from Motorola, specifically the 68020 and 68030, running hot, and the 68040 and 68060 needing a (passive) cooler. Only with the Pentium Pro active cooling starting to really take off.

Here are some Power consumption numbers for you from old Pentium chips:

Pentium 90: 9W

Pentium 133: 11.3W

Pentium MMX 233: 17W

Pentium Pro 166: 35W

Pentium II 350: 21.7W

These would these days all go into Ultrathin Laptops. But for those days, they needed much power, hence why they started making mobile versions of the Pentium II (until then they still mostly used 486 DX2/4 in 3.3V mode - or simply P54C OG Pentiums), which ran at half the power consumption.

Mobile Pentium II 333: 11.8W

Pentium III also stayed below 35W. It's only with the Pentium IV that TDP's started exploding.

It wasn't just the CPU's either. Have a look at these old non-3D-capable ATi GPUs. None of them came with any coolers for any chips. Only with the 3D-capable ATi Rage and NVidia Riva TNT (Yes, they made GPUs before the GeForce series, and is also the reason where the GPU tuning tool Riva Tuner got it's name from) or 3DFX Voodoo Banshee did the GPUs also get some coolers.

But with the coming of 3D graphics, they had a little additional roadblock to get to high TDP rates: Early GPUs only got their power through the PCi/AGP/PCIe slot, meaning they were limited to 75W maximum. Cooling solutions were also not sophisticated enough yet to allow for much more than this early on and either couldn't run at expected speeds due to this (most later 3DFX chips ran into this) or sounded like a hairdryer on speed (looking at you, GeForce FX 5800!). And while it jumped rather quickly to about 250W from there, it stayed there for quite a while until last the GPU gen:

GeForce 8800 GTX: 145W

GeForce GTX 285: 203W

GeForce GTX 480: 250W It was nicknamed Thermi from it's codename Fermi for a good reason

GeForce GTX 780Ti: 230W

GeForce RTX 2080Ti: 250W

But now, it's exploding again. A this rate, we'll get GPUs in 2030 where a 300W model is the low mainstream, almost entry-level. I mean, how many modern GPUs get released that come without any power adapter anymore? Even entry-level cards these days have at least a 6-pin connector.