Captain_Yuri said:
Well it will be hard to tell until it actually comes out but most likely, it will be the former like Kopite's other leaks. Cause logically it makes no sense for a 4070 Ti to use 400 watts with only 7680 cores when the 4080 with it's 10,000+ cores will use 420 watts. I think the 4070Ti can spike to 400 Watts like how a 3070Ti can spike to 360 Watts but on average, it should be much less like how a 3070Ti generally uses 300sh Watts on average. So a 4070Ti will be capable of handling 400 Watts but on average, it will probably be more so around 300 Watts. |
I really hope it's the max power usage, but yeah, we'll see.
Also, if there's one thing all the rumors seem to agree so far is that Ada will be more power hungry than Ampere. In that case, a 4070Ti would use more power than a 3070Ti, and if the later already uses 300-ish watts, then it makes sense for the new one to use more than that, specially with the extra, and faster, VRAM.
Bofferbrauer2 said:
If this hypothetical 4070Ti is really a 400W card, then Lovelace would be a pretty weak increase in terms of performance per watt. I really hope this is the peak power consumption and regular use is quite a bit lower, otherwise I fear NVidia will get trounced by AMD in that domain (while probably retaining the pure performance crown). Also, this wouldn't bode very well for mobile GPUs, as this would mean that the biggest GPU chip in laptops would only be something like a 4060 as the others would be too damn power-hungry. |
As I said, I hope it's just the, how do they call it, max thermal design? The max it can use when overclocked and overvolted.
As for laptops... well, they've managed to put 12900K in some laptops. It's clear that, if they want to put something in a laptop, they'll find a way to do so.
Please excuse my bad English.
Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.