By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ethomaz said:

I agree with @drkohler.

The example you give is with multipliers... so you didn't changed the base clock to go with high multiples... like 10x200 = 2000Mhz or 11x200 = 2200Mhz... no change in voltage.

But to increase the base clock you have a limit... you start to increase it until you have to increase the voltage to get stable.

Now remember GPU only have the base clock (AMD GPU... nVidia have two clocks but I won't enter in this part).


I never mentioned anything about multipliers, nor do we have the option to play with multipliers on GPU's.

My point still stands, when you raise voltage, heat and power consumption increases exponentially. (I.E. The increase is the square of the voltage in a linear circuit.)

If you increase clockspeed and not the voltage, the increase in heat and power consumption isn't going to be an exponential increase, nor do you *have* to increase the voltage to run at a higher clock speed. (Untill you hit a wall and need more voltage, that is.) - Again, decades worth of proof via overclockers is available for you to peruse via google.

Also, nVidia moved away from having the shader clocks untied from the core clock some time ago.



--::{PC Gaming Master Race}::--