By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JEMC said:

The fact you're not taking into account regarding the 12100F is the frequency that CPU runs at: the optimal one.

Every chip, CPU and GPU has an optimal setting where it runs at the highest efficiency possible. If your chip doesn't reach that point, it can't maximise its full potential, but if the chip goes beyond that point, you get more performance at the cost of losing efficiency.

The 12100 is a low end CPU that Intel runs at its max efficiency setting, but the 14700 and 14900 parts are high end CPUs where every bit of extra performance matters, and Intel values raw performance more than efficiency. That's why those chips run beyond that peak efficiency point, making them a lot less efficient.

Indeed. The voltage-frequency curve is like a sigmoid function:

With frequency in the Y axis and voltage in the X axis. These will be dramatically different on a product-by-product basis. Mind, the most efficient point is technically near the idle frequencies (or something like undocked Switch, or battery-saving mode in cellphones), while the definition of what is optimal here would be a bit more subjective.

Intel has been pushing frequency again since the cost of IPC increases in x86 these days is huge (like doubling the number of transistors for 10-20% IPC) so easier to place the cost at the foot of the customer (in the shape of power bills) by just increasing frequency as much as they can.