By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Anytime you talk about Intel CPUs and their power usage

Intel’s Next-Gen Battlemage “Xe2” & Celestial, “Xe3” dGPUs & Panther Lake. Nova Lake iGPUs Recieve Support In HWiNFO

https://wccftech.com/intel-battlemage-xe2-celestial-xe3-dgpus-panther-lake-nova-lake-igpus-support-hwinfo/

Intel Core Ultra 7 155H CPU Fails Miserably Against The AMD Ryzen 7 7840U In Linux Benchmarks

https://wccftech.com/intel-core-ultra-7-155h-cpu-fails-miserably-amd-ryzen-7-7840u-linux-benchmarks/

Intels first attempt into chiplet design for consumers continues to feel very meh

Intel’s Next-Gen Battlemage “Xe2” & Celestial, “Xe3” dGPUs & Panther Lake. Nova Lake iGPUs Recieve Support In HWiNFO

https://wccftech.com/intel-battlemage-xe2-celestial-xe3-dgpus-panther-lake-nova-lake-igpus-support-hwinfo/

NVIDIA First To Offer Driver Support For New Vulkan H.265 & H.264 Video Encode Extensions

https://wccftech.com/nvidia-first-to-offer-driver-support-new-vulkan-h-265-h-264-video-encode-extensions/

Lots of drama over Igor's labs review of the new alpha cool fans. Igor's lab has been a bit sus for some time now, especially with how they gave out nonsense conclusions when it came to 4090s and the melting of connectors as they were never able to replicate the faults by themselves but rather gave theories as to the issues. But this drama is much worse. Not only was their fan testing bad and had wildly different results compared to every other reviewer where Noctua NF-A12x25 was generally superior, but the testing in itself made zero sense as there was more airflow with 60mm radiator vs 25mm radiator. After getting criticized, they doubled down on it and made reaction video which made things a lot worse. While this wasn't Igor himself but rather one of his employees, his handling was also bad.

LG Teases 2024 OLED “UltraGear” Gaming Monitors: Can Switch Between 480Hz FHD & 240Hz UHD Modes On The Go

https://wccftech.com/lg-teases-2024-oled-ultragear-gaming-monitors-switch-480hz-fhd-240hz-uhd-modes/

"The monitor would have the ability to switch to a blazing-fast 480Hz refresh rate when you are in Esports or high-refresh-rate gaming scenarios, and at the time of media consumption, you can switch to the 4K resolution mode, which also comes with a 240Hz refresh rate."

Might just be the greatest year to buy a new monitor for pc gamers!

Last edited by Jizz_Beard_thePirate - on 20 December 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

It seems that Intel's chips are somewhat comparable to AMD when locked to eco-mode around 90W, ex. the 14700K and 7700X are similar in FPS per Watt. However, Intel got nothing against X3D efficiency. The chart is somewhat skewed bc these X3D run at lower Watts here and that increases efficiency.



numberwang said:

It seems that Intel's chips are somewhat comparable to AMD when locked to eco-mode around 90W, ex. the 14700K and 7700X are similar in FPS per Watt. However, Intel got nothing against X3D efficiency. The chart is somewhat skewed bc these X3D run at lower Watts here and that increases efficiency.

The thing is, even when Intel gets power-limited into an eco mode, it still comes with a worse FPS/Watt than a 7700X running in standard settings. And the 7700X is not considered especially conservative in it's power draw, hence why that part only barely beats his 4 year old predecessor 3700X. Had they put the 7700X also into ECO mode, it would have run away from the 14700K yet again.

The chart isn't skewed. Intel's P-cores are simply guzzling too much, and that's not helped at all by Intel's insistence of clocking them higher and higher. They get much more economic when their clock speeds drop to about 4.5Ghz, but at that point they wouldn't be able to compete with AMD anymore in terms of performance.



The more I look at that methodology an presentation, the less I like it.

According to the chart above the i3-12100F (4P 0E) is beating nearly all AMD CPUs without X3D cache in efficiency. This looks strange considering that the 4 P-cores have to run at high clocks and that should be the worst outcome in efficiency. Idk what is happening there.

There is another more meaningful test with two CPUs locked to 86W and now outcomes are very comparable again. I think that is the key for Intel CPUs: you have to lock them at around 90W to get good efficiency. This will likely force the usage of E-cores and lower clocks for P-cores.

We know that Intel decided to unlock their CPUs to win benchmarks and that will destroy their efficiency at +200W. I think it would be better to sell these CPUs locked to 90W (14700) and 120W (14900) and enable full power via an overclocking option.



The fact you're not taking into account regarding the 12100F is the frequency that CPU runs at: the optimal one.

Every chip, CPU and GPU has an optimal setting where it runs at the highest efficiency possible. If your chip doesn't reach that point, it can't maximise its full potential, but if the chip goes beyond that point, you get more performance at the cost of losing efficiency.

The 12100 is a low end CPU that Intel runs at its max efficiency setting, but the 14700 and 14900 parts are high end CPUs where every bit of extra performance matters, and Intel values raw performance more than efficiency. That's why those chips run beyond that peak efficiency point, making them a lot less efficient.

Don't get it wrong, AMD does the same because they need that extra bit of performance to beat the competition. Check their video running the 7950X at 65 and 105W Eco modes and you'll see how much more efficient that CPU is when not pushed at its limit.

That's why the ----X3Ds are so efficient, because the extra cache limits the amount of power those chips can be run at, resulting in lower frequencies, and that makes them run much closer to their most optimal setting.

Intel's biggest problem, then, is that even thought both companies run their chips beyond their peak efficiency point, Intel pushes their high end chips a lot further beyond that point than AMD does, which is why their CPUs suffer more in terms of efficiency.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
Jizz_Beard_thePirate said:

Lots of drama over Igor's labs review of the new alpha cool fans. Igor's lab has been a bit sus for some time now, especially with how they gave out nonsense conclusions when it came to 4090s and the melting of connectors as they were never able to replicate the faults by themselves but rather gave theories as to the issues. But this drama is much worse. Not only was their fan testing bad and had wildly different results compared to every other reviewer where Noctua NF-A12x25 was generally superior, but the testing in itself made zero sense as there was more airflow with 60mm radiator vs 25mm radiator. After getting criticized, they doubled down on it and made reaction video which made things a lot worse. While this wasn't Igor himself but rather one of his employees, his handling was also bad.

What I don't get is that we already had this issue where Linus himself and a few of his staff ended up doing roughly the same, then doubling down on it. Are these tech review companies getting paid to be intentionally misleading with their tests?, because a few shitty employees willingly mucking around with what really should be the objective test result, have nothing to gain but scorn and ire.

Yet these ppl keep doubling down, especially when they are wrong, and what exactly do they think that will get them?. I'm genuinely curious as hell to know why this is ultra specifically becoming a thing in recent years with these techies, and whatever it is they could possibly gain by fucking around and not giving us the actual results that everyone else is getting (unless the company providing them the tech paid them to talk praise, or less of a slant?). 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:

Are these tech review companies getting paid to be intentionally misleading with their tests?

Their sponsors pay them.

They are double and triple dipping on financials because advert revenue and t-shirts aren't enough to cover the cost of most things.

Shame that long-credible websites such as Anandtech are just a shell of their former self.



--::{PC Gaming Master Race}::--

JEMC said:

The fact you're not taking into account regarding the 12100F is the frequency that CPU runs at: the optimal one.

That's what I don't like about this chart. Any old CPU can win the efficiency crown if you down clock it enough (in the case of the i3-1200F with factory down clock). That is not a meaningful method for game efficiency analysis as we will dip way below 60fps with a 12100F. The 7800X3D/7950X3D using only 61W/65W is a second strange point. You will get better efficiency at these lower watts but why aren't they running at 120W as designed? There is a specific bug with Cyberpunk that prevents it from using AMD multi-threading. It could be that AMD CPUs are less utilized and thus run at lower wattage which increases efficiency.



Chazore said:
Jizz_Beard_thePirate said:

Lots of drama over Igor's labs review of the new alpha cool fans. Igor's lab has been a bit sus for some time now, especially with how they gave out nonsense conclusions when it came to 4090s and the melting of connectors as they were never able to replicate the faults by themselves but rather gave theories as to the issues. But this drama is much worse. Not only was their fan testing bad and had wildly different results compared to every other reviewer where Noctua NF-A12x25 was generally superior, but the testing in itself made zero sense as there was more airflow with 60mm radiator vs 25mm radiator. After getting criticized, they doubled down on it and made reaction video which made things a lot worse. While this wasn't Igor himself but rather one of his employees, his handling was also bad.

What I don't get is that we already had this issue where Linus himself and a few of his staff ended up doing roughly the same, then doubling down on it. Are these tech review companies getting paid to be intentionally misleading with their tests?, because a few shitty employees willingly mucking around with what really should be the objective test result, have nothing to gain but scorn and ire.

Yet these ppl keep doubling down, especially when they are wrong, and what exactly do they think that will get them?. I'm genuinely curious as hell to know why this is ultra specifically becoming a thing in recent years with these techies, and whatever it is they could possibly gain by fucking around and not giving us the actual results that everyone else is getting (unless the company providing them the tech paid them to talk praise, or less of a slant?). 

Yea pretty much. It feels like retarded amounts of hubris just because they have access to things that regular folks do not. It's a very easy way to kill credibility and while Linus is too big and would need multiple cuckups to have impact, Igors lab is no where near as big so they need to get their shat together.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

JEMC said:

The fact you're not taking into account regarding the 12100F is the frequency that CPU runs at: the optimal one.

Every chip, CPU and GPU has an optimal setting where it runs at the highest efficiency possible. If your chip doesn't reach that point, it can't maximise its full potential, but if the chip goes beyond that point, you get more performance at the cost of losing efficiency.

The 12100 is a low end CPU that Intel runs at its max efficiency setting, but the 14700 and 14900 parts are high end CPUs where every bit of extra performance matters, and Intel values raw performance more than efficiency. That's why those chips run beyond that peak efficiency point, making them a lot less efficient.

Indeed. The voltage-frequency curve is like a sigmoid function:

With frequency in the Y axis and voltage in the X axis. These will be dramatically different on a product-by-product basis. Mind, the most efficient point is technically near the idle frequencies (or something like undocked Switch, or battery-saving mode in cellphones), while the definition of what is optimal here would be a bit more subjective.

Intel has been pushing frequency again since the cost of IPC increases in x86 these days is huge (like doubling the number of transistors for 10-20% IPC) so easier to place the cost at the foot of the customer (in the shape of power bills) by just increasing frequency as much as they can.