By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Fully Connected 16-Pin Connector On The NVIDIA GeForce RTX 4090 Ends Up Melting Too

https://wccftech.com/fully-connected-16-pin-connector-on-the-nvidia-geforce-rtx-4090-ends-up-melting-too/

The shitty connector cases continues

GALAX’s Monster GeForce RTX 4090 HOF Graphics Cards Breaks 20 GPU OC World Records

https://wccftech.com/galaxs-monster-geforce-rtx-4090-hof-graphics-cards-breaks-20-gpu-oc-world-records/

With two of those connectors, I'd hope so

Total Board Power (TBP) vs. real power consumption and the inadequacy of software tools vs. real measurements

https://www.igorslab.de/en/graphics-cards-and-their-consumption-read-out-rather-than-measured-why-this-is-easy-with-nvidia-and-nearly-impossible-with-amd/

While 4090s are rated for 450 watts, in game they generally run at 388 watts which is below their rating. Idk why Nvidia felt the need to go to 450 watts at all tbh.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

At this point, I don't think we'll get a proper answer as to why the adapters melt until we get something official from Nvidia or PCI-SIG.

As for the TBP of the 4090, it's indeed strange how Nvidia left those +50W extra there. Could it be for the transients? Like taking them into account so that a power supply that's able to deliver constant 450W of power will be able to withstand the surges without failing?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Its more to do with the workload. If you use it for rendering with blender and such, it will use the full 450 watts but it will also finish the render nearly 2x faster than 3090Ti



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Then the article, and the graph you've posted, is useless because it doesn't use the workloads that push the cards, and its powe consumption, to its limits.

After all, what's the point of the article, comparing stated TBP with the real power consumption or complain, once again, that AMD uses a weird method to estimate its TBP that makes it nearly impossible to verify?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

It shows the power usage and efficiency in gaming vs previous gens in gaming. 3090 Ti for example uses more watts for significantly less performance.

Last edited by Jizz_Beard_thePirate - on 12 November 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

If you want to talk about the power actually used and the efficiency in gaming, TBP doesn't really matter as it's not the real power consumption figure you'll use to calculate the efficiency.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Since I posted the first three, here's another of TechPowerUp articles comparing two CPUs, albeit this time it's a bit different:

RTX 4090 & 53 Games: Core i9-13900K E-Cores Enabled vs Disabled Review: https://www.techpowerup.com/review/rtx-4090-53-games-core-i9-13900k-e-cores-enabled-vs-disabled/
     >>The E-cores enabled give a 0.9% increase at 1080p, 0.4% at 1440p and a 0.1% at 4K

With that said, the average results don't tell the whole story, with some titles seeing a 10% increase or decrease with the E-cores enabled, so this is one of those things that depend on a game basis, as long as you're CPU limited, of course.

It's also worth mentioning the Far Cry games tested. Despite both of them using the Dunia engine, Far Cry 5 hates the E-cores while Far Cry 6 loves them.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Captain_Yuri said:

Total Board Power (TBP) vs. real power consumption and the inadequacy of software tools vs. real measurements

https://www.igorslab.de/en/graphics-cards-and-their-consumption-read-out-rather-than-measured-why-this-is-easy-with-nvidia-and-nearly-impossible-with-amd/

While 4090s are rated for 450 watts, in game they generally run at 388 watts which is below their rating. Idk why Nvidia felt the need to go to 450 watts at all tbh.

The reason why it's so much lower has nothing to do with efficiency or NVidia overstating the power consumption, it's just due to it being so powerful that the CPU limits the GPU in some of those games and thus the GPU ain't running at full speed in them.

If none of the games would have been CPU limited, the real power consumption would also be similar to it's TBP, just like with all the other cards.



Bofferbrauer2 said:
Captain_Yuri said:

Total Board Power (TBP) vs. real power consumption and the inadequacy of software tools vs. real measurements

https://www.igorslab.de/en/graphics-cards-and-their-consumption-read-out-rather-than-measured-why-this-is-easy-with-nvidia-and-nearly-impossible-with-amd/

While 4090s are rated for 450 watts, in game they generally run at 388 watts which is below their rating. Idk why Nvidia felt the need to go to 450 watts at all tbh.

The reason why it's so much lower has nothing to do with efficiency or NVidia overstating the power consumption, it's just due to it being so powerful that the CPU limits the GPU in some of those games and thus the GPU ain't running at full speed in them.

If none of the games would have been CPU limited, the real power consumption would also be similar to it's TBP, just like with all the other cards.

I think he only mentions that during the upscaling segment and not running it at Native 4k segment.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850



Finally got my chonky boi set up. New gpu, psu, and ssd. Had a bit of a scare after the rebuild when my pc would turn on but had no picture or power to the m+kb. After more troubleshooting than I care to admit it turned out that my ram had come loose during the upgrade. Reseating it fixed the problem easily, lol. I haven't had the chance to benchmark or anything yet but it feels nice to just have this thing up and running near its final form after so much planning and waiting. The last things I need to do are optimize my storage setup and get some new ram over the holidays. Other than that I'll just be watching that power connector like a hawk. Can't wait to game, create, and experiment!