Cyran said:
Forgetting PCI- SIG standards which is where the 150 wattage for 8 pin come from, just looking at how a 8 pin is wired if you using 18AWG it would max out at ~250watts so 500 watts total. 18AWG is typically what comes with a PSU. If you have a fully modular PSU you could get a custom 16AWG cable which would up the max to ~325 watts. it possible some high end PSU using 16AWG I never really checked. It seem like in the rumor specs for the 12 pin develop by NVIDIA they assuming the use of 16AWG cable for a max wattage of ~650. Which would be the max of 2x8 pin using 16AWG but most likely your PSU using 18AWG for a max of ~500 watts. On the PSU side usually you only limited by the max rail ampere so no issue there. That why generally speaking using a 8 pin splitter to 2x8 pins will actually work even through if you actually maxing out you getting on iffy territory with 18AWG cable. The 150 watts is usually regulated on the gpu connector size to not ask for more then the 150 watts. You technically not PCI-E SIG certified if you pulling more but there nothing stopping a maker to sell a none certified gpu. Actually basically every high end gpu is not since the PCI-E gen 3 spec allows for 300 watts max (pcie+6pin+8pin) any gpu with 2x8 pin is technically outside the spec. That might of change with pci-e gen 4 spec. The short answer is it really depends on how the GPU side made and how much power nvidia going to allow it to draw or the 3rd party designs allows and a 8 pin connector is capable of a lot more power then the PCI-SIG allows. |
Thanks for the explanation as I didn't know about any of that. September 1st is gonna be one fun day loll.
PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850