Captain_Yuri said: Also the original 12 pin rumour stated that it's purpose was to deliver 600 Watts. So I am assuming Dual 8 Pin to a single 12 Pin adapter can do just that otherwise the adapter feels kinda pointless but who knows. |
Forgetting PCI- SIG standards which is where the 150 wattage for 8 pin come from, just looking at how a 8 pin is wired if you using 18AWG it would max out at ~250watts so 500 watts total. 18AWG is typically what comes with a PSU. If you have a fully modular PSU you could get a custom 16AWG cable which would up the max to ~325 watts. it possible some high end PSU using 16AWG I never really checked. It seem like in the rumor specs for the 12 pin develop by NVIDIA they assuming the use of 16AWG cable for a max wattage of ~650. Which would be the max of 2x8 pin using 16AWG but most likely your PSU using 18AWG for a max of ~500 watts.
On the PSU side usually you only limited by the max rail ampere so no issue there. That why generally speaking using a 8 pin splitter to 2x8 pins will actually work.
The 150 watts is usually regulated on the gpu connector side to not ask for more then the 150 watts. You technically not PCI-E SIG certified if you pulling more but there nothing stopping a maker to sell a none certified gpu. Actually basically every high end gpu is not since the PCI-E gen 3 spec allows for 300 watts max (pcie+6pin+8pin) any gpu with 2x8 pin is technically outside the spec. That might of change with pci-e gen 4 spec.
The short answer is it really depends on how the GPU side made and how much power nvidia going to allow it to draw or the 3rd party designs allows and a 8 pin connector is capable of a lot more power then the PCI-SIG allows.
Last edited by Cyran - on 29 August 2020