By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

And some PC gaming news:

Playstation plans to release more 1st party titles to PC

https://www.sony.net/SonyInfo/IR/library/corporatereport/CorporateReport2020_E.pdf

" Targeted outcomes include growth in active users, stronger retention and a shorter cash conversion cycle, from which expanded cash flow can be expected. We will explore expanding our 1st party titles to the PC platform, in order to promote further growth in our profitability. "

Doki Doki Literature Club creator demonetized on YouTube for using his own music



The reply from Youtube is what makes this thing funny

"(2/2) Your channel may be lacking context about the creative value you’re adding to make the content unique. We recommend changing your overall content strategy and reapply in 30 days to send your channel for a new review. Check out best practices here: https://yt.be/help/reused-content "

It's like their twitter account is run by bots as well



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:
vivster said:

The 3090 is specced for 350W TGP and comes with a maximum possible power draw of 375W. I doubt you'll have much fun overclocking here unless you're also undervolting.

For comparison, the 2080ti sits at only 250W TGP.

Yea but the 2080 Ti will go well above it's power ratings to hit those clock speeds and I imagine so will the 3090 as long as you have the power supply and a good enough card to handle it. But granted every generation, overclocking becomes less and less of a thing so you could be correct as well.

The 2080ti has head room of 125W, plenty for overclocking. How much do you think you will be able to overclock with barely 25W?

So yeah, I'm upping my prediction. Not only will it not be possible to get the 3090 to 2000MHz without liquid nitrogen, but it won't be physically possible without electronically altering the card.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

I'm still surprised by the high TDP of the two big cards and how little difference there's between the 3090 and the 3080 even with 14GB more of VRAM, that alone should already eat the extra 30W of the 3090.


That Sony will launch more games on PC is great news for us, and it will be a delight to read the meltdowns that some users will have once someone makes a thread about it. I only hope they test their ports better to avoid the many troubles that Horizon has. Should we start guessing which game will come next? I'd say Days Gone, because it didn't work so well in terms of sales.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

vivster said:
Captain_Yuri said:

Yea but the 2080 Ti will go well above it's power ratings to hit those clock speeds and I imagine so will the 3090 as long as you have the power supply and a good enough card to handle it. But granted every generation, overclocking becomes less and less of a thing so you could be correct as well.

The 2080ti has head room of 125W, plenty for overclocking. How much do you think you will be able to overclock with barely 25W?

So yeah, I'm upping my prediction. Not only will it not be possible to get the 3090 to 2000MHz without liquid nitrogen, but it won't be physically possible without electronically altering the card.

Where are you getting the 25W from?



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
vivster said:

The 2080ti has head room of 125W, plenty for overclocking. How much do you think you will be able to overclock with barely 25W?

So yeah, I'm upping my prediction. Not only will it not be possible to get the 3090 to 2000MHz without liquid nitrogen, but it won't be physically possible without electronically altering the card.

Where are you getting the 25W from?

The 3090 has a TGP(Total Graphics Power) of 350W. That is the base specification. With the 12 pin connector, which are just 2x8 pins, the GPU will be able to draw an absolute maximum of 375W. That's not much room to grow. For comparison the 2080ti had 250W TGP. An overclocked 2080ti at 2000MHz will already draw close to the max of 375W.

Now unless I'm getting something wrong here if the basic configuration, read non-OC, is already close to 350W then there isn't much room to the physical limit of 375W.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Around the Network
vivster said:
Captain_Yuri said:

Where are you getting the 25W from?

The 3090 has a TGP(Total Graphics Power) of 350W. That is the base specification. With the 12 pin connector, which are just 2x8 pins, the GPU will be able to draw an absolute maximum of 375W. That's not much room to grow. For comparison the 2080ti had 250W TGP. An overclocked 2080ti at 2000MHz will already draw close to the max of 375W.

Now unless I'm getting something wrong here if the basic configuration, read non-OC, is already close to 350W then there isn't much room to the physical limit of 375W.

If the way you are calculating the max 375 Watts is by adding the standard specification for those 8 pin connectors and then adding PCI-E wattage, so 150 Watt + 150 Watt + 75 Watt, we have seen in the past that those connectors can provide a lot more power than the standard specs suggest as long as the power supply is capable of it.

For example, if you take two very power hungry cards, the Titan Z and 295x2, both of those only have dual 8 pin connectors. Not more than that yet their estimated power draw is almost 500 Watts. It's why the recommended power supply is 750 Watts at a minimum and it's highly recommended to get 850 Watts or more. And you could overclock those on top of that.

https://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_z_review,7.html

https://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,12.html

And of course, there's the third method of AIB partners adding more connectors like Zotac is doing.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
vivster said:

The 3090 has a TGP(Total Graphics Power) of 350W. That is the base specification. With the 12 pin connector, which are just 2x8 pins, the GPU will be able to draw an absolute maximum of 375W. That's not much room to grow. For comparison the 2080ti had 250W TGP. An overclocked 2080ti at 2000MHz will already draw close to the max of 375W.

Now unless I'm getting something wrong here if the basic configuration, read non-OC, is already close to 350W then there isn't much room to the physical limit of 375W.

If the way you are calculating the max 375 Watts is by adding the standard specification for those 8 pin connectors and then adding PCI-E wattage, so 150 Watt + 150 Watt + 75 Watt, we have seen in the past that those connectors can provide a lot more power than the standard specs suggest as long as the power supply is capable of it.

For example, if you take two very power hungry cards, the Titan Z and 295x2, both of those only have dual 8 pin connectors. Not more than that yet their estimated power draw is almost 500 Watts. It's why the recommended power supply is 750 Watts at a minimum and it's highly recommended to get 850 Watts or more. And you could overclock those on top of that.

Then I don't understand why everywhere I looked they were talking about "MAX" power over PCIe and the connectors. Where I come from "MAX" means "MAX". There must be a limit where the PSU can't or will not allow more power through any connector, otherwise they might not bother at all with safety.

I actually expected that all high overclocked cards come with 3x 8 Pin.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:
Captain_Yuri said:

If the way you are calculating the max 375 Watts is by adding the standard specification for those 8 pin connectors and then adding PCI-E wattage, so 150 Watt + 150 Watt + 75 Watt, we have seen in the past that those connectors can provide a lot more power than the standard specs suggest as long as the power supply is capable of it.

For example, if you take two very power hungry cards, the Titan Z and 295x2, both of those only have dual 8 pin connectors. Not more than that yet their estimated power draw is almost 500 Watts. It's why the recommended power supply is 750 Watts at a minimum and it's highly recommended to get 850 Watts or more. And you could overclock those on top of that.

Then I don't understand why everywhere I looked they were talking about "MAX" power over PCIe and the connectors. Where I come from "MAX" means "MAX". There must be a limit where the PSU can't or will not allow more power through any connector, otherwise they might not bother at all with safety.

I actually expected that all high overclocked cards come with 3x 8 Pin.

Yea I am not sure either. I didn't have to think about power supplies or how they function until now where I can even afford to get a card that might require me to think about it loll. I am hoping the reviews can explain it or it will turn out like how you said or only AIB partners with 3 connectors can have a chance at those clock speeds. All I know is, nothing tells the full story these days until we see the actual proof. The only reason I am so keen on that 2100 Mhz is due to the 21 theme and that "rumour" of 2100 Mhz on user benchmark but granted that was the 3080.

Also the original 12 pin rumour stated that it's purpose was to deliver 600 Watts. So I am assuming Dual 8 Pin to a single 12 Pin adapter can do just that otherwise the adapter feels kinda pointless but who knows.

Last edited by Jizz_Beard_thePirate - on 29 August 2020

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Also the original 12 pin rumour stated that it's purpose was to deliver 600 Watts. So I am assuming Dual 8 Pin to a single 12 Pin adapter can do just that otherwise the adapter feels kinda pointless but who knows.

Forgetting PCI- SIG standards which is where the 150 wattage for 8 pin come from, just looking at how a 8 pin is wired if you using 18AWG it would max out at ~250watts so 500 watts total.  18AWG is typically what comes with a PSU.  If you have a fully modular PSU you could get a custom 16AWG cable which would up the max to ~325 watts.  it possible some high end PSU using 16AWG I never really checked.  It seem like in the rumor specs for the 12 pin develop by NVIDIA they assuming the use of 16AWG cable for a max wattage of ~650.  Which would be the max of 2x8 pin using 16AWG but most likely your PSU using 18AWG for a max of ~500 watts.

On the PSU side usually you only limited by the max rail ampere so no issue there.  That why generally speaking using a 8 pin splitter to 2x8 pins will actually work.

The 150 watts is usually regulated on the gpu connector side to not ask for more then the 150 watts.  You technically not PCI-E SIG certified if you pulling more but there nothing stopping a maker to sell a none certified gpu.  Actually basically every high end gpu is not since the PCI-E gen 3 spec allows for 300 watts max (pcie+6pin+8pin) any gpu with 2x8 pin is technically outside the spec.  That might of change with pci-e gen 4 spec.

The short answer is it really depends on how the GPU side made and how much power nvidia going to allow it to draw or the 3rd party designs allows and a 8 pin connector is capable of a lot more power then the PCI-SIG allows.

Last edited by Cyran - on 29 August 2020

Cyran said:
Captain_Yuri said:

Also the original 12 pin rumour stated that it's purpose was to deliver 600 Watts. So I am assuming Dual 8 Pin to a single 12 Pin adapter can do just that otherwise the adapter feels kinda pointless but who knows.

Forgetting PCI- SIG standards which is where the 150 wattage for 8 pin come from, just looking at how a 8 pin is wired if you using 18AWG it would max out at ~250watts so 500 watts total.  18AWG is typically what comes with a PSU.  If you have a fully modular PSU you could get a custom 16AWG cable which would up the max to ~325 watts.  it possible some high end PSU using 16AWG I never really checked.  It seem like in the rumor specs for the 12 pin develop by NVIDIA they assuming the use of 16AWG cable for a max wattage of ~650.  Which would be the max of 2x8 pin using 16AWG but most likely your PSU using 18AWG for a max of ~500 watts.

On the PSU side usually you only limited by the max rail ampere so no issue there.  That why generally speaking using a 8 pin splitter to 2x8 pins will actually work even through if you actually maxing out you getting on iffy territory with 18AWG cable.

The 150 watts is usually regulated on the gpu connector size to not ask for more then the 150 watts.  You technically not PCI-E SIG certified if you pulling more but there nothing stopping a maker to sell a none certified gpu.  Actually basically every high end gpu is not since the PCI-E gen 3 spec allows for 300 watts max (pcie+6pin+8pin) any gpu with 2x8 pin is technically outside the spec.  That might of change with pci-e gen 4 spec.

The short answer is it really depends on how the GPU side made and how much power nvidia going to allow it to draw or the 3rd party designs allows and a 8 pin connector is capable of a lot more power then the PCI-SIG allows.

Thanks for the explanation as I didn't know about any of that. September 1st is gonna be one fun day loll.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850