By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

JEMC said:
Captain_Yuri said:
GOT IT!

Congratulations for getting the card... but bloody hell at its powe consumption.

Well, at least you'll safe money this winter in heating.

Thanks loll. Tbh 400 watts is where I start to worry about my power supply so maybe I won't overclock and enjoy the lower noise levels instead. We will see though!



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:
GOT IT!

Sic.

I wouldn't worry about overclocking that either.. at least for now lol. Good get.



hinch said:
Captain_Yuri said:
GOT IT!

Sic.

I wouldn't worry about overclocking that either.. at least for now lol. Good get.

Now we just need the 3080 and Ps5 to ship. Come on people! (Ik Ik, Ps5 is gonna release in November but still)



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

The one interesting thing about Ampere is:

3080 @ Computerbase.de: A 50W lower TDP would've meant only 4% lower performance and 25% better efficiency than 2080Ti, not 10%.

https://www.reddit.com/r/nvidia/comments/iuht5l/3080_computerbasede_a_50w_lower_tdp_wouldve_meant/

So if that's true, that means Samsung's 8N is pretty efficient at a certain power target (similar to every other node) but Nvidia decided to push it to the brink where efficiency goes out the widow.

That's very interesting. I wonder why they've decided o push it so much and screw up the efficiency numbers for so little gain.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

For those looking for review of custom RTX 3080 cards, Videocardz has made an article with them. Asus, Gigabyte, MSI, EVGA, Zotac, Galax, etc: https://videocardz.com/newz/nvidia-geforce-rtx-3080-custom-graphics-cards-review-roundup

Also, in case someone still had hopes of a comeback, SLI is officially dead:

NVIDIA SLI Support Transitioning to Native Game Integrations
https://videocardz.com/press-release/nvidia-sli-support-transitioning-to-native-game-integrations
With the emergence of low level graphics APIs such as DirectX 12 and Vulkan, game developers are able to implement SLI support natively within the game itself instead of relying upon a SLI driver profile. The expertise of the game developer within their own code allows them to achieve the best possible performance from multiple GPUs. As a result, NVIDIA will no longer be adding new SLI driver profiles on RTX 20 Series and earlier GPUs starting on January 1st, 2021. Instead, we will focus efforts on supporting developers to implement SLI natively inside the games. We believe this will provide the best performance for SLI users.

Existing SLI driver profiles will continue to be tested and maintained for SLI-ready RTX 20 Series and earlier GPUs.

For GeForce RTX 3090 and future SLI-capable GPUs, SLI will only be supported when implemented natively within the game.

Now my worry is if they'll be honesy and work with developers to implement multi-GPU support in their games... or work with devs to implement SLI in their games in a way that makes it impossible for AMD to make those games work in X-fire even if they tried.

Knowing Nvidia, my bet is the first option, but I doubt they'll bother much with it.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
JEMC said:
Captain_Yuri said:

The one interesting thing about Ampere is:

3080 @ Computerbase.de: A 50W lower TDP would've meant only 4% lower performance and 25% better efficiency than 2080Ti, not 10%.

https://www.reddit.com/r/nvidia/comments/iuht5l/3080_computerbasede_a_50w_lower_tdp_wouldve_meant/

So if that's true, that means Samsung's 8N is pretty efficient at a certain power target (similar to every other node) but Nvidia decided to push it to the brink where efficiency goes out the widow.

That's very interesting. I wonder why they've decided o push it so much and screw up the efficiency numbers for so little gain.

To be fair, you can do the same thing to almost every CPU and GPU out there, specially in the case of the ones that are being loosely binned or have conservative power targets. Undervolting Ryzen CPUs yields particularly crazy results:

https://www.overclock.net/threads/why-you-should-undervolt-your-ryzen-3000-part-ii.1739796/

An example for Nvidia GPUs:

https://www.reddit.com/r/nvidia/comments/9idtco/rtx_2080_downvolt_the_easy_way/

30 - 40 W less power or better for same perf or small overclocks.

Last edited by haxxiy - on 17 September 2020

 

 

 

 

 

Captain_Yuri said:
Alright got some more AIB info coming in.





So, 2-2.5% increase in performance at the low cost of about 15% more consumption. I think Mr. Atkinson describes what we're all feeling about that.

I do wonder what the aftermarket Ti cards will require to pull off optimal performance.



haxxiy said:
JEMC said:

That's very interesting. I wonder why they've decided o push it so much and screw up the efficiency numbers for so little gain.

To be fair, you can do the same thing to almost every CPU and GPU out there, specially in the case of the ones that are being loosely binned or have conservative power targets. Undervolting Ryzen CPUs yields particularly crazy results:

https://www.overclock.net/threads/why-you-should-undervolt-your-ryzen-3000-part-ii.1739796/

An example for Nvidia GPUs:

https://www.reddit.com/r/nvidia/comments/9idtco/rtx_2080_downvolt_the_easy_way/

30 - 40 W less power or better for same perf or small overclocks.

Yeah, I remember that Tom's Hardware did an article about undervolting AMD's Fury back in the day, but it still surprises me how conservative are all manufacturers to make sure that all CPUs and GPUs perform as promised.

And it also makes me wonder how bad has to be a chip to discard it even with those margins.

Mummelmann said:
Captain_Yuri said:
Alright got some more AIB info coming in.
*snip*

So, 2-2.5% increase in performance at the low cost of about 15% more consumption. I think Mr. Atkinson describes what we're all feeling about that.

I do wonder what the aftermarket Ti cards will require to pull off optimal performance.

Better not think about that right now or you'll get nightmares.

It also makes you wonder how much power the 3090 will use.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.



JEMC said:

For those that, despite the current problems, will try Rocket League next week when it goes free to play AND didn't read the article about it on today's news, here' something for you:

https://www.pcgamer.com/rocket-league-goes-free-to-play-on-the-epic-games-store-next-week/

But the big cookie is the free-to-play launch that's coming next week. Rocket League is available for wishlisting on the Epic Games Store, and anyone who picks it up in the first month of Epic release will also get a $10 discount coupon for use on games and DLC in the Epic Games Store. (So even if you don't care about Rocket League, just add it to your library and get ten bucks off the next EGS thing that you do care about.) 

Not sure if it will work with those coming from Steam, and the discount only works when used in games that cost more than $15.

I still have to waste my current $10 coupon valid until November before I get the new coupon.

Any good deals currently on EGS? Maybe I get "Twin Mirror". Or is it also coming to GamePass?