By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx's PC gaming emporium - Catch up on all the latest PC Gaming related news

 

Zarx changed his avatar again. Thoughts?

Noice 244 61.31%
 
So soon? I just got used to the last one 13 3.27%
 
it sucks 22 5.53%
 
Your cropping skills are lacking 13 3.27%
 
Too noisy, can't tell WTF it even is 13 3.27%
 
Meh 32 8.04%
 
Total:337

The main reason for my new PC were VR games, some of them didn't run perfect with high settings and without resolution scaling on my GTX 1070 and i5-4670K (Half-Life Alyx, Fallout 4, Hellblade, Senua's Sacrifice).

So far all VR games I tested with my new PC (5800X / RTX 3070) run great with highest settings and 200% SteamVRs resolution scaling. Many of them even with 300% resolution scaling, some of them with 400% resolution scaling.

I hope that I can use my preordered Reverb G2 (with much higher native resolution than my Oculus Rift) with 150% SteamVRs resolution scaling in most VR games. If not, there is the hope for the RTX 3080 Ti.



Around the Network
hinch said:

I wouldn't say bad at higher resolutions. Rasterization is pretty much on par at 4k and is scales better at 1440P, its just the RT performance which is a bit eh. RDNA 3 isn't a bad shout, or waiting it out for Nvidia's 7nm cards.

This new console generation means a tonne of bad ports.. so heavy brute forcing is going to be the way. Idd about the feeling of Ryzen gen 1, it a good product and competitive in a lot of ways, but not the home run. And with RT being prevalent in the next generation of games this seems like a stop gap until RDNA 3 comes out.

.

Isn't the consistent drop in overall performance when increasing resolution a surefire sign that they did not go wide enough? For higher resolution you need to go wide because there is more stuff to do in parallel and they obviously did not go that route. Imagine where those cards were if not for the crazy clocks. Now the big question is how well does RDNA scale horizontally. If it does then Nvidia has a big problem on their hands because they are already quite wide and don't really have much potential to go much faster until they get on a new node. They will be forced to significantly remodel their architecture.

Question into the room: How much do you think Samsung's node is at fault for the obscenely bad efficiency of Ampere? Will we be able to get double digit improvements on the Ampere refreshes on TSMC? If not, who is the most likely culprit for scratching at 500Watts?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Hmm, so these cards are no good. Wait for 5050 ti or 6060 ti, gotcha.



6000 series are pretty terrible products when you realize that AMD is asking people to spend $650/$1000 on a GPU to Turn Off settings. It's amusing how on games that use heavy Ray Tracing, even Turing does better than the 6800 XT and that's before DLSS. You know... The GPUs that everyone said will age badly?

And the biggest kicker is, the Raster performance isn't even that good depending on where you look. 4k, the 6800 XT loses to a 3080. 1440p, it Trades blows or is in the lead by a small margin depending on the game, 1080p, it wins, in VR, it loses. Hell it gets killed on some of the older titles like The Witcher 3 at every resolution. Looking at Cyberpunk, Ray Tracing isn't even being supported at launch. You know, one of the most highly anticipated games of the generation?

6000 series is just for AMD fans only and no one else. The Vram capacity argument has been nonsense in the past and continues to be nonsense today proven by the benchmarks. For most people, it's better to wait until AMD has a proper GPU that is actually worth the price they are asking instead of forking over $650/$1000 for potato ray tracing performance, good Raster performance and zero answer to DLSS. Cause at those prices, you really shouldn't need to turn off settings to justify a purchase.

As for TSMC vs Samsung. I do think Nvidia going back to TSMC might be their Pascal type of leap in all fronts.

Last edited by Captain_Yuri - on 22 November 2020

             

                               Anime: Haruhi                                                                                      Anime: Love Live
                              Nsfw Anime Thread                                                                             Join our Anime Threads!
                             Sfw Anime Thread                                                                                VGC Tutorial Thread

Captain_Yuri said:

6000 series are pretty terrible products when you realize that AMD is asking people to spend $650/$1000 on a GPU to Turn Off settings. It's amusing how on games that use heavy Ray Tracing, even Turing does better than the 6800 XT and that's before DLSS. You know... The GPUs that everyone said will age badly?

I was going to defend them, since RX 6800 + RX 6800 XT are quite fast in the newest games (AC Valhalla, Dirt 5, Watch Dogs Legion).

So if you aren't very interested in raytracing or VR games, they will deliver a good performance for years, especially their VRAM makes them quite future proof.

Unfortunately(?) I'm very interested in raytracing reflections, raytracing shadows/lighting and in VR games.

But until they have a solution similar to DLSS to even out the performance hit of activated raytracing, I'm better off with Ampere.

https://www.eurogamer.net/articles/digitalfoundry-2020-amd-radeon-rx-6800-and-6800-xt-review?page=5 

Control with RTX activated (which looks awesome!) in 1440p brings an RX 6800 down to 33 fps on average with slowndowns to 25 fps. Thanks to DLSS2.0 I'm playing Control on my RTX 3070 with RTX activated and in the highest 1440p-settings with 60 - 100 fps.



Around the Network
Conina said:
Captain_Yuri said:

6000 series are pretty terrible products when you realize that AMD is asking people to spend $650/$1000 on a GPU to Turn Off settings. It's amusing how on games that use heavy Ray Tracing, even Turing does better than the 6800 XT and that's before DLSS. You know... The GPUs that everyone said will age badly?

I was going to defend them, since RX 6800 + RX 6800 XT are quite fast in the newest games (AC Valhalla, Dirt 5, Watch Dogs Legion).

So if you aren't very interested in raytracing or VR games, they will deliver a good performance for years, especially their VRAM makes them quite future proof.

Unfortunately(?) I'm very interested in raytracing reflections, raytracing shadows/lighting and in VR games.

But until they have a solution similar to DLSS to even out the performance hit of activated raytracing, I'm better off with Ampere.

https://www.eurogamer.net/articles/digitalfoundry-2020-amd-radeon-rx-6800-and-6800-xt-review?page=5 

Control with RTX activated (which looks awesome!) in 1440p brings an RX 6800 down to 33 fps on average with slowndowns to 25 fps. Thanks to DLSS2.0 I'm playing Control on my RTX 3070 with RTX activated and in the highest 1440p-settings with 60 - 100 fps.

Both AC and Dirt 5 are AMD sponsored titles so it makes sense as to why those perform better on 6000 series. WatchDogs Legion seems to be on par with the 3080 on some reviews but behind on a lot of others from what I can tell unless you go 1080p. And RT is broken for Watchdogs Legion on AMD cards so I wouldn't believe any RT benchmarks.

We have seen the Vram capacity theory before many times as AMD always had more Vram than Nvidia but we have never seen it become a major factor so I doubt it will be the case outside of edge cases. Especially with the bandwidth advantage of the 3000 series as there's more to Vram than just capacity.



             

                               Anime: Haruhi                                                                                      Anime: Love Live
                              Nsfw Anime Thread                                                                             Join our Anime Threads!
                             Sfw Anime Thread                                                                                VGC Tutorial Thread

Months of hype to get disappointing products. Overpriced and yet all sold out. Further can't even buy them for those willing to fork out the money for whatever reason. The bad pricing becomes worse for those outside US of A.

Also some of the features only for people are on Zen 3. Bleh. 

Last edited by green_sky - on 22 November 2020

So the Dark Hero is supposed to release this week. I would say good luck but I hope you all fail so I can get one.



             

                               Anime: Haruhi                                                                                      Anime: Love Live
                              Nsfw Anime Thread                                                                             Join our Anime Threads!
                             Sfw Anime Thread                                                                                VGC Tutorial Thread

vivster said:
hinch said:

I wouldn't say bad at higher resolutions. Rasterization is pretty much on par at 4k and is scales better at 1440P, its just the RT performance which is a bit eh. RDNA 3 isn't a bad shout, or waiting it out for Nvidia's 7nm cards.

This new console generation means a tonne of bad ports.. so heavy brute forcing is going to be the way. Idd about the feeling of Ryzen gen 1, it a good product and competitive in a lot of ways, but not the home run. And with RT being prevalent in the next generation of games this seems like a stop gap until RDNA 3 comes out.

.

Isn't the consistent drop in overall performance when increasing resolution a surefire sign that they did not go wide enough? For higher resolution you need to go wide because there is more stuff to do in parallel and they obviously did not go that route. Imagine where those cards were if not for the crazy clocks. Now the big question is how well does RDNA scale horizontally. If it does then Nvidia has a big problem on their hands because they are already quite wide and don't really have much potential to go much faster until they get on a new node. They will be forced to significantly remodel their architecture.

Question into the room: How much do you think Samsung's node is at fault for the obscenely bad efficiency of Ampere? Will we be able to get double digit improvements on the Ampere refreshes on TSMC? If not, who is the most likely culprit for scratching at 500Watts?

To be fair I'm not sure what else they could have done. They could make the top Big Navi die have more CU's but that would increase silicon budget in turn add cost, power and would need a greater cooling solution/s. And Navi 21 would end up a more expensive or on par 3080, but still with lower RT performance. About scaling, RDNA scales well with more CU's as we've seen in the lineup, thus far. Next generation of cards will be interesting with both Nvidia and AMD seemingly wanting to go with chiplet designs.

In any case AMD have met their targets with performance. And Nvidia, too with top tier performance across the board. Though they went with Samsung, which kinda threw a  spanner in the works and allowed AMD to release a cards in the ballpark, which potentially can take some market share from them.

7nm will be faster as power efficiency gains will allow clocks to increase. I'm assuming Nvidia will release the big Super variants next year with 3080/3070 Supers. Akin to what they done with Turing's Super launches and potentially a 3080Ti. Not going to predict performance increase as its going to depend on Nvidia and if they are planning to have a stack ready to complete with AMD's new cards next year.



Ofc this isnt PC but I still find it interesting.

PS5s memory runs really hot, close to 100°C (at 22-23°C ambiant) and the console is not that great at getting heat out.

That memory will be cooking in hot summers if you don't have AC.