By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Conina said:
Bofferbrauer2 said:

So, some are better on Radeon, most are better on the 3080. All in all it's not that bad for the 6800XT for a first try.

Conina said:

A lot of dropped frames for the RX 6800 XT in Fallout 4 (5% dropped frames), No Man's Sky (3% dropped frames) and especially Subnautica (31% dropped frames)

What are synth frames?

Frame reprojection to keep framerates steady at 90 FPS (or 80 FPS or 120 fps, depending on the VR headset).

If a GPU can output a frame every 11 milliseconds, 100% would be "new frames" (green). If some scenes are more complex and the game needs 11 - 22 milliseconds, a whole frame will be dropped and the user sees 22 milliseconds the same image without adapting to his changed head position.

If the GPU needs 22 -33 milliseconds, two whole frames will be dropped and the user sees 33 milliseconds the same image without adapting to his changed head position.

There are some similar frame reprojection methods for PSVR, SteamVR, OculusVR and some other VR platforms, f. e. ASW (asynchronous space warp) for the Oculus devices.They deliver a smoother VR experience.

If the GPU needs 11 - 22 milliseconds, the missing frame will be replaced by the same frame as before BUT adapted to the head movement. But it needs also a few milliseconds for that reprojection and if there ain't enough time, the frame will be dropped instead of reprojected.

Dropped frames are more noticeable than synthetic frames.

So the Radeon doesn't produce such synth frames yet, but it's not a hardware problem per se and rather something that could be fixed with a driver update, if I understand correctly?



Around the Network

Retail copies of Cyberpunk (PS4) out in the wild. Looks like no more delays!



Bofferbrauer2 said:
Conina said:

Frame reprojection to keep framerates steady at 90 FPS (or 80 FPS or 120 fps, depending on the VR headset).

If a GPU can output a frame every 11 milliseconds, 100% would be "new frames" (green). If some scenes are more complex and the game needs 11 - 22 milliseconds, a whole frame will be dropped and the user sees 22 milliseconds the same image without adapting to his changed head position.

If the GPU needs 22 -33 milliseconds, two whole frames will be dropped and the user sees 33 milliseconds the same image without adapting to his changed head position.

There are some similar frame reprojection methods for PSVR, SteamVR, OculusVR and some other VR platforms, f. e. ASW (asynchronous space warp) for the Oculus devices.They deliver a smoother VR experience.

If the GPU needs 11 - 22 milliseconds, the missing frame will be replaced by the same frame as before BUT adapted to the head movement. But it needs also a few milliseconds for that reprojection and if there ain't enough time, the frame will be dropped instead of reprojected.

Dropped frames are more noticeable than synthetic frames.

So the Radeon doesn't produce such synth frames yet, but it's not a hardware problem per se and rather something that could be fixed with a driver update, if I understand correctly?

The Radeon also produce synth frames (if it has enough time). If they had chosen a bit lower settings for Fallout 4, No Man's Sky and Subnautica, frame reprojection probably would have worked.

It worked for Hellblade most of the time:



Feels like the 6000 series will not age well. They're already bad at complex and high resolution games. There is absolutely no way anyone should buy those cards. RDNA3 is gonna be so much better and if you really need a card right now there are Nvidia cards. The 6000 series really feels like Ryzen Gen1.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

I wouldn't say bad at higher resolutions. Rasterization is pretty much on par at 4k and is scales better at 1440P, its just the RT performance which is a bit eh. RDNA 3 isn't a bad shout, or waiting it out for Nvidia's 7nm cards.

This new console generation means a tonne of bad ports.. so heavy brute forcing is going to be the way. Idd about the feeling of Ryzen gen 1, it a good product and competitive in a lot of ways, but not the home run. And with RT being prevalent in the next generation of games this seems like a stop gap until RDNA 3 comes out.

.

Last edited by hinch - on 22 November 2020

Around the Network

The main reason for my new PC were VR games, some of them didn't run perfect with high settings and without resolution scaling on my GTX 1070 and i5-4670K (Half-Life Alyx, Fallout 4, Hellblade, Senua's Sacrifice).

So far all VR games I tested with my new PC (5800X / RTX 3070) run great with highest settings and 200% SteamVRs resolution scaling. Many of them even with 300% resolution scaling, some of them with 400% resolution scaling.

I hope that I can use my preordered Reverb G2 (with much higher native resolution than my Oculus Rift) with 150% SteamVRs resolution scaling in most VR games. If not, there is the hope for the RTX 3080 Ti.



hinch said:

I wouldn't say bad at higher resolutions. Rasterization is pretty much on par at 4k and is scales better at 1440P, its just the RT performance which is a bit eh. RDNA 3 isn't a bad shout, or waiting it out for Nvidia's 7nm cards.

This new console generation means a tonne of bad ports.. so heavy brute forcing is going to be the way. Idd about the feeling of Ryzen gen 1, it a good product and competitive in a lot of ways, but not the home run. And with RT being prevalent in the next generation of games this seems like a stop gap until RDNA 3 comes out.

.

Isn't the consistent drop in overall performance when increasing resolution a surefire sign that they did not go wide enough? For higher resolution you need to go wide because there is more stuff to do in parallel and they obviously did not go that route. Imagine where those cards were if not for the crazy clocks. Now the big question is how well does RDNA scale horizontally. If it does then Nvidia has a big problem on their hands because they are already quite wide and don't really have much potential to go much faster until they get on a new node. They will be forced to significantly remodel their architecture.

Question into the room: How much do you think Samsung's node is at fault for the obscenely bad efficiency of Ampere? Will we be able to get double digit improvements on the Ampere refreshes on TSMC? If not, who is the most likely culprit for scratching at 500Watts?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Hmm, so these cards are no good. Wait for 5050 ti or 6060 ti, gotcha.



6000 series are pretty terrible products when you realize that AMD is asking people to spend $650/$1000 on a GPU to Turn Off settings. It's amusing how on games that use heavy Ray Tracing, even Turing does better than the 6800 XT and that's before DLSS. You know... The GPUs that everyone said will age badly?

And the biggest kicker is, the Raster performance isn't even that good depending on where you look. 4k, the 6800 XT loses to a 3080. 1440p, it Trades blows or is in the lead by a small margin depending on the game, 1080p, it wins, in VR, it loses. Hell it gets killed on some of the older titles like The Witcher 3 at every resolution. Looking at Cyberpunk, Ray Tracing isn't even being supported at launch. You know, one of the most highly anticipated games of the generation?

6000 series is just for AMD fans only and no one else. The Vram capacity argument has been nonsense in the past and continues to be nonsense today proven by the benchmarks. For most people, it's better to wait until AMD has a proper GPU that is actually worth the price they are asking instead of forking over $650/$1000 for potato ray tracing performance, good Raster performance and zero answer to DLSS. Cause at those prices, you really shouldn't need to turn off settings to justify a purchase.

As for TSMC vs Samsung. I do think Nvidia going back to TSMC might be their Pascal type of leap in all fronts.

Last edited by Jizz_Beard_thePirate - on 22 November 2020

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

6000 series are pretty terrible products when you realize that AMD is asking people to spend $650/$1000 on a GPU to Turn Off settings. It's amusing how on games that use heavy Ray Tracing, even Turing does better than the 6800 XT and that's before DLSS. You know... The GPUs that everyone said will age badly?

I was going to defend them, since RX 6800 + RX 6800 XT are quite fast in the newest games (AC Valhalla, Dirt 5, Watch Dogs Legion).

So if you aren't very interested in raytracing or VR games, they will deliver a good performance for years, especially their VRAM makes them quite future proof.

Unfortunately(?) I'm very interested in raytracing reflections, raytracing shadows/lighting and in VR games.

But until they have a solution similar to DLSS to even out the performance hit of activated raytracing, I'm better off with Ampere.

https://www.eurogamer.net/articles/digitalfoundry-2020-amd-radeon-rx-6800-and-6800-xt-review?page=5 

Control with RTX activated (which looks awesome!) in 1440p brings an RX 6800 down to 33 fps on average with slowndowns to 25 fps. Thanks to DLSS2.0 I'm playing Control on my RTX 3070 with RTX activated and in the highest 1440p-settings with 60 - 100 fps.