By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:
Captain_Yuri said:

It's more to do with having a discussion point rather than taking them as gospel and ignoring the reviews. In the past, the devs gave vague requirements that showed min and recommended with potato specs. This gen is different where the devs are giving us fairly detailed requirements with interesting patterns.

So it's not like people won't be posting performance numbers as they come out but until then, it is an interesting point of discussion as we start coming into the next generation of games.

I don't think saying 6700 XT/2070 Super for 1080p60 at medium and a 6800 XT/3080 for 4K60 at epic is interesting is right, just wildly inaccurate.

If the former is true the latter will barely get to 30 fps at 4K. If the latter is true then the former would likely be >120 fps at the same settings, let alone medium. Not to mention if the 3080 and the 6800 are equivalent here then the 6700 will outperform the 2070 Super by 20-25%. If they aren't, they could have picked up better GPUs to compare at the higher end. So nothing about it makes sense... as usual for those spec sheets.

Edit - maybe I'm being too harsh since the department/employee responsible for these likely have limited time and a limited variety of test configurations. But still, in this case, the original point stands.

Well there is too many variables to consider as to the exact reason why they would list those specs but I do think there could be a degree of logical sense which is based on the average performance target.

Firstly with Returnal, 6700XT/2070 Super is meant for High settings and Not Medium according to their specs. A 2070 Super is generally with in 6-8% of a 2080. So if we use an example game such as the Callisto Protocol from Techpowerup with their 1080p benchmark:

https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html

You can see that a 6700XT is pretty close to a 2080 which is pretty close to a 2070 Super (if we theorize %) in average in that game. But you can also see that there is a bit of a gap between a 6600XT/3060 vs 6700XT/2080/2070 Super. So if the goal is to tell people that here are two GPUs that is required to hit a 1080p 60fps average at high settings and a 6600XT/3060 are too slow to hit that average but a 2070 Super/6700XT can hit it... Well there you go. Now for all we know, it could be a situation where a 2070 Super just hits 60fps average while a 6700XT does 65fps average. But since AMD doesn't have another GPU that hits 60fps, they decided to put in a 6700XT.

And as for the 3080/6800XT requirements, it could be situation where 6800XT lets say does 60fps and 3080 does 65fps but since a 3070 Ti is too weak or runs out of Vram so they put in a 3080 instead for 4k. I think the main issue is that people are thinking of the averages for a 6700XT during say launch where it was pretty close to a 3070. But the issue is that the 6700XT did not age very well and some recent game testing shows that it's under performing even against a 3060 Ti or 2080.

So overall, I don't think the requirements are nonsensical as we already have AMD sponsored games that showcase a 6700XT performing similar enough to a 2070 Super. It's just you can't take them as accurately as performance reviews but rather a vague preview of potential performance expectations.

Last edited by Jizz_Beard_thePirate - on 18 January 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850