By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Captain_Yuri said:
JRPGfan said:

I heard a rumor that these cards do Raytraceing better than the 2080ti does.
Not as fast as the 3080 ect, but more than fast enough (to use it in games).

Also seems AMD has a huge edge in 1440p gameing (abit smaller in 4k).

Its crazy AMD is back at top-end again, its been so long (feels like AMD just let nvidia always be faster, and make bigger die cards).

The thing about no counterpart to DLSS yet, thats been said it'll launch sometime early 2021.

The question is of course, how much better and which card cause the 3080 is 25-40% faster than 2080 Ti when RT is enabled depending on the game.

These are cherry picked benchmarks so I will wait for the actual reviews. The 1440p benchmarks are actually quite closer than I was thinking as there's 4/10 titles where AMD has a huge edge against the 3080.

Yea it is pretty crazy that after what? 3-4 generations of GPUs, we finally have an AMD GPU that can go toe to toe in Raster. There's no way to put it other than amazing.

Then the question is which games will support AMD's version of DLSS. Not to mention, how's the quality of the image?

JRPGfan said:
Random_Matt said:
Nvidia fanboys will always buy green. Had this discussion on overclockers, Intel shills are still buying their chips, the validations change all the time.

People forget DLSS 2.0 (that dlss thats actually good) is only used in like 12 games so far.
Nvidia has blinded people, with marking names for new techniques and stuff.

This is like freesync vs G-sync.
Going forwards I suspect just as many games will make use of amds version of DLSS, that isnt card dependent.
If its decent enough, devs may even drop useing dlss in future.

This narrative needs to stop. There is no "AMD version of DLSS" and there won't be in the foreseeable future. DLSS is an entirely new technology that is only possible thanks to AI cores on the card and a machine learning farm to train for games. AMD has neither of those things. But those things are precisely what gives DLSS that huge performance edge. You cannot achieve the same results without these exact components. Anything else will just be some post processing image sharpener that will guess about the image and fill it accordingly. Which will yield worse results with worse performance. Comparing DLSS with Gsync shows a fundamental lack of understanding.

A more apt comparison would be PhysX, though the application range of PhysX is much more niche than DLSS.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.