By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - FSR 2.0! now with Temporal Up-Sampling! (and better image quality)

Here is FSR 2.0 (1440->4k) vs Native 4k+TAA+Sharpening enabled:

https://live.staticflickr.com/65535/51971605235_5cdec485da_k.jpg    (clik link for big picture, if you want to see it and compaire yourself)

It still looks like it beats out Native 4k imo (even with TAA + Sharpening also applied to the 4k picture).

Granted this is a still picture.
We still need more "in-motion" clips.... but even there (from short clips we have seen), it looked good.

It honestly looks better than Native.

edit:

Also this:

There is a link to a PDF, and if you scroll down to page 55, it reaches the FSR 2.0 part.
Go down futher, you find the screen grabs from differnt modes (not just FSR 2.0 quality).

The performance mode (720-> upscaled to 4k) looks damn good considering its from 720p.
Basically the same black magic of DLSS... impressive it upscales that well.

Last edited by JRPGfan - on 30 March 2022

Around the Network
JRPGfan said:

Here is FSR 2.0 (1440->4k) vs Native 4k+TAA+Sharpening enabled:

https://live.staticflickr.com/65535/51971605235_5cdec485da_k.jpg    (clik link for big picture, if you want to see it and compaire yourself)

It still looks like it beats out Native 4k imo (even with TAA + Sharpening also applied to the 4k picture).

Granted this is a still picture.
We still need more "in-motion" clips.... but even there (from short clips we have seen), it looked good.

It honestly looks better than Native.

edit:

Also this:

There is a link to a PDF, and if you scroll down to page 55, it reaches the FSR 2.0 part.
Go down futher, you find the screen grabs from differnt modes (not just FSR 2.0 quality).

The performance mode (720-> upscaled to 4k) looks damn good considering its from 720p.
Basically the same black magic of DLSS... impressive it upscales that well.

Feels like the sharpening introduced a bit more aliasing on the plastic green wrapper covering the eyes, wheras in FSR 2.0's it is corrected and looks better, definitely an improvement for sure.

I do wish there was a benchmark tool that features plenty of wires and cables to show off FSR 2.0 both in stills and in motion, because I want to see more of what it can correct over native res. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

So its out... bechmark reviews.
https://www.techpowerup.com/review/amd-fidelity-fx-fsr-20/2.html

TLDR:
DLSS 2.3 has slightly better image stability at lower resolutions boosted up (performance mode, and 1080p upsampling).
At 1440p, and 4k, and Quality, they are basically the same (DLSS 2,3 and FSR 2.0). FSR2 handles ghosting slightly better than DLSS.
Also performance wise, FSR 2.0, is almost as good as DLSS (on Nvidia cards, which was used to compair to dlss).
Basically in gameplay, you wouldnt notice if it was one or the other.


Techpower up quotes:
"AMD has achieved the unthinkable—the new FidelityFX Super Resolution FSR 2.0 looks amazing, just as good as DLSS 2.0, actually DLSS 2.3 (in Deathloop). Sometimes even slightly better, sometimes slightly worse, but overall, this is a huge win for AMD."

"When comparing "DLSS Quality" against "FSR 2.0 Quality," spotting minor differences is possible, but for every case I found, I'd say it's impossible to declare one output better than the other; it's pretty much just personal preference, or not even that."

"In terms of performance, FSR 2.0 deserves praise, too. While it is a little bit more demanding than FSR 1.0, which is not surprising given the additional logic, it's still mighty fast and pretty much identical to DLSS 2.0 on even NVIDIA hardware, which is able to offload many DLSS upscaling operations to the Tensor Cores. No doubt, on AMD hardware, there will be additional optimizations, but we wouldn't be able to compare performance against DLSS then because it's an NVIDIA exclusive technology. "

-----------------------

Also AMD did a major update to how their cards handle DX11.
Supposedly theres 10-20% gains in many games, and much higher minimum frames (ei. less drops).

For years, people have said, amd needed to fix their DX11 support, to go equal with nvidia.
Well, looks like they finally did some work on it.



Nice, now all we need to do is see how long it takes for AMD to actually have it implemented in a load of games going forward, hopefully within a year we'll see up to 50+ games being supported, instead of a few handpicked titles.

Nvidia is still improving their side of things with DLSS, so it's looking like we now have two viable options on the table to choose from.

Not really sure why Techpower bothered with that last line, because if AMD is going to get exclusive optimisations with FSR 2.0 on AMD cards, then of course Nvidia is doing the same already with their own cards and DLSS, and if AMD does go heavy on their own brand with the hw, then it's going to eventually become pointless comparing the two on Nvidia's own hw.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

According to HardwareUnboxed, FSR 2.0 gains on the AMD GPU(s) they tested didn't differ much from Nvidia's in Deathloop. The techniques AMD is using seem to be perfectly compatible with Nvidia GPU's, including some older ones. This means any possible AMD "advantage" in the near future would likely be the result of poor Nvidia optimization as opposed to any special AMD optimizations. It should be easy to tell in the games that support both FSR 2.0 and DLSS 2.0+

FSR 2.0 will be huge for new consoles, non-RTX GPU's, and maybe the industry as a whole. Developers can now push graphics to a significantly higher level even on older hardware. The one hardware related aspect that really disappointed about PS5/Series X was the lack of a Tensor Core equivalent, which I thought was necessary for achieving such crazy reconstruction results based on the comments and analyses from tech nerds all over the internet.



Around the Network

It's ironic that I that AMD has added life to my old GTX 1080 laptop. Sure I'll take the extra performance on RX6900XT desktop but it's not something I was worried about to begin with. My laptop has started to show it's age as of late.



Darc Requiem said:

It's ironic that I that AMD has added life to my old GTX 1080 laptop. Sure I'll take the extra performance on RX6900XT desktop but it's not something I was worried about to begin with. My laptop has started to show it's age as of late.

Amd cards age like fine wine too.
Usually they dont put in as much work as needed with driver optimisations at launch, as nvidia does.
So overtime the drivers see bigger improvements.
While its like nvidia purposefully give older cards worse drivers as time goes on (time to upgrade $$$$).

This means if you look at launch reviews, you might see a amd card loseing out against a competeing priceclass nvida card (at launch).
Then 5 years down the line, its actually performing better than the competeing nvidia card, in then, newly released titles.

But yeah, AMDs approach of everything being open source, and for all, is nice.

They have to work abit harder for sales, than nvidia does, since their the underdog.
Sometimes it really points out how much BS, nvidia does though.
Case in point, G-Sync vs Free-Sync. Paying 150$ extra to nvidia, for something barely noticeable vs free solution that works for everyone.

Also nvidia is quick to come up with new technologies. Then dictate how its used, by developers.
Like Tessellation? remember when that was a issue?
Nvidia was better at it than amd, so it paid benchmark makers, to use more of it than was nessary (ei. you couldnt even tell visually anymore, on model with increased amounts) (invisible sea mess, of tressellation ect). Just so nvidia could look better in benchmarks (non games), in reviews.

They also paid to have dx10.1 removed in Assasins creed game? I think it was, due to it favoring AMD in benchmarks.
Nvidia play hardball, and cheat at times to come out ahead. Walled gardens, ect.
Feels bad man, same with Intel, they where doing tons of suspect things like that as well, in the x86 space for years.

Its amasing that AMD can even compete (marketshare wise.... its like 30% in gpu and cpus?).
hard not to root for the underdog (because where would we be, if it was only Intel and Nvidia, dictateing everything and price too?)

Last edited by JRPGfan - on 13 May 2022

FSR 2.0 running on Steam Deck on 40Hz mode with Deathloop.

Doesn't look that bad on that 7" 800P screen tbh. Still soft with some of shimmering but may be worth the trade off in battery life for Deck users in portable mode.

There's also this.. which compares both native 800P vs FSR 2.0

Turning it on with upcapped FPS, there are gains a up to 10fps in a static scene. Trade off being a softer image, though with AA and a little too over-sharpened/post process look.

Last edited by hinch - on 14 May 2022

Digital Foundry released their review on it as well:

It's very informative so I'd give it a watch. Certainly not a "DLSS Killer" that some sites claimed but it is very good none the less.

Last edited by Captain_Yuri - on 14 May 2022

             

                               Anime: Haruhi                                                                                                           Nsfw Anime Thread