By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - FSR 2.0! now with Temporal Up-Sampling! (and better image quality)

One thing I will say is people should keep their expectations in check until DF comes out with their comparisons. FSR 2.0 isn't an alternative to DLSS... It's an alternative to TAA upsampling which has been implemented in various games for a while. The benefits of that is that it can be implemented in a more wide variety of hardware but we have seen what TAA Upscaling can do and it's limitations. There's a reason why AMD is only comparing images that are being upscaled to 4k (and not 1440p) and why they aren't doing image comparisons against DLSS despite Deathloop having DLSS.

So while what AMD is doing with FSR 2.0 is fantastic, I would keep your expectations in check until we have DF doing their comparisons because whether it be AMD, Nvidia or Intel, they will always show their product in the best light.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
JRPGfan said:

It already has DLSS....
So adding FSR2.0, should be like 1-2 days of work (max 3) (according to AMD).
Just a question if they can spare (bother with) a programmer, working a day or two on it.

I know it has DLSS?, I was saying that they should properly impliment FSR 2.0 once it's fully released. 

They already have FSR 1.0, but it's shit, mainly due to how CDPR's version of TAA works.

You say it'll take that much time, but we only just got patch 1.5 this year, while the game was released way back in Dec 2020, I wouldn't expect FSR 2.0 to be put into the game until later on this year, if not next year at best.

They can spare, they clearly made absolute bank from 2077 sales, they aren't poor, it's just a matter of "do they care to do it sooner rather than later?".



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Kyuu said:

So Digital Foundry (kind of) made the first meaningful comparison between DLSS2 and TSR (via Unreal Engine 4 which was not supposed to support TSR?), and the results were great. In stills, they're VERY close (as opposed to TAAU being outclassed), but in motion DLSS2 starts showing noticeable advantages in a number of aspects as expected.

Curiously, the PS5 version used FSR1. We'll see if TSR gets any improvements from Unreal Engine 5, and how it'll compare to FSR2.

Not that surprising since AMD hasn't said that the PS5 supports FSR2. Which is odd since they announced old card of their main competitor do. But from all recent GPU only the PS5 one hasn't been confirmed as that it supports FSR 2. Perhaps Microsoft made a deal since they specifically did mention Xbox consoles. Its very doubtful and would lead Sony to Intel/Nvidia for the PS6, but possible in theory.



Please excuse my (probally) poor grammar

Here is FSR 2.0 (1440->4k) vs Native 4k+TAA+Sharpening enabled:

https://live.staticflickr.com/65535/51971605235_5cdec485da_k.jpg    (clik link for big picture, if you want to see it and compaire yourself)

It still looks like it beats out Native 4k imo (even with TAA + Sharpening also applied to the 4k picture).

Granted this is a still picture.
We still need more "in-motion" clips.... but even there (from short clips we have seen), it looked good.

It honestly looks better than Native.

edit:

Also this:

There is a link to a PDF, and if you scroll down to page 55, it reaches the FSR 2.0 part.
Go down futher, you find the screen grabs from differnt modes (not just FSR 2.0 quality).

The performance mode (720-> upscaled to 4k) looks damn good considering its from 720p.
Basically the same black magic of DLSS... impressive it upscales that well.

Last edited by JRPGfan - on 30 March 2022

JRPGfan said:

Here is FSR 2.0 (1440->4k) vs Native 4k+TAA+Sharpening enabled:

https://live.staticflickr.com/65535/51971605235_5cdec485da_k.jpg    (clik link for big picture, if you want to see it and compaire yourself)

It still looks like it beats out Native 4k imo (even with TAA + Sharpening also applied to the 4k picture).

Granted this is a still picture.
We still need more "in-motion" clips.... but even there (from short clips we have seen), it looked good.

It honestly looks better than Native.

edit:

Also this:

There is a link to a PDF, and if you scroll down to page 55, it reaches the FSR 2.0 part.
Go down futher, you find the screen grabs from differnt modes (not just FSR 2.0 quality).

The performance mode (720-> upscaled to 4k) looks damn good considering its from 720p.
Basically the same black magic of DLSS... impressive it upscales that well.

Feels like the sharpening introduced a bit more aliasing on the plastic green wrapper covering the eyes, wheras in FSR 2.0's it is corrected and looks better, definitely an improvement for sure.

I do wish there was a benchmark tool that features plenty of wires and cables to show off FSR 2.0 both in stills and in motion, because I want to see more of what it can correct over native res. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network

So its out... bechmark reviews.
https://www.techpowerup.com/review/amd-fidelity-fx-fsr-20/2.html

TLDR:
DLSS 2.3 has slightly better image stability at lower resolutions boosted up (performance mode, and 1080p upsampling).
At 1440p, and 4k, and Quality, they are basically the same (DLSS 2,3 and FSR 2.0). FSR2 handles ghosting slightly better than DLSS.
Also performance wise, FSR 2.0, is almost as good as DLSS (on Nvidia cards, which was used to compair to dlss).
Basically in gameplay, you wouldnt notice if it was one or the other.


Techpower up quotes:
"AMD has achieved the unthinkable—the new FidelityFX Super Resolution FSR 2.0 looks amazing, just as good as DLSS 2.0, actually DLSS 2.3 (in Deathloop). Sometimes even slightly better, sometimes slightly worse, but overall, this is a huge win for AMD."

"When comparing "DLSS Quality" against "FSR 2.0 Quality," spotting minor differences is possible, but for every case I found, I'd say it's impossible to declare one output better than the other; it's pretty much just personal preference, or not even that."

"In terms of performance, FSR 2.0 deserves praise, too. While it is a little bit more demanding than FSR 1.0, which is not surprising given the additional logic, it's still mighty fast and pretty much identical to DLSS 2.0 on even NVIDIA hardware, which is able to offload many DLSS upscaling operations to the Tensor Cores. No doubt, on AMD hardware, there will be additional optimizations, but we wouldn't be able to compare performance against DLSS then because it's an NVIDIA exclusive technology. "

-----------------------

Also AMD did a major update to how their cards handle DX11.
Supposedly theres 10-20% gains in many games, and much higher minimum frames (ei. less drops).

For years, people have said, amd needed to fix their DX11 support, to go equal with nvidia.
Well, looks like they finally did some work on it.



Nice, now all we need to do is see how long it takes for AMD to actually have it implemented in a load of games going forward, hopefully within a year we'll see up to 50+ games being supported, instead of a few handpicked titles.

Nvidia is still improving their side of things with DLSS, so it's looking like we now have two viable options on the table to choose from.

Not really sure why Techpower bothered with that last line, because if AMD is going to get exclusive optimisations with FSR 2.0 on AMD cards, then of course Nvidia is doing the same already with their own cards and DLSS, and if AMD does go heavy on their own brand with the hw, then it's going to eventually become pointless comparing the two on Nvidia's own hw.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

It's ironic that I that AMD has added life to my old GTX 1080 laptop. Sure I'll take the extra performance on RX6900XT desktop but it's not something I was worried about to begin with. My laptop has started to show it's age as of late.



Darc Requiem said:

It's ironic that I that AMD has added life to my old GTX 1080 laptop. Sure I'll take the extra performance on RX6900XT desktop but it's not something I was worried about to begin with. My laptop has started to show it's age as of late.

Amd cards age like fine wine too.
Usually they dont put in as much work as needed with driver optimisations at launch, as nvidia does.
So overtime the drivers see bigger improvements.
While its like nvidia purposefully give older cards worse drivers as time goes on (time to upgrade $$$$).

This means if you look at launch reviews, you might see a amd card loseing out against a competeing priceclass nvida card (at launch).
Then 5 years down the line, its actually performing better than the competeing nvidia card, in then, newly released titles.

But yeah, AMDs approach of everything being open source, and for all, is nice.

They have to work abit harder for sales, than nvidia does, since their the underdog.
Sometimes it really points out how much BS, nvidia does though.
Case in point, G-Sync vs Free-Sync. Paying 150$ extra to nvidia, for something barely noticeable vs free solution that works for everyone.

Also nvidia is quick to come up with new technologies. Then dictate how its used, by developers.
Like Tessellation? remember when that was a issue?
Nvidia was better at it than amd, so it paid benchmark makers, to use more of it than was nessary (ei. you couldnt even tell visually anymore, on model with increased amounts) (invisible sea mess, of tressellation ect). Just so nvidia could look better in benchmarks (non games), in reviews.

They also paid to have dx10.1 removed in Assasins creed game? I think it was, due to it favoring AMD in benchmarks.
Nvidia play hardball, and cheat at times to come out ahead. Walled gardens, ect.
Feels bad man, same with Intel, they where doing tons of suspect things like that as well, in the x86 space for years.

Its amasing that AMD can even compete (marketshare wise.... its like 30% in gpu and cpus?).
hard not to root for the underdog (because where would we be, if it was only Intel and Nvidia, dictateing everything and price too?)

Last edited by JRPGfan - on 13 May 2022

FSR 2.0 running on Steam Deck on 40Hz mode with Deathloop.

Doesn't look that bad on that 7" 800P screen tbh. Still soft with some of shimmering but may be worth the trade off in battery life for Deck users in portable mode.

There's also this.. which compares both native 800P vs FSR 2.0

Turning it on with upcapped FPS, there are gains a up to 10fps in a static scene. Trade off being a softer image, though with AA and a little too over-sharpened/post process look.

Last edited by hinch - on 14 May 2022