By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - FSR 2.0! now with Temporal Up-Sampling! (and better image quality)

Yup, vastly improved over spacial tech they had going on. Which though was okay, though gave results not much better than a simple sharpening filter, without effecting the UI, text etc. At best FSR 1.0 felt like AMD's stop gap solution and marketing device to push their branding for its own upscaling tech until their true FSR (2.0) was ready - their answer to Nvidia's DLSS.

Moving to a temporal upscaling was the only way to go. With the the industry moving towards better upscaling techniques, this is pretty much what was needed from them. They were falling behind behind Nvidia in many key areas like having a good upscaler though now its looking a lot better for AMD. Granted they still need better Raytracing performance - something we've been hearing from 'insiders' that RDNA 3 will fix that.

For the graph, is the internal resolution before the upscaler does its thing and the output resolution is what is displayed on screen as an output. So for me I have a 1440P monitor. When I enable FSR 2.0 on a compatible game, at quality setting.. it will render the game at 960P and upscale it to 1440P and it should (should FSR 2 work as intended) give image close to native or better with anti-aliasing applied.

Last edited by hinch - on 25 March 2022

Around the Network
hinch said:

For the graph, is the internal resolution before the upscaler does its thing and the output resolution is what is displayed on screen as an output. So for me I have a 1440P monitor. When I enable FSR 2.0 on a compatible game, at quality setting.. it will render the game at 960P and upscale it to 1440P and it should (should FSR 2 work as intended) give image close to native or better with anti-aliasing applied.

I too game at 1440p, so this is good to know.

I just have to wonder now if CDPR will work out implementing 2.0 with Cyberpunk and seeing if they'll rework their version of TAA, because afak, 1.0 has trouble with it's scaling due to how 2077's TAA works, which does involve perf drops, ghosting at times and a blurry image in any sort of motion (even with motion blur completely disabled along with CA).

I'm also hoping this gets devs in the future to rethink how they go about their own AA solutions, that won't disrupt DLSS and FSR 2.0, because the last thing we want is a dev screwing over new tech because they thought their game would look better with a malformed version of AA that other devs aren't using in the same manner.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

As a console player im hoping 1440p 60fps with all the fixings becomes the staandard now. I'm pretty sure the consoles where build for that so fsr could take them to 4k.



It takes genuine talent to see greatness in yourself despite your absence of genuine talent.

Chazore said:
hinch said:

For the graph, is the internal resolution before the upscaler does its thing and the output resolution is what is displayed on screen as an output. So for me I have a 1440P monitor. When I enable FSR 2.0 on a compatible game, at quality setting.. it will render the game at 960P and upscale it to 1440P and it should (should FSR 2 work as intended) give image close to native or better with anti-aliasing applied.

I too game at 1440p, so this is good to know.

I just have to wonder now if CDPR will work out implementing 2.0 with Cyberpunk and seeing if they'll rework their version of TAA, because afak, 1.0 has trouble with it's scaling due to how 2077's TAA works, which does involve perf drops, ghosting at times and a blurry image in any sort of motion (even with motion blur completely disabled along with CA).

I'm also hoping this gets devs in the future to rethink how they go about their own AA solutions, that won't disrupt DLSS and FSR 2.0, because the last thing we want is a dev screwing over new tech because they thought their game would look better with a malformed version of AA that other devs aren't using in the same manner.

Cool, yeah its nice to have options.. since I don't currently have a RTX card and don't plan to upgrade until later this year. But yeah at least its usable at this resolution, unlike FSR 1 lol.

I did read that FSR 2.0 overrides TAA in games so it shouldn't be as much of an issue.

Last edited by hinch - on 25 March 2022

Chazore said:
hinch said:

For the graph, is the internal resolution before the upscaler does its thing and the output resolution is what is displayed on screen as an output. So for me I have a 1440P monitor. When I enable FSR 2.0 on a compatible game, at quality setting.. it will render the game at 960P and upscale it to 1440P and it should (should FSR 2 work as intended) give image close to native or better with anti-aliasing applied.

I too game at 1440p, so this is good to know.

I just have to wonder now if CDPR will work out implementing 2.0 with Cyberpunk and seeing if they'll rework their version of TAA, because afak, 1.0 has trouble with it's scaling due to how 2077's TAA works, which does involve perf drops, ghosting at times and a blurry image in any sort of motion (even with motion blur completely disabled along with CA).

I'm also hoping this gets devs in the future to rethink how they go about their own AA solutions, that won't disrupt DLSS and FSR 2.0, because the last thing we want is a dev screwing over new tech because they thought their game would look better with a malformed version of AA that other devs aren't using in the same manner.

It already has DLSS....
So adding FSR2.0, should be like 1-2 days of work (max 3) (according to AMD).
Just a question if they can spare (bother with) a programmer, working a day or two on it.



Around the Network
Chazore said:

I'm also hoping this gets devs in the future to rethink how they go about their own AA solutions, that won't disrupt DLSS and FSR 2.0, because the last thing we want is a dev screwing over new tech because they thought their game would look better with a malformed version of AA that other devs aren't using in the same manner.

Now that you mention it, could be a reason why Nintendo is not doing any AA on their Switch games, as they'll be upscaled next-gen with DLSS.

Or does the implementation needs some sort of AA applied before upscaling?



@Twitter | Switch | Steam

You say tomato, I say tomato 

"¡Viva la Ñ!"

TomaTito said:
Chazore said:

I'm also hoping this gets devs in the future to rethink how they go about their own AA solutions, that won't disrupt DLSS and FSR 2.0, because the last thing we want is a dev screwing over new tech because they thought their game would look better with a malformed version of AA that other devs aren't using in the same manner.

Now that you mention it, could be a reason why Nintendo is not doing any AA on their Switch games, as they'll be upscaled next-gen with DLSS.

Or does the implementation needs some sort of AA applied before upscaling?

If they engine they made the game on allows for it, its not that bad supposedly (DLSS or FSR 2.0).
The thing is, I'm not sure how much nintendo cared about forwards thinking, when they design their games.
Like, when they made the last pokemon game, did they think "in the future, we might port this to Switch 2, with FSR2 or DLSS" when they where planning the game? If they did, its probably not that many extra steps. However knowing Nintendo,..... they probably didnt, and it'll require possible a few weeks or months for each game.



Kyuu said:

I wonder what GTX 1070 being the weakest hardware on Nvidia's side to support FSR 2.0...

If FSR 2.0 will run on a GTX 1070, it will also run on a GTX 1060 or GTX 1050 Ti.

It's the same architecture with the same driver.

Just because AMD drew an arbitrary line for their recommendation doesn't mean that slower models with the same architectiure can't benefit (with lower 1080p settings or with 900p, 800p or 720p output.



JRPGfan said:

From AMD marketing:

"AMD FidelityFX Super Resolution 2.0 technology is a temporal upscaling solution with incredible image quality that is the result of many years of research into upscaling technologies. It has been built by AMD from the ground up to deliver similar or better than native image quality and help boost framerates in supported games."

"FSR 2.0 temporal upscaling uses frame color, depth, and motion vectors in the rendering pipeline and leverages information from past frames to create very high-quality upscaled output and it also includes optimized high-quality anti-aliasing. Spatial upscaling solutions like FSR 1.0 use data from the current frame to create the upscaled output and rely on the separate anti-aliasing incorporated into a game’s rendering pipeline. Because of these differences, FidelityFX Super Resolution 2.0 delivers significantly higher image quality than FSR 1.0 at all quality mode presets and screen resolutions."

Video explaining it:

The short of it?

It boosts performance, while giveing pretty much near equal image quality.



Zoomed in on screengrabs:  (left most = native 4k, then FSR 1.0, and lastly FSR 2)


(left most Native 4k, then FSR 1.0, and lastly FSR2.0)

PCgameshardware.de has a tool for compairsion if  you want to see for yourself:

Link here:
https://www.pcgameshardware.de/commoncfm/comparison/clickSwitchNew.cfm?article=1391135&page=1&draft=-1&rank=3

And you conveniently don't mention that some of those are using quality mode: Going from 1440p to "4k". And the final result is still worse than dlss going from 1080p to 4k years ago. By the time fsr manages to handle 1440p to 4k perfectly, dlss will be close to converting 720p into 4k perfectly. If fsr ever manages proper 1080p to 4k conversions many years from now, dlss might already be flirting with 540p to 4k and 1080p to 8k conversions.



Kyuu said:
Conina said:

If FSR 2.0 will run on a GTX 1070, it will also run on a GTX 1060 or GTX 1050 Ti.

It's the same architecture with the same driver.

Just because AMD drew an arbitrary line for their recommendation doesn't mean that slower models with the same architectiure can't benefit (with lower 1080p settings or with 900p, 800p or 720p output.

As a GTX 1060 6GB owner, I sure hope you're right and my PC can benefit from it without any weird workarounds!

Probably AMD didn't want to advertize FSR 2.0 for GTX 1060 GPU due to their high share on Steam.

GTX 1070 - 1080 Ti + GTX 16xx series are already more than 20% of the Steam hardware base. With GTX 1060 included, 30% of the Steam hardware base would benefit from FSR 2.0 (all GPUs of the competitor without DLSS access):

While only 5% of the Steam hardware base which will benefit from FSR 2.0 are their own GPUs. Even if they include the other Polaris GPUs (which I expect) and the R9-series (which won't happen) already twice of the competitors legacy GPUs would benefit from FSR 2.0 than their own legacy GPUs.

With GTX 1060 officially included, the ratio would be over 3:1 in favor of the competitor's benefit: