By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - How much do you care about the graphical leap between consoles at this point?

Pemalite said:
DonFerrari said:

Hey I guess you made a slight mistake by putting the increase instead of the downgrade, since you can't reduce something 113%

Yeah I did. But essentially you are decreasing by more than half. - Couldn't be bothered to redo the calculation, but my point was still made.

Sure thing. Nintendo capped the performance of both CPU and GPU either for cost or battery consumption. Besides the display that was for cost and worsened the consumption.

Didn't see your opinion on DLSS itself on how much it really will change the landscape. Because I have a hard time believing you'll be able to make the 100 USD and 1000USD same gen cards look almost the same with one having DLSS and the other not having. Or on what would the GP budget be used if the 1000 USD go for DLSS.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
DonFerrari said:
Pemalite said:

Yeah I did. But essentially you are decreasing by more than half. - Couldn't be bothered to redo the calculation, but my point was still made.

Sure thing. Nintendo capped the performance of both CPU and GPU either for cost or battery consumption. Besides the display that was for cost and worsened the consumption.

Didn't see your opinion on DLSS itself on how much it really will change the landscape. Because I have a hard time believing you'll be able to make the 100 USD and 1000USD same gen cards look almost the same with one having DLSS and the other not having. Or on what would the GP budget be used if the 1000 USD go for DLSS.

What you're saying here doesn't really make sense, a $100 DLSS card versus a $1000 DLSS card ... the $1000 Nvidia card will still have DLSS, so it will be able to process high end ray tracing effects and things of that nature. 

Not that $100 modern Nvidia cards are even available, the cheapest DLSS capable card that's in production is the RTX 2060 Super is $400. 

It's only when compared to AMD that yes it becomes a valid comparison is that way. AMD doesn't have DLSS, which means cheaper Nvidia cards can outperform their more expensive ones. But Nvidia to Nvidia comparison doesn't work the same way.



DonFerrari said:
Pemalite said:

Yeah I did. But essentially you are decreasing by more than half. - Couldn't be bothered to redo the calculation, but my point was still made.

Sure thing. Nintendo capped the performance of both CPU and GPU either for cost or battery consumption. Besides the display that was for cost and worsened the consumption.

Didn't see your opinion on DLSS itself on how much it really will change the landscape. Because I have a hard time believing you'll be able to make the 100 USD and 1000USD same gen cards look almost the same with one having DLSS and the other not having. Or on what would the GP budget be used if the 1000 USD go for DLSS.

DLSS doesn't increase lighting, texture details, geometric complexity and such.
It just cleans up the image so it seems higher resolution than it actually is... This is a path that we have been going down for years, Sony put the approach front-and-center with checkerboard rendering on the Playstation 4 Pro... And like DLSS also brought with it caveats like artifacts.

So it will never turn a $100 USD GPU into a $1,000 one, nVidia wouldn't cannibalize it's Geforce Titan profit margins like that.

Not only that, but nothing is stopping anyone from taking that high-end GPU, settings all the graphics to 11 and doing the exact same thing, but just better.

The idea with any frame reconstruction is that you can sit at a decent resolution (Again, every GPU has an optimal resolution range!) in order to bolster visual effects and then just rely on upscaling the image.

Soundwave said:

What you're saying here doesn't really make sense, a $100 DLSS card versus a $1000 DLSS card ... the $1000 Nvidia card will still have DLSS, so it will be able to process high end ray tracing effects and things of that nature. 

Not that $100 modern Nvidia cards are even available, the cheapest DLSS capable card that's in production is the RTX 2060 Super is $400. 

It's only when compared to AMD that yes it becomes a valid comparison is that way. AMD doesn't have DLSS, which means cheaper Nvidia cards can outperform their more expensive ones. But Nvidia to Nvidia comparison doesn't work the same way.

AMD has it's own alternative technologies to DLSS such as DirectML for example. - AMD's approach shifts the burden from itself (As it's relying on Microsofts technology) to developers though, where-as nVidia is using a propriety approach.
https://www.overclock3d.net/news/software/microsoft_s_directml_is_the_next-generation_game-changer_that_nobody_s_talking_about/1

DirectML which Next-Gen consoles will have:


You also have other approaches like AMD's image sharpening:
https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Obviously temporal reconstruction, checker-boarding and other frame reconstruction techniques are other such approaches developers have/will take to achieve fake-4k. - Which is "good enough"



--::{PC Gaming Master Race}::--

Pemalite said:
DonFerrari said:

Sure thing. Nintendo capped the performance of both CPU and GPU either for cost or battery consumption. Besides the display that was for cost and worsened the consumption.

Didn't see your opinion on DLSS itself on how much it really will change the landscape. Because I have a hard time believing you'll be able to make the 100 USD and 1000USD same gen cards look almost the same with one having DLSS and the other not having. Or on what would the GP budget be used if the 1000 USD go for DLSS.

DLSS doesn't increase lighting, texture details, geometric complexity and such.
It just cleans up the image so it seems higher resolution than it actually is... This is a path that we have been going down for years, Sony put the approach front-and-center with checkerboard rendering on the Playstation 4 Pro... And like DLSS also brought with it caveats like artifacts.

So it will never turn a $100 USD GPU into a $1,000 one, nVidia wouldn't cannibalize it's Geforce Titan profit margins like that.

Not only that, but nothing is stopping anyone from taking that high-end GPU, settings all the graphics to 11 and doing the exact same thing, but just better.

The idea with any frame reconstruction is that you can sit at a decent resolution (Again, every GPU has an optimal resolution range!) in order to bolster visual effects and then just rely on upscaling the image.

Soundwave said:

What you're saying here doesn't really make sense, a $100 DLSS card versus a $1000 DLSS card ... the $1000 Nvidia card will still have DLSS, so it will be able to process high end ray tracing effects and things of that nature. 

Not that $100 modern Nvidia cards are even available, the cheapest DLSS capable card that's in production is the RTX 2060 Super is $400. 

It's only when compared to AMD that yes it becomes a valid comparison is that way. AMD doesn't have DLSS, which means cheaper Nvidia cards can outperform their more expensive ones. But Nvidia to Nvidia comparison doesn't work the same way.

AMD has it's own alternative technologies to DLSS such as DirectML for example. - AMD's approach shifts the burden from itself (As it's relying on Microsofts technology) to developers though, where-as nVidia is using a propriety approach.
https://www.overclock3d.net/news/software/microsoft_s_directml_is_the_next-generation_game-changer_that_nobody_s_talking_about/1

DirectML which Next-Gen consoles will have:


You also have other approaches like AMD's image sharpening:
https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Obviously temporal reconstruction, checker-boarding and other frame reconstruction techniques are other such approaches developers have/will take to achieve fake-4k. - Which is "good enough"

Thanks, it was more or less what I expected, a better reconstruction technique than what is available in AMD (perhaps much better) but not enough to overcome a let's say 10x gap on the performance of two cards.

Sure it may help Switch 2 to get even more ports with less trouble for devs, but won't bring quality as close as SoundWave is claiming.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

The last time I recall new graphics blowing my mind was probably with the pre-rendered era: Donkey Kong Country, Myst, FF7, and Resident Evil 2.

N64/PSX to Dreamcast was probably the last jump I really cared about. The jump from Dreamcast/PS2 to Wii/PS3 was not even close to as impressive. The main problem with older generations is that they don’t always translate well over to more recent screen ratios and resolutions. But other than that, I don’t see as substantial a difference between games of  2001 and 2020 as I did between 1988 and 1992, or 1992 and 1996c or 1998 and 2001. Those were all gigantic leaps, that doesn’t happen any longer.

Last edited by Jumpin - on 26 May 2020

I describe myself as a little dose of toxic masculinity.

Around the Network
Jumpin said:

The last time I recall new graphics blowing my mind was probably with the pre-rendered era: Donkey Kong Country, Myst, FF7, and Resident Evil 2.

N64/PSX to Dreamcast was probably the last jump I really cared about. The jump from Dreamcast/PS2 to Wii/PS3 was not even close to as impressive. The main problem with older generations is that they don’t always translate well over to more recent screen ratios and resolutions. But other than that, I don’t see as substantial a difference between games of  2001 and 2020 as I did between 1988 and 1992, or 1992 and 1996c or 1998 and 2001. Those were all gigantic leaps, that doesn’t happen any longer.

You should play death stranding, graphics blew my mind several times. It's gotten to a point where it feels like that what's behind the screen is real and breathtaking. 



Jumpin said:

The last time I recall new graphics blowing my mind was probably with the pre-rendered era: Donkey Kong Country, Myst, FF7, and Resident Evil 2.

N64/PSX to Dreamcast was probably the last jump I really cared about. The jump from Dreamcast/PS2 to Wii/PS3 was not even close to as impressive. The main problem with older generations is that they don’t always translate well over to more recent screen ratios and resolutions. But other than that, I don’t see as substantial a difference between games of  2001 and 2020 as I did between 1988 and 1992, or 1992 and 1996c or 1998 and 2001. Those were all gigantic leaps, that doesn’t happen any longer.

Top shelf console graphics in 2001:

Soon-to-be-replaced console graphics in 2020:

Not a gigantic leap? Really?

Last edited by curl-6 - on 26 May 2020

Pemalite said:
DonFerrari said:

Sure thing. Nintendo capped the performance of both CPU and GPU either for cost or battery consumption. Besides the display that was for cost and worsened the consumption.

Didn't see your opinion on DLSS itself on how much it really will change the landscape. Because I have a hard time believing you'll be able to make the 100 USD and 1000USD same gen cards look almost the same with one having DLSS and the other not having. Or on what would the GP budget be used if the 1000 USD go for DLSS.

DLSS doesn't increase lighting, texture details, geometric complexity and such.
It just cleans up the image so it seems higher resolution than it actually is... This is a path that we have been going down for years, Sony put the approach front-and-center with checkerboard rendering on the Playstation 4 Pro... And like DLSS also brought with it caveats like artifacts.

So it will never turn a $100 USD GPU into a $1,000 one, nVidia wouldn't cannibalize it's Geforce Titan profit margins like that.

Not only that, but nothing is stopping anyone from taking that high-end GPU, settings all the graphics to 11 and doing the exact same thing, but just better.

The idea with any frame reconstruction is that you can sit at a decent resolution (Again, every GPU has an optimal resolution range!) in order to bolster visual effects and then just rely on upscaling the image.

Soundwave said:

What you're saying here doesn't really make sense, a $100 DLSS card versus a $1000 DLSS card ... the $1000 Nvidia card will still have DLSS, so it will be able to process high end ray tracing effects and things of that nature. 

Not that $100 modern Nvidia cards are even available, the cheapest DLSS capable card that's in production is the RTX 2060 Super is $400. 

It's only when compared to AMD that yes it becomes a valid comparison is that way. AMD doesn't have DLSS, which means cheaper Nvidia cards can outperform their more expensive ones. But Nvidia to Nvidia comparison doesn't work the same way.

AMD has it's own alternative technologies to DLSS such as DirectML for example. - AMD's approach shifts the burden from itself (As it's relying on Microsofts technology) to developers though, where-as nVidia is using a propriety approach.
https://www.overclock3d.net/news/software/microsoft_s_directml_is_the_next-generation_game-changer_that_nobody_s_talking_about/1

DirectML which Next-Gen consoles will have:


You also have other approaches like AMD's image sharpening:
https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Obviously temporal reconstruction, checker-boarding and other frame reconstruction techniques are other such approaches developers have/will take to achieve fake-4k. - Which is "good enough"

i had a look at the comparisons, doesn't look like a game changer by any stretch. And that car pic sharpening looks like a bullshot to me. 



DonFerrari said:

Thanks, it was more or less what I expected, a better reconstruction technique than what is available in AMD (perhaps much better) but not enough to overcome a let's say 10x gap on the performance of two cards.

Sure it may help Switch 2 to get even more ports with less trouble for devs, but won't bring quality as close as SoundWave is claiming.

It won't bring a theoretical Switch 2 console any closer to next gen devices than normal, because next-gen has the same technology available.
Developers can thus reduce the rendering resolution redirect that additional processing to bolster visuals and then do an upscale on a Playstation 5/Xbox Series X.

It's an efficiency improvement, not one that is exclusive to only nVidia or the Switch 2... The entire industry is going to start using this technology, nVidia just happened to be ahead of the curve. (Just like with Ray Tracing to be fair.)

KratosLives said:

i had a look at the comparisons, doesn't look like a game changer by any stretch. And that car pic sharpening looks like a bullshot to me. 

It's not a bullshot.
If you bothered to read the link you will see that it was demonstrated in real time via DirectML.

Keep in mind, that it is upsampling a photo, not up-sampling in game visuals.



--::{PC Gaming Master Race}::--

Focus on "photo realistic" graphics is a rabbit hole not worth chasing. These games never hold up over time, and I for one would rather want to boot up a game in 5 years like BotW, Mario Odyssey, Luigi's Mansion 3, etc. The art direction of these games will allow them to hold up better over time and with a bigger focus on GAMEPLAY they are a superior experience over the rinse and repeat cycle of Microsoft/Sony IPs. And besides, if you want a focus on graphics, go PC.



Nintendo with the Switch: