By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - VGC: Switch 2 Was Shown At Gamescom Running Matrix Awakens UE5 Demo

zeldaring said:
Soundwave said:

Do you ask Digital Foundry when to poop and pee too or do you have your own opinion on these things? lol. Digital Foundry's job is to slow down things and analyze by frame by frame, no one actually plays video games that way. 

Again here is Spider-Man Miles Morales on the PS4 versus the PS5. 

I think any reasonable person can say the PS4 version holds up extremely well. Shockingly well actually. Frankly the PS5 is a rather underwhelming generational difference. If Switch 2 runs Tears of the Kingdom at 60 frame per second, I'm not falling out of my seat in amazement. Like sorry but I don't think running a game that runs fine on a previous generation console, but just at double the frame rate is "holy crap!" moment. In the past we'd laugh at a generational leap that small. 

There's nothing on the PS5/XSX honestly that makes me go "wow, what a huge generational leap" the way PS3 over PS2 or PS2 over PS1 did. The most impressive thing I've seen on the PS5/XSX 3 years in is The Matrix Awakens demo, that's really the only thing that made me go "whoa, OK that is a step up" and that's not even an actual game, lol. Probably because it would cost a fortune to make a full game look like that. Diminishing returns is definitely setting in. To get to a visual fidelity that is wildly past the PS4 needs you to trend into photorealism, and to get visuals of that caliber I think you are talking about a budget that even most big ticket studios cannot sustain. 

A lot of the power of the PS5/XSX is being sucked away by having to push a ridiculous number of pixels (4K) and calculating lighting bounces/reflections that really you have to stop and study closely to notice half the time and studios are fine with that because they don't want to actually hire an art staff the size of a Hollywood movie to do their graphics anyway (so they don't really want to go that far beyond a PS4 tier of graphics). Most PS5 games just look like PS4 games on steroids running at a higher resolution. 

We heard the same thing with wiiu ports, newer architecture will makes games run better then 360 the majority didn't.  I don't expect that with switch 2 but the best thing to do is wait and see ports, that honestly gives the best idea of where switch 2 at.   

False equivalence, the Switch which wasn't too different from the WiiU in terms of pure TFlops did manage without a sweat and that's because of the ARM NVDIA architecture. The WiiU was opting for the IBM stuff which was not up to par with x86 architectures. 

But indeed, the wait and see approach is always the best 



Switch Friend Code : 3905-6122-2909 

Around the Network

60 fps vs 30 fps of a last gen game is supposed to be impressive for $500? lol. Am I supposed to fall out my couch when Tears of the Kingdom on Switch 2 runs at 60 fps with a higher resolution?

We really have hit diminishing returns hard.

Doesn't surprise me when you look at it though. The PS4 was 1.8 teraflops, the PS5 is 10.8 teraflops (yeah, yeah different architectures, but still), but right off the hop, 4K basically requires the PS5 to render 4x the resolution, so a lot of that power increase is neutered right off the bat right there. Start to throw in real time reflections and the rest of your gain goes straight down the toilet.

Whereas PS2 is 6.2 gigaflops, and the XBox 360 was 250 gigaflops ... that's a monstrous upgrade even if the XBox 360 has to render at 720p versus 480p. And the 360 came just 5 years after the PS2.

Here's God of War Ragnarok on PS5 vs PS4 ... and like GoW Ragnarok is one of the better looking PS5 games, it's not like GoW looks like some dated game when compared to others on the PS5. 

We're supposed to be excited by this kind of jump? lol, I'd be surprised if a Average Joe can really even tell much of a difference here. 



Mar1217 said:
zeldaring said:

We heard the same thing with wiiu ports, newer architecture will makes games run better then 360 the majority didn't.  I don't expect that with switch 2 but the best thing to do is wait and see ports, that honestly gives the best idea of where switch 2 at.   

False equivalence, the Switch which wasn't too different from the WiiU in terms of pure TFlops did manage without a sweat and that's because of the ARM NVDIA architecture. The WiiU was opting for the IBM stuff which was not up to par with x86 architectures. 

But indeed, the wait and see approach is always the best 

ummm switch has like 2X the amount of GPU power of the wiiu.

https://thegamingsetup.com/console-power-comparison-chart



Soundwave said:

IMO after you get passed PS4 tier graphics, the differences in visuals become much more subtle because either you invest that horse power in subtle effects like light bounces/reflections or ... you spend a Hollywood movie style budget for higher end models/visuals ... the second option is not possible for most studios, even huge studios can't be making $200 million dollar video games. Graphics don't magically come for free. The other thing is lighting doesn't scale linearally like people think. Just because you have 5x more powerful hardware doesn't mean your lighting is going to look 5x better. It means your GPU can be bogged down be calculating light bounces that basically force it to max out but the end result of what it looks like on screen actually isn't that big of a difference.

This isn't fully true. Heavily improved lighting from path tracing is gonna make a big difference to visuals when that becomes standard. Diminishing returns are setting in but there's still a long way to go from PS4 tier graphics.



Soundwave said:

60 fps vs 30 fps of a last gen game is supposed to be impressive for $500? lol. Am I supposed to fall out my couch when Tears of the Kingdom on Switch 2 runs at 60 fps with a higher resolution?

We really have hit diminishing returns hard.

Doesn't surprise me when you look at it though. The PS4 was 1.8 teraflops, the PS5 is 10.8 teraflops (yeah, yeah different architectures, but still), but right off the hop, 4K basically requires the PS5 to render 4x the resolution, so a lot of that power increase is neutered right off the bat right there. Start to throw in real time reflections and the rest of your gain goes straight down the toilet.

Whereas PS2 is 6.2 gigaflops, and the XBox 360 was 250 gigaflops ... that's a monstrous upgrade even if the XBox 360 has to render at 720p versus 480p. And the 360 came just 5 years after the PS2.

And yet despite that "monstrous" upgrade, if you go back and look at the titles released in the 360's first year, they pretty much just look like HD versions of PS2 games. It wasn't until the likes of Gears of War started showing up a year into the 360's lifecycle, with engines properly architected to make use of the new hardware features, that we really started seeing what the HD consoles could do. Likewise, early-generation PS4/Xbox One games weren't a massive graphical leap over the previous generation.

Diminishing returns is probably becoming a factor, especially when it comes to memory (which "only" doubled between the PS4/XB1 and PS5/XSX generation, albeit with a much faster storage subsystem being added), but new console generations taking a while to prove their worth over their predecessors isn't as new as you might think.



Around the Network
Norion said:
Soundwave said:

IMO after you get passed PS4 tier graphics, the differences in visuals become much more subtle because either you invest that horse power in subtle effects like light bounces/reflections or ... you spend a Hollywood movie style budget for higher end models/visuals ... the second option is not possible for most studios, even huge studios can't be making $200 million dollar video games. Graphics don't magically come for free. The other thing is lighting doesn't scale linearally like people think. Just because you have 5x more powerful hardware doesn't mean your lighting is going to look 5x better. It means your GPU can be bogged down be calculating light bounces that basically force it to max out but the end result of what it looks like on screen actually isn't that big of a difference.

This isn't fully true. Heavily improved lighting from path tracing is gonna make a big difference to visuals when that becomes standard. Diminishing returns are setting in but there's still a long way to go from PS4 tier graphics.

To be honest not really. You can cheat a lot with baked lighting to make it look similar. 

If you want "way bettererer graphics" than PS4 era, the budget restrictions become restrictive. Movies cost $200 million for maybe like 45 minutes worth of effects shots/CGI ... a video game needs to be like 30 hours long and have 5x times as many environments in many cases, no artist is just magically going to work on this stuff for free. You have to pay people to do that and even as a huge studio, what happens if you spend $150 million on a game (before marketing) and it doesn't do as well as you hoped? Now bravo, your entire studio is bankrupt. 

You have to massively expand your staff to get games to look far better than the PS4 because really a PS4 can already create fairly realistic looking visuals. Like what is the top shit next-gen game right now? Starfield? Starfield doesn't look that much better than God of War Ragnarok which runs fine on the PS4. 

Also no console can really ever do full blown ray tracing. For movie effects the light bounces in a scene take hours to render a single frame and a few seconds of footage can take days of massive computers to render at top quality. 

It's fools gold IMO. You're better off just going with baked lighting and letting a talented visual artist tweak the look area by area IMO because the performance cost is absurd. 

Even a PS2 game could cripple a PS5 if you really crank the light bounces to be accurate, to me it's not worth it. Random kid in Nebraska playing your game isn't really going to appreciate or give a crap what those light rays are doing. 



Soundwave said:
Norion said:

This isn't fully true. Heavily improved lighting from path tracing is gonna make a big difference to visuals when that becomes standard. Diminishing returns are setting in but there's still a long way to go from PS4 tier graphics.

To be honest not really. You can cheat a lot with baked lighting to make it look similar. 

If you want "way bettererer graphics" than PS4 era, the budget restrictions become restrictive. Movies cost $200 million for maybe like 45 minutes worth of effects shots/CGI ... a video game needs to be like 30 hours long and have 5x times as many environments in many cases, no artist is just magically going to work on this stuff for free. You have to pay people to do that and even as a huge studio, what happens if you spend $150 million on a game (before marketing) and it doesn't do as well as you hoped? Now bravo, your entire studio is bankrupt. 

You have to massively expand your staff to get games to look far better than the PS4 because really a PS4 can already create fairly realistic looking visuals. Like what is the top shit next-gen game right now? Starfield? Starfield doesn't look that much better than God of War Ragnarok which runs fine on the PS4. 

Also no console can really ever do full blown ray tracing. For movie effects the light bounces in a scene take hours to render a single frame and a few seconds of footage can take days of massive computers to render at top quality. 

It's fools gold IMO. You're better off just going with baked lighting and letting a talented visual artist tweak the look area by area IMO because the performance cost is absurd. 

Even a PS2 game could cripple a PS5 if you really crank the light bounces to be accurate, to me it's not worth it. Random kid in Nebraska playing your game isn't really going to appreciate or give a crap what those light rays are doing. 

If you haven't watched the Digital Foundry coverage on the Cyberpunk path tracing update you should, it really does make a big difference to the visuals. What you're missing here is that improved lighting from ray tracing and later on path tracing is going to make game development easier. Look at how much better Mario 64 looks with improved lighting. You don't need a big budget to take advantage of this large boost to visual fidelity so even indie developers making games with simple visuals are gonna benefit from it. The performance cost is massive now but the PS6 and next Xbox will be able to do path tracing so it's just a matter of time.



Regarding diminishing returns that would be resolution. 2k versus 4k, not much difference. 30 fps versus 60 fps is massive. As for lighting it matter significantly on the right display. Perhaps on a cheap $200 Costco TV the improved lighting is meh. But on a high end OLED, night and day.

My main point is the one thing we don't need is 4k, native or upscaled. 2k is fine and power should go to fps.



720P to 1080P felt like a bigger jump to 4k. 4k has been underwhelming. Yeah, it looks nice but it didn't have that wow factor. Maybe when we move to 8K.

I would take 1080P 60FPS over 4k 30FPS any day.



Bite my shiny metal cockpit!

Norion said:
Soundwave said:

To be honest not really. You can cheat a lot with baked lighting to make it look similar. 

If you want "way bettererer graphics" than PS4 era, the budget restrictions become restrictive. Movies cost $200 million for maybe like 45 minutes worth of effects shots/CGI ... a video game needs to be like 30 hours long and have 5x times as many environments in many cases, no artist is just magically going to work on this stuff for free. You have to pay people to do that and even as a huge studio, what happens if you spend $150 million on a game (before marketing) and it doesn't do as well as you hoped? Now bravo, your entire studio is bankrupt. 

You have to massively expand your staff to get games to look far better than the PS4 because really a PS4 can already create fairly realistic looking visuals. Like what is the top shit next-gen game right now? Starfield? Starfield doesn't look that much better than God of War Ragnarok which runs fine on the PS4. 

Also no console can really ever do full blown ray tracing. For movie effects the light bounces in a scene take hours to render a single frame and a few seconds of footage can take days of massive computers to render at top quality. 

It's fools gold IMO. You're better off just going with baked lighting and letting a talented visual artist tweak the look area by area IMO because the performance cost is absurd. 

Even a PS2 game could cripple a PS5 if you really crank the light bounces to be accurate, to me it's not worth it. Random kid in Nebraska playing your game isn't really going to appreciate or give a crap what those light rays are doing. 

If you haven't watched the Digital Foundry coverage on the Cyberpunk path tracing update you should, it really does make a big difference to the visuals. What you're missing here is that improved lighting from ray tracing and later on path tracing is going to make game development easier. Look at how much better Mario 64 looks with improved lighting. You don't need a big budget to take advantage of this large boost to visual fidelity so even indie developers making games with simple visuals are gonna benefit from it. The performance cost is massive now but the PS6 and next Xbox will be able to do path tracing so it's just a matter of time.

The performance cost for truly accurate lighting will always be enormous. 

Hollywood movies still need hours to render a single frame largely because several GPUs need that much time to accurately account for the light bounces. And these are workstations that put a PS5 to shame, probably even today have more performance than a PS6 will have. If they could do that at even 3 frames per second, they obviously would do that instead. 

A game console in real time is always going to have fake it and even that will absolutely tank its performance. 

Yeah Mario 64, great cool reflections on the water (cherry picked an area to show it off), but this is a game from 1996 that needs a $1500 GPU to run like that, lol, which kinda proves the point.