By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - How do the visuals on the Nintendo Switch compare to those of the Xbox 360 & PS3?

 

The Nintendo Switch hardware is...

A big leap over 7th gen 71 40.11%
 
A minor leap over 7th gen 72 40.68%
 
About the same as 7th gen 24 13.56%
 
Actually WORSE than last gen 10 5.65%
 
Total:177
SegataSanshiro said:
Miyamotoo said:

First you take look on that list, most of those things dont exist on in GTA V on PS3/360, those are quite impressive and advance things for Wii U hadware not to mentione PS3/360 hadware that older. Empty forest areas!? So we ignoring huge wild life of Zelda BotW with small and big animals, bugs, enemies, lakes, rivers..!? You also ignoring fact that huge BotW world is made in way that player have huge physics based interaction with that same huge world (setting grass on fire, spreading fire in direction of wind, wind that effects on cloud formation, wind that affects player and enemies, lightning that's attracted by metal and that can set of fire grass, cutting trees that fail and water and continue floating in water, very impressive AI of enemies, pounds that grows/vanishing based on weather effects...) all those things are very demanding.

 

Talking about shading and lighting, you dont know what are you talking about, if you looked at link that I post you would see yourself. Textures are better in GTAV beacuse good textures tend to look better in city avirmant compared to wild world...but world of Zelda BotW is much more bigger and its physics based and its heavily interactive world, so it's much more demanding. Korok Forest has FPS isuses, but GTAV on PS3/360 also are very is often below 30 FPS.

Talking about lighting, shading and effets:

I've read  Brainchild stuff before on Era and that is a dude who truly knows his stuff. It's like Niel Degrasse Tyson of explaining how games work lol.

Yeah, but buy @quickrick he is just some random Nintendo fanboy. :D

 

Solid-Stark said:
It's basically PS3 like visuals rendered at full 720p/900p/1080p, where the PS3 was usually sub 720p.

Actualy I think there is no single multi platform game that was available on PS3/360 that runs on Switch at 720p, most games runing at 1080p while few of them run at 900p. Also because modern tech/architecture Switch also supports most of modern effects that PS3 also didnt.

 

GhaudePhaede010 said:
Switch barely looks better than Wii U. That should be all you need to know.

But thats wrong, for instance ARMS is looking quite good and its running at 1080p/60FPS, MK8D on Switch is 1080p compared to 720p on Wii U, Mario Odyssey looks much better and its much demanding than Mario 3D World and its runing 900p...and we still talking only about Switchs 1st year games, later games will be more advanced and more impresive.



Around the Network

I think the switch could handle gta 5 so I'd say it's the same or slightly worse than ps3/360. I only got odyssey and xenoblade 2 for the switch and both these games aren't better looking than the last of us or uncharted 3.



To me the biggest difference is that most PS3 / 360 games aimed for 720p at 30fps while most Switch games aim for 1080p at 60fps (lower for open world games). That's a huge difference for me and really all I need. If Mario Kart 9 runs at 1080p/60fps with better effects than Mario Kart 8 that's perfectly fine in my book. Same goes for the next Zelda game if it runs at 1080p/30fps with some other improvements.



curl-6 said:
bonzobanana said:

Watchdogs on wii u vs ps3 and 360 versions is a massive win for 360 and PS3 because they both utilise their hard drives to stream in data quickly. The lack of hard drive in wii u was a big issue and I don't think the extra 500MB was enough to compensate for it although saying that obviously the wii u had 2GB of memory 4x 360 and PS3, both of which set aside a small amount of memory for their background operating system (not much though). My point is Watchdogs is a similar game to GTA and struggled on wii u and Zelda is a game that while beautiful makes no effort to present a realistic world it is more of a update on Zelda windwaker in visuals. Games like Zelda should not be used for comparison because they simply don't test the hardware. They don't attempt to do realistic physics, textures, lighting etc. However visually pleasing they are it is not really fair. Frankly its the reason Nintendo don't need decent performing hardware because most of their own games (if not all) are done in a cartoon style. The fact that PC's are emulating Zelda BOTW in 8k with high frame rates pretty much shows the underlying hardware. You aren't seeing that for PS3 or 360 games. Windwaker on Gamecube was achieved with only 8 gflops of gpu performance and still looked fantastic.

https://www.youtube.com/watch?v=qF1Itaye-z8

Everyone realistically knows Zelda BOTW would run on 360 and PS3 its just a question of how the game would be in comparison but it could certainly use hard drive streaming of data to compensate for lack of main memory.  The core engine of Zelda BOTW doesn't look particularly complicated and needy of resources.

It's a mistake to assume that a game not being realistic means it isn't demanding. Ratchet & Clank on PS4 is a cartoon, you think that would run on Wii U without downgrades? Botw has a ton of effects and techniques that are demanding for its hardware. 

You can port almost anything to anything if you downgrade it enough. Could PS3/360 run a mildly downgraded version of Botw? Of course they could. Could they run it as it is now, intact? Of course not; they don't have the memory, and hard drive streaming won't close the gap because Botw already does that.

We will have to disagree about the cartoon graphics, I'm not saying there isn't some demanding parts of such games but compared to a game attempting realistic graphics which has to do a lot more with regard physics and realistic textures which are constantly changing from scene to scene plus many other graphic effects which try to mimic real life I don't think it is a fair comparison. When you put PC games on very low graphic settings or even mod them to run on even weaker hardware than intended you can actually end up with a game that looks cartoon like and Nintendo in style.

1GB of main memory, optical drive reading plus a 3GB flash memory cache vs 360-420MB, faster optical drive (360) and much larger HDD cache is difficult to compare  especially when you  have to factor in cpu performance differences, memory bandwidth and greater GPU raw performance vs a later GPU architecture of weaker  performance  marginally.. We know generally when a multiformat title is on all 3 consoles PS3, 360 and wii u most of the time the 360 wins, followed by PS3 and then wii u. Yes wii u wins occasionally for a few games but overall the 360 is the easy winner. Zelda could easily be better on 360 and PS3 if programmed to the maximum of their abilities, enhanced using the greater cpu performance. Again though we can speculate as much as we want it will never happen so we can all believe what we want to believe. I own ps3, 360 and wii u and I believe properly programmed the ps3 is most powerful then the 360 and then wii u based on the evidence I've seen and the games I play and I would say digital foundry certainly backs up my views entirely based on their comparisons that 360 is stronger than wii u. 



Very similar. I would say in handheld mode is basically on par, in docked mode the switch just has higher resolution.

I dont think any switch game yet looks as good as the last of us on ps3



Around the Network
sc94597 said:
bonzobanana said:

Unless this has changed with a later firmware the Switch still caps its CPU's to 1ghz and only 3 are used for games and this applies to both docked and undocked modes and while these are more capable cpu's than in the 360 and PS3 they easily surpass Switch by the sheer speed they run at, 3.2ghz. While many have said the reason the Switch can't run LA Noire well is its optimised to utilise the PS3's cell processors the 360 did in fact run the game well too and I don't think its unfair to say both 360 and PS3 easily surpass Switch CPU performance.

https://www.youtube.com/watch?v=SZT8lB0icC8

Isn't it something like 9,000 mips for wii u, 13,000 mips for Switch (due to the 1ghz limit) but something like 20,000 mips for 360 and maybe 28,000-40,000 mips for ps3. PS4 and Xbone are up to 34,000 - 38,000 mips. If the Tegra CPU's were run at full speed of course it would be different but they aren't they are only run at about half speed in Switch but you can imagine if Nintendo released more cpu performance with a later firmware it would comfortably surpass 360 and be closer to the other consoles. Both PS4 and Xbone never pushed cpu performance in their consoles being only a mild jump from the last gen.

There are cheap octacore android tablets that exceed 30,000 mips for cpu performance but of course have much, much weaker gpu performance than Switch. For comparison the current AMD Ryzen CPU can exceed 300,000 mips. Cisc chipsets tend to get more work done per cycle as they have a larger instruction set (generalisation). 

However mobile chipsets tend to utilise the main cpu for secondary tasks too and don't have as many support processors as non mobile chipsets. So a comparison of mobile vs non mobile without factoring that in would not be fair. So a mobile chipset 10,000 mips is weaker than a non mobile 10,000mips chipset which again is weaker than a cisc 10,000 mips non mobile chipset. I'm just making the point the issues of LA Noire on Switch are extremely likely based on the weak cpu performance especially as the issue effects both docked and undocked.

A while back there was a thread regarding this topic where I calculated this based on estimates scoured on the internet. 

http://gamrconnect.vgchartz.com/post.php?id=8432244

sc94597 said: 

It is really hard to compare real-world performance for CPU's with such drastically different architectures without benchmarks (and even with benchmarks it is difficult.) 

One way to measure theoretical CPU performance is in DMIPS though (basically how many million instructions per second can the processor perform after considering differences in instruction sets by a generalized benchmark called Dhrystone.) An instruction set is the set of all instructions that the CPU's machine language provides for. 

So for the A57 the recorded statistic is 4.1-4.5 DMIPS/MHZ. Let's just take it to be 4.3 DMIPS/MHZ. Mutiply that by a clock speed of 1020 MHZ, and we get 4182 DMIPS/core. 

Scouring the web it looks like Expresso is 2877.32 DMIPS/core  

The ratio of performance is then 4 cores*(4182) DMIPS/core/ 3 cores*(2877.32)DMIPS/core =  1.93 times more instructions/sec for Switch's cpu than Expresso. 

So more or less twice as many instructions per second, on a basic comparison. 

The Xbox 360's CPU provides 5638.90 DMIPS @ 3.2 MHZ for all cores

Which gives a ratio of (4*4182)DMIPS/5638.90 DMIPS or about 3 times the Xenon. 

Not even going to bother comparing to the CELL because the architecture is so odd. 

Jaguar has about 3.6 DMIPS/MHZ , so 1,750 MHZ * 3.6 DMIPS *8 cores = 50,400 DMIPS (for XBOX ONE)

So the Switch's CPU is about 33% the Jaguar @1,750 MHZ (assuming both use all of their cores.) 

Jaguar in the PS4/XBO ~ 3* A57 in Switch;  A57 in Switch ~ 2*Expresso in Wii U ~ 3*Xenon in Xbox 360 (theoretically; assuming all cores can be used to the max.) 

Performance is of course different, because we know Microsoft and probably Nintendo don't use all of their cores at max. 

Also note that developers have commented that the Xenon has better real-world performance than the Expresso, but that could just be a matter of not bothering with optimization for the Wii U's advantages in ports. 

 

I don't know where you got your MIPS estimates for Xbox 360/PS3. Even if we assume only three cores can be used at max, that still gives us much more performance than the Xenon. 

I got my Xbox 360 estimate from here. 

https://www.neogaf.com/threads/wii-u-cpu-espresso-die-photo-courtesy-of-chipworks.513471/page-15#post-58036908

We also have to recall that in gaming there are diminishing returns the more cores you have, just because not everything is parallelizable. 

It's well documented the 360 CPU, I remembered it at 20,000 but its slightly below at 19,200 mips but certainly well beyond your estimates.

https://en.wikipedia.org/wiki/Instructions_per_second

Remember that 6 threads at 3.2ghz so even if the individual cores are weaker they run very fast. The 360 also has excellent memory bandwidth too, 256GB/s for the 10MB of video memory, 25GB/s for main memory which is one of the factors that can slow down multicore cpu's when they are bandwidth starved. One thread is pretty much dedicated to the background operating system I believe.

PS3 is only 10,200 mips for its dual thread PPC core but of course is supported by the 7 cell processors which all run at 3.2ghz and boosts performance 3-4x when properly programmed. They also are extremely efficient at doing parallel tasks like  enhancing the gpu feature set, decoding sound etc.

There is no way the Switch matches the CPU performance of those 2 consoles especially PS3 unless Nintendo unlocks the firmware which may not be possible for thermal and battery reasons. ARM A57's are good but just 3 of them at 1ghz is not competitive surely and they maybe required to do additional processing like wifi, bluetooth etc. I would put Switch in CPU terms between wii u and 360. It's probably like a 50% boost over wii u which is a huge difference but still noticeably weaker than 360 and PS3. Also mobile chipsets are prone to thermal throttling which you don't get on 360, PS3 or wii u. Although I think Nintendo has set the cpu's at 1ghz to prevent any thermal throttling, that would seem like a likely reason for the cap. 

 

 



bonzobanana said:

1. It's well documented the 360 CPU, I remembered it at 20,000 but its slightly below at 19,200 mips but certainly well beyond your estimates.

https://en.wikipedia.org/wiki/Instructions_per_second

2. Remember that 6 threads at 3.2ghz so even if the individual cores are weaker they run very fast.

 The 360 also has excellent memory bandwidth too, 256GB/s for the 10MB of video memory, 25GB/s for main memory which is one of the factors that can slow down multicore cpu's when they are bandwidth starved. One thread is pretty much dedicated to the background operating system I believe.

3. PS3 is only 10,200 mips for its dual thread PPC core but of course is supported by the 7 cell processors which all run at 3.2ghz and boosts performance 3-4x when properly programmed. They also are extremely efficient at doing parallel tasks like  enhancing the gpu feature set, decoding sound etc.

4. There is no way the Switch matches the CPU performance of those 2 consoles especially PS3 unless Nintendo unlocks the firmware which may not be possible for thermal and battery reasons.

5. ARM A57's are good but just 3 of them at 1ghz is not competitive surely and they maybe required to do additional processing like wifi, bluetooth etc.

6. I would put Switch in CPU terms between wii u and 360. It's probably like a 50% boost over wii u which is a huge difference but still noticeably weaker than 360 and PS3. Also mobile chipsets are prone to thermal throttling which you don't get on 360, PS3 or wii u. Although I think Nintendo has set the cpu's at 1ghz to prevent any thermal throttling, that would seem like a likely reason for the cap. 

 

 

1. You say it is well-documented but you didn't provide said documentation. The Wikipedia page has no source for their Xenon and Cell statistics. Plus one must distinguish DMIPS (which are results from an actual benchmark called Dhrystone) from MIPS (which is a pretty useless measurement across architectures as Permalite noted earlier.)

2. Only if you can take advantage of six threads. Very few games do. There is only so much you can parallelize, and the more threads we're talking about the lower the returns. This is gaming, not video-editing. 

3. Okay, but when we're talking about such a large difference in GPU power, why even care about the SPE's? These computations can just be done on the GPU. The SPE's were a hassle for PS3 development anyway, and many games suffered because of it. 

4. I agree, there is no way. The Switch's CPU is definitely better for gaming. 

5. This is mostly speculative/overly assertive on your part. 

6. The switch's dock is there to prevent throttling. If throttling were a thing we'd notice when our Switch's got hot. Now there is likely throttling to reduce power-draw for less intensive games, but that is entirely dependent on the requirements of the game. 

It might be valuable to read the quote that quickrick provided, cited from his random developer.

"Cell and Xenon are good in highly optimized SIMD code. Xenon = 3 cores at 3.2 GHz, four multiply-adds per cycle (76.8 GFLOP/s). That's significantly higher theoretical peak than the 4x ARM cores on Switch can achieve. But obviously it can never reach this peak. You can't assume that multiply-add is the most common instruction (see Broadwell vs Ryzen SIMD benchmarks for further proof). Also Xenon vector pipelines were very long, so you had to unroll huge loops to reach good perf with it. Branching and indexing based on vector math results was horrible (~40 cycle stall to move data between register files). ARM NEON is a much better instruction set and OoO and data prefetch helps even in SIMD code.

If you compare them in standard C/C++ game code, ARM and Jaguar both stomp over the old PPC cores.
 I remember that it was common consensus that the IPC in generic code was around 0.2. So both Jaguar and ARM should be 5x+ faster per clock than those PPC cores (IIRC Jaguar average IPC was around 1.0 in some real life code benchmark, this ARM core should be close). However you can also write low level optimized game code for PPC, so it all depends on how much resources you had to optimize and rewrite the code. Luckily those days are a thing of the past. I don't want to remember all those ugly hacks we had around the code base to make the code run "well enough". The most painful thing was that CPU didn't have a data prefetcher. So you had to know around 2000 cycles in advance which memory regions your future code is going to access, and prefetch that data to cache. If you didn't do this, you would get 600 cycle stalls on memory loads. Those PPC cores couldn't even prefetch linear arrays."

Last edited by sc94597 - on 28 January 2018

sc94597 said:
bonzobanana said:

1. It's well documented the 360 CPU, I remembered it at 20,000 but its slightly below at 19,200 mips but certainly well beyond your estimates.

https://en.wikipedia.org/wiki/Instructions_per_second

2. Remember that 6 threads at 3.2ghz so even if the individual cores are weaker they run very fast.

 The 360 also has excellent memory bandwidth too, 256GB/s for the 10MB of video memory, 25GB/s for main memory which is one of the factors that can slow down multicore cpu's when they are bandwidth starved. One thread is pretty much dedicated to the background operating system I believe.

3. PS3 is only 10,200 mips for its dual thread PPC core but of course is supported by the 7 cell processors which all run at 3.2ghz and boosts performance 3-4x when properly programmed. They also are extremely efficient at doing parallel tasks like  enhancing the gpu feature set, decoding sound etc.

4. There is no way the Switch matches the CPU performance of those 2 consoles especially PS3 unless Nintendo unlocks the firmware which may not be possible for thermal and battery reasons.

5. ARM A57's are good but just 3 of them at 1ghz is not competitive surely and they maybe required to do additional processing like wifi, bluetooth etc.

6. I would put Switch in CPU terms between wii u and 360. It's probably like a 50% boost over wii u which is a huge difference but still noticeably weaker than 360 and PS3. Also mobile chipsets are prone to thermal throttling which you don't get on 360, PS3 or wii u. Although I think Nintendo has set the cpu's at 1ghz to prevent any thermal throttling, that would seem like a likely reason for the cap. 

 

 

1. You say it is well-documented but you didn't provide said documentation. The Wikipedia page has no source for their Xenon and Cell statistics. Plus one must distinguish DMIPS (which are results from an actual benchmark called Dhrystone) from MIPS (which is a pretty useless measurement across architectures as Permalite noted earlier.)

2. Only if you can take advantage of six threads. Very few games do. There is only so much you can parallelize, and the more threads we're talking about the lower the returns. This is gaming, not video-editing. 

3. Okay, but when we're talking about such a large difference in GPU power, why even care about the SPE's? These computations can just be done on the GPU. The SPE's were a hassle for PS3 development anyway, and many games suffered because of it. 

4. I agree, there is no way. The Switch's CPU is definitely better for gaming. 

5. This is mostly speculative/overly assertive on your part. 

6. The switch's dock is there to prevent throttling. If throttling were a thing we'd notice when our Switch's got hot. Now there is likely throttling to reduce power-draw for less intensive games, but that is entirely dependent on the requirements of the game. 

It might be valuable to read the quote that quickrick provided, cited from his random developer.

"Cell and Xenon are good in highly optimized SIMD code. Xenon = 3 cores at 3.2 GHz, four multiply-adds per cycle (76.8 GFLOP/s). That's significantly higher theoretical peak than the 4x ARM cores on Switch can achieve. But obviously it can never reach this peak. You can't assume that multiply-add is the most common instruction (see Broadwell vs Ryzen SIMD benchmarks for further proof). Also Xenon vector pipelines were very long, so you had to unroll huge loops to reach good perf with it. Branching and indexing based on vector math results was horrible (~40 cycle stall to move data between register files). ARM NEON is a much better instruction set and OoO and data prefetch helps even in SIMD code.

If you compare them in standard C/C++ game code, ARM and Jaguar both stomp over the old PPC cores.
 I remember that it was common consensus that the IPC in generic code was around 0.2. So both Jaguar and ARM should be 5x+ faster per clock than those PPC cores (IIRC Jaguar average IPC was around 1.0 in some real life code benchmark, this ARM core should be close). However you can also write low level optimized game code for PPC, so it all depends on how much resources you had to optimize and rewrite the code. Luckily those days are a thing of the past. I don't want to remember all those ugly hacks we had around the code base to make the code run "well enough". The most painful thing was that CPU didn't have a data prefetcher. So you had to know around 2000 cycles in advance which memory regions your future code is going to access, and prefetch that data to cache. If you didn't do this, you would get 600 cycle stalls on memory loads. Those PPC cores couldn't even prefetch linear arrays."

You seem to have a strong bias in favour of the Switch rather than a sit on the fence of looking at the evidence and how the Switch performs. Your bias seems to be negating all evidence that differs with what you want to believe. Those mips/dmips figures are admittedly only a rough guide to comparing chips but you seem to be using that opportunity to believe it is the Switch under represented when its more likely the other way for reasons I've given previously. You would obviously understand the powerpc figures are pretty much inline with many other powerpc cpu's that are variants on the same processors in ps3 and 360. Just correlate them with other powerpc chips in that list of performance figures. There are no wild claims. 

Dhrystone is just an integer performance check and those figures are mainly DMIPs anyway. 

It's not like those PPC figures seem strange they are stating 2 mips per mhz per core on 360 where as wii was 2.3 mips per mhz which was out of order execution like ps3. So even with the dual thread on each core they are stating a low figure. It's only because there are 3 cores running at 3.2ghz that the final figure is so good. Basically 6 x 3200 approx which is 19,200. The cpu design of 360 seems to be designed around generating high floating point performance.

You look at LA Noire on Switch and you can clearly see a system with a CPU bottleneck in the design and this is hardly surprising with Nintendo.

Both the specification and real world performance support my view I believe but I'm happy to change my opinion if any evidence comes along to change it. I was an early adopter of the wii u but it was clear when I started playing it there was a cpu issue and that was made clear when the actual specs were leaked. I just don't think Nintendo are to bothered about CPU performance. Surely they could have run the Tegra at full CPU speed if they wanted to but didn't.



If the 7th gen would get an Upgrade like the ps3 pro it would have been like todays switch.

I would say its gen 7.5.



ClassicGamingWizzz said:
RolStoppable said:
It's a huge leap. Some games look even better than anything on the PS4.

I want to know what game on switch is better than ANYTHING on the ps4. I am curious.

The home menu UI. :P