By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - How do the visuals on the Nintendo Switch compare to those of the Xbox 360 & PS3?

 

The Nintendo Switch hardware is...

A big leap over 7th gen 71 40.11%
 
A minor leap over 7th gen 72 40.68%
 
About the same as 7th gen 24 13.56%
 
Actually WORSE than last gen 10 5.65%
 
Total:177

The last video about Bayonetta on the Switch made by Digital Foundry is a great example to study. The game runs at the same resolution as the XBox 360 version with the same textures, the difference? 30 to 50 fps on 360 with slowdowns and tearing vs 60 fps stable on the Switch...

Having said that, it shows at least 30% increased power over the 360 version... the only way to play the 360 version with the same fps is by playing it on the One X with back compatibility, which improves the game.

https://www.youtube.com/watch?v=R_M0gX0GE0o  (5:30 minutes)



PS4: Tryklon  Steam: Tryklon

Switch: 0307-6588-7010 | New 2DS XL: 2037-2612-6964

MacBook Air (Mid 2017) | iPhone SE | Apple Watch Series 3

Around the Network
Pemalite said:
bonzobanana said:

Memory bandwidth is a bit limited and there is pressure to keep storage resources low. In that situation there will be games that outperform Switch on PS3 and 360 in certain situations.

The Switch has Delta Colour Compression.
Maxwell's implementation of Delta Colour Compression allowed it to achieve a 20-30% bandwidth saving over Kepler.
https://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/3

That's 3rd gen. So there was another 15-20% on top of that.

Thus bandwidth isn't a direct comparison between 7th gen consoles and the Switch based purely on the memory bus.

And thanks to Maxwells tiled-based rasterization it's able to be more efficient at culling, meaning it does less work and can use the limited bandwidth it has to a greater effect.


bonzobanana said:

I still see people today write the wii u is superior to 360 and PS3 and yet I know from experience this is untrue overall. The wii u's is a weaker console that benefits from fantastic Nintendo games.

To be fair... The Switch's GPU like the WiiU's is more efficient. It's not as efficient as Maxwell, but it's certainly better than the R580 hybrid chip in the Xbox 360.
The Achilles heel is of course that paltry 12.8GB/s of bandwidth which the eDRAM is supposed to try and mitigate, which will vary between developers on how effectively it is leveraged.

bonzobanana said:

I'm not a Switch owner but looking to buy one at some point and curious about its performance level. It does feel many are ignoring the Switch lack of CPU resources but the games are showing this to be an issue.

The Switch's CPU can hold it's own against the last generation consoles. It's amazing how far CPU performance has come in the last decade.
The Switch's CPU would have easily beaten the Xbox 360 and Wii U's CPU no contest if it actually ran with it's original clocks.

Better low-level API's and more efficient drivers have lightened the CPU burden somewhat though, which will play in the Switch's favor.

bonzobanana said:

It's at a lower level in pure gflops. Something like 157 gflops in portable mode or 188 gflops with a performance boost at the expense of battery life. From what I've read 157 gflops is the norm in most games not 188 gflops. That's 19 gflops less than wii u and something like 50-90 less than 360 and PS3 in portable mode.

Gflops are irrelevant. Unless all other things are equal.
A GPU with less flops can beat a GPU with more flops.

bonzobanana said:

In the pc world we aren't seeing claims that the slightly newer architecture and features radically changes game performance. An old high end gpu of 200 gflops compared to a new low end budget one of 200 gflops doesn't magically outperform the older model even if one is DX10 and the other DX12. Sometimes the older model might actually perform better as it may have higher end spec elsewhere, more memory, higher bandwidth memory etc. It was designed with less compromises and cost reductions. On screen you might get one or two improved effects if you are lucky mostly nothing though. 

Radeon 5870:
* 2.72 Teraflops.
* 153.6GB/s of bandwidth.

Radeon 7850.
* 1.76 Teraflops.
* 153.6GB/s of bandwidth.

No contest that the Radeon 5870 would win right? I mean, flops is all that matters? And if you have the same bandwidth there isn't a radical difference in performance right? Those are YOUR claims.

And you are wrong.
https://www.anandtech.com/bench/product/1062?vs=1076

The Radeon 7850 wins hands down.

Fact of the matter is... Architectural efficiency is important and people need to stop ignoring it in favor of using something simple like flops or bandwidth as their only comparison points, it's stupid, it's inaccurate and it's wrong. Stop doing it.

bonzobanana said:

Like the wii u though I'm sure the Switch will have many comparisons with newer and older hardware over its lifetime and the evidence accumulating now clearly show its strength and weaknesses already.

And so it should.

quickrick said:

It's still far to early to give judgment that switch CPU is weaker then 360/ps3 CPU, it's just came out, and runs most games better. aside from al noir which is probably the most demanding port, but that game especially targeted ps3 CPU, and 360 ran a bit so not exactly a fair comparison, we need more games to come tot his conclusion. 

Some tasks the Switch's CPU will beat the Xbox 360/Playstation 3's CPU. In other tasks it will loose.
I.E It should win with integers every day of the week as it's a more balanced architecture, but loose in iterative refinement floating point which the Cell will dominate.

It's a very different ISA anyway.
The main issue with the Switch CPU is that it is being held back by it's clock rate, if it was running 80% faster it would have won, no contest.

For now though... Ports are our best gauges to the CPU capabilities of the device.

I have made the point myself about better optimisation of bandwidth but still there seems like a shortfall and if anything your values seem to confirm it. As I said previously the xbox one has a pretty recent GPU architecture but still benefits from that 32MB of ultrafast memory to supplement main memory bandwidth which is in itself much higher anyway than Switch. 

My point about graphics cards was a low end chipset of recent times as used in Nintendo hardware vs high performance of the past which share similar gflops output and I'm sure I've seen PC benchmarks where the older cards actually came out faster because they weren't compromised as much. Your comparison seems more like mid range to old high range. I certainly wasn't thinking about such a comparison I was thinking of high or mid range old gpu vs  low end/mobility of more recent times of similar gflops. In the case of the wii u the Radeon GPU after all the debate about 176 or 352 gflops it was really down to the exceptionally low power draw of the wii u that confirmed it couldn't be 352 gflops and likely a mobility radeon GPU  rather than desktop version. That seemed to fit better with the confirmed spec and a better fit for a small console with limited cooling. 

I don't agree with your comparison as I say but even if we accept it, looks like most game frame rates between the 2 are broadly similar and roughly speaking the newer chipset achieves about a 50% frame rate boost for the same gflops. It doesn't seem true to some other comparisons I've seen where they have fitted older GPU's and newer GPU's to the same motherboard and CPU setup. This for all I know could have been historic frame rates with the older gpu matched to an older motherboard and weaker older CPU setup which would make the comparison utterly pointless but even if we accept this that means the 155 gflops is boosted to about 230 gflops or something putting it squarely into the ps3/360 performance level and no more. It doesn't seem right though. With games like Xenoblade 2 dropping to 368p at times and Doom and very low resolutions too both at 30fps as well. Some Switch games suffer from slower frames rates when docked. This must surely be a CPU or memory bandwidth issue. However where a game has to drop resolution significantly to maintain frame rates in portable mode it feels like its a GPU performance issue. The CPU is the same and memory bandwidth better for lower resolutions.

We are clearly both in agreement about the CPU performance anyway, yes the Tegra at full clocks would be comfortably superior to Xbox 360 in CPU terms but Nintendo didn't use full clocks and only 3 CPU's are used for actual games the other is dedicated to the operating system and background tasks. I still secretly hope Nintendo might release some CPU performance with a later firmware but maybe that is unlikely. Sony did increase clocks with the PSP slightly so it is possible.  I'm unsure if at full clocks it would beat a PS3 with all cells firing and fully optimised code I strongly suspect not considering many consider not even the Xbox One or PS4 achieve that and there are some benchmarks that show it stronger in real world performance than ps4/xbone. 

I tend to look at what developers are achieving with hardware as well as the spec numbers. See what's happening on screen and where the compromises are especially where the compromises are repeated by many different developers. I think you always have to factor in real world evidence. 

I'm not yet convinced of the Tegra's graphic superiority in portable mode with such low clocks vs 360/PS3. To me it looks much more like the 4GB of main memory is the real saviour, room for better graphics, textures without the constant need to swop/stream data of 360 and PS3. I feel this is the huge strength of the system compared to 360/PS3. I honestly wonder if the PS3 and 360 had that available memory if it couldn't both speed up and allow more sophisticated game engines. PS3 especially was a nightmare. I remember running Fallout 3 with DLC at 480i on my PS3 just because it helped reduce slowdown which was awful in Operation Anchorage and the other DLC. Capcom's pressure on Nintendo to boost memory from 2GB to 4GB on Switch I feel was a great move I just wish they asked for higher CPU clocks at the same time. You have made a case for the Switch graphic hardware but despite being a big Nvidia fan myself I'm not convinced by it at this point for portable performance. Yes great for docked and easily superior to 360/PS3 no question.



bonzobanana said:

I have made the point myself about better optimisation of bandwidth but still there seems like a shortfall and if anything your values seem to confirm it. As I said previously the xbox one has a pretty recent GPU architecture but still benefits from that 32MB of ultrafast memory to supplement main memory bandwidth which is in itself much higher anyway than Switch.

The Xbox One has more bandwidth by default than the Switch.
The Xbox One has eSRAM to lend a hand to lend a hand for things like bandwidth intensive render targets.

But in terms of architectural efficiency... Maxwell beats Graphics Core Next.

Graphics Core Next does come into it's own once you start to leverage Asynchronous Compute however, but when you start to compare the Switch against the XBox 360, the Switch starts to look more favorable.

bonzobanana said:

My point about graphics cards was a low end chipset of recent times as used in Nintendo hardware vs high performance of the past which share similar gflops output and I'm sure I've seen PC benchmarks where the older cards actually came out faster because they weren't compromised as much. Your comparison seems more like mid range to old high range.

My comparison was to prove the point that GPU's of different architectures cannot have their performance determined by just using flops.


bonzobanana said:

In the case of the wii u the Radeon GPU after all the debate about 176 or 352 gflops it was really down to the exceptionally low power draw of the wii u that confirmed it couldn't be 352 gflops and likely a mobility radeon GPU  rather than desktop version.

And yet was most likely a Very Long Instruction Word (VLIW) 5-way derived Radeon GPU architecture, which had all the accompanying advantages over the R580 derived GPU in the Xbox 360.

bonzobanana said:

With games like Xenoblade 2 dropping to 368p at times and Doom and very low resolutions too both at 30fps as well.


Doom is doing some good effects though. Unless you disagree? Then you need to point me towards an Xbox 360 game that has hardware accelerated particles which have hardware accelerated lighting and shadowing.

The Switch is doing things with Doom that the Xbox 360 simply can't.

bonzobanana said:

Some Switch games suffer from slower frames rates when docked. This must surely be a CPU or memory bandwidth issue. However where a game has to drop resolution significantly to maintain frame rates in portable mode it feels like its a GPU performance issue. The CPU is the same and memory bandwidth better for lower resolutions.

Could be a combination of all of the above. Or none at all.
Until you look at the performance profiling for the engine running on the actual hardware, the only thing you can make is accusations, which is not something I can adhere to.

bonzobanana said:


We are clearly both in agreement about the CPU performance anyway, yes the Tegra at full clocks would be comfortably superior to Xbox 360 in CPU terms but Nintendo didn't use full clocks and only 3 CPU's are used for actual games the other is dedicated to the operating system and background tasks.

Well, the CPU is more efficient even at those lower clocks, that isn't really up for debate.
Even having just 3-cores for gaming is fine.

But the clocks is what holds things back. I'll probably do some more research on this at a later date and so how both chips compare on this front at some point, just haven't had the time recently.


bonzobanana said:


 I still secretly hope Nintendo might release some CPU performance with a later firmware but maybe that is unlikely. Sony did increase clocks with the PSP slightly so it is possible.

That would be nice. But probably unlikely. Battery life is fairly short on the Switch, higher CPU clocks would erode that farther.

bonzobanana said:


I'm unsure if at full clocks it would beat a PS3 with all cells firing and fully optimised code I strongly suspect not considering many consider not even the Xbox One or PS4 achieve that and there are some benchmarks that show it stronger in real world performance than ps4/xbone.

In some tasks it would be faster than the Cell... Especially with Integers.

The Cell was very specific where it could really show it's capabilities... And games/game engines have tons of different demands.

The best Analogy would be a car race... Where the 8-core Jaguar/4-core Tegra can cruise down to the finishing line at 100 miles per hour.
The Cell however will do 50 miles per hour for the bulk of the race... But when the road conditions are just right, it can peak to 150 miles per hour.

The Cell has the higher potential speed, but it will loose the race every time as it's not able to maintain that speed under all conditions.

bonzobanana said:

I tend to look at what developers are achieving with hardware as well as the spec numbers. See what's happening on screen and where the compromises are especially where the compromises are repeated by many different developers. I think you always have to factor in real world evidence.

Real world evidence is what I live by.

However, keep in mind that the Switch hasn't been on the market for a year yet... We have yet to see the best looking and most demanding games on the platform... The Xbox 360, Playstation 3 and Wii U are done. Finished. Tapped out.

bonzobanana said:


I'm not yet convinced of the Tegra's graphic superiority in portable mode with such low clocks vs 360/PS3.

Maxwell beats Graphics Core Next in terms of efficiency every day of the week. (Outside of Asynchronous Compute that is.)
Graphics Core Next obliterates VLIW4 in terms of efficiency every day of the week.
VLIW4 was a substantial improvement over VLIW5 in terms of efficiency as it was a more balanced layout.

I have already shown an example of a Graphics Core Next GPU out-performing a VLIW5 GPU with almost a teraflop extra of single precision capability, that same kind of performance divide would exist between a VLIW5 GPU and the R580 as well.

And the Xbox 360/Playstation 3 GPU predates Graphics Core Next, VLIW 4 and in some aspects, VLIW5.
The newer the GPU, the more work it can do per flop, that's the crux of it.

bonzobanana said:

To me it looks much more like the 4GB of main memory is the real saviour, room for better graphics, textures without the constant need to swop/stream data of 360 and PS3.

That's a massive part of it.
7th gen consoles were extremely memory limited.

bonzobanana said:

I honestly wonder if the PS3 and 360 had that available memory if it couldn't both speed up and allow more sophisticated game engines.

It could.

bonzobanana said:

PS3 especially was a nightmare. I remember running Fallout 3 with DLC at 480i on my PS3 just because it helped reduce slowdown which was awful in Operation Anchorage and the other DLC.

Net Immerse/Gamebryo/Creation Engine was just never a good fit for the Playstation 3.

bonzobanana said:

Capcom's pressure on Nintendo to boost memory from 2GB to 4GB on Switch I feel was a great move I just wish they asked for higher CPU clocks at the same time.

Even just a 20% higher clocked CPU would have helped significantly.

bonzobanana said:

You have made a case for the Switch graphic hardware but despite being a big Nvidia fan myself I'm not convinced by it at this point for portable performance. Yes great for docked and easily superior to 360/PS3 no question.

I'm not an nVidia ardent fan. I primarily sit in the AMD camp as they typically provide better price/performance.

Here we can see the Microsoft Surface Pro beat the Geforce 7800GT and 7900GS.
https://www.anandtech.com/show/6877/the-great-equalizer-part-3/3

The Playstation 3 has a 24:8:24:8 core layout running at 550mhz with 22.4GB/s of bandwidth.
Which means it should fall below the Geforce 7800GTX in terms of capability. (As it has more ROPS and bandwidth.)

Tegra K1 is beating the Microsoft Surface Pro by 50% or more.
https://www.anandtech.com/show/8296/the-nvidia-shield-tablet-review/5

The Tegra X1 is beating the K1 by another 60% - 80% and in a few instances by 400%.
https://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/4

So we can account for the fact that the Switch is just 30% of the clockspeed of the X1 and still come out on top.

Ergo. Switch beats the Playstation 3 and Xbox 360.



--::{PC Gaming Master Race}::--

Something is wrong with quoting on my phone. I own Bayonetta 1 on Xbox 360 (which means I have it for Xbox One X), Wii U, and Switch.

I'm personally disappointed with the image quality of the Switch version of Bayonetta 1 on my 4K TV. Unlike most of my other Switch games, it looked really muddy. I also fired up Bayonetta 2 and it looks really sharp and vibrant.

I really can't use Bayonetta 1 as an example of Switch shortcomings because Bayo 2 looks so much better! And, like Digital Foundry said, it may actually be the best console version available. I'll compare the original Xbox 360 version to the the Switch version and maybe even the Xbox One X version myself, tonight. I feel like there's a lack of aliasing on the Switch but I could just be remembering things wrong.

Personally, optional portability makes it the version to own, anyway.



d21lewis said:
Something is wrong with quoting on my phone. I own Bayonetta 1 on Xbox 360 (which means I have it for Xbox One X), Wii U, and Switch.

I'm personally disappointed with the image quality of the Switch version of Bayonetta 1 on my 4K TV. Unlike most of my other Switch games, it looked really muddy. I also fired up Bayonetta 2 and it looks really sharp and vibrant.

I really can't use Bayonetta 1 as an example of Switch shortcomings because Bayo 2 looks so much better! And, like Digital Foundry said, it may actually be the best console version available. I'll compare the original Xbox 360 version to the the Switch version and maybe even the Xbox One X version myself, tonight. I feel like there's a lack of aliasing on the Switch but I could just be remembering things wrong.

Personally, optional portability makes it the version to own, anyway.

The Bayo ports to Switch are very barebones conversions; they're not bad per se, they just don't do much to take advantage of the hardware beyond just the almost automatic framerate boost that comes from a faster GPU. I mean, it's cool that both games run smoother than 360/Wii U in portable mode, but the docked mode feels like an afterthought.



Around the Network

I knew there was a reason I was subscribed to this guy!

https://youtu.be/DCfR54Wq4x8



d21lewis said:
I knew there was a reason I was subscribed to this guy!

https://youtu.be/DCfR54Wq4x8

This guy thinks Wiiu and switch have around the same Gflops which is laughable.



d21lewis said:
I knew there was a reason I was subscribed to this guy!

https://youtu.be/DCfR54Wq4x8

Few inaccuracies in the video. But he mimics my own sentiments on the GPU front.



--::{PC Gaming Master Race}::--