By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - How do the visuals on the Nintendo Switch compare to those of the Xbox 360 & PS3?

 

The Nintendo Switch hardware is...

A big leap over 7th gen 71 40.11%
 
A minor leap over 7th gen 72 40.68%
 
About the same as 7th gen 24 13.56%
 
Actually WORSE than last gen 10 5.65%
 
Total:177
bonzobanana said:
curl-6 said:

It's a mistake to assume that a game not being realistic means it isn't demanding. Ratchet & Clank on PS4 is a cartoon, you think that would run on Wii U without downgrades? Botw has a ton of effects and techniques that are demanding for its hardware. 

You can port almost anything to anything if you downgrade it enough. Could PS3/360 run a mildly downgraded version of Botw? Of course they could. Could they run it as it is now, intact? Of course not; they don't have the memory, and hard drive streaming won't close the gap because Botw already does that.

We will have to disagree about the cartoon graphics, I'm not saying there isn't some demanding parts of such games but compared to a game attempting realistic graphics which has to do a lot more with regard physics and realistic textures which are constantly changing from scene to scene plus many other graphic effects which try to mimic real life I don't think it is a fair comparison. When you put PC games on very low graphic settings or even mod them to run on even weaker hardware than intended you can actually end up with a game that looks cartoon like and Nintendo in style.

1GB of main memory, optical drive reading plus a 3GB flash memory cache vs 360-420MB, faster optical drive (360) and much larger HDD cache is difficult to compare  especially when you  have to factor in cpu performance differences, memory bandwidth and greater GPU raw performance vs a later GPU architecture of weaker  performance  marginally.. We know generally when a multiformat title is on all 3 consoles PS3, 360 and wii u most of the time the 360 wins, followed by PS3 and then wii u. Yes wii u wins occasionally for a few games but overall the 360 is the easy winner. Zelda could easily be better on 360 and PS3 if programmed to the maximum of their abilities, enhanced using the greater cpu performance. Again though we can speculate as much as we want it will never happen so we can all believe what we want to believe. I own ps3, 360 and wii u and I believe properly programmed the ps3 is most powerful then the 360 and then wii u based on the evidence I've seen and the games I play and I would say digital foundry certainly backs up my views entirely based on their comparisons that 360 is stronger than wii u. 

Multiplats aren't really a good point of comparison in this case as they were built with engines/teams that had 8 years of accumulated experience and optimization for PS3/360, while the Wii U never sold well enough for devs to invest the resources to really get the most out of it. Zelda will often hit scenarios when it needs more than 500MB of RAM in use at once, and when that happens, PS3/360 will hit a brick wall and downgrades will be required.

But honestly, this debate has been done to death for the past 5 years, there's nothing to say that hasn't been said a million times already. This thread isn't about Wii U. It's about Switch.

Last edited by curl-6 - on 29 January 2018

Around the Network
bonzobanana said:

1. You seem to have a strong bias in favour of the Switch rather than a sit on the fence of looking at the evidence and how the Switch performs. Your bias seems to be negating all evidence that differs with what you want to believe.

2. Those mips/dmips figures are admittedly only a rough guide to comparing chips but you seem to be using that opportunity to believe it is the Switch under represented when its more likely the other way for reasons I've given previously. You would obviously understand the powerpc figures are pretty much inline with many other powerpc cpu's that are variants on the same processors in ps3 and 360. Just correlate them with other powerpc chips in that list of performance figures. There are no wild claims. 

3. Dhrystone is just an integer performance check and those figures are mainly DMIPs anyway. 

4. It's not like those PPC figures seem strange they are stating 2 mips per mhz per core on 360 where as wii was 2.3 mips per mhz which was out of order execution like ps3. So even with the dual thread on each core they are stating a low figure. It's only because there are 3 cores running at 3.2ghz that the final figure is so good. Basically 6 x 3200 approx which is 19,200. The cpu design of 360 seems to be designed around generating high floating point performance.

5. You look at LA Noire on Switch and you can clearly see a system with a CPU bottleneck in the design and this is hardly surprising with Nintendo.

6. Both the specification and real world performance support my view I believe but I'm happy to change my opinion if any evidence comes along to change it.

7. I was an early adopter of the wii u but it was clear when I started playing it there was a cpu issue and that was made clear when the actual specs were leaked. I just don't think Nintendo are to bothered about CPU performance. Surely they could have run the Tegra at full CPU speed if they wanted to but didn't.

1. You might have a point if I didn't provide evidence in the form of a. Dhrystone benchmark results, and b. the developer testimonial about relative real-world IPC of ARM/Jaguar vs. PPC. Furthermore you made certain assumptions like developers being able to take advantage of all threads of the Xenon for games equally which deserve to be critiqued.  Compare it to PC gaming. In what games does one gain a huge benefit from using six threads over four threads? I can only think of simulation heavy games like Cities: Skylines, Total War, with Crysis 3 being the exception for an action game. 

2. Real-world benchmarks are usually a good predictor of relative performance (as long as one contextualizes them in their assessment), but theoretical predictions are pretty useless without considering the architectural differences, that is why we've rejected this mips prediction (which is usually based on the best performing instruction set's performance) as being anything useful. Again, as permalite said, 

"MIPS is only relevant if the CPU's being compared are of an identical architecture.

Nor is it representative of a complete processors capabilities anyway.

I mean... There are soundcards with 10,000+ MIPS, you aren't running Crysis on them."

3. Well yeah, when you have a dedicated and specialized thing called a GPU you're going to be mainly using that to compute flops where you can. Hence the usefulness of Dhrystone when it comes to measuring the viability of general purpose processing. The persons in the neogaf link I provided performed a more general benchmark which includes fp operations anyway, and the conclusion was pretty much the same. 

4. But again you can't just add up the mips among the various cores and say: see it is more powerful because it has more mips. Most computations are going to be on two cores with each extra core having diminishing returns. 

5. I look at LA Noire and see an unpolished game. There is no reason why frame-rate should affect game-play speed if the game were polished.  The number of high quality Switch titles that reach solid 60fps with respect to PS360 titles tells us just as much about the relative CPU performance, unless one thinks the GPU is offsetting a lot of the CPU's tasks, which could very well be a possibility. 

6. If so, then provide the evidence rather than speculation. 

7. Who knows, they could always free up the fourth core once they optimize their OS further. But generally what you're seeing is not just a trend with Nintendo. The other platform-makers knew that by further dividing the processing between the CPU and GPU more efficiently they could achieve better results. It is why GPGPU  was such a big catch word on gaming forums when the PS4/XBO released. When you have vector operations to compute it makes so much more sense to use a processor that has hundreds of weak cores versus one that has three or four strong ones. In Nintendo's case I suspect the Tegra was under-clocked for TDP/heat considerations. 

Edit: By the way here is some more evidence about the DMIPS. 

http://lowendmac.com/musings/05/0513.html

"Update: Under Linux, the Xbox 360 has Dhrystone benchmark scores roughly comparable to a 1.25 GHz G4 Mac mini (also running Linux). That's a lot less power than we ever would have expected from a triple-core 3.2 GHz PowerPC machine."

Last edited by sc94597 - on 28 January 2018

Tell me an open world PS3 game that looks as good as this: 



"The strong do what they can and the weak suffer what they must" - Thoukydides

Alkibiádēs said:

Tell me an open world PS3 game that looks as good as this: 

Virtual Hydlide on Saturn looks better. In VH it looks like a real outof shape white guy wearing a bad costume! It looks so real! I don't see a real looking fat white guy in a bad costume in Zelda..do you? Exactly! Saturn has 8 processors. It has the power of Model 2 games and true to life fat white guys! 



Alkibiádēs said:

Tell me an open world PS3 game that looks as good as this: 

And runs above 720p. As far as I can recall, not a single complex open world game on PS3/360 was higher than 720p.

Last edited by curl-6 - on 28 January 2018

Around the Network
fatslob-:O said:

It's questionable if the PS3 was "technically superior" since there were many other pitfalls in hardware. Hardware capability is defined with respect to bottlenecks and software.

In overall performance, the Playstation 3 was technically superior to the 360, it's not even up for debate at this point as it's been done to death.

fatslob-:O said:

Calling ports "shit" is too shallow when no game design perfectly matches hardware bottlenecks for every platform ... (NFSU2 wasn't hot on the GC but it was a perfectly competent port for what the hardware could do)

You are just confirming my point. If a game doesn't account for a platforms various hardware nuances and suffers from erratic performance and compromises to visual fidelity, then it is a shit port.

Every game is built with a main platform in mind, Skyrim for instance was a shit port on the Playstation 3 because the Xbox 360 was the main platform it was developed for.

This isn't the only time in console history this has happened either.
WiiU suffered allot of the same issues as the Playstation 3 with shit ports.


fatslob-:O said:

 For what it is, GC was overrated and PS2 went underrated in terms of hardware capability and the two are closer (possible about even) than what most of the hardware/enthusiast community thinks ... (it had too many real world pitfalls to be deemed "superior" and it showed since many ports suffered due to the game not being originally designed around GC bottlenecks and it's arguably one of the biggest reasons why original Xbox was able to keep up with a supercharged GC since it had programmable shaders) 

GC hardware is more lame than what most hardcore gamers believe and most people had no idea that PS2 well surpassed PS3 in terms of hardware and software design complexity ... 

I disagree.
When a game was built around the Gamecubes hardware, it showed it was a step up above the Playstation 2.
If we start talking about Multiplatform ports, then the improvements would be marginal...

We all remember how marginal the improvement was between 6th and 7th gen early on, the improvement in Call of Duty 3 between the PS2 and Xbox 360 wasn't exactly a generational divide all things considered... So one would assume the differences would be even smaller for a console of the same gen that was more capable.

In the end though, the games really do speak for themselves Xbox and Gamecube games built for the hardware were a big step up over the Playstation 2, can't believe I am having this discussion in 2018 to be honest.

fatslob-:O said:

 (polymorph engine is only useful for geometry amplification/tessellation but developers figure that they can just pass high polycount meshes instead since they don't seem to think the higher vertex attribute bandwidth consumption and more expensive vertex shader is a problem)

I am aware. However the Polymorph engine will still kick the Truform engine in the nuts on the Xbox 360... Whilst the Playstation 3 weeps in the corner.




www.youtube.com/@Pemalite

Pemalite said:

In overall performance, the Playstation 3 was technically superior to the 360, it's not even up for debate at this point as it's been done to death.

It's still pretty debatable. PS3 was NUMA in it's extremist form. Segmented physical memory meant level designs had to be smaller to avoid stutters, SPE's did not have access to main memory and instead had 256KB local stores so a DMA engine had to be used to communicate between the two, heterogeneous processor and virtual memory, no eDRAM (made PS3 struggled more often with alpha effects compared to 360), lower geometry performance and no unified shaders made load balancing nearly impossible so some efficiency is lost right there ... (similar situation applies to WII U) 

Pemalite said:

You are just confirming my point. If a game doesn't account for a platforms various hardware nuances and suffers from erratic performance and compromises to visual fidelity, then it is a shit port.

Every game is built with a main platform in mind, Skyrim for instance was a shit port on the Playstation 3 because the Xbox 360 was the main platform it was developed for.

This isn't the only time in console history this has happened either.
WiiU suffered allot of the same issues as the Playstation 3 with shit ports.

Calling ports shit highly undermines the technical difficulties in a developers work and I don't think you understand the hardships that technical developers have to go through ... (not every port can be built to take advantage of each platforms, much less sometimes there not possible in the case of 6th gen) 

If ports were made to take specific advantages of each hardware especially in the divergent case then you'd get visual differences instead of compromise and there would be no technically "inferior" or "superior" but would probably come down to subjectivity ... 

Pemalite said: 

I disagree.

When a game was built around the Gamecubes hardware, it showed it was a step up above the Playstation 2.
If we start talking about Multiplatform ports, then the improvements would be marginal...

We all remember how marginal the improvement was between 6th and 7th gen early on, the improvement in Call of Duty 3 between the PS2 and Xbox 360 wasn't exactly a generational divide all things considered... So one would assume the differences would be even smaller for a console of the same gen that was more capable.

In the end though, the games really do speak for themselves Xbox and Gamecube games built for the hardware were a big step up over the Playstation 2, can't believe I am having this discussion in 2018 to be honest.

Because many games used different technologies back then so comparisons couldn't made on a binary basis, at that point the terms "superior" and "inferior" became subjective ... 

There were GC games built that couldn't be ran on the PS2 in the same way but the same easily applied the other way around. There were things to appreciate that other couldn't do so neither had a definitive advantage ... (GC wasn't very good at vertex processing or alpha effects to the same degree like the PS2 was and PS2 didn't have texture compression or a flexible texture combiner system) 

I recount an instance at Beyond3D where one developer said GC was easily the worst performing platform of three based on his experience especially in the case if the GPU had to clip some triangles ... 

Pemalite said: 

I am aware. However the Polymorph engine will still kick the Truform engine in the nuts on the Xbox 360... Whilst the Playstation 3 weeps in the corner.

Tessellation is useless since nearly no developers are using it anymore (concept was great but technology/implementation sucked plus there were issues with quad shading efficiency) and I doubt it's an advantage for Switch since it doesn't have a very high geometry throughput to begin with. (384M tri/s at the absolute lowest ? 360 was able to do 500M tri/s while PS3 was half rate ?) Polymorph and Truform ended up being dead silicon, I bet async compute will get more traction than current tessellation technology ever could ... 



I can't believe people here really believe that PS3 is better than Switch. I mean... I just can't even argue with someone like this



EricFabian said:
I can't believe people here really believe that PS3 is better than Switch. I mean... I just can't even argue with someone like this

Well to be fair many of these people believed that Sony should have used the Cell processor in the PS4 as well.

I don't understand what we are arguing right now, if its sheer power then that's a given Switch all day. If we are just talking visuals though as someone else already pointed out. At the start of this generation many people claimed that PS4 didn't look that advanced in comparison to PS3, now they would never say that. Its going to take exclusives and other games created with the Switch in mind to see what the system can truly do.



EricFabian said:
I can't believe people here really believe that PS3 is better than Switch. I mean... I just can't even argue with someone like this

The fact we're even having this discussion is quite silly frankly. It's like saying "3DS has better graphics than Vita because I haven't seen anything on Vita that looks better than RE Revelations".

Last edited by curl-6 - on 28 January 2018