By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
bonzobanana said:

Memory bandwidth is a bit limited and there is pressure to keep storage resources low. In that situation there will be games that outperform Switch on PS3 and 360 in certain situations.

The Switch has Delta Colour Compression.
Maxwell's implementation of Delta Colour Compression allowed it to achieve a 20-30% bandwidth saving over Kepler.
https://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/3

That's 3rd gen. So there was another 15-20% on top of that.

Thus bandwidth isn't a direct comparison between 7th gen consoles and the Switch based purely on the memory bus.

And thanks to Maxwells tiled-based rasterization it's able to be more efficient at culling, meaning it does less work and can use the limited bandwidth it has to a greater effect.


bonzobanana said:

I still see people today write the wii u is superior to 360 and PS3 and yet I know from experience this is untrue overall. The wii u's is a weaker console that benefits from fantastic Nintendo games.

To be fair... The Switch's GPU like the WiiU's is more efficient. It's not as efficient as Maxwell, but it's certainly better than the R580 hybrid chip in the Xbox 360.
The Achilles heel is of course that paltry 12.8GB/s of bandwidth which the eDRAM is supposed to try and mitigate, which will vary between developers on how effectively it is leveraged.

bonzobanana said:

I'm not a Switch owner but looking to buy one at some point and curious about its performance level. It does feel many are ignoring the Switch lack of CPU resources but the games are showing this to be an issue.

The Switch's CPU can hold it's own against the last generation consoles. It's amazing how far CPU performance has come in the last decade.
The Switch's CPU would have easily beaten the Xbox 360 and Wii U's CPU no contest if it actually ran with it's original clocks.

Better low-level API's and more efficient drivers have lightened the CPU burden somewhat though, which will play in the Switch's favor.

bonzobanana said:

It's at a lower level in pure gflops. Something like 157 gflops in portable mode or 188 gflops with a performance boost at the expense of battery life. From what I've read 157 gflops is the norm in most games not 188 gflops. That's 19 gflops less than wii u and something like 50-90 less than 360 and PS3 in portable mode.

Gflops are irrelevant. Unless all other things are equal.
A GPU with less flops can beat a GPU with more flops.

bonzobanana said:

In the pc world we aren't seeing claims that the slightly newer architecture and features radically changes game performance. An old high end gpu of 200 gflops compared to a new low end budget one of 200 gflops doesn't magically outperform the older model even if one is DX10 and the other DX12. Sometimes the older model might actually perform better as it may have higher end spec elsewhere, more memory, higher bandwidth memory etc. It was designed with less compromises and cost reductions. On screen you might get one or two improved effects if you are lucky mostly nothing though. 

Radeon 5870:
* 2.72 Teraflops.
* 153.6GB/s of bandwidth.

Radeon 7850.
* 1.76 Teraflops.
* 153.6GB/s of bandwidth.

No contest that the Radeon 5870 would win right? I mean, flops is all that matters? And if you have the same bandwidth there isn't a radical difference in performance right? Those are YOUR claims.

And you are wrong.
https://www.anandtech.com/bench/product/1062?vs=1076

The Radeon 7850 wins hands down.

Fact of the matter is... Architectural efficiency is important and people need to stop ignoring it in favor of using something simple like flops or bandwidth as their only comparison points, it's stupid, it's inaccurate and it's wrong. Stop doing it.

bonzobanana said:

Like the wii u though I'm sure the Switch will have many comparisons with newer and older hardware over its lifetime and the evidence accumulating now clearly show its strength and weaknesses already.

And so it should.

quickrick said:

It's still far to early to give judgment that switch CPU is weaker then 360/ps3 CPU, it's just came out, and runs most games better. aside from al noir which is probably the most demanding port, but that game especially targeted ps3 CPU, and 360 ran a bit so not exactly a fair comparison, we need more games to come tot his conclusion. 

Some tasks the Switch's CPU will beat the Xbox 360/Playstation 3's CPU. In other tasks it will loose.
I.E It should win with integers every day of the week as it's a more balanced architecture, but loose in iterative refinement floating point which the Cell will dominate.

It's a very different ISA anyway.
The main issue with the Switch CPU is that it is being held back by it's clock rate, if it was running 80% faster it would have won, no contest.

For now though... Ports are our best gauges to the CPU capabilities of the device.

I have made the point myself about better optimisation of bandwidth but still there seems like a shortfall and if anything your values seem to confirm it. As I said previously the xbox one has a pretty recent GPU architecture but still benefits from that 32MB of ultrafast memory to supplement main memory bandwidth which is in itself much higher anyway than Switch. 

My point about graphics cards was a low end chipset of recent times as used in Nintendo hardware vs high performance of the past which share similar gflops output and I'm sure I've seen PC benchmarks where the older cards actually came out faster because they weren't compromised as much. Your comparison seems more like mid range to old high range. I certainly wasn't thinking about such a comparison I was thinking of high or mid range old gpu vs  low end/mobility of more recent times of similar gflops. In the case of the wii u the Radeon GPU after all the debate about 176 or 352 gflops it was really down to the exceptionally low power draw of the wii u that confirmed it couldn't be 352 gflops and likely a mobility radeon GPU  rather than desktop version. That seemed to fit better with the confirmed spec and a better fit for a small console with limited cooling. 

I don't agree with your comparison as I say but even if we accept it, looks like most game frame rates between the 2 are broadly similar and roughly speaking the newer chipset achieves about a 50% frame rate boost for the same gflops. It doesn't seem true to some other comparisons I've seen where they have fitted older GPU's and newer GPU's to the same motherboard and CPU setup. This for all I know could have been historic frame rates with the older gpu matched to an older motherboard and weaker older CPU setup which would make the comparison utterly pointless but even if we accept this that means the 155 gflops is boosted to about 230 gflops or something putting it squarely into the ps3/360 performance level and no more. It doesn't seem right though. With games like Xenoblade 2 dropping to 368p at times and Doom and very low resolutions too both at 30fps as well. Some Switch games suffer from slower frames rates when docked. This must surely be a CPU or memory bandwidth issue. However where a game has to drop resolution significantly to maintain frame rates in portable mode it feels like its a GPU performance issue. The CPU is the same and memory bandwidth better for lower resolutions.

We are clearly both in agreement about the CPU performance anyway, yes the Tegra at full clocks would be comfortably superior to Xbox 360 in CPU terms but Nintendo didn't use full clocks and only 3 CPU's are used for actual games the other is dedicated to the operating system and background tasks. I still secretly hope Nintendo might release some CPU performance with a later firmware but maybe that is unlikely. Sony did increase clocks with the PSP slightly so it is possible.  I'm unsure if at full clocks it would beat a PS3 with all cells firing and fully optimised code I strongly suspect not considering many consider not even the Xbox One or PS4 achieve that and there are some benchmarks that show it stronger in real world performance than ps4/xbone. 

I tend to look at what developers are achieving with hardware as well as the spec numbers. See what's happening on screen and where the compromises are especially where the compromises are repeated by many different developers. I think you always have to factor in real world evidence. 

I'm not yet convinced of the Tegra's graphic superiority in portable mode with such low clocks vs 360/PS3. To me it looks much more like the 4GB of main memory is the real saviour, room for better graphics, textures without the constant need to swop/stream data of 360 and PS3. I feel this is the huge strength of the system compared to 360/PS3. I honestly wonder if the PS3 and 360 had that available memory if it couldn't both speed up and allow more sophisticated game engines. PS3 especially was a nightmare. I remember running Fallout 3 with DLC at 480i on my PS3 just because it helped reduce slowdown which was awful in Operation Anchorage and the other DLC. Capcom's pressure on Nintendo to boost memory from 2GB to 4GB on Switch I feel was a great move I just wish they asked for higher CPU clocks at the same time. You have made a case for the Switch graphic hardware but despite being a big Nvidia fan myself I'm not convinced by it at this point for portable performance. Yes great for docked and easily superior to 360/PS3 no question.