Pemalite said:
The Switch has Delta Colour Compression.
To be fair... The Switch's GPU like the WiiU's is more efficient. It's not as efficient as Maxwell, but it's certainly better than the R580 hybrid chip in the Xbox 360.
The Switch's CPU can hold it's own against the last generation consoles. It's amazing how far CPU performance has come in the last decade.
Gflops are irrelevant. Unless all other things are equal.
Radeon 5870:
And so it should.
Some tasks the Switch's CPU will beat the Xbox 360/Playstation 3's CPU. In other tasks it will loose. For now though... Ports are our best gauges to the CPU capabilities of the device. |
I have made the point myself about better optimisation of bandwidth but still there seems like a shortfall and if anything your values seem to confirm it. As I said previously the xbox one has a pretty recent GPU architecture but still benefits from that 32MB of ultrafast memory to supplement main memory bandwidth which is in itself much higher anyway than Switch.
My point about graphics cards was a low end chipset of recent times as used in Nintendo hardware vs high performance of the past which share similar gflops output and I'm sure I've seen PC benchmarks where the older cards actually came out faster because they weren't compromised as much. Your comparison seems more like mid range to old high range. I certainly wasn't thinking about such a comparison I was thinking of high or mid range old gpu vs low end/mobility of more recent times of similar gflops. In the case of the wii u the Radeon GPU after all the debate about 176 or 352 gflops it was really down to the exceptionally low power draw of the wii u that confirmed it couldn't be 352 gflops and likely a mobility radeon GPU rather than desktop version. That seemed to fit better with the confirmed spec and a better fit for a small console with limited cooling.
I don't agree with your comparison as I say but even if we accept it, looks like most game frame rates between the 2 are broadly similar and roughly speaking the newer chipset achieves about a 50% frame rate boost for the same gflops. It doesn't seem true to some other comparisons I've seen where they have fitted older GPU's and newer GPU's to the same motherboard and CPU setup. This for all I know could have been historic frame rates with the older gpu matched to an older motherboard and weaker older CPU setup which would make the comparison utterly pointless but even if we accept this that means the 155 gflops is boosted to about 230 gflops or something putting it squarely into the ps3/360 performance level and no more. It doesn't seem right though. With games like Xenoblade 2 dropping to 368p at times and Doom and very low resolutions too both at 30fps as well. Some Switch games suffer from slower frames rates when docked. This must surely be a CPU or memory bandwidth issue. However where a game has to drop resolution significantly to maintain frame rates in portable mode it feels like its a GPU performance issue. The CPU is the same and memory bandwidth better for lower resolutions.
We are clearly both in agreement about the CPU performance anyway, yes the Tegra at full clocks would be comfortably superior to Xbox 360 in CPU terms but Nintendo didn't use full clocks and only 3 CPU's are used for actual games the other is dedicated to the operating system and background tasks. I still secretly hope Nintendo might release some CPU performance with a later firmware but maybe that is unlikely. Sony did increase clocks with the PSP slightly so it is possible. I'm unsure if at full clocks it would beat a PS3 with all cells firing and fully optimised code I strongly suspect not considering many consider not even the Xbox One or PS4 achieve that and there are some benchmarks that show it stronger in real world performance than ps4/xbone.
I tend to look at what developers are achieving with hardware as well as the spec numbers. See what's happening on screen and where the compromises are especially where the compromises are repeated by many different developers. I think you always have to factor in real world evidence.
I'm not yet convinced of the Tegra's graphic superiority in portable mode with such low clocks vs 360/PS3. To me it looks much more like the 4GB of main memory is the real saviour, room for better graphics, textures without the constant need to swop/stream data of 360 and PS3. I feel this is the huge strength of the system compared to 360/PS3. I honestly wonder if the PS3 and 360 had that available memory if it couldn't both speed up and allow more sophisticated game engines. PS3 especially was a nightmare. I remember running Fallout 3 with DLC at 480i on my PS3 just because it helped reduce slowdown which was awful in Operation Anchorage and the other DLC. Capcom's pressure on Nintendo to boost memory from 2GB to 4GB on Switch I feel was a great move I just wish they asked for higher CPU clocks at the same time. You have made a case for the Switch graphic hardware but despite being a big Nvidia fan myself I'm not convinced by it at this point for portable performance. Yes great for docked and easily superior to 360/PS3 no question.








