Hello,
Wii U's power has been subject to much speculation for the last year, some developers praising it initially over the PS360 gen, although most games appear to either not take advantage of the power of the system, with usually poorer versions of the multiplatform games. However, the system is not necessarily underpowered for the target audience of Nintendo, as shown by a few spectaclar games. For the third party developers, however, the main issues from a technical point of view;
1) Nintendo chose the wrong architecture this time. They couldn't foresee the industry shifting away from PowerPC to x86 (Apple, Sony, Microsoft all switched to x86), and suddenly the Nintendo is the odd man.
2) Wii was underpowered but the casual gamers created enough of an audience to still be targeted by the third party developers. Now given the lack of a compelling gimmick, the crowd is not flocking to the Wii. Most Nintendo players of the yore grew up and switched to more hardcore platforms. Tablet controller could have been a nice gimmick if it wasn't already surpassed by the attack of the annually updated of endless tablet onslaught.
After wishing the best of luck to WiiU, which I have nothing against, I would like to compare the WiiU's GPU to PS4's GPU, completely from a technical point of view. THIS IS, IN NO WAY, INTENDED TO BE A COMPARISON OF CONSOLES, but merely a technical comparison of GPUs with respect to their common producers (AMD) and different baseline architectures (VLIW vs GCN).
I remember reading a technical analysis and comparison of the old and the new architecture at the following link:
http://techreport.com/review/24086/a-first-look-at-amd-radeon-hd-8790m
First, I'm trying to figure out the average & approximate efficiency increases with the new architecture (apart from the increased capability, scalability, and feature-set even before optimization). The cards tested include:
GPU | ALUs | Core clock |
Mem. clock |
Mem. interface width |
Memory | Fab. process |
Die size |
Radeon HD 7690M | 480 | 600 MHz | 800 MHz | 128-bit | 1GB GDDR5 | 40 nm | 118 mm² |
Radeon HD 8790M | 384 | 900 MHz | 1000 MHz | 128-bit | 2GB GDDR5 | 28 nm | ~76.5 mm² |
Let's create a very linear and equal baseline for the both architectures, taking away any advantage they may inherently have.
HD 7690M
-has 25% more cores (1.25),
-runs at 2/3 of the core clock (0.667)
-runs at 80% of the memory clock (0.80)
Therefore, in order to equalize the ground, I need to multiple the HD8790M results with (1.25x0.667x0.80)= 0.667, which is a rough linear approximation of its "frequency & core equivalent" results.
Average Frame Rate (Higher is better) | ||||||
HD 7690M | HD8790M | HD8790M adj. | ||||
Battlefield 3 | 31 | 100.0% | 49 | 158.1% | 32.7 | 105.4% |
Borderlands 2 | 28 | 100.0% | 44 | 157.1% | 29.3 | 104.8% |
Far Cry 3 | 29 | 100.0% | 42 | 144.8% | 28.0 | 96.6% |
Hitman Abs. | 22 | 100.0% | 41 | 186.4% | 27.3 | 124.2% |
Sleeping Dogs | 31 | 100.0% | 47 | 151.6% | 31.3 | 101.1% |
Average | 100.0% | 159.6% | 106.4% |
99th Percentile Frame Time (Lower is better) | ||||||
HD 7690M | HD8790M | HD8790M adj. | ||||
Battlefield 3 | 39.2 | 100.0% | 24.7 | 63.0% | 37.1 | 94.5% |
Borderlands 2 | 88 | 100.0% | 54.3 | 61.7% | 81.5 | 92.6% |
Far Cry 3 | 55 | 100.0% | 39.8 | 72.4% | 59.7 | 108.5% |
Hitman Abs. | 103.8 | 100.0% | 53.2 | 51.3% | 79.8 | 76.9% |
Sleeping Dogs | 95.2 | 100.0% | 29.3 | 30.8% | 44.0 | 46.2% |
Average | 100.0% | 55.8% | 83.7% |
Now given that the performance does not always scale linearly, those results with respect to efficiency gains are extremely conservative. Therefore we can expect AT LEAST a 6.4% increase in average frame rates, and 16% decrease in 99th frame time, providing not only better performance but also a more stable experience (with less variance and dips in frame rate). Again those are the most conservative (absolute minimum) efficiency gains you would expect with no optimization whatsoever, on a core-to-core, frequency-to-frequency basis.
Now, let's go to the actual differences between the WiiU GPU and PS4 GPU
PS4 : 1154 CU, WiiU : 320 CU, ratio : 1154 / 320 = 3.6
operating at Memory Frequency : 800 Mhz vs 550 Mhz, ratio : 800 / 550 = 1.46
in addition to
PS4 Memory Bandwidth : 176 GB/s, WiiU : 12.8 GB/S, ratio : 13.75
Before adding up the efficiency gains from the older architecture to GCN,
- A simple math tells us a 3.6x1.46 = 5.25 times of raw power difference.
- Given that the efficiency gains are at least 7%, this will widen the difference to a minimum of 5.6x or roughly 6 times.
There you have it... PS4 GPU, very roughly speaking, yields at least 6 times the performance of the WiiU's GPU. Of course the system is not only composed of the GPU, but other components also tend to agree with this. The Memory available for games is also 5-6x the size of WiiU, along with almost 14x the memory bandwidth, and much more powerful CPU (WiiU's CPU is based on 1999 architecture).
Don't get me wrong, WiiU is a nice machine, it's just not in the same league of the new gen consoles, and the difference is larger than thought, or than it's been between PS360 and Wii. Nintendo knows this, but just hoping the touch controller to be better utilized and popular, which didn't seem to work out. WiiU is still far more powerful than Wii, at least in terms of RAM and GPU, but just appeals to a different audience.
Playstation 5 vs XBox Series Market Share Estimates
Regional Analysis (only MS and Sony Consoles)
Europe => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 : 49-52% vs PS4 : 48-51%
Global => XB1 : 32-34% vs PS4 : 66-68%