| JEMC said:
And now look at this table:
| SKU Name |
RX-421BD |
RX-418GD |
RX-216GD |
RX-421ND |
RX-216TD |
| Cores/Threads |
4/4 |
4/4 |
2/2 |
4/4 |
2/2 |
| Base Clock (15W) |
2.1 GHz |
1.8 GHz |
1.6 GHz |
2.1 GHz |
1.6 GHz |
| Max Clock (>15W) |
3.4 GHz |
3.2 GHz |
3.0 GHz |
3.4 GHz |
3.0 GHz |
| L2 Cache |
2 MB |
2 MB |
1 MB |
2 MB |
1 MB |
| iGPU |
Radeon R7 |
Radeon R6 |
Radeon R5 |
N/A |
N/A |
| GCN CUs |
8 CUs |
6 CUs |
4 CUs |
N/A |
N/A |
| GCN SPs |
512 SPs |
384 SPs |
256 SPs |
N/A |
N/A |
| iGPU Clock |
800 MHz |
800 MHz |
800 Mhz |
N/A |
N/A |
| TDP |
12-35W |
12-35W |
12-15W |
12-35W |
12-15W |
| DDR4 Data Rate |
2400 MHz |
2400 MHz |
1600 MHz |
2400 MHz |
1600 MHz |
Now, given that AMD first talked about these chips way back in 2014, it is much more logical (but also inherently wrong) to assume that Nintendo could use them to power their machines.
Why wouldn't Nintendo use the most powerful of those SoC, the 421BD, as the base of their home console, just dial down the CPU frequency to 1.6GHz but double the amount of GPU cores from 8 to 16 (or even 20 like the PS4) to keep it around the 20W mark, and use the 216GD as the base for their handheld part of NX tweaking it to keep it into the 10W mark?
Porting between the home and handheld would be a lot easier having the same components (but in different number), they both would use tried and tested components, given their nature of SoC chips they wouldn't be too pricey for the performance they give and, surprisingly enough, the home console could be more powerful than the PS4. It's a win-win escenario for Nintendo!
Now, am I going to say that Nintendo will use those chips? Of course not! There were SoC more powerful than WiiU long before it launched and Nintendo didn't use them back then, so no one can say what Nintendo will do.
But if I had to bet, I'd bet on one of these chips rather than on a Zen+Artic Islands GPU.
|
First, PS4 has 18, not 20 CU (1152, not 1280 SP)
But more to the point about the bolded, that calculation wouldn't work. The GPU is by far the most consuming part, going up to 16 while lowering the CPU Clock from 2.1 to 1.6 would extend the TDP to around 50W minimum. And that's at the GPU's base clock, not the 800mhz turbo in that diagram. The APUs lower the CPU clockrate anyway when the both the CPU and GPU parts are at full capacity (Intel instead ops to lower the GPU clock and give the CPU full resources) and the APU tend not to hold the turboclock for long anyway (at least previous versions didn't)
I agree that Nintendo probably won't use Zen and GCN 1.3 (arctic Islands), but I also doubt on Carrizo or any previous Bulldozer based core due to how the modules work being at odds with the needs of a console. My bet is more on the Puma+ cores (a nicely improved Jaguar from PS4/Xbox ONE with higher IPC and Clock rates), which also fits better into their low consumption sceme, and GCN 1.1 or 1.2 for the graphics part, both in 28nm
Since both are only very incremental updates to the hardware of PS4/Xbox ONE, beating them in processing power would need to consume practically just as much power as they do, which is not how Nintendo deals things. It could beat Xbox ONE, but I seriously doubt about the PS4. At least when it comes to GPU power, that is; one the CPU side Nintendo could easely clock a Puma+ above 2ghz without consuming more than their PS4/Xbox ONE counterparts, meaning less power needs to be diverted from GPU to CPU. So while on paper weaker, it could probably keep up pretty well.
Puma+ could also be interesting for a fusion concept (which I still don't really believe in, but hey, speculations) as AMD has made chips specifically for tablets in that architecture which consume less than 5W TDP (which can be found instead in many low-price laptops, even 17inch ones), so basically it has the ideal handheld counterpart, too.