JustBeingReal said:
It's all in the math and pixel count dude. 1080p is 1920X1080= 2,073,600 pixels, 480p is 640X480=307,200, 1080p is 6.75X more demanding to render than 480p. That's with the same level of effects, physics, everything being the same comparing one resolution to another. If the NX console had the same level of hardware as PS4 then the NX handheld could run the exact same level of visuals and every other feature just at native 480p if it was packing 1/6.75 of the hardware of the console. When I say Carrizo cut down, it's just an easier way of saying an APU that uses a smaller amount of Excavator CPU cores and GPU tech than what Carrizo has, clocked low enough to fit within X amount of a wattage budget. The CPU core type could be Puma or Puma+, that would still outstrip Jaguar since Puma is a new architecture, with better performance per watt. As for the HBM comment, HBM can be used in whatever quantities Nintendo needs, dependent on cost they have outlined or factory output. From a power consumption perspective 1 watt can allow for 35GB/s of bandwidth, it's a flexible technology that would work for any platform holder's needs in their next platform, very efficient indeed, about 3X better than GDDR5 is currently.
Personally I think NX has a great chance of doing well, particularly if it is designed to put Nintendo's games at the forefront and break down any barriers between the handheld and console audiences that Nintendo covers. If Nintendo focused on making a platform that caters to the 3rd party development community it could very well be Nintendo's chance to gain business back for not only the console market, but also the handheld. Even 3DS lacks the major releases like Witcher 3, Batman games, etc, if NX console can run good versions of those games and a handheld version can play those games too, even at 480p with lower graphical details it could mean big business for both Nintendo and their 3rd party partners.
As far as your last comment goes, you're not looking at things in the right way. From a technical perspective it's the added pixels that cause the major demand on hardware, sure extra gameplay features add to that demand, but they're not the biggest performance hog on hardware, rendering extra pixels is. |
Is the Carizzo really all that special? I think the chip Nintendo would want AMD to start with is not the Carizzo, but the Wii U chip itself. That's a 40nm part, I'm sure today they can probably look at that chip and improve its efficiency even further, then you make a new chip based on that design. Replace the IBM CPU, get it on a 14nm process, remove the embedded RAM on the GPU for likely more power efficient HBM2.
The Wii U chip gets about 11 GFLOPS/watt (if we believe the 350 GFLOP number that's used for it) at 40nm, if it was somehow possible to shrink that to 14nm, that would probably be in the range of 22-23 GFLOPS/watt which is fairly comparable to a Carizzo, no? Considering it's a 4-5 year old chip, that's not bad, I'm sure they could today improve it's power efficiency even more without just die shrinks too.
As for pixels, I think for the portable maybe 960x540 resolution for really demanding 3D games, which would correspond 1:4 to 1920x1080 for the console. For lower end games, I think even the portable could just run things like Kirby's Rainbow Curse, Yoshi's Wooly World, Star Fox Zero, at 1280x720 though.
1 watt per 35GB of bandwidth sounds ok, but wouldn't that be too power hungry for a portable? The design of the handheld is going to have to be the trick here I think, because the console is relatively easy to figure out because you can just plug it into the wall.
*If* they can get PS4 level engines (with some effects stripped down) to run on a portable at a reduced resolution, yes I agree, they would probably get a shit-ton of Japanese third party support at least, and probably an OK amount of Western support too. I wonder if that's doable though.