Soundwave said:
Problem remains this is a massive waste of SMs. You could get the same performance from 8SMs and just clock a bit higher and have a cheaper chip. Why pay for 12SMs, it's a way larger chip, doesn't really make a whole lot of sense. Like this doesn't even align with things Nintendo has done in the past if that's the argument. 210 MHz, lol, the Wii's 2001-era GPU has a higher clock than that. These are insanely low clocks for a more expensive and much larger chip for no reason. This is like going out of your way to buy a jumbo popcorn at the theater and paying the $6 premium for it and then eating 10% of it, you could have just bought a freaking regular popcorn and not have paid the extra money. And if the argument for doing so is "well I did that because I'm cheap" ... it's like what? lol. How does that make any sense. |
Because you'd save power. Most hardware runs at just a fraction of how efficient they can be because they're placed far, far above the optimal point in the voltage vs. frequency curve. Besides, again, the older node would be cheaper even with a larger chip, and the frequencies would be higher than that.
What, you think Nintendo would take in the cost of the die shrink just so it can run at a higher frequency? Just so they could have better graphics?
Reminder this is the same company that released the Wii U in 2012 with a ~ 1997 architecture CPU, and underclocked a 15W Tegra by 65% to be 20 times slower than a GTX 1060 with the undocked mode Switch.
All of that being said... I do think the console can be 5/4 nm as I said before, and I hope it is. It's just that a lot of people here are fuming and screaming at the mere thought of it and it definitely there's a universe it could.