By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Bofferbrauer2 said:
Games like Xenoblade Chronicles X where thought to be impossible on the Wii U's hardware, both in size and graphical capacities. Same with the physics in BotW. Nintendo isn't the only one who can manage to do these things, but's most visible with them due to their lower power compared to their competitors. Many late PS360 games would have been thought impossible before due to the small RAM, for instance. It's part of squeezing every inch of power out of a hardware set.

Whether a game is thought to be impossible or not for the given hardware is ultimately irrelevant.
If it is running on the hardware, it's not impossible.

Nor does it mean the game in question is a graphical powerhouse, the Wii U was lacking in a multitude of areas.

Bofferbrauer2 said:
Wouldn't help at all. The approach was good - for the Server chips! An 8-core Ryzen actually ain't much smaller than a native intel 8-core chip, and the difference in size mainly comes from the Intel 8-cores having much more L3 cache ( Core i7 5960X/i7 6900K have 20MB, 7820X clocks in at 11MB, compared to the 8MB of Ryzen) since they derive from. In Servers, where Intel comes with a 28-core behemoth, having 4x8 cores instead is a massive manufacturing advantage, but in the sizes of desktop/console chips, this doesn't make a difference at all. I mean, AMD could come theoretically come with 4 dualcores with each having 512kB L2 and 2MB L3 cache, but I very much doubt they will. Because it wouldn't help at all in production, what is won in such small scale chips is lost again by the redundancies that need to be build into each chip (which makes a 4x2 cores bigger than a native 8-core)

You are mistaken. It would help.
You need to remember that wafers have a % of defects, the more % of space a chip takes up on the wafer the greater the chance it's going to have a fault, which decreases yields and thus increases costs.

Now companies tend to get around this to a degree by building chips to be larger than they need to be. For example, Sony actually has more CU's in the Playstation 4, but because such a high number of chips had defects, Sony disabled a chunk of the chip to increase yields.

Now the reason why AMD didn't take a similar approach for chips with 8~ cores and under is simple. The chips were already relatively small and had good yields, thus it was mostly unnecessary. - Plus they were die harvesting parts for chips with smaller core counts.

But that can only take you so far.

If you were to start making a monolithic next-gen SoC with an 8-core Ryzen complex, with a beefier memory controller and a big array of CU's for the GPU, then you start running into the same issue as thread-ripper, the chip is going to be stupidly massive, you are only going to get a few chips per wafer, costs will sky rocket.

Bofferbrauer2 said:

Fabs switched because DRAM was too long too cheap and drove the manufacturers to either change production or face bankruptcy. Some where big enough to do both, and those where who stayed. At the time where they did the switch, NAND Flash was still very new and expensive, so for those who couldn't live from their DRAMs anymore it was a pretty easy way out as they didn't need to change much in terms of machinery. When the market then exploded, there where simply not enough manufacturers left in that domain, and having the biggest DRAM fab in south east asia being totally flooded (as in submerged under several meters of rainwater) when the demand started to rise again didn't help matters, either. There is no artificial demand inflation, just the fact that demand was much lower a couple of years ago - too low to feed all the producers at the time.

If NAND really will get oversupplied, then we will have a reversal of the situation pre-2015, where DRAM was oversupplied and NAND in short supply. This might incite the same movement like it did back then, just from NAND to DRAM this time around.

I have to disagree.
Hence the DRAM price fixing debacle.
https://en.wikipedia.org/wiki/DRAM_price_fixing

Right now manufacturers are playing the supply game, switching between NAND and DRAM to maximize profits.
They also try to ramp up production of a certain technology when something like the next iPhone/Samsung drops, which has a flow on effect to other markets.




www.youtube.com/@Pemalite