By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Can Sony afford to make PS5 more expensive than PS4 launch price?

 

How much should it cost?

400 45 52.94%
 
450 15 17.65%
 
500 21 24.71%
 
550 0 0%
 
600 0 0%
 
600+ 4 4.71%
 
Total:85
drkohler said:
twintail said:

While I think the PS5 will obviously be PS4 BC

That is in no way obvious. Already adding a Zen-like thingie with SMT will break many of the current PS4 games. Toss SMT one out and you are safer with bc.

On another note, I see a lot of posts (in many forums) that want a $1000 gpu, a $400 cpu, $250 ram, $400 ssd drives in a PS5 "but a $499 price would be too high". I wish these nuts would get ome education before posting such nonsense.

I see a stripped-down custom Ryzen 2600 with an 9-10TF gpu in a SoC, 16GB gddr6 as a reasonable expectation by late 2020 for a new console with an affordable price point.

I think you are exagerating..... or you are oit of touch with whats really possible at $399/$499 in 2020.

Best advice I can give ia don't look at PC prices, and dont ignore the gains that come with smaller fabrication processes.



Around the Network
TheBraveGallade said:
Just to break the mold, I wonder what the switch's influence in all this would be.

Absolutely nothing.



The public does not buy "powerful" consoles, it buys "cheap" consoles



Sony will be able to get away with a more expensive PS5 because people view PlayStation as much more than a games machine now. Most people on my friends list are watching Netflix, Hulu, Amazon Prime Video, PlayStation Vue, PlayStation Video, WWE Network, and other services more than gaming.

They will keep the focus on it being a games first platform, but the necessity features are what will keep it selling to the masses. As this gen continues they will continue to expand on necessity features, and PS5 will very likely have a significant leap in accessibility and convenience adding even more necessity features.



Stop hate, let others live the life they were given. Everyone has their problems, and no one should have to feel ashamed for the way they were born. Be proud of who you are, encourage others to be proud of themselves. Learn, research, absorb everything around you. Nothing is meaningless, a purpose is placed on everything no matter how you perceive it. Discover how to love, and share that love with everything that you encounter. Help make existence a beautiful thing.

Kevyn B Grams
10/03/2010 

KBG29 on PSN&XBL

Pemalite said:

 

Bofferbrauer2 said:
Basically everything. Just check what Nintendo can manage to get out of their hardware even though in raw power it's massively lagging behind since the Wii. This doesn't only count for graphics directly, but also other things like texture compression and anything hanging on recurring algorithms.

Nintendo's games aren't graphical powerhouses, they haven't been for generations.
But they do have some of the best artists in the industry, that is undeniable.

Games like Xenoblade Chronicles X where thought to be impossible on the Wii U's hardware, both in size and graphical capacities. Same with the physics in BotW. Nintendo isn't the only one who can manage to do these things, but's most visible with them due to their lower power compared to their competitors. Many late PS360 games would have been thought impossible before due to the small RAM, for instance. It's part of squeezing every inch of power out of a hardware set.

Bofferbrauer2 said:

Less than that and it won't have enough to differentiate itself from the PS4 Pro or the XOX. Even in 7nm that will need a pretty big chip, and the bigger the chip, the more expensive it gets, nearly exponentially so in fact.

Indeed. AMD also recognized this, so you know what they did? They used a fabric to stitch together smaller chips that have orders-of-magnitude better yields.
And thus costs were lowered substantially, more working chips you get out of a wafer, the better.

I wouldn't be surprised if next-gen took a similar multi-chip approach.

Wouldn't help at all. The approach was good - for the Server chips! An 8-core Ryzen actually ain't much smaller than a native intel 8-core chip, and the difference in size mainly comes from the Intel 8-cores having much more L3 cache ( Core i7 5960X/i7 6900K have 20MB, 7820X clocks in at 11MB, compared to the 8MB of Ryzen) since they derive from. In Servers, where Intel comes with a 28-core behemoth, having 4x8 cores instead is a massive manufacturing advantage, but in the sizes of desktop/console chips, this doesn't make a difference at all. I mean, AMD could come theoretically come with 4 dualcores with each having 512kB L2 and 2MB L3 cache, but I very much doubt they will. Because it wouldn't help at all in production, what is won in such small scale chips is lost again by the redundancies that need to be build into each chip (which makes a 4x2 cores bigger than a native 8-core)

Bofferbrauer2 said:

DRAM is not artificially inflated, the prices are high because there's simply not enough production capacity worldwide for those chips. Hopefully this will change next year when some fabs come online, but it might still not be enough if demand keeps growing like it does right now.

Oh. They are artificially inflated. Fabs switched production from DRAM to NAND.

We may actually have an oversupply of NAND soon, which means more fabs will switch back to fabricating DRAM.

https://epsnews.com/2017/11/07/samsung-end-tight-dram-supply-earlier-expected/
https://epsnews.com/2017/09/22/expect-tight-dram-supply-2018-possible-oversupply-nand-flash/

Fabs switched because DRAM was too long too cheap and drove the manufacturers to either change production or face bankruptcy. Some where big enough to do both, and those where who stayed. At the time where they did the switch, NAND Flash was still very new and expensive, so for those who couldn't live from their DRAMs anymore it was a pretty easy way out as they didn't need to change much in terms of machinery. When the market then exploded, there where simply not enough manufacturers left in that domain, and having the biggest DRAM fab in south east asia being totally flooded (as in submerged under several meters of rainwater) when the demand started to rise again didn't help matters, either. There is no artificial demand inflation, just the fact that demand was much lower a couple of years ago - too low to feed all the producers at the time.

If NAND really will get oversupplied, then we will have a reversal of the situation pre-2015, where DRAM was oversupplied and NAND in short supply. This might incite the same movement like it did back then, just from NAND to DRAM this time around.



Around the Network

$400 at launch and get as many people to stay on the PS+ revenue stream.

This generation has been so much better for Sony I struggle to see them mixing it up in terms of pricing. they took the market for granted once and nearly cost them everything. They should remember how they grew, dominated, fell, and were reborn.



Bofferbrauer2 said:
Games like Xenoblade Chronicles X where thought to be impossible on the Wii U's hardware, both in size and graphical capacities. Same with the physics in BotW. Nintendo isn't the only one who can manage to do these things, but's most visible with them due to their lower power compared to their competitors. Many late PS360 games would have been thought impossible before due to the small RAM, for instance. It's part of squeezing every inch of power out of a hardware set.

Whether a game is thought to be impossible or not for the given hardware is ultimately irrelevant.
If it is running on the hardware, it's not impossible.

Nor does it mean the game in question is a graphical powerhouse, the Wii U was lacking in a multitude of areas.

Bofferbrauer2 said:
Wouldn't help at all. The approach was good - for the Server chips! An 8-core Ryzen actually ain't much smaller than a native intel 8-core chip, and the difference in size mainly comes from the Intel 8-cores having much more L3 cache ( Core i7 5960X/i7 6900K have 20MB, 7820X clocks in at 11MB, compared to the 8MB of Ryzen) since they derive from. In Servers, where Intel comes with a 28-core behemoth, having 4x8 cores instead is a massive manufacturing advantage, but in the sizes of desktop/console chips, this doesn't make a difference at all. I mean, AMD could come theoretically come with 4 dualcores with each having 512kB L2 and 2MB L3 cache, but I very much doubt they will. Because it wouldn't help at all in production, what is won in such small scale chips is lost again by the redundancies that need to be build into each chip (which makes a 4x2 cores bigger than a native 8-core)

You are mistaken. It would help.
You need to remember that wafers have a % of defects, the more % of space a chip takes up on the wafer the greater the chance it's going to have a fault, which decreases yields and thus increases costs.

Now companies tend to get around this to a degree by building chips to be larger than they need to be. For example, Sony actually has more CU's in the Playstation 4, but because such a high number of chips had defects, Sony disabled a chunk of the chip to increase yields.

Now the reason why AMD didn't take a similar approach for chips with 8~ cores and under is simple. The chips were already relatively small and had good yields, thus it was mostly unnecessary. - Plus they were die harvesting parts for chips with smaller core counts.

But that can only take you so far.

If you were to start making a monolithic next-gen SoC with an 8-core Ryzen complex, with a beefier memory controller and a big array of CU's for the GPU, then you start running into the same issue as thread-ripper, the chip is going to be stupidly massive, you are only going to get a few chips per wafer, costs will sky rocket.

Bofferbrauer2 said:

Fabs switched because DRAM was too long too cheap and drove the manufacturers to either change production or face bankruptcy. Some where big enough to do both, and those where who stayed. At the time where they did the switch, NAND Flash was still very new and expensive, so for those who couldn't live from their DRAMs anymore it was a pretty easy way out as they didn't need to change much in terms of machinery. When the market then exploded, there where simply not enough manufacturers left in that domain, and having the biggest DRAM fab in south east asia being totally flooded (as in submerged under several meters of rainwater) when the demand started to rise again didn't help matters, either. There is no artificial demand inflation, just the fact that demand was much lower a couple of years ago - too low to feed all the producers at the time.

If NAND really will get oversupplied, then we will have a reversal of the situation pre-2015, where DRAM was oversupplied and NAND in short supply. This might incite the same movement like it did back then, just from NAND to DRAM this time around.

I have to disagree.
Hence the DRAM price fixing debacle.
https://en.wikipedia.org/wiki/DRAM_price_fixing

Right now manufacturers are playing the supply game, switching between NAND and DRAM to maximize profits.
They also try to ramp up production of a certain technology when something like the next iPhone/Samsung drops, which has a flow on effect to other markets.



--::{PC Gaming Master Race}::--

Nymeria said:
$400 at launch and get as many people to stay on the PS+ revenue stream.

This generation has been so much better for Sony I struggle to see them mixing it up in terms of pricing. they took the market for granted once and nearly cost them everything. They should remember how they grew, dominated, fell, and were reborn.

I think people arent realizing what MS releasing the XB1X at $499 has done for sony. That litwrally meama they can launch at $499 and drop the price to $399 a year later. 

I'm sure they can sell to at least 15M customers in omw year at a $499 price point. Especially if the hardware is powerful enough to justify such a price. ASll this also rides on what MS prices their console though, if MS says $499 sony will match it, if they say $399 sony qoll match that too.



Pemalite said: 

Bofferbrauer2 said:
Wouldn't help at all. The approach was good - for the Server chips! An 8-core Ryzen actually ain't much smaller than a native intel 8-core chip, and the difference in size mainly comes from the Intel 8-cores having much more L3 cache ( Core i7 5960X/i7 6900K have 20MB, 7820X clocks in at 11MB, compared to the 8MB of Ryzen) since they derive from. In Servers, where Intel comes with a 28-core behemoth, having 4x8 cores instead is a massive manufacturing advantage, but in the sizes of desktop/console chips, this doesn't make a difference at all. I mean, AMD could come theoretically come with 4 dualcores with each having 512kB L2 and 2MB L3 cache, but I very much doubt they will. Because it wouldn't help at all in production, what is won in such small scale chips is lost again by the redundancies that need to be build into each chip (which makes a 4x2 cores bigger than a native 8-core)

You are mistaken. It would help.
You need to remember that wafers have a % of defects, the more % of space a chip takes up on the wafer the greater the chance it's going to have a fault, which decreases yields and thus increases costs.

Now companies tend to get around this to a degree by building chips to be larger than they need to be. For example, Sony actually has more CU's in the Playstation 4, but because such a high number of chips had defects, Sony disabled a chunk of the chip to increase yields.

Now the reason why AMD didn't take a similar approach for chips with 8~ cores and under is simple. The chips were already relatively small and had good yields, thus it was mostly unnecessary. - Plus they were die harvesting parts for chips with smaller core counts.

But that can only take you so far.

If you were to start making a monolithic next-gen SoC with an 8-core Ryzen complex, with a beefier memory controller and a big array of CU's for the GPU, then you start running into the same issue as thread-ripper, the chip is going to be stupidly massive, you are only going to get a few chips per wafer, costs will sky rocket.

Don't worry, I took it into account. That's where I said that the advantages are being eaten up by it's disadvantages. Because at such a small scale (in square mm), there's  not much to win by splitting the chip up. The only thing I could see coming (and actually expect) would be that it won't be an APU, but not due to size, but for cooling reasons.

The Jaguar cores don't consume much power, but Ryzen, as efficient as it is, will consume more than the Jaguar cores unless clock speed would be very low. Add to this that the GPU part will probably also consume around 150-200W (more than that is very hard to cool in a small case like a console and liable for breakdowns, RRoD anyone?) and it would result into a monstrous APU that would be very hard to keep cool, especially when the weather is hot.

Bofferbrauer2 said:

Fabs switched because DRAM was too long too cheap and drove the manufacturers to either change production or face bankruptcy. Some where big enough to do both, and those where who stayed. At the time where they did the switch, NAND Flash was still very new and expensive, so for those who couldn't live from their DRAMs anymore it was a pretty easy way out as they didn't need to change much in terms of machinery. When the market then exploded, there where simply not enough manufacturers left in that domain, and having the biggest DRAM fab in south east asia being totally flooded (as in submerged under several meters of rainwater) when the demand started to rise again didn't help matters, either. There is no artificial demand inflation, just the fact that demand was much lower a couple of years ago - too low to feed all the producers at the time.

If NAND really will get oversupplied, then we will have a reversal of the situation pre-2015, where DRAM was oversupplied and NAND in short supply. This might incite the same movement like it did back then, just from NAND to DRAM this time around.

I have to disagree.
Hence the DRAM price fixing debacle.
https://en.wikipedia.org/wiki/DRAM_price_fixing

Right now manufacturers are playing the supply game, switching between NAND and DRAM to maximize profits.
They also try to ramp up production of a certain technology when something like the next iPhone/Samsung drops, which has a flow on effect to other markets.

You are aware that that happened over 15 years ago?

When the prices where so low in the early 2010s, it was almost impossible for those companies to survive, the market was oversaturated. Let's look at the list of manufacturers that got into the DRAM price fixing and what they are doing now, shall we? Infineon? Left the market entirely and produces mainly microcontrollers and power controller chips nowadays. Elpida? Went bankrupt in 2012 because they where solely producing DRAM chips and those had almost no margins back then. Aquired by Micron, the next one on our list. Makes DRAM, NAND and NOR Flash Memory. This broad listing is why they survived. Most of their production is still DRAM, and their most modern Fabs exclusively produce DRAM. Hynix? Got auctioned for 3 Billions in 2010 and saved when SK Group bought a large part of the company, hence why it's called SK Hynix now. The other ones who still do DRAM today are Samsung and Toshiba, to big Corporations, and Sandisk, who survived mainly because of HDD market.

tl;dr: DRAM wasn't enough to survive during the 2009-2015 era where prices were so low that they didn't have much margin anymore.

However checking the Semiconductor Fab lists of some companies, it doesn't seem like the production is going to be expanded, SK Hynix is building 3 Fabs, but all for NAND Flash. TSMC is building 3 new Fabs, but I have no idea what will be produced there in the end, it could be anything with them. Samsung at least is building a Fab for both DRAM and VNAND. So it's quite possible they are not willing to let the prices drop again.



Intrinsic said:
Nymeria said:
$400 at launch and get as many people to stay on the PS+ revenue stream.

This generation has been so much better for Sony I struggle to see them mixing it up in terms of pricing. they took the market for granted once and nearly cost them everything. They should remember how they grew, dominated, fell, and were reborn.

I think people arent realizing what MS releasing the XB1X at $499 has done for sony. That litwrally meama they can launch at $499 and drop the price to $399 a year later. 

I'm sure they can sell to at least 15M customers in omw year at a $499 price point. Especially if the hardware is powerful enough to justify such a price. ASll this also rides on what MS prices their console though, if MS says $499 sony will match it, if they say $399 sony qoll match that too.

I'm not sure Microsoft will factor into Sony's thinking besides not wanting to launch a year behind. With the Xbox One X think it is safe that Microsoft won't beat the PlayStation 5 to market.  Right now I expect a 2020 release of PS5 and X1 successor.  I think Sony will set price at $400 and then be on Microsoft to match or not.  I actually could see Microsoft waiting until 2021 and releasing a $500 system marketing it as more powerful option.