By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - [EurogamerDF] Orbis Unmasked: what to expect from the next-gen PlayStation

ninjablade said:
sunK1D said:
ninjablade said:
platformmaster918 said:
ninjablade said:
honestly it looks like the ps4 will more powerful, very disappointed in microsoft, looks like i'm going with sony nextgen.

don't judge on power alone.  I mean I'm certainly going with Sony next gen but that's because they have the best first party stable in the industry imo.  Power isn't everything as evidenced by DS, Wii, PS2, PS, and 3DS.

Talking about consoles strictly, ps and ps2 were beasts for  there time, they were the most powerdul consoles at the time of launch, ps2 blew dreamcast out of the water cause it launched a year later, as for the wii, sure it was a huge success but nintendo has basically ruined its name with gamers, watch the wii u not even sell 20 million this gen and you can quote me on that.

Actually, from a technical perspective it goes PS2 < Dreamcast < GC < Xbox.

LOL i owned a dreamcast, was a dremcast fanboy, and all i have to say you could'nt be further from the truth MGS2 and GT3  looked a generation ahead of anything on dreamcast, i thought it was cgi.

You don'r have to be a fanboy to appreciate something.

Beauty is in the eye of the beholder. And I think Shenmue > GT 3 or any other PS2 game for that matter.

Shenmue http://youtu.be/IUvOoBBNg8Q  One of the most iconic cutscenes ever in gaming.

GT 3 http://youtu.be/6BS5OW7WnYk

Not to mention FSAA on the DC architecture!



Around the Network
sunK1D said:
ninjablade said:
sunK1D said:
ninjablade said:
platformmaster918 said:
ninjablade said:
honestly it looks like the ps4 will more powerful, very disappointed in microsoft, looks like i'm going with sony nextgen.

don't judge on power alone.  I mean I'm certainly going with Sony next gen but that's because they have the best first party stable in the industry imo.  Power isn't everything as evidenced by DS, Wii, PS2, PS, and 3DS.

Talking about consoles strictly, ps and ps2 were beasts for  there time, they were the most powerdul consoles at the time of launch, ps2 blew dreamcast out of the water cause it launched a year later, as for the wii, sure it was a huge success but nintendo has basically ruined its name with gamers, watch the wii u not even sell 20 million this gen and you can quote me on that.

Actually, from a technical perspective it goes PS2 < Dreamcast < GC < Xbox.

LOL i owned a dreamcast, was a dremcast fanboy, and all i have to say you could'nt be further from the truth MGS2 and GT3  looked a generation ahead of anything on dreamcast, i thought it was cgi.

You don'r have to be a fanboy to appreciate something.

Beauty is in the eye of the beholder. And I think Shenmue > GT 3 or any other PS2 game for that matter.

Shenmue http://youtu.be/IUvOoBBNg8Q  One of the most iconic cutscenes ever in gaming.

GT 3 http://youtu.be/6BS5OW7WnYk

Not to mention FSAA on the DC architecture!

just read beyond 3D, the fact ps2 is more powerdul then dreamcast, dreamcat couldn't even do the lighting in MGS2 or GT3, not to mention ps2 could 10x as much polygons as dreamcast and had more ram as well.



yup the more i read about 720 the more i wanna vomit, 2 cores and 3 gigs of ram, for the os, kinect and crappy gimmicks, the ps4 is looking like a good deal more powerful, halo you will be missed, gears can be replaced with uncharted and i forza with GT, always preferred GT anyway.



ninjablade said:
yup the more i read about 720 the more i wanna vomit, 2 cores and 3 gigs of ram, for the os, kinect and crappy gimmicks, the ps4 is looking like a good deal more powerful, halo you will be missed, gears can be replaced with uncharted and i forza with GT, always preferred GT anyway.

Don't even need to post another picutre since you treat a guess as fact.

Not only RAM type,they even guessing amount wrong lol.



ninjablade said:
yup the more i read about 720 the more i wanna vomit, 2 cores and 3 gigs of ram, for the os, kinect and crappy gimmicks, the ps4 is looking like a good deal more powerful, halo you will be missed, gears can be replaced with uncharted and i forza with GT, always preferred GT anyway.


Do you not think its a little bit stupid to put so much faith in rumours?



Around the Network
slowmo said:
ninjablade said:
yup the more i read about 720 the more i wanna vomit, 2 cores and 3 gigs of ram, for the os, kinect and crappy gimmicks, the ps4 is looking like a good deal more powerful, halo you will be missed, gears can be replaced with uncharted and i forza with GT, always preferred GT anyway.


Do you not think its a little bit stupid to put so much faith in rumours?


considering a ubisoft developer at neogaf has confirmed these specs are very close to the real thing, and digital foundry has made comments on the ps4 probably being more powerful, i'm thinking ps4 90% chance being more powerful, and 70% being a good deal more powerful, anything could happen but its not looking good at the moment and from rumors it seems microsoft is more set on a media box and kinect then gaming, thats why they chose slow ram, which is not great for gaming but great for the os and kinect.



disolitude said:

This is a very informative post covering the cost of wafer use and chip manufacturing.

However your pricing doesn't take in to account the real cost of a GPU, CPU or any chip...its development. Skilled staff, research and development, RTL hardware guys, testing, drivers, SDKs/APIs, marketing, paying license fees, patent disputes, equipment, etc. The more complex the chip, the higher the R&D costs. A 40 nm chip is used for a AMD 6450 that retails for 39 bucks, while 40 nm 6970 retailed for $399. According to your logic both are 40 nm so they must cost the same right? Even with different yields, where does the 10X markeup come from. R&D for the 6970 was much higher than 6450 hence the higher cost to OEM's and consimers.

Your post seems to think that just because AMD can push GPUs at $24 per chip at 70% yeild rate, that is the cost for AMD to sell that GPU to Sony (+20 dollars  for manufacturing apparently). The very first 7900m GPU cost AMD 100s of millions of dollars to develop and they have to recoup that cost somehow and hopefully make some money on it.

Now I know it's not $300 per unit for Sony to use this GPU, but my post said "cost of parts at retail"... This is the only information we can use unless someone has inside scoop at Sony and AMD. Everything else at this point is pure speculation.

I think you are confusing the costs.... we are talking about how much money Sony needs to spend to manufacture each PS4 unit.... the R&D cost are covered in the lifetime of a console with the profit of hardware and software.

So if Sony can manufacture a PS4 per $300 and sell it at $350 then they will have a profit of $50... if Sony manufature the PS4 per $400 and sell it at $350 then Sony is losing money in each PS4 sold... just it.

The cost of manufacture a console is the cost I tried to explain to you using the CPU/GPU/APU chip... of couse there are other chips, HDD drive, BD drive, motherboard, etc... but a GPU chips not cost $200 to manufacture it's easly one of the most cheaper parts of a console... the PS3 costed $600 because just the BD drive costed $200... the Cell was expensive when you look at the cost of other CPU/GPU chip but they are not the one that make the PS3 cost $600 because without the BD drive Sony could be sold the PS3 per $450 and loose little money with it.

That's the point... Sony needs to cover all the manufacture cost of the PS4 with the sales to retail to not happen what happened with PS3 (loose on each console)... the other production costs like R&D, marketing, etc can be covered by the software sales.

That's it.



ethomaz said:

disolitude said:

This is a very informative post covering the cost of wafer use and chip manufacturing.

However your pricing doesn't take in to account the real cost of a GPU, CPU or any chip...its development. Skilled staff, research and development, RTL hardware guys, testing, drivers, SDKs/APIs, marketing, paying license fees, patent disputes, equipment, etc. The more complex the chip, the higher the R&D costs. A 40 nm chip is used for a AMD 6450 that retails for 39 bucks, while 40 nm 6970 retailed for $399. According to your logic both are 40 nm so they must cost the same right? Even with different yields, where does the 10X markeup come from. R&D for the 6970 was much higher than 6450 hence the higher cost to OEM's and consimers.

Your post seems to think that just because AMD can push GPUs at $24 per chip at 70% yeild rate, that is the cost for AMD to sell that GPU to Sony (+20 dollars  for manufacturing apparently). The very first 7900m GPU cost AMD 100s of millions of dollars to develop and they have to recoup that cost somehow and hopefully make some money on it.

Now I know it's not $300 per unit for Sony to use this GPU, but my post said "cost of parts at retail"... This is the only information we can use unless someone has inside scoop at Sony and AMD. Everything else at this point is pure speculation.

I think you are confusing the costs.... we are talking about how much money Sony needs to spend to manufacture each PS4 unit.... the R&D cost are covered in the lifetime of a console with the profit of hardware and software.

So if Sony can manufacture a PS4 per $300 and sell it at $350 then they will have a profit of $50... if Sony manufature the PS4 per $400 and sell it at $350 then Sony is losing money in each PS4 sold... just it.

The cost of manufacture a console is the cost I tried to explain to you using the CPU/GPU/APU chip... of couse there are other chips, HDD drive, BD drive, motherboard, etc... but a GPU chips not cost $200 to manufacture it's easly one of the most cheaper parts of a console... the PS3 costed $600 because just the BD drive costed $200... the Cell was expensive when you look at the cost of other CPU/GPU chip but they are not the one that make the PS3 cost $600 because without the BD drive Sony could be sold the PS3 per $450 and loose little money with it.

That's the point... Sony needs to cover all the manufacture cost of the PS4 with the sales to retail to not happen what happened with PS3 (loose on each console)... the other production costs like R&D, marketing, etc can be covered by the software sales.

That's it.


I fully understand your points but I disagree with this logic.

Let's take PS3's lauch for example. Initial estimated cost of PS3 was 840 dollars by isupply.

http://www.edge-online.com/news/isuppli-60gb-ps3-costs-840-produce/

"Some of the more expensive PS3 components that were charted by iSuppli include Nvidia’s $129 RSX graphics chip, the $125 blu-ray disc drive and the $89 IBM Cell processor."

So the RSX which is based on older 90nm tech and offered middle of the road like performance at the time cost Sony $129. The size of the RSX is just under 200mm² so lets say 300 chips per wafer like you stated before.

Because this isn't a cutting edge chip lets say the yields were 90% - $5000 / 270 chips = $19 per chip

So iSupply states that RSX cost Sony $129 bucks yet the chip costs $19 dollars to manufacture each.

How so?



disolitude said:

I fully understand your points but I disagree with this logic.

Let's take PS3's lauch for example. Initial estimated cost of PS3 was 840 dollars by isupply.

http://www.edge-online.com/news/isuppli-60gb-ps3-costs-840-produce/

"Some of the more expensive PS3 components that were charted by iSuppli include Nvidia’s $129 RSX graphics chip, the $125 blu-ray disc drive and the $89 IBM Cell processor."

So the RSX which is based on older 90nm tech and offered middle of the road like performance at the time cost Sony $129. The size of the RSX is just under 200mm² so lets say 300 chips per wafer like you stated before.

Because this isn't a cutting edge chip lets say the yields were 90% - $5000 / 270 chips = $19 per chip

So iSupply states that RSX cost Sony $129 bucks yet the chip costs $19 dollars to manufacture each.

How so?

lol

1. I'm pretty sure the RSX in 90nm used a 200mm wafer... just the 65nm started to use the 300mm wafer... the 300mm wafer is almost two times bigger than the 200mm... so you can cut that for 150 chips instead 300...

2. I'm pretty sure too the percent of good chips of RSX was below 90%...  less than 130 chips per wafer.

3. The actual 300mm wafer for 28nm costs $5000... the 200mm wafer for 90nm costs $8000-10000 in 2006.

So... $10000  / 130 = $77 but that's not for Sony... Sony needs to pay to  NVIDIA and TSMC to manufacture the chips... the RSX was in 2006 a chip near $100 to manufacture... even more if the quantity of good chips was low.

A 80% or 70% of good chips can give you the $129 estimated by the site.



ethomaz said:

disolitude said:

I fully understand your points but I disagree with this logic.

Let's take PS3's lauch for example. Initial estimated cost of PS3 was 840 dollars by isupply.

http://www.edge-online.com/news/isuppli-60gb-ps3-costs-840-produce/

"Some of the more expensive PS3 components that were charted by iSuppli include Nvidia’s $129 RSX graphics chip, the $125 blu-ray disc drive and the $89 IBM Cell processor."

So the RSX which is based on older 90nm tech and offered middle of the road like performance at the time cost Sony $129. The size of the RSX is just under 200mm² so lets say 300 chips per wafer like you stated before.

Because this isn't a cutting edge chip lets say the yields were 90% - $5000 / 270 chips = $19 per chip

So iSupply states that RSX cost Sony $129 bucks yet the chip costs $19 dollars to manufacture each.

How so?

lol

1. I'm pretty sure the RSX in 90nm used a 200mm wafer... just the 65nm started to use the 300mm wafer... the 300mm wafer is almost two times bigger than the 200mm... so you can cut that for 150 chips instead 300...

2. I'm pretty sure too the percent of good chips of RSX was below 90%...  less than 130 chips per wafer.

3. The actual 300mm wafer for 28nm costs $5000... the 200mm wafer for 90nm costs $8000-10000 in 2006.

So... $10000  / 130 = $77 but that's not for Sony... Sony needs to pay to  NVIDIA and TSMC to manufacture the chips... the RSX was in 2006 a chip near $100 to manufacture... even more if the quantity of good chips was low.

A 80% or 70% of good chips can give you the $129 estimated by the site.


1. Thats just not true. 300 mm wafers were used since early 2000s and were used even with 130 nm nodes let alone 90 nm. All one has to do is google "300 mm wafer 90 nm". There is no way anyone as big as nVidia would have used 200mm wafer in 2006 due to cost.

Here is a nice pfd about the switch from 200 to 300 mm wafer technology back from 2003.

http://www.google.ca/url?sa=t&rct=j&q=&esrc=s&frm=1&source=web&cd=4&cad=rja&ved=0CFsQFjAD&url=http%3A%2F%2Fwww.nanobuildings.com%2Fbat%2Fpresentations%2Fdownloads%2FMJamison_Presentation.pdf&ei=MC_7UJGiHoaFrAHjg4Bg&usg=AFQjCNHjo_VGZy63bXSRNf94_U8-9K5h_w&sig2=daGiC7s0TT9EEAoyE5_OoA

2. Why would RSX which is not a very good chip in 2006 have lower yields than 7970m which is the best mobile chip money can buy in 2012? RSX is not the Cell or Nvidia Geforce 200 series processor which were cutting edge.  It's a fairly basic GPU...even if the yield wasn't 90%, it has to be high.

3. Wafer cost figures you're quoting...lets see some links. By searching on Google, I see 300mm wafers price quoted anywhere from $2000 dollars to $10000 dollars.

Bottom line is that the manufacturing cost math doesn't add up if you admit that 300 mm wafer was used, which it was. All this "I'm pretty sure" stuff is just a way to spin arguments in your favor.

A second bottom line is that there is no way that RSX (mediocre chip in 2006) cost 129 dollars while a 7970m (amazing chip in 2013) costs less. If it did, everyone would be making powerful consoles, including Nintendo.