By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - [EurogamerDF] Orbis Unmasked: what to expect from the next-gen PlayStation

bananaking21 said:
BenVTrigger said:
bananaking21 said:
so im to tired to read the article, but if they are right, how powerful are these things going to be?


Could easily play Battlefield 3 and Crysis 2 on max settings in 1080p

i hope this is true!! this is how i feel!

 


Battlefield 3 and Crysis 2 are already old
This year its about Crysis 3 Maxed out on PC(even if it takes a patch to max it out like it did with Crysis2 -.-) and Battlefield 4 that will use an updated FB2 Engine.
Wouldn't be impressive if NEXT GEN Consoles could run 3 year old games on Max Settings lol


AMD is probably the cheapest option for consoles otherwise we would face another 600$ where most kids would cry cause they don't have money.
At the same time this does mean the hardware is already a lot more outdated than the PS360 was at its time aka both consoles will have much less to overcome limitations like the PS3.We will face this time a "360" for both consoles like we did this gen.The 360 is maxed out for years now while the PS3 is still getting pushed.This time we won't see something like that.
But maybe/hopefully we see streaming becoming a good option for consoles so it wouldn't matter.
PC graphics for everyone!...You should write letters to Obama so you get as good internet as europe...100MB/s for cheap everywhere.Google fiber should become a standard worldwide - Even Europe is hungry for it but we don't need it as bad as the USA cause we have nice speed for cheap with no limitations at all.

Soon PC gets the 700 Nvidia Series aka new powerhouses that can't be used at all.
There won't probably even be a single game for PC that will max out the GTX680 in the next 3 years cause consoles(even the NextGen) will still hold everything back for Multiplatgames.



Around the Network
D-Joe said:
Nsanity said:
D-Joe said:
Nsanity said:
D-Joe said:
ethomaz said:
D-Joe said:

Funny that medias keep guessing wrong about 720's RAM type

Why?

Both will have GDDR5

Where did you hear that?

AMD China

They actually said that?

GPU part desginer of the team,yeah

Do you have a link?



M.U.G.E.N said:
VGKing said:
Why do people keep saying next Xbox will be more powerful than Playstation? From what I understand, NextBox is more focused on Apps/multitasking...etc while PS4 is made primarily for gaming. PS4 would have an edge in most games, but the real deciding factor is the ram. 8gig DDR3 for Xbox and 4gigGDDR5 for PS4.


Mainly because two users over at gaf (proelite and aegis) are saying so or at least trying to push that mentality. One of them still work for MS and the other used work for MS and is currently working for polygon which iirc has MS's backing (not sure about this one tbh so feel free to correct me)  so yeah

 

the one they call Mr. Accurate..a dev under NDA said both are pretty much equal :) just strengths in different areas. 

 

I'm 100% sure that he works for Ubisfoft.

"In my opinion both machines will be very close, with some details in favor for each one, but I think we will have a ps360 situation again with 2 machines very close.

And some details in the articles are simply wrong"

http://www.neogaf.com/forum/showpost.php?p=46551080&postcount=274

 



enditall727 said:
oh and does anybody know how high R&D costs are? i read somewhere that it costed Sony like $170 to make the Vita but R&D supposedly drove it up to costing $310 when it was all said and done(when it was in retailers hands)

resulting in them losing $60 each Vita sale at launch being that its retail price was $250

yea, people very often totally neglect R&D, marketing and shipping costs aswell as retailer cut, taxes (import/sales)  and assemble costs when discussing whether or not a system is "profitable"

they just toss together certain components (sometimes even just raw materials) and expect that to magically form a device that is available at retailers



enditall727 said:
disolitude said:
VGKing said:
The RAM different between Xbox 720 and PS4 comes down to a quantity vs quality.
Xbox has the quanity, PS4 has the quality. The question is which games work best with which set-up?


Nah it really doesn't come down to that at all. 

If this rumor is to be believed it comes down to the following:

PS4 GDDR5 - able to push larger amounts of data at once, but much slower cycle rate

360 DDR3 - able to push less data at once but do it more rapidly + having more memory to work with.

PS4 - .50 caliber machine gun firing 5 rounds per second

360 - .30 caliber machine gun firing 10 rounds per second and having more ammo to fire with.

 

I really don't know how else I can illustrate this to people...GDDR5 doesn't necessarily mean faster performance. Infact for non GPU tasks its much worse. 

this is confusing..

 

you basically just said that the 720's ddr3 is faster AND has more ammo than the PS4's GDDR5

 

I thought the GDDR5 was faster though..

Only faster for graphics... and the 720 has more DDR3 then Ps4 has GDDR5.



Around the Network
MaulerX said:
I don't know. If next Gen lasts 10 years then I can honestly see how in the long run, 8 gigs of slower RAM is going to prevail against 4 gigs of faster RAM. Also, while I'm not sure I believe the Durangos 3 gigs reserved for the OS, but when compared to the alleged 512MB reserved for the Orbis OS, it sends a message that Microsoft wants to make the Durango a multitasking & multimedia powerhouse. I'm also interested to see what's the Durango's "secret sauce" as this article admits they really don't know much about.

Truth is we no longer live in a world where a purely gaming console might not cut it for the masses. We might see a situation where the PS4 finally gets cross game chat, buy the 720 goes ahead with cross game video chat while simultaneously recording your favorite tv show in the background and fast app switching. (just my speculation based on how much RAM and how much RAM is reserved for respective OS's).


This would be more convincing if it wasn't for the fact that the most popular gaming console of this generation was the one that was purley gaming focused.

I mean hell, Wii didn't even have competant online.



disolitude said:

7970m for 100 bucks? I want what youre smoking... 7970m is a 200 dollar upgrade on most laptops over 7800 series gpu.

Do you know how is the cost of a single gpu or cpu chip? $200 dollars is the retail price for us consumers... even so the same chip is used for other models so a $100 chip is used for $200 - $500 products... all depends the potential of clock.

I will tried do explain to you how much a single chip cost to be manufactured...

1. You need to know there is no cost for manufature a single gpu/cpu chip... the cost is to manufacture one wafer of chips.... the price of a single wafer 300mm 28nm today manufactured by TSMC is ~ $5000... that's fixed for any chip complex or not.

2. What define the final cost of a single chips is how much "good" chip can be made in this wafer... so if the chip is simple and small the wafer can have more chips but if it is complex and big the wafer have less chip... eg. a 300mm 65nm wafer can have 94 NVIDIA GT200 and a 300mm 45nm wafer can have 2500 Atom processors... of couse that is the max number of chip that fit the size of these wafers but you need to remember the "bad" chip no used for nothing... in a simple chip the lost with "bad" chips is low (less than 10% of the chip manufactured in a wafer) but if a chip is complex and the prodcution is not that good yet you have a lot of "bad" chips like the GT200 or Cell (near 50% of the manufactured chips in a wafer was lost).

The NIVIDA GT200 was a big and complex chip that cost to be make ~$112 per chip... of the 94 chips near 40 was lost... so the wafer give you just 50 chips in the end... the GT200 equiped two retail products the GTX 280 ($600) e GTX 270 ($300)... in this case NVIDIA choose to use the chips with at least 480 SPs usable because less than 10% of the chips have the full 512SPs usables... a wafer with 90% of the chips bad is impossible to make money with it... so at least 60% of the chips have 480SPs usables... so NVIDIA never used the full GT200 power.

I give that example because the GT200 was the first chip to cost more than $100... it size was 500-600mm² (a monster). So any chip with less than 500mm² make in 55nm was chepear than the GT200 (< $100).

That $100 for GT200 is based in a 55nm process with a wafer cost of $8000... the wafer cost for a 28nm is $5000... so cheaper... so the same chip cost is below $50 in 28nm.

3. Now talking about PS4... the estimated size for the 7970M GPU is 212 mm² in 28nm... a 300mm wafer can have ~300 200mm² chips (all depends how is the chip... square, rectangular, etc)... so a $5000 wafer can have ~300... now all depends how much chips are good for use:

* 90% of the chips are good: $5000 / 270 chips = $19 per chip
* 70% of the chips are good: $5000 / 210 chips = $24 per chip
* 50% of the chips are good: $5000 / 150 chips = $34 per chip

That's the price of a single 7970M chip... not the full video card or the retail price for consumers.

4. PS4 yet and how Sony can buy or manufacture the 7970M chip... eg with 70% of the chips good.

+ Sony can buy the project and manufacture itself (like Microsoft did with R500): ~$25 per chip
+ Sony can ask to AMD manufacture they: ~$40-45 per chip (AMD uses GlobalFoundries... so ~$10 for each company per chip)
+ Sony can buy the project and manufacture in TSMC: ~$35-40 per chip (Sony need just to pay to TSMC)
+ Sony can ask to AMD manufacture in TSMC: ~$45-50 per chip (same than AMD plus GlobalFoundries but I think the TSMC a little more expensive)

In any case it's impossible a 212 mm² in 28nm using 300mm wafer costs to Sony over $50 per chip... it is just impossible... unless all the chips less than 30% are good and usable.

So what am I smoking??? I think the guys here knows nothig about how much a CPU/GPU chip costs.

 

Said that... I think the full chip CPU + APU + GPU together will be a 400mm²... bigger, more complex, less chips per wafer are good... so I think a cost for Sony from $60 to $80 per chip with a ~50% of good chips... I put in my estimate to you $200... it's a overcost with all needed to put this chip woking in the console motherboard... even the components to stay it cool and below the critical temperatures.

 

And yes... AMD e Intel sells the same $50 chips in CPU models from $200 to $999 (the highest potetial clock chips equipcs the top $999 cpus).



To added... some cost of old CPU to AMD/Intel.

Assuming:

0.002 Defects per mm^2
$4900 per 300mm CMOS Wafer
$5300 per 300mm SOI Wafer
$2700 per 200mm SOI Wafer
alpha of 1
Dies Per Wafer = {[pi * (Wafer Diameter/2) ^ 2] / Die Area} - {[pi * Wafer Diameter] / (2 * Die Area) ^ 1/2}
Yield = (1 + [Defects per Area * Die Area / alpha]) ^ (-alpha)
$10 burning, binning and packaging cost for single die chips
$15 burning, binning and packaging cost for dual die chips

AMD Agena (9XXX):
65nm, 300mm Wafer
285 mm^2 Die
= 203 Dies per Wafer
= 63.69% Yield
= 129 Usable Dies per Wafer
= $41.09 per Die
= $51.09 per Chip

intel Yorkfield (2x 6MB) (Q9XXX):
45nm, 300mm Wafer
2x 107 mm^2 Dies
= 2x 290 Dies per Wafer
= 82.37% Yield
= 2x 238 Usable Dies per Wafer
= $20.52 per 2x Die
= $35.52 per Chip

intel Kentsfield (2x 4MB) (Q6XXX):
65nm, 300mm Wafer
2x 143 mm^2 Dies
= 2x 213 Dies per Wafer
= 77.76% Yield
= 2x 165 Usable Dies per Wafer
= $29.56 per 2x Die
= $44.56 per Chip

 AMD Agena (8XXX):
65nm, 300mm Wafer
285 mm^2 Die
= 203 Dies per Wafer
= ~68.68% Yield
= ~139 Usable Dies per Wafer (of which ~133 could be sold as quads)
= ~$38.13 per Die
= ~$48.13 per Chip

AMD Kuma (7XXX):
65nm, 300mm Wafer
~150 mm^2 Die
= ~405 Dies per Wafer
= ~76.92% Yield
= ~312 Usable Dies per Wafer
= ~$17.01 per Die
= ~$27.01 per Chip

AMD Windsor (X2+):
90nm, 200mm Wafer
219 mm^2 Die
= 113 Dies per Wafer
= 69.54% Yield
= 79 Usable Dies per Wafer
= $35.83 per Die
= $45.83 per Chip

intel Wolfdale (6MB) (E8XXX):
45nm, 300mm Wafer
107 mm^2 Die
= 580 Dies per Wafer
= 82.37% Yield
= 477 Usable Dies per Wafer
= $10.26 per Die
= $20.26 per Chip

intel Conroe (4MB) (E6XXX):
65nm, 300mm Wafer
143 mm^2 Die
= 426 Dies per Wafer
= 77.76% Yield
= 331 Usable Dies per Wafer
= $14.78 per Die
= $24.78 per Chip

AMD Brisbane (X2+):
65nm, 300mm Wafer
126 mm^2 Die
= 488 Dies per Wafer
= 79.87% Yield
= 389 Usable Dies per Wafer
= $13.61 per Die
= $23.61 per Chip

intel "Ridgefield" (3MB) (E7XXX):
45nm, 300mm Wafer
~83 mm^2 Die
= ~757 Dies per Wafer
= ~85.76% Yield
= ~649 Usable Dies per Wafer
= ~$7.55 per Die
= ~$17.55 per Chip

intel Alendale (2MB) (E4XXX):
65nm, 300mm Wafer
111 mm^2 Die
= 558 Dies per Wafer
= 81.83% Yield
= 456 Usable Dies per Wafer
= $10.74 per Die
= $20.74 per Chip

AMD Manilla (Sempron):
90nm, 200mm Wafer
126 mm^2 Die
= 201 Dies per Wafer
= 79.87% Yield
= 160 Usable Dies per Wafer
= $16.85 per Die
= $26.12 per Chip

intel Conroe-L (4XX):
65nm, 300mm Wafer
90 mm^2 Die
= 715 Dies per Wafer
= 84.75.76% Yield
= 606 Usable Dies per Wafer
= $8.32 per Die
= $18.32 per Chip

http://www.overclockers.com/forums/showthread.php?t=550542



ethomaz said:

disolitude said:

7970m for 100 bucks? I want what youre smoking... 7970m is a 200 dollar upgrade on most laptops over 7800 series gpu.

Do you know how is the cost of a single gpu or cpu chip? $200 dollars is the retail price for us consumers... even so the same chip is used for other models so a $100 chip is used for $200 - $500 products... all depends the potential of clock.

I will tried do explain to you how much a single chip cost to be manufactured...

1. You need to know there is no cost for manufature a single gpu/cpu chip... the cost is to manufacture one wafer of chips.... the price of a single wafer 300mm 28nm today manufactured by TSMC is ~ $5000... that's fixed for any chip complex or not.

2. What define the final cost of a single chips is how much "good" chip can be made in this wafer... so if the chip is simple and small the wafer can have more chips but if it is complex and big the wafer have less chip... eg. a 300mm 65nm wafer can have 94 NVIDIA GT200 and a 300mm 45nm wafer can have 2500 Atom processors... of couse that is the max number of chip that fit the size of these wafers but you need to remember the "bad" chip no used for nothing... in a simple chip the lost with "bad" chips is low (less than 10% of the chip manufactured in a wafer) but if a chip is complex and the prodcution is not that good yet you have a lot of "bad" chips like the GT200 or Cell (near 50% of the manufactured chips in a wafer was lost).

The NIVIDA GT200 was a big and complex chip that cost to be make ~$112 per chip... of the 94 chips near 40 was lost... so the wafer give you just 50 chips in the end... the GT200 equiped two retail products the GTX 280 ($600) e GTX 270 ($300)... in this case NVIDIA choose to use the chips with at least 480 SPs usable because less than 10% of the chips have the full 512SPs usables... a wafer with 90% of the chips bad is impossible to make money with it... so at least 60% of the chips have 480SPs usables... so NVIDIA never used the full GT200 power.

I give that example because the GT200 was the first chip to cost more than $100... it size was 500-600mm² (a monster). So any chip with less than 500mm² make in 55nm was chepear than the GT200 (< $100).

That $100 for GT200 is based in a 55nm process with a wafer cost of $8000... the wafer cost for a 28nm is $5000... so cheaper... so the same chip cost is below $50 in 28nm.

3. Now talking about PS4... the estimated size for the 7970M GPU is 212 mm² in 28nm... a 300mm wafer can have ~300 200mm² chips (all depends how is the chip... square, rectangular, etc)... so a $5000 wafer can have ~300... now all depends how much chips are good for use:

* 90% of the chips are good: $5000 / 270 chips = $19 per chip
* 70% of the chips are good: $5000 / 210 chips = $24 per chip
* 50% of the chips are good: $5000 / 150 chips = $34 per chip

That's the price of a single 7970M chip... not the full video card or the retail price for consumers.

4. PS4 yet and how Sony can buy or manufacture the 7970M chip... eg with 70% of the chips good.

+ Sony can buy the project and manufacture itself (like Microsoft did with R500): ~$25 per chip
+ Sony can ask to AMD manufacture they: ~$40-45 per chip (AMD uses GlobalFoundries... so ~$10 for each company per chip)
+ Sony can buy the project and manufacture in TSMC: ~$35-40 per chip (Sony need just to pay to TSMC)
+ Sony can ask to AMD manufacture in TSMC: ~$45-50 per chip (same than AMD plus GlobalFoundries but I think the TSMC a little more expensive)

In any case it's impossible a 212 mm² in 28nm using 300mm wafer costs to Sony over $50 per chip... it is just impossible... unless all the chips less than 30% are good and usable.

So what am I smoking??? I think the guys here knows nothig about how much a CPU/GPU chip costs.

 

Said that... I think the full chip CPU + APU + GPU together will be a 400mm²... bigger, more complex, less chips per wafer are good... so I think a cost for Sony from $60 to $80 per chip with a ~50% of good chips... I put in my estimate to you $200... it's a overcost with all needed to put this chip woking in the console motherboard... even the components to stay it cool and below the critical temperatures.

 

And yes... AMD e Intel sells the same $50 chips in CPU models from $200 to $999 (the highest potetial clock chips equipcs the top $999 cpus).

That was actually a really informative post. Thanks!



Scoobes said:

That was actually a really informative post. Thanks!

Thanks... here a ilustrative picture of a 300mm wafer with 94 GT200 chips... you can see the borders are not complete chips and of these 94 just a percentage is usable "good chip".

http://www.anandtech.com/show/2549