| Kynes said: You keep on talking without knowing what you talk about. AMD older cards had low efficiency due to the VLIW architecture. Nowadays both AMD and nVidia use high efficiency scalar architectures, so as always, you spread technical bullshit. PS: It's funny how people use a number for the GDDR5 cost from a GAF user as it's the truth. Those modules weren't in production until recently, and AFAIK they haven't been used on any card, so we don't know/we can't try to guess how much do they cost. We can't know it unless we have a quotation from the manufacturer, and I doubt a guy working on a bank has it. |
- Windows overhead
- Drivers overhead
- Graphic API overhead
- PCI-E overhead
- A lot more overhead
And before you said I'm saying bullshit... please do a research first because it is not my claiming... MS is saying the actual GPU have a 50% raw power efficiency, X360 had 60% efficiency and the Nextbox will have close 100% efficiency.
The eSRAM, Data Moves, and fixed units not existents in PC are created to archive that... the PS4 have 8 ACEs and 8 queues pipelines (GNC have 2 of each) to archive close 100% raw power efficiency too.
I'm no using random data or make assumptions... I was just sharing what the other companies says... not me... so call they bullshiters and not me.
The PS part... all the cost estimate was made using 32 chips of 2Gb GDDR5... the manufacturers share the cost of each chip... you can search in the google to find the official price of the 2Gb GDDR5 chips... for mass production contract Sony will pay less than the official price.
In any case no way close to the over $300 cost of Blu-ray drive.







