Persistantthug said:
|
When did I say the GPU may cost over $200? For one its an APU design with both and for two any prices I've given were clearly speculation that would include discounts due to customizations an volume.
Persistantthug said:
|
When did I say the GPU may cost over $200? For one its an APU design with both and for two any prices I've given were clearly speculation that would include discounts due to customizations an volume.
superchunk said:
When did I say the GPU may cost over $200? For one its an APU design with both and for two any prices I've given were clearly speculation that would include discounts due to customizations an volume. |
Earlier you were refering to price and wattage....I then went on to remind you that the GPU in consoles are chips only, and custom as well, not the same as full PC graphic's card, and that's why a 7900 series GPU for the PS4 was entirely possible.
Now you see, I very well might be right. :)
Persistantthug said: But you don't know what the wattage or the cost of a custom GPU chip. Keep in mind that I said chip....because unlike what you might buy for your PC with a 7950 or 7970, for example, Consoles only use the chip....not the whole board. |
The board/PCB is one of the least expensive components of the graphics card. The thermal/cooling system, actual GPU chip, dedicated VRAM are where the bulk of the costs are and all of those have to be in the console. This is why most people don't care for the distinction between the total cost of a console GPU and the Graphics Card on the PC. Here is the breakdown of all the actual components costs for GPUs from last generation. It's not directly comparable to HD7970/GTX680 but it gives you an idea what AMD/NV charge to Add-In Board Partners (AIBs) for each part.
Replace HD6950 with HD7950 to get your 3TFlop+ estimate and you are looking at $150-160 for the graphics card kit, purchased directly from AMD. But that's assuming HD7950 costs as much as HD6950 to manufacture. It likely costs more because HD6950 debuted at $299, while HD7950 launched at $449. AMD's GPU division isn't exactly making money hand over fist, which seems to indicate that manufacturing costs are higher for current gen than HD6xxx series. There have also been many articles and NV's presentation where it has been stated that cost per wafer increases from 40nm to 28nm generation. That means 28nm GPUs cost AMD substantially more $ to fab than HD6950/6970 series. This is actually known since TSMC raised 28nm wafer prices vs. 40nm generation. Anything faster than HD7870 2GB (HD7970M) sounds like wishful thinking in a PS4/720 console due to the cost and power consumption limitations, unless they take something like an HD7950/8950 and gimp it on the ROP/memory bandwidth side (but then it won't be a high-end GPU).
HD7870 has 20 Compute Units and each Compute Unit has 64 Steam Processors. If they remove 2 Compute units (Red GCN blocks in the diagram below) and drop the GPU clocks from 1000mhz to 800mhz, you end up with a 10% cut-down HD7870 with 20% reduction in GPU clock speed:
18 Compute Unit custom "Pitcairn" HD7870 in PS4:
18 CUs *64 SPs = 1152 Steam Processors @ 800mhz GPU clock speed x 2 Floating Point Ops/clock = 1.84Tflops floating point.
^^^ A slightly cut-down HD7870 actually sounds like a very reasonable rumor for PS4 because these specs are very similar to AMD's fastest single GPU in the mobile space -- HD7970M:
1280 SPs @ 850mhz
http://www.notebookcheck.net/AMD-Radeon-HD-7970M.72675.0.html
If we consider that PS4 is not going to be the size of a mid-tower PC, we can deduce that thermal/heat dissipation limits and power consumption are much more important. In the smaller form factor of PS4, using a cut-down Pitcairn is akin to using near top-of-the line mobile GPU in AMD's laptop product stack. I'd say that's a pretty reasonable balance of power consumption, performance and cost. There is a dramatic size difference between HD7870 "Pitcairn" at 212mm2 vs. HD7950/7970 "Tahiti XT" at 365mm2 chip (72% larger chip). If you are buying a 300mm wafer, the price per wafer is actually fixed, but you can fab more of the smaller-sized chips. Thus, your cost to manufacture each chip is less and yields are higher since larger chips tend to have more manufacturing flaws/harder to fab. Over millions of chips, these costs and yields add up.
http://techreport.com/review/22573/amd-radeon-hd-7870-ghz-edition
@ superchunk,
For PS4 Display Support, I think 4K TV upscaling resolution will be supported. I think Sony will want to market the PS4 as the next gen multi-media device. Since HD7000 series supports 4K Displays, it's not unreasonable to believe that PS4 could upscale movies to 4K, support 4K BluRay, etc. PS4 could still have 4K media capability even if games run at 1080P only.
learning a lot of interesting information from this thread, not on the specs as they're rumours but on tech in general cheers people.
@Blue Falcon - as usual, you never fail to deliver. Great post, thanks for that breakdown table.
What I find interesting in it is that ATI's card that costs 199$ (7870 can be found for as low as 209.99$) is manufactured for 119$, and GPU part is just 50$. Considering that 2 other major parts are RAM and Thermal, and both will be shared between CPU and GPU in console, GPU that is close to 7870 really doesn't seem so far stretched. I think that most (reasonable) people were expecting something like that, and though it may not be as big of a jump as it should be considering length of this gen, it's still some 9-10x of 360, and in my opinion, that's quite a leap.
No doubt this gen has been squeezed to no end. New hardware will be a great experience for all of us. Compare Resistance Fall of Man to something like the Last of Us, worlds apart. Launch games for next gen should be quite remarkable considering we've been in this one too long and over utilizing limited hardware specs. Only awesome to come, in my opinion.
Before the PS3 everyone was nice to me :(
After see the @Blue Falcon costs for the some parts of a manufactured video card... now I have "almost" certain the PS4/720 will be sold per less than $400 in the US market.
$350 is now my new bet.
HoloDust said: @Blue Falcon - as usual, you never fail to deliver. Great post, thanks for that breakdown table. What I find interesting in it is that ATI's card that costs 199$ (7870 can be found for as low as 209.99$) is manufactured for 119$, and GPU part is just 50$. Considering that 2 other major parts are RAM and Thermal, and both will be shared between CPU and GPU in console, GPU that is close to 7870 really doesn't seem so far stretched. |
Thanks!
You are right that it's surprising that the GPU chips aren't actually $200+ parts. Most people don't factor in the supply-chaing --> retail middle-men costs along the way. Then they assume that a $500 GPU costs $350-400 to manufacture. In reality, once AMD/NV sell the parts to AIBs, then AIBs have to put these parts together, pay for their marketing/advertising, shipping costs, warranty costs, etc. and then the AIBs and retailers also tack on profits for selling these products. Then we end up with a GPU that sells for $350 on Newegg but really MS/Sony could buy it directly for ~$160.
Here is the breakdown for costs for older generations. When HD4890 sold for about $217 in retail, the components for it cost $121 (GPU chip was $55). When GTX285 was going for $350 MSRP, the total components were about half of that, or $163 (GPU chip cost $90). It's pretty clear from the chart I posted earlier and this even older gen chart that NV charges much more for their parts at similar performance levels (HD4870 Kit was $87 to buy and it had better performance than GTX260 that you'd have to purchase for $114). This explains why AMD was able to win all 3 designs in next gen consoles.
I haven't been able to find the latest generational cost for HD7000/GTX600 series but I'll link it if I do.
Based on these cost breakdowns, I am pretty disappointed if Xbox 720 will only end up with an HD7770Ghz GPU. I was honestly hoping for HD7850-7870 for both PS4/720. I suppose MS might focus more on differentiating features such as Kinect 2.0 or some other things we don't know about.
More tweaking of info.... fill free to assist me with specific info and especially ranking/comparison info now that parts are starting to become clearer.
Ok OPs updated with as much detail as I can gather.
For anyone doubting current listings, note that many reputable developers have now stated that these are correct. (not my specific thread, but the same details listed on other sites)