By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Nvidia's GTX480/Fermi is broken

Soleron said:
drkohler said:
...

Here is a good article about the situation as it is now. It is about ATI's RV870 chip but reading through the article (and occasionally between the lines) one can figure out the (serious) problems NVidia is currently facing: 

http://www.anandtech.com/video/showdoc.aspx?i=3740

On the other hand, I find it rather strange that people think an entire fermi/RV870 based graphics card (chips at least as complex as the best processors available) should be available for a measly $199-ä299.

That article is really good, I'd recommend it to anyone. It's the story of RV870's creation straight from the engineers who designed it. I'd also recommend the article written for RV770.

The 5850 and 5870 are priced a little high compared with the previous generation. But that is to be expected given the lack of competition and the fact demand still exceeds supply (Proof: look at the 5970's stock situation). They're still rationing Cypress chips, so they can charge what the market will bear.

Compared to 2007 and earlier though, these prices are fantastic. Remember when top-of-the-line was $800 with the 8800 Ultra? Or how nothing under $250 was a decent gaming card with X1x00 and 7x00 cards?

Can't fairly compare this to the pre 2007 VGA graphics situation since cards like the 8800GT and the HD4850 basically let the performance genie out of the bottle at prices any PC gamer who really cared about high performance graphics could afford. The only way you can roll back pricing trends on premium cards is if/when you have a monopoly.

Currently, PC gamers are paying a pretty big premium for that extra 10% performance in the HD5870 for single GPU performance. Even the 5850 doesn't get the immediate nod as the best $290 VGA solution based on raw performance alone.

But there's no arguing with the struggle to keep stock relative to output on the 5850 and 5870. Micro Center doesn't even bother stocking them on shelves when they even have them. They keep them in the stock room or behind the counter. Considering this is 5 months after release, that's a bit unusual. It's for reasons like this that I don't think ATI is "gouging" consumers with these cards as they could clearly sell a lot more of them if they could just keep vendors stocked. Given the higher margins on top tier VGA cards relative to the volume consumer cards (all readily available), there's no reason why they wouldn't if they could.



Around the Network

UPDATE:

http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/

The GTX480 is 5% faster than a 5870 on average. So worse than we thought.

The GTX470 is 10-15% slower than that, so probably has a slight lead on the 5850.

Given the massive die size, yield problems and power issues, I don't believe there's a price Nvidia can put them at that is both profitable and competitive.

Also another issue: the chip's DX11 support is not ready. It has image quality issues on parts of the DX11 Unigine benchmark.



^ very interestomg read Soleron. In any case it would seem that Nvidia fucked up royally. Its just like the Nvidia 5xxx Series all over again.



Long Live SHIO!

Soleron said:
UPDATE:

http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/

The GTX480 is 5% faster than a 5870 on average. So worse than we thought.

The GTX470 is 10-15% slower than that, so probably has a slight lead on the 5850.

Given the massive die size, yield problems and power issues, I don't believe there's a price Nvidia can put them at that is both profitable and competitive.

Also another issue: the chip's DX11 support is not ready. It has image quality issues on parts of the DX11 Unigine benchmark.

Nvidia was claiming/shooting for 60% greater performance than Cypress, so this puts them in a pretty bad spot. They're not going to be able to sell them for much more than the current $399 and $299 prices of the HD5870 and HD5850. The MSRP on those cards is supposed to be $379 and $259 even though they've been selling for more. I just don't see how a 5% increase in overall performance can be expected to command any sort of price premium over that.

The article also mentions only 5,000-8,000 are being produced with losess expected on each unit? It's almost like these cards are just place holders so that Nvidia can have something competitive with Cypress VGA cards.

Hoping some of these issues could be fixed with driver revisions, but that thermal envelope and die size is a serious liability. 70% fan speed on idle @70C is a bit disturbing. No headroom for overclocking.



Time to sell nVidia's stocks and buy AMD's! :D:D:D

I guess those cards will be very expensive.



Please register :)

Around the Network
greenmedic88 said:
Soleron said:
UPDATE:

http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/

The GTX480 is 5% faster than a 5870 on average. So worse than we thought.

The GTX470 is 10-15% slower than that, so probably has a slight lead on the 5850.

Given the massive die size, yield problems and power issues, I don't believe there's a price Nvidia can put them at that is both profitable and competitive.

Also another issue: the chip's DX11 support is not ready. It has image quality issues on parts of the DX11 Unigine benchmark.

Nvidia was claiming/shooting for 60% greater performance than Cypress, so this puts them in a pretty bad spot. They're not going to be able to sell them for much more than the current $399 and $299 prices of the HD5870 and HD5850. The MSRP on those cards is supposed to be $379 and $259 even though they've been selling for more. I just don't see how a 5% increase in overall performance can be expected to command any sort of price premium over that.

The article also mentions only 5,000-8,000 are being produced with losess expected on each unit? It's almost like these cards are just place holders so that Nvidia can have something competitive with Cypress VGA cards.

Hoping some of these issues could be fixed with driver revisions, but that thermal envelope and die size is a serious liability. 70% fan speed on idle @70C is a bit disturbing. No headroom for overclocking.

First off i doubt nvidia will release these cards without DX11 support. That is a sure way to make these cards fail commercially at the very least they will add in the drivers so they can say their cards support DX11.

 

But yes i agree with you these cards are an obvious screwup. They tried to do something that is just a bit past today's technology and theyre going to pay for it. If all these rumors turn out to be true the Nvidia is screwed. They could potentially be in the same place as ATI was back with the launch of the 2xxx Series (R600) and thats not where they want to be stuck with a line of cards that cost a fortune to produce and dont have the performance to back it up.



Long Live SHIO!

Nvidia still has their chipset business, workstation cards and a huge line up of mainstream VGA cards as the meat and potatoes of their VGA card business. Apparently those top end enthusiast cards don't add up to a major portion of their overall business. 5,000-8,000 cards is not much even if they were taking big profit margins on each card, which they aren't. The main thing Nvidia loses from the uninspiring Fermi is prestige. Technically, they will still have the fastest single GPU cards, just overpriced and below what was initially claimed.

I guess this just means Nvidia will continue to use its 2 gen old G90 series GPUs and last gen R200s for their bulk sellers. But if they start cutting out the GTX260, 275 and 285, I'm really not sure what they plan on selling to fill in the gaps. Not more rebadged G90 chips at any rate.

Clearly ATI has the better business plan with respect to their VGA cards this gen.



greenmedic88 said:
Nvidia still has their chipset business, workstation cards and a huge line up of mainstream VGA cards as the meat and potatoes of their VGA card business. Apparently those top end enthusiast cards don't add up to a major portion of their overall business. 5,000-8,000 cards is not much even if they were taking big profit margins on each card, which they aren't. The main thing Nvidia loses from the uninspiring Fermi is prestige. Technically, they will still have the fastest single GPU cards, just overpriced and below what was initially claimed.

I guess this just means Nvidia will continue to use its 2 gen old G90 series GPUs and last gen R200s for their bulk sellers. But if they start cutting out the GTX260, 275 and 285, I'm really not sure what they plan on selling to fill in the gaps. Not more rebadged G90 chips at any rate.

Clearly ATI has the better business plan with respect to their VGA cards this gen.

They've stopped developing new chipsets. There will be no more Nvidia chipsets for either Intel or AMD platforms.

I agree, enthusiast cards don't add up to much, but since they can't make competitive Fermi derivatives their lower end will be rebadged G9x until at least the end of 2010. But those cards aren't competitive now, much less with a hypothetical part-refresh from AMD in the next few months, and certainly not with the confirmed complete refresh in H2 2010.

Currently their lineup looks like this: G 210, GT 220, GT 240, GTS 250, GTX 260, GTX 275, GTX 285, GTX 295.

The G 210 loses to the 5450 on value, with a larger die preventing price cuts.
The GT 220 loses to the 5570 on value.
The GT 240 loses to the 5670 on value, with a larger die preventing price cuts.
The GTS 250 is competitive with the 5750; still, the 5750 has a much smaller die (181mm^2 vs. ~260mm^2).
The GTX260 through 295 are very low on stock, soon to be completely gone. Even when they are availible, the (upcoming 5830 and) 5850 destroy the value proposition of the first three and the 5970 that of the 295.

So that's one competitive product out of eight. And AMD could easily sell the 5750 profitably at $90 which would kill that too.

Only one area looks OK, and that's workstation cards, but heat could be an issue and AMD will have 5xxx workstation cards soon enough.



Dirt 2 DX11 AAx2 Max. Det. except Crowd @HIGH and distant car details @HIGH
10.2
Malasya Free Race Ladara Rally start dash view - 76FPS
10.3
Malasya Free Race Ladara Rally start dash view - 82FPS


Catalyst 10.3 looks like its worth ~5% performance, at least on a couple of examples.

Read the rest here: http://forum.beyond3d.com/showpost.php?p=1397342&postcount=57

So if Charlie is right then its neck and neck. :-/ Not a good time to be talking about margins as small as that.

I wonder if Nividia is going to claim Fermi has untapped potential?