By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Nvidia's GTX480/Fermi is broken

Soleron said:
ssj12 said:
And they have a reference GPU to prove this? If they do I'll believe them, if not, then frankly your blowing smoke on an untested product.

He's been right on every statement or claim we can test. And these  haven't been vague things; when he said them, they were considered raving fanboyism and complete fabrication. At the time there were many conflicting rumours sites changing their story every few days (Fudzilla has listed seven distinct release dates for Fermi so far. None correct.) yet Charlie stuck to his March prediction giving full and detailed reasoning.

- Last May, he said that Fermi would arrive in March. Everyone else was sure it would be in time for the Windows 7 launch.
- GT212 being cancelled. It was. GT214 being redone as GT215. It was. All of GT215-218 being severely delayed from the original Q2 timeframe to November at retail.
- Fermi needing two spins, and the timings of their respective tapeouts and first silicon. They are confirmed, because the 6 month delay that implies them has passed.
- Huge power requirements - the demos at CES (months after the claim) with their power plugs proved this. Also confirmed in last month's public statement that Fermi will be hotter than GT200, again months after when he said it.
- The fake Fermi boards at GTC. First pointed out by him and now accepted.
- The clockspeeds and bins being disappointing were confirmed after a few months by the Tesla document which I'll link if you ask.
- The rest of the stuff (how TSMC works, how long it takes to respin, the cost estimates, the BoM estimates) are all public knowledge if you phone the respective suppliers as a customer.

And all this with zero confirmed evidence to the contrary. He also has a forum post record of a few hundred posts where you can see the story unfolding with reasoning. If this was fabricated he would have tripped up in the record by now.


the reason why I actually believe Nvidia could fuck up like this on the Fermi is because of my experience with the 3D vision. Never have I seen a company with such a lackluster support for such a cool product.

Here is an example. Bioshock 2 for has native 3D vision support built in to the game. Devs of the game were working on this together with Nvidia and made the game 100% compatible. But guess what...the driver update for the game is not ready yet. ETA unknown.

So out of the 3 games that I wanted to play on PC this year (Mass Effect 2, Bioshock 2, Alien Vs Predator) NONE are 3D vision compatible as of yet.

I think Nvidia is branching out in to other markets way too fast and is not able to deliver a competent product across all of their division.



Around the Network
disolitude said:
...


the reason why I actually believe Nvidia could fuck up like this on the Fermi is because of my experience with the 3D vision. Never have I seen a company with such a lackluster support for such a cool product.

Here is an example. Bioshock 2 for has native 3D vision support built in to the game. Devs of the game were working on this together with Nvidia and made the game 100% compatible. But guess what...the driver update for the game is not ready yet. ETA unknown.

So out of the 3 games that I wanted to play on PC this year (Mass Effect 2, Bioshock 2, Alien Vs Predator) NONE are 3D vision compatible as of yet.

I think Nvidia is branching out in to other markets way too fast and is not able to deliver a competent product across all of their division.

Yes. It is obvious from the design of Fermi (officially detailed last month) that it is for a GPGPU/workstation compute product rather than a desktop graphics card.

Nvidia's core business is going away. Clarkdale and Llano are taking away the need for low-end discrete GPUs, which both AMD and Nvidia need to make product volume to pay off the massive costs of designing a GPU. High-end GPUs are declining as a $100 GPU is good enough for 95% of us now. And they've stopped making chipsets because AMD and Intel are making the chipset irrelevant by bringing the NB and graphics on die. ATI avoided this by merging with AMD so they will make AMD's chipsets and AMD's on-die graphics. The goal is complete integration of CPU and GPU, and Nvidia doesn't have an x86 CPU to do it. So they bet everything on general purpose compute.

The problem is that it is a tiny market and as a company need to stick rigidly to power limits and schedules. One screw up (GT200's SP/DP performance) and you don't have the initiative any longer. Two (Fermi's delays and power issues) and you've lost the market. Also Nvidia are pushing CUDA, 3D Vision and PhysX when no one will adopt them because they're tied to one vendor's hardware. OpenCL, DirectCompute, Intel's Havok physics, and whatever becomes a cross-vendor 3D standard will actually get better adoption.

 



disolitude said:
Soleron said:
ssj12 said:
And they have a reference GPU to prove this? If they do I'll believe them, if not, then frankly your blowing smoke on an untested product.

He's been right on every statement or claim we can test. And these  haven't been vague things; when he said them, they were considered raving fanboyism and complete fabrication. At the time there were many conflicting rumours sites changing their story every few days (Fudzilla has listed seven distinct release dates for Fermi so far. None correct.) yet Charlie stuck to his March prediction giving full and detailed reasoning.

- Last May, he said that Fermi would arrive in March. Everyone else was sure it would be in time for the Windows 7 launch.
- GT212 being cancelled. It was. GT214 being redone as GT215. It was. All of GT215-218 being severely delayed from the original Q2 timeframe to November at retail.
- Fermi needing two spins, and the timings of their respective tapeouts and first silicon. They are confirmed, because the 6 month delay that implies them has passed.
- Huge power requirements - the demos at CES (months after the claim) with their power plugs proved this. Also confirmed in last month's public statement that Fermi will be hotter than GT200, again months after when he said it.
- The fake Fermi boards at GTC. First pointed out by him and now accepted.
- The clockspeeds and bins being disappointing were confirmed after a few months by the Tesla document which I'll link if you ask.
- The rest of the stuff (how TSMC works, how long it takes to respin, the cost estimates, the BoM estimates) are all public knowledge if you phone the respective suppliers as a customer.

And all this with zero confirmed evidence to the contrary. He also has a forum post record of a few hundred posts where you can see the story unfolding with reasoning. If this was fabricated he would have tripped up in the record by now.


the reason why I actually believe Nvidia could fuck up like this on the Fermi is because of my experience with the 3D vision. Never have I seen a company with such a lackluster support for such a cool product.

Here is an example. Bioshock 2 for has native 3D vision support built in to the game. Devs of the game were working on this together with Nvidia and made the game 100% compatible. But guess what...the driver update for the game is not ready yet. ETA unknown.

So out of the 3 games that I wanted to play on PC this year (Mass Effect 2, Bioshock 2, Alien Vs Predator) NONE are 3D vision compatible as of yet.

I think Nvidia is branching out in to other markets way too fast and is not able to deliver a competent product across all of their division.

Well, Nvidia's main market isnt even the genreal consumer, its the professional market. Basically their Quadro series GPUs. So really, their drivers will come when they come. They still need to add SLi support for BFBC2's beta.



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453
 

AMD's driver support for Linux is still terrible (ATI cards). I'm an extreme budget PC gamer so I use the top end stuff from 1-2 years ago. Until AMD's drivers aren't complete shit, Nvidia is where I stay. :/



ssj12 said:
...

Well, Nvidia's main market isnt even the genreal consumer, its the professional market. Basically their Quadro series GPUs. So really, their drivers will come when they come. They still need to add SLi support for BFBC2's beta.

Professional may be the most profitable, but they need desktop to make up sales volume and pay off the fixed costs of design. They could not survive if they just concentrated on their current professional market, even if they dropped all desktop development, departments and employees.

@rendo

What if there was no Nvidia? This is a serious possibility; they've all but discontinued the GTX260 and above (stock levels are very low), the lower-end cards can't compete (GT210 v. 5450, GT220 v. 5570, GT240 v. 5670) and in the middle they're still selling the GTS 250 which is a G80 based card (though die shrunk) from 2006. If there's no replacements in sight, how can they turn a profit?

AMD are trying their best given the size of the Linux market for them. Their closed driver, while flawed, is a lot better than 3 years ago, and the open driver is doing very well for the resources it has. Admittedly Nvidia is better, but closed-source as you know.



Around the Network
rendo said:
AMD's driver support for Linux is still terrible (ATI cards). I'm an extreme budget PC gamer so I use the top end stuff from 1-2 years ago. Until AMD's drivers aren't complete shit, Nvidia is where I stay. :/

lol...im sorry, every time someone says "it doesn't work in Linux" I remember this chart.

 



Soleron said:
Scoobes said:
When I saw a negative thread about Nvidia's new chip, I somehow knew you'd be the one posting it Soleron, lol.

I like AMD, but I'm not a fanboy. I actually want competition in the GPU market, those 5xxx prices need to come down. Charlie's articles haven't really been accepted by the mainstream until recently, when it turned out everything he's written about Fermi since early last year is correct (well, we'll know for certain next month).

I never said you were, but whenever I've been in a GPU thread and NVidia's new chip has been raised you've always slated it. So when this thread came up, I knew you'd be the one starting it.



i dont see a chart



Scoobes said:
...

I never said you were, but whenever I've been in a GPU thread and NVidia's new chip has been raised you've always slated it. So when this thread came up, I knew you'd be the one starting it.

I don't like the idea of people waiting 6 months for a GPU I know won't be better value than 5xxx, when they wanted one then.

My whole position rests on Charlie being reliable of course. And so far, he appears to be. I would change in a moment if he was proven wrong by this launch (and never believe his articles again). There is no other consistent source of pre-release GPU news.



Random Canadian said:
i dont see a chart

right click and say show picture...