Quantcast
GeForce 8800GTS 512 (128 shaders) is here

Forums - PC Discussion - GeForce 8800GTS 512 (128 shaders) is here

Hey, I know there are many PC gamers here that might be interested in this one. No, this is not your traditional GTS 90nm. This one has more shaders and it's much faster than the older one. It even matches the GTX in some stuff... aside from being cheaper.

The most interesting part is that this card is clocked 650Mhz core / 1940Mhz memory / 1625Mhz shaders.

 

 

And it can be overclocked to 830Mhz core / 2060Mhz memory / 2000Mhz shaders... on stock air cooling.

(*Not my screenshot) 

 This card seems to be a real performer for the money. What do you think?



Around the Network

What's the price range? Sorry, I'm too lazy to look today.

I'm looking to spend $150-$200 on a graphics card, but if I can get top-end performance for $250, I probably would, and that's what the 8800GT is...is this better than the GT?



LEFT4DEAD411.COM
Bet with disolitude: Left4Dead will have a higher Metacritic rating than Project Origin, 3 months after the second game's release.  (hasn't been 3 months but it looks like I won :-p )

It's slightly better than the GT, but even the GT is currently selling at $300 and up due to shortages, and the new GTS 512 MB uses the same G92 chip with all 8 shader banks operating (so it's in even tighter supply than G92s with only 7 shader banks operating).

Ben, if I were in your position (which I am, since I'm also budgeting $200-250 on a graphics card), I would wait and pounce on the GT as soon as it falls back down to MSRP.



Sounds good, I need to wait on money for a couple months anyway.



LEFT4DEAD411.COM
Bet with disolitude: Left4Dead will have a higher Metacritic rating than Project Origin, 3 months after the second game's release.  (hasn't been 3 months but it looks like I won :-p )

I'd also recommend the GT if you can find one $250 or under. This new card is supposed to be $400+, right?

Just how much better is this card than the current GTX?



Around the Network

Remember, a graphics card refresh is coming quite soon. Q1 2008 will bring R680 and D8E, while H1 2008 will have D9E and R700. The technology you would buy now has been available for nearly a year. Also, any (non-ATI) card you buy now will not be DirectX 10.1 compliant.



Ubuntu. Linux for human beings.

If you are interested in trying Ubuntu or Linux in general, PM me and I will answer your questions and help you install it if you wish.

Game_boy said:
Remember, a graphics card refresh is coming quite soon. Q1 2008 will bring R680 and D8E, while H1 2008 will have D9E and R700. The technology you would buy now has been available for nearly a year. Also, any (non-ATI) card you buy now will not be DirectX 10.1 compliant.

DX10.1 is not that superb though, the mandatory 32-bit floating point filtering, and the imposing 4x AA minimums and the like are some of the major highlights for DX10.1.

These things will all be good down the road when we can pack a few more transistors in the cards but until then these features are more hinderance than help.

Realistically speaking right now even DX10 hasn't shown itself to be worthy of the extra fuss and in general I think DX9 is still the consumer preferred standard. Until DX10 as a base API can be considered worthwhile I don't see why 10.1 would be any different. Honestly I am starting to get tired of this view from the "graphics whore" world of DX10 that its ok to sacrifice performance if things look just a tiny bit better. Sacrificing performance for eye candy is only a good idea up to the point that the performance hit is noticeable to the user...and then you need to back up a bit and leave it there.

This is of course one of the problems a lot of folks have with games like Crysis and legitimately so. While most are woefully uninformed about how good a PC needs to be to run the game they are correct in saying that if their PC can't run it at a somewhat reasonable frame rate then it is nothing more than a very pretty slide show, one not worth much more than a few "oooh"s and perhaps one or two "aaah"s.



To Each Man, Responsibility

"Realistically speaking right now even DX10 hasn't shown itself to be worthy of the extra fuss and in general I think DX9 is still the consumer preferred standard."

The big features of DX10 aren't that relevant to games, but they are hugely relevant and worth the extra fuss.

Virtual GPU memory, and GPU multitasking are the big features of DX10. That is why DX10 is Vista only. Because the XP kernel can't support these two features.

These two features combine to allow multiple windows to simultaneously make use of hardware accelerated graphics. This is a huge deal. And its absolutely crucial for Windows to keep up with OSX and Linux as a modern OS.

It just doesn't really make grass or snow look that much better in Crysis etc so the game-media glosses over it, but trust me, DX10 is HUGE advance over DX9.



Yeah. It used to be with DX7, DX8, and especially DX9 introductions, that DX made games better looking, for a reasonable performance cost. It was always a big improvement...but with DX10, I see better graphics but a huge performance cost, and it's not worth it to me, really.

I dunno, I'm not really a technical person.



LEFT4DEAD411.COM
Bet with disolitude: Left4Dead will have a higher Metacritic rating than Project Origin, 3 months after the second game's release.  (hasn't been 3 months but it looks like I won :-p )

vux984 said:
"Realistically speaking right now even DX10 hasn't shown itself to be worthy of the extra fuss and in general I think DX9 is still the consumer preferred standard."

The big features of DX10 aren't that relevant to games, but they are hugely relevant and worth the extra fuss.

Virtual GPU memory, and GPU multitasking are the big features of DX10. That is why DX10 is Vista only. Because the XP kernel can't support these two features.

These two features combine to allow multiple windows to simultaneously make use of hardware accelerated graphics. This is a huge deal. And its absolutely crucial for Windows to keep up with OSX and Linux as a modern OS.

It just doesn't really make grass or snow look that much better in Crysis etc so the game-media glosses over it, but trust me, DX10 is HUGE advance over DX9.

That's all well and good, but the point of buying a high end graphics card is to enjoy games. If DX10 doesn't do much in the way of making games look better or run faster, then DX10 (or 10.1) compliance is not a big selling point for a graphics card, which is what we're talking about. Even a "DX9 card" will run fine in Vista with DX10 installed, it just won't support Shader Model 4 and a handful of other things. Shader Model 3 already nailed down most of the important bits anyway like looping and branching.