Garcian Smith said:
So says... a "graphics market analyst"? Some no-name analyst who may or may not have received industry payola is not an authority on the graphics card market. The fact is, the remaining ATI DX10 gaming cards (that would be the 4850 and 4770, since everything else has been phased out by this point) have absolutely nothing to match them at their $100 price point. I mean, what are gamers on a budget supposed to buy instead? The 5670? I guess they could pay an extra $40+ for a 5750, but that's only a small performance upgrade from the 4850 for not a small amount of money, and that card isn't good enough to run DX11 games anyway. And besides that, most intelligent PC gamers (i.e. those who don't try to "future-proof") upgrade their graphics cards about once every 18 months anyway. It saves money and gets you a better product in the long run. |
Its an extra $30 for a card which doesn't require a PCI-E 6 pin power adapter, uses far less power both in idle and load and performs slightly better, about 7% on average on current generation games. In addition to this the deals were always going to be temporary, the supply of 48xx cards are discounted for a good reason. As they are moved off the shelves the current mid range cards have been dropping in price to follow suit. BTW most gamers on a budget have lower end screens so they don't exactly need top of the line graphics hardware to push those pixels.
I'd love to see how the HD 5750 isn't fast enough to run DX11 games, indulge me. Theres a scale of implementations from the low end all the way up to the high end, so no it hasn't got anything to do with what DX level is implemented. Even adventure games can be DX11 if the developer were to want that and there are DX9 games like Crysis which make current generation cards cry.
Do you know what its like to live on the far side of Uranus?