By using this site, you agree to our Privacy Policy and our Terms of Use. Close

But the basic issue with decent level performance with the integrated on chip video of Sandy Bridge doesn't change: it still requires consumers to buy a new computer in most instances, or change CPUs/motherboards in all others.

Naturally, nobody is going to buy a new mobo and CPU for the integrated video solution, so the focus is on the typical consumer that buys pre-made PCs like most normal people.

Better integrated video shouldn't do much more than drop the floor from under the future entry level VGA card market.

This doesn't matter significantly to developers either since the only integrated video that will run GPU intensive games at the lower end would have to be PCs that were recently purchased. They still have to account for the 99% of the non-enthusiast PCs that predate Sandy Bridge in determining where they can draw the line as to what hardware can play their games at acceptable perfomance levels.

This won't force GPU manufacturers ATI and Nvidia into producing better low end GPU cards; they'd simply stop selling them and position their next tier of cards as entry level while marketing the performance advantages of discrete video cards.

But the advantages of decent integrated video performance doesn't change either. Consumers (regular, not enthusiasts) should have to play less of the "can this computer play these titles" guessing game without having to resort to shopping for a discrete VGA card when purchasing a new computer based upon Sandy Bridge CPUs.