By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Intel's Sandy Bridge: Benchmarks and good graphics

NJ5 said:

Zlejedi your standards are way too high, a HD 5750 is definitely not in the lower end of modern cards. The low end are cards costing $50 or even less.

It's harder for some enthusiasts to understand where the bottom floor is when it comes to VGA solutions.

Some have become so brainwashed into the VGA upgrade cycle that they have a hard time fathoming the use and value of any card under $100, $200, or $300, or wherever their arbitrary standard has been set.

I think the reality may be that an average PC gaming consumer cares less about the performance of their card and more about whether they can simply play a given game they just bought.

Besides, the bottom end still includes all those older previous generation VGA cards still in circulation (being sold new at retail), most of which are well below HD 5750 performance levels.



Around the Network
NJ5 said:

Zlejedi your standards are way too high, a HD 5750 is definitely not in the lower end of modern cards. The low end are cards costing $50 or even less.


It's 100 euro card so a bare minimum a person who uses pc for gaming should be aiming for ;) Going below that is counterproductive as savings are diminishing below that at the cost of huge drop in performance.

Generally best cards are usually in $100-200 range - they offer best value, good performance and even if you sell them after two years they often still have at least 50% of starting value.



PROUD MEMBER OF THE PSP RPG FAN CLUB

Zlejedi said:
Squilliam said:

You can't run 3D at more than 1280 by 720 due to how the HDMI 1.4 spec operates. Its not a bandwidth limitation.

The reason why they should have limited GPU power?

1. It means that instead of simply tacking on more functional units they would have been forced to make them a lot more efficient and this would translate to improved performance from bottom to top.

2. It means that PC game developers would be able to target a more consistant specification and there wouldn't be as great a spread between the highest end and the lowest end GPU on the market.

That's for 3D. Normal picture can go to 1080p without problems.

Ad1. And they are working on power efficiency - radeon 5770 brings power similar to 4870 at 100W instead of 180, radeon 5870 has performance equal to pair of 4870 in 186W TDP instead of 360Watts.

Ad2. I'd rather prefer if they reduced this gap by increasing performance of low end cards instead of putting leash on development of high end ;) And actually next generation of cpus is doing exactly this. By greatly increasing power of integrated GPUs they give stable base for delevepor to benchmark against and also this will make tons of low end cards pointless which mean GPU manufacturers will have to impove low end gpus to the point when they bring serious performance.

That power efficiency is the result of improvements in process technology, stuffing more units in when each unit requires less power due to improvements in the silicon fabrication processes. What I was talking about is performance per watt on every given process. GPUs would have evolved differently if they had a limited power budget to work with and it would have been a lot more practical for OEM machines to be upgraded to gaming spec.

At the moment there are laptops, basic OEM desktop units and enthusiast machines. Each is an order of magnitude different in average performance. Smart developers have always targetted the lowest end possible. If anything something like The Sims or Starcraft 2 is the ideal PC game because of its market inclusiveness. The high end has always been a fools errand, but at least now everyone can see that. There are many good reasons why something like an HD 5870 is spinning its wheels trying to play any game at a typical PC desktop resolution, even full HD means you run out of settings to turn on for 99% of titles in existance.



Tease.

Squilliam said:
 

You can't run 3D at more than 1280 by 720 due to how the HDMI 1.4 spec operates. Its not a bandwidth limitation.

 

Its because the HDMI 1.4 chips (which are HDMI 1.3 with mustard on top) on the market are not fast enough to accept that bandwith. They need to run at 297 mhz or higher.  It would be to costly to mass market the fast ones at this point and 3D TVs are stuck using the standard 225 mhz HDMI chips ...therefore the current 3D TVs are stuck at 720p 3D.

Hence the reason why disolitude didn't buy one yet and bought a 3D projector instead... :)

Apperently there are firmware hacks already which allow games to be played on bluray 1080p/24 fps per eye setting. But its not a pretty sight...



I wonder what clock rates Bulldozer will achieve. I keep reading it's supposed to be a high-frequency design with a longer pipeline (ala Pentium 4, but maybe not as exaggerated). Could we see perhaps near-5 GHz clock rates?



My Mario Kart Wii friend code: 2707-1866-0957

Around the Network
disolitude said:
Squilliam said:
 

You can't run 3D at more than 1280 by 720 due to how the HDMI 1.4 spec operates. Its not a bandwidth limitation.

 

Its because the HDMI 1.4 chips (which are HDMI 1.3 with mustard on top) on the market are not fast enough to accept that bandwith. They need to run at 297 mhz or higher.  It would be to costly to mass market the fast ones at this point and 3D TVs are stuck using the standard 225 mhz HDMI chips ...therefore the current 3D TVs are stuck at 720p 3D.

Hence the reason why disolitude didn't buy one yet and bought a 3D projector instead... :)

Apperently there are firmware hacks already which allow games to be played on bluray 1080p/24 fps per eye setting. But its not a pretty sight...

Thats the reason why next generation most consoles won't target more than 720P 3D. By the time there are new TVs out which can accept higher it will probably be 2-3 years down the road. Its almost cynical, they don't care about people wanting to play games and those chips aren't that expensive. Its like they know you'll want to upgrade again if they limit the feature set.

Squilliam is very said, though Squilliam hates all the current 3D implementations. Too much blank time on the current 3D set's glasses makes for a bad experience.



Tease.

Squilliam said:
 

Squilliam is very said, though Squilliam hates all the current 3D implementations. Too much blank time on the current 3D set's glasses makes for a bad experience.


Squilliam may change his mind about 3D once disolutude posts pics of his new setup, and how much he spent on everything...



NJ5 said:

I wonder what clock rates Bulldozer will achieve. I keep reading it's supposed to be a high-frequency design with a longer pipeline (ala Pentium 4, but maybe not as exaggerated). Could we see perhaps near-5 GHz clock rates?

Likely we'll see 5Ghz only if we overclock. Overall I would say perhaps 4Ghz would be their highest clock given the fact that Phenom II tops out at 3.6Ghz. 4Ghz is a good marketing number, I wouldn't expect them to push higher than that given their TDP limits and due to the fact that they might have timing problems in getting their chips up to that frequency in a stable fashion.



Tease.

disolitude said:
Squilliam said:
 

Squilliam is very said, though Squilliam hates all the current 3D implementations. Too much blank time on the current 3D set's glasses makes for a bad experience.


Squilliam may change his mind about 3D once disolutude posts pics of his new setup, and how much he spent on everything...

Squilliam is getting a new TV in April next year, but Squilliam doesn't hold out any hopes that the 3D offered will be adequate to Squilliams needs without giving Squilliam a headache.

Squilliam is also getting a home theatre system which he hopes will be controllable on his iTouch because iOS rules for accessory compatibility. However Squilliam doesn't know of a good amp to get.



Tease.

greenmedic88 said:

But the basic issue with decent level performance with the integrated on chip video of Sandy Bridge doesn't change: it still requires consumers to buy a new computer in most instances, or change CPUs/motherboards in all others.

Correct. It's a step in the right direction though.

Naturally, nobody is going to buy a new mobo and CPU for the integrated video solution, so the focus is on the typical consumer that buys pre-made PCs like most normal people.

OEMs will adopt these platforms for pre-builts pretty fast, especially in laptops, because they no longer need to put a discrete card in with all of the expense, battery life drain, cooling requirements, weight and support costs that adds. AMD's Q4 2010 netbook platform will also perform like a 5450 and kill Atom performance wise so the same with that.

Better integrated video shouldn't do much more than drop the floor from under the future entry level VGA card market.

Yes. That will be the #1 effect.

This doesn't matter significantly to developers either since the only integrated video that will run GPU intensive games at the lower end would have to be PCs that were recently purchased. They still have to account for the 99% of the non-enthusiast PCs that predate Sandy Bridge in determining where they can draw the line as to what hardware can play their games at acceptable perfomance levels.

Agreed.

This won't force GPU manufacturers ATI and Nvidia into producing better low end GPU cards; they'd simply stop selling them and position their next tier of cards as entry level while marketing the performance advantages of discrete video cards.

AMD will have no issue with this as all their netbook, laptop and mainstream desktop CPUs will include powerful graphics from next year. They lose no sales.

Nvidia I don't believe will survive. Since Llano/SB kills everything they have under $100, they aren't profiting on anything $200 and up (R&D screwups mean they have no product until late 2011), the intermediate market won't be big enough to pay the R&D costs. I believe they will be replaced in the market by Intel GPUs within 3 years.

But the advantages of decent integrated video performance doesn't change either. Consumers (regular, not enthusiasts) should have to play less of the "can this computer play these titles" guessing game without having to resort to shopping for a discrete VGA card when purchasing a new computer based upon Sandy Bridge CPUs.

And with a guaranteed minimum GPU performance on all PCs (eventually), consumer apps (photo editing, video editing/playback, office, web applications) can start to use the GPU for non-graphics acceleration for a great speedup.