By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Intel's 2nd Generation Core processors 29 new CPUs and enhanced graphics

HappySqurriel said:
NJ5 said:

There is one rather stupid thing and one rather scary/disturbing thing about these new CPUs (correct me if I'm wrong):

- stupid thing: the highest-end, overclockable Sandy Bridge CPUs have substantially better integrated graphics hardware, occupying a large portion of the chip as the picture in the OP shows. But the people who want to buy these highest-end chips are the ones who will rarely use the integrated graphics, obviously they will buy a good standalone graphics card instead. This looks like a waste of transistors and die area in these models.

- scary/disturbing thing: these CPUs can be remotely switched off by Intel:

http://www.techspot.com/news/41643-intels-sandy-bridge-processors-have-a-remote-kill-switch.html

The second one by itself is enough to make me very wary of buying one.

While I agree these integrated chips are problematic at the moment, if ATI/nVidia can work out solutions similar to the hybrid crossfirex to take advantage of these GPUs (or developers take advantage of these chips) it might not be such a waste


Doesn't Nvidia have that hybrid SLI thing where when your not using intense graphics solutions notebooks switch to the integrated solution. Couldn't they do the same thing here to save power and cut heat in a desktop?



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453
 
Around the Network
ssj12 said:
HappySqurriel said:
NJ5 said:

There is one rather stupid thing and one rather scary/disturbing thing about these new CPUs (correct me if I'm wrong):

- stupid thing: the highest-end, overclockable Sandy Bridge CPUs have substantially better integrated graphics hardware, occupying a large portion of the chip as the picture in the OP shows. But the people who want to buy these highest-end chips are the ones who will rarely use the integrated graphics, obviously they will buy a good standalone graphics card instead. This looks like a waste of transistors and die area in these models.

- scary/disturbing thing: these CPUs can be remotely switched off by Intel:

http://www.techspot.com/news/41643-intels-sandy-bridge-processors-have-a-remote-kill-switch.html

The second one by itself is enough to make me very wary of buying one.

While I agree these integrated chips are problematic at the moment, if ATI/nVidia can work out solutions similar to the hybrid crossfirex to take advantage of these GPUs (or developers take advantage of these chips) it might not be such a waste


Doesn't Nvidia have that hybrid SLI thing where when your not using intense graphics solutions notebooks switch to the integrated solution. Couldn't they do the same thing here to save power and cut heat in a desktop?

You could do that, although I'm not certain it would have as much value as similar solutions have on notebooks ...



HappySqurriel said:
ssj12 said:
HappySqurriel said:
NJ5 said:

There is one rather stupid thing and one rather scary/disturbing thing about these new CPUs (correct me if I'm wrong):

- stupid thing: the highest-end, overclockable Sandy Bridge CPUs have substantially better integrated graphics hardware, occupying a large portion of the chip as the picture in the OP shows. But the people who want to buy these highest-end chips are the ones who will rarely use the integrated graphics, obviously they will buy a good standalone graphics card instead. This looks like a waste of transistors and die area in these models.

- scary/disturbing thing: these CPUs can be remotely switched off by Intel:

http://www.techspot.com/news/41643-intels-sandy-bridge-processors-have-a-remote-kill-switch.html

The second one by itself is enough to make me very wary of buying one.

While I agree these integrated chips are problematic at the moment, if ATI/nVidia can work out solutions similar to the hybrid crossfirex to take advantage of these GPUs (or developers take advantage of these chips) it might not be such a waste


Doesn't Nvidia have that hybrid SLI thing where when your not using intense graphics solutions notebooks switch to the integrated solution. Couldn't they do the same thing here to save power and cut heat in a desktop?

You could do that, although I'm not certain it would have as much value as similar solutions have on notebooks ...


considering the amount of enviromental minded people realizing less electricity used the lower the bill at the end of the month is, could be a decent selling point for people. At least it could be a selling point at Walmart, Best Buy, etc. Even I'd consider it as a worthwhile feature if I was to build a new or upgrade my current a desktop in 4 years.



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453
 
ssj12 said:
HappySqurriel said:
ssj12 said:
HappySqurriel said:
NJ5 said:

There is one rather stupid thing and one rather scary/disturbing thing about these new CPUs (correct me if I'm wrong):

- stupid thing: the highest-end, overclockable Sandy Bridge CPUs have substantially better integrated graphics hardware, occupying a large portion of the chip as the picture in the OP shows. But the people who want to buy these highest-end chips are the ones who will rarely use the integrated graphics, obviously they will buy a good standalone graphics card instead. This looks like a waste of transistors and die area in these models.

- scary/disturbing thing: these CPUs can be remotely switched off by Intel:

http://www.techspot.com/news/41643-intels-sandy-bridge-processors-have-a-remote-kill-switch.html

The second one by itself is enough to make me very wary of buying one.

While I agree these integrated chips are problematic at the moment, if ATI/nVidia can work out solutions similar to the hybrid crossfirex to take advantage of these GPUs (or developers take advantage of these chips) it might not be such a waste


Doesn't Nvidia have that hybrid SLI thing where when your not using intense graphics solutions notebooks switch to the integrated solution. Couldn't they do the same thing here to save power and cut heat in a desktop?

You could do that, although I'm not certain it would have as much value as similar solutions have on notebooks ...


considering the amount of enviromental minded people realizing less electricity used the lower the bill at the end of the month is, could be a decent selling point for people. At least it could be a selling point at Walmart, Best Buy, etc. Even I'd consider it as a worthwhile feature if I was to build a new or upgrade my current a desktop in 4 years.


I don't disagree, I just think that a laptop that has a much longer battery life is a better selling feature than a desktop that uses less power.



intel has been promising a good graphic chipset for yrs now , and guess what they still look like shite for games



Around the Network

what sucks is i am planning on getting an CORE i7 proccessor today to finish of my new PC i am building. oh well thats the name in the game spend 1000 dollars one day. Worth 500 dollars the next 



Of Course That's Just My Opinion, I Could Be Wrong

Cool, bring on the competition!



IllegalPaladin said:

Cool, bring on the competition!


April-May.

Actually AMD will certainly have the lead, because they will have 8-core processors while Intel will only have 4 SB cores until Q4.



Soleron said:
IllegalPaladin said:

Cool, bring on the competition!


April-May.

Actually AMD will certainly have the lead, because they will have 8-core processors while Intel will only have 4 SB cores until Q4.


hyper-threading and slightly higher performance per clock will make it a close fight I think.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

zarx said:
Soleron said:
IllegalPaladin said:

Cool, bring on the competition!


April-May.

Actually AMD will certainly have the lead, because they will have 8-core processors while Intel will only have 4 SB cores until Q4.


hyper-threading and slightly higher performance per clock will make it a close fight I think.

I don't think so.  Four Phenom II cores are behind about 50% against a similarly clocked SB, so even eight Phenom II cores today would win. And BD will have higher IPC according to AMD (since it's their first all-new architecture since 2003 I would hope that is that case).

Bulldozer will clock very highly anyway; it is designed as a Pentium 4 like speed demon.