By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General Discussion - INTEL - Most Profitable Quarter in HISTORY

TheRealMafoo said:
WilliamWatts said:
CatFangs806 said:
It's quite possibly all those netbooks with those cheap, single core "Atom" processors in them. I can't believe they are still making single core processors. Good for them, I guess. Cheap to make, sell them higher in the extreme price of netbooks for what you get, and they reap the rewards.

It can't be because that line of processors is actually bring their average margins DOWN and not up. I would suggest its the higher margins CULV, Quad Cores and Core 2s which are doing it.

My guess is whats doing it is slow CPU tech advancement. Most of the cost of a CPU, is R&D.

Let's say it takes me $100 million to design something, and it cost $20 to make. let's asume I am expecting to sell that thing at 50,000 a year for 5 year, and then replace it with the next thing that cost me $100 million to design.

I project then, that I will sell 250,000 of these things. The cost then, is $100 million across each ($400) + $20. It cost me $420 to make each one.

But, if in 5 years I can sell that thing for another year, the cost drops to 20 bucks each, and I still get to sell it for something close to what I used to sell it for.

This is the kind of thing that's happening. CPU's are no longer the slow part of computer system. People need a lot more ram or faster storage long before they run out of CPU.

I am writing this on a 4 year old Mac Book Pro, and if the CPU in this thing was 10x faster, I would not notice it. 

I think the biggest bottleneck in any machine these days is the GPU. Especially if it's integrated graphics. Mine works great, but I have to play Call Of Duty 4 in low settings. And I have a quad-core and 8 gigs of ram. I shouldn't have to play games in those settings. Oh well, it's more optimized for HD video editing, which is one of the main reasons I bought the computer. Can't complain.



Around the Network

@WilliamWatts

Yes, the fab spinoff was a great idea with the increased capital costs and the fact AMD couldn't use all that capacity itself but otherwise would still be paying to have it.

AMD will not have any chance at being the performance leader until Bulldozer in 2011. But it doesn't matter! >95% of the market is looking for value not absolute performance, and AMD is still competitive there. They just have to keep profitable. Their graphics and chipset side should help, 5xxx margins are much better than 4xxx and Nvidia is no longer a threat in either market.

On the server side AMD will be completely competitive in 2 months. Magny-Cours will be 12 cores, priced against* Intel's quad- and hex-cores with similar performance and power consumption**.

*No, not against the 8 core. That will be much more expensive than current Xeons.
**There will be 8 and 12 core models with 55W typical power, and 6 core models with 35W typical power.



CatFangs806 said:
...

I think the biggest bottleneck in any machine these days is the GPU. Especially if it's integrated graphics. Mine works great, but I have to play Call Of Duty 4 in low settings. And I have a quad-core and 8 gigs of ram. I shouldn't have to play games in those settings. Oh well, it's more optimized for HD video editing, which is one of the main reasons I bought the computer. Can't complain.

You know a $50 graphics card would solve that? GPUs are really cheap.

AMD's upcoming 8xx integrated grahics will be DX11 and perform like a $50 card too.



Soleron said:
CatFangs806 said:
...

I think the biggest bottleneck in any machine these days is the GPU. Especially if it's integrated graphics. Mine works great, but I have to play Call Of Duty 4 in low settings. And I have a quad-core and 8 gigs of ram. I shouldn't have to play games in those settings. Oh well, it's more optimized for HD video editing, which is one of the main reasons I bought the computer. Can't complain.

You know a $50 graphics card would solve that? GPUs are really cheap.

AMD's upcoming 8xx integrated grahics will be DX11 and perform like a $50 card too.

My power supply is only 350Watts, though. I don't want to have to replace that. Is there a good graphics card that can run on that and at least run Crysis fairly high along with other new games?



CatFangs806 said:
Soleron said:
CatFangs806 said:
...

I think the biggest bottleneck in any machine these days is the GPU. Especially if it's integrated graphics. Mine works great, but I have to play Call Of Duty 4 in low settings. And I have a quad-core and 8 gigs of ram. I shouldn't have to play games in those settings. Oh well, it's more optimized for HD video editing, which is one of the main reasons I bought the computer. Can't complain.

You know a $50 graphics card would solve that? GPUs are really cheap.

AMD's upcoming 8xx integrated grahics will be DX11 and perform like a $50 card too.

My power supply is only 350Watts, though. I don't want to have to replace that. Is there a good graphics card that can run on that and at least run Crysis fairly high along with other new games?

What's your CPU?

According to Anandtech, a 5770 will get 26fps on 1680x1050 Gamer Quality with a Core i7 920 at 3.33GHz (i.e. higher power than the vast majority of CPUs) and uses 236W under load, well below 350W.



Around the Network
Soleron said:
CatFangs806 said:
Soleron said:
CatFangs806 said:
...

I think the biggest bottleneck in any machine these days is the GPU. Especially if it's integrated graphics. Mine works great, but I have to play Call Of Duty 4 in low settings. And I have a quad-core and 8 gigs of ram. I shouldn't have to play games in those settings. Oh well, it's more optimized for HD video editing, which is one of the main reasons I bought the computer. Can't complain.

You know a $50 graphics card would solve that? GPUs are really cheap.

AMD's upcoming 8xx integrated grahics will be DX11 and perform like a $50 card too.

My power supply is only 350Watts, though. I don't want to have to replace that. Is there a good graphics card that can run on that and at least run Crysis fairly high along with other new games?

What's your CPU?

According to Anandtech, a 5770 will get 26fps on 1680x1050 Gamer Quality with a Core i7 920 at 3.33GHz (i.e. higher power than the vast majority of CPUs) and uses 236W under load, well below 350W.

It's an AMD athlon 2 x4 quad core at 2.6GHZ



CatFangs806 said:

My power supply is only 350Watts, though. I don't want to have to replace that. Is there a good graphics card that can run on that and at least run Crysis fairly high along with other new games?

An HD 5670 1GB card has more performance than a 9800GT with 400 Stream Processors, reasonable memory bandwidth at 60GB/S and only draws 61W at load. I would say you can run that card, especially if your processor is a 65W TDP dual or quad core. It costs $99 at the moment. You'd be able to run Crysis on a fairly high setting if not high itself on a 1280/1024 (17") monitor.

As you can see its performing admirably at that setting.

http://www.guru3d.com/article/radeon-hd-5670-review-test-crossfire/15

Thats with 2xMSAA if you turn that off you should yield good 30FPS playable on the 'gamer' high setting.



WilliamWatts said:
CatFangs806 said:

My power supply is only 350Watts, though. I don't want to have to replace that. Is there a good graphics card that can run on that and at least run Crysis fairly high along with other new games?

An HD 5670 1GB card has more performance than a 9800GT with 400 Stream Processors, reasonable memory bandwidth at 60GB/S and only draws 61W at load. I would say you can run that card, especially if your processor is a 65W TDP dual or quad core. It costs $99 at the moment. You'd be able to run Crysis on a fairly high setting if not high itself on a 1280/1024 (17") monitor.

As you can see its performing admirably at that setting.

http://www.guru3d.com/article/radeon-hd-5670-review-test-crossfire/15

Thats with 2xMSAA if you turn that off you should yield good 30FPS playable on the 'gamer' high setting.

My quad core uses about 45W at load, although I've never actually maxed it out. The only thing I need to know now is how much of my power supply is being used with everything that's already in it. If the quad core uses 45W out of 350W, then the rest of the parts should be all right. I just don't want to fry my computer overloading my PSU. Is there a way to tell how many watts of my power supply is being used?



Yeah, WW, I was going to post the 5670 but I chose the 5770. I was also going to say 'if you have a 65W dual core...' but then looked at Anand's power numbers, he gets more than 100W below the limit with an OC'd i7 920 so any CPU will be fine.

Get a 5670 if you intend to run 1280x1024 on high, or a 5770 for 1680x1050 on medium-high. With your CPU (95W TDP, lower in practice) it should be completely fine on 350W. The 5xxx cards do have great idle and load power consumption.

Since Crysis is the most demanding game by far, a 5670 will run any other game on medium settings at 1680x1050 and a 5770 any game on High at 1900x1200.



CatFangs806 said:
WilliamWatts said:
CatFangs806 said:

My power supply is only 350Watts, though. I don't want to have to replace that. Is there a good graphics card that can run on that and at least run Crysis fairly high along with other new games?

My quad core uses about 45W at load, although I've never actually maxed it out. The only thing I need to know now is how much of my power supply is being used with everything that's already in it. If the quad core uses 45W out of 350W, then the rest of the parts should be all right. I just don't want to fry my computer overloading my PSU. Is there a way to tell how many watts of my power supply is being used?

You should be fine. Where did you measure 45W?