personally i prefer Nvidia, but nonetheless, great PC. are you going to overclock it?
Bet With routsounmanman: By the end of Q1 2008 Capcom WONT have announced a RE5 Wii Edition OR a new RE (classic gameplay) for the Wii (WON)
personally i prefer Nvidia, but nonetheless, great PC. are you going to overclock it?
Bet With routsounmanman: By the end of Q1 2008 Capcom WONT have announced a RE5 Wii Edition OR a new RE (classic gameplay) for the Wii (WON)
Your monitor's nicer than mine. I went for the Intel Quad, I regret it though; I should have bought a Duo Extreme.
I would cite regulation, but I know you will simply ignore it.
thanny said: personally i prefer Nvidia, but nonetheless, great PC. are you going to overclock it? |
This is one area where I don't care. I think my last 5 upgrades, have gone back and forth between ATI and NVidia (I had a 7800GTX).
They both make great cards. I just end up buying whoever gives the most bang for the buck, and right now, it's ATI.
Oh, and no overlocking for me. I should be good with this. If I need more GPU, I will buy another one and is Crossfire.
The CPU socket should last a while, so if I need more CPU, I will buy another one. I have been burned a few times over-clocking. That, and I never saw much difference in games. It was just great for benchmarks.
steven787 said: Your monitor's nicer than mine. I went for the Intel Quad, I regret it though; I should have bought a Duo Extreme. |
The intel quad is nice, I just went AMD because it was $235, and came with a Mobo and Heatsink/fan.
TheRealMafoo said:
I am running XP. I loose some ram (it only sees 3.5 Gig), but I doubt I will miss it anytime soon. DX10 is a joke. Crysis on very high in DX9 looks the same. I have seen very few areas where it adds visual qualities, (smoke in BioShock is one), but for the most part, I don't mind loosing those features if it means I don't have to run that POS OS. I wish MS would make DX10 for XP, (I know if will never happen). |
Even I;m going to say WHAT?
http://webpages.charter.net/bliss/
http://hothardware.com/News/DX9_vs_DX10_with_Lost_Planet/
Truly DX10 and DX10.1 are massive improvements.
.Geenie. said: Brilliant. May i recomend the witcher |
i played that game,for some reason i didnt really like it,maybe cause ive never liked those type of games,but he might like it
TheRealMafoo said:
The intel quad is nice, I just went AMD because it was $235, and came with a Mobo and Heatsink/fan. |
The phenom is a lot of bang for the buck, I wasted a lot of money on mine. I was just saying for the amount I spent I could have gotten the DuoX or four hundred more for the QuadX.
I would cite regulation, but I know you will simply ignore it.
ssj12 said:
Even I;m going to say WHAT? http://webpages.charter.net/bliss/ http://hothardware.com/News/DX9_vs_DX10_with_Lost_Planet/ Truly DX10 and DX10.1 are massive improvements.
|
The Lost Planet lost a lot of FPS in DX10 on Vista. I would rather turn up other parts of the game (like resolution or AA), with DX9, and get a better overall look.
The difference in Crysis is High vs Very High. There is a hack to get Crysis to run in Very High mode on DX9, and it looks the same as DX10. it makes you wonder how much MS paid them to turn it off in the first place.
TheRealMafoo said:
The Lost Planet lost a lot of FPS in DX10 on Vista. I would rather turn up other parts of the game (like resolution or AA), with DX9, and get a better overall look. The difference in Crysis is High vs Very High. There is a hack to get Crysis to run in Very High mode on DX9, and it looks the same as DX10. it makes you wonder how much MS paid them to turn it off in the first place. |
You can get the texture quality from that hack but you can not get the improved lighting.
TheRealMafoo said:
The Lost Planet lost a lot of FPS in DX10 on Vista. I would rather turn up other parts of the game (like resolution or AA), with DX9, and get a better overall look. The difference in Crysis is High vs Very High. There is a hack to get Crysis to run in Very High mode on DX9, and it looks the same as DX10. it makes you wonder how much MS paid them to turn it off in the first place. |
Its probably a limitation in the cards, the Geforce 8800gtx/Radeon 3870 etc are more extended direct x 9 cards. They don't run shader model 4/4.1 as efficiently as they do shader model 3.0. I think the differences with modern cards between direct x 10/9 are a lot smaller from something I read a bit ago.
Also the Direct X 10 codepath less optimised because fewer people use it and they save it for the really HEAVY effects as well.
Tease.