By using this site, you agree to our Privacy Policy and our Terms of Use. Close

PC - New GeForce GTX 280! - View Post

sieanr said:
ssj12 said:
Soleron said:
The Radeon 4850 completely destroys the competition and also beats cards that are $100 (9800GTX) or even $200 (GTX 260) more expensive.

http://techreport.com/articles.x/14967

Your GTX 280 ($649) is beaten by 4850 Crossfire ($398).

 

And your point? of course crossfire on a new generation of cards will top a high-end card thats basically a mid-generation card. Lets have a game that actually takes advantage of CUDA as nothing does at the moment and then lets see what happens. From what I can guess CUDA will be nVidia's trump card.


 

Do you actually know anything about computers?

 

CUDA has almost nothing to do with gaming. I'll repeat that so you can understand it; Nothing. Its purpose is to run desktop programs on the GPU, which is great if you need to run an operation that takes advantage of a GPUs strength. AMD/ATI has its own system of doing the same thing called Close To Metal. Neither of these will likely ever be used in games in the foreseeable future, if ever. There is a plethora of reasons why that should be pretty damn obvious, but based on your apparent knowledge of computer tech its not surprising that you think CUDA is some sort of magic bullet.

 

Havok on AMD GPUs is a different beast then CUDA altogether. Although if CUDA does anything for gaming it will be simplifying physics on the GPU. Anyways back to AMD: basically they're making it easy to run havok accelerated physics on the GPU. The problem is at this point communication between the CPU and GPU tends to be slow. So if the GPU runs any physics calculations that change the game environment - say a pillar falling on the ground- then this information needs to be sent to the CPU so the CPU will know that the AI has to walk around the pillar, or to know when the player runs into it. Since this would be something of a bottleneck its likely that all GPU physics in the near future will be purely eye candy and "game changing" physics will still be on the CPU. Eventually physics will be all on the GPU, but not with this generation of cards.

 

Oh, and the fact that Nvidia is trumping up CUDA tells you that even they know that they have a bomb on their hands. They were caught complacent and are now stuck trying to push an outdated design on the public at an outrageous price. Unfortunatly for Nvidia the competition stepped up their game, designed a truly next gen GPU, and was able to get the process down to very good yields.

 

I can't wait to see 4870 crossfire, then 4870X2. Nvidia is going to have its ass handed to them, especially at the high end.

.... CUDA allows for the card to be directly coded for different factors like you said Physics and to be more plain, Physx.

Due to CUDA's ability to be coded in C++ developers can allow for more direct coding and interaction with the card. Developers will be able to tap into every once of ram, and power in the card.This should be basic knowledge because this is a similar platform that the game consoles run on. All the game console GPUs are hand coded to run each game. This is why it takes developers years to tap into the console's power. They have to actually learn the hardware.

For example of one practical application for CUDA is that Adobe is putting a nVidia only accelerator in Photoshop CS4. If developers coded their own accelerators for nVidia GPUs guess what? games like crysis would be flying like a bird.

PC hardware haven;t had this problem before. You bitch about cost. Tell me mr. brainiac, why is it that for a 15% increase in performance on a SINGLE CORE gpu cost a company a ton of money to put out? Simple. It is because they are hitting the difficulties presented in Moore's Law and the Universal Limit. 

What am I talking about universal limit in technology?

Let me use a direct quote

"The physical limits to computation have been under active scrutiny over the past decade or two, as theoretical investigations of the possible impact of quantum mechanical processes on computing have begun to make contact with realizable experimental configurations. We demonstrate here that the observed acceleration of the Universe can produce a universal limit on the total amount of information that can be stored and processed in the future, putting an ultimate limit on future technology for any civilization, including a time-limit on Moore's Law. The limits we derive are stringent, and include the possibilities that the computing performed is either distributed or local. A careful consideration of the effect of horizons on information processing is necessary for this analysis, which suggests that the total amount of information that can be processed by any observer is significantly less than the Hawking-Bekenstein entropy associated with the existence of an event horizon in an accelerating universe."

Source

 

Simply put we might be able to shrink our transistors and jam more and more but there is a limit. The more we push that limit the more expensive the card. For every minor gain in power the card itself will cost a retarded amount of money.

The GTX 280 costs $649, standard clock, because it is at the limits of it's size. 1.4 billion transistors gives nVidia the most powerful single core GPU on the planet. Is the fact they are pushing the limit's of our technologies and the basic physical laws mean they have out dated tech? hell no. If they put the GTX 280 in a SLi setup like the GX2 the card would be a beast. It would probably be enough to hold it's own against the Radeon 4900X2.

Seriously outdated tech, lol. So a switch to more stream processors, which are still a mess of transistors and diodes isnt out dated?

Do YOU know computers? If you did you would not being trying to act like a smart ass like you are. I know the hardware better then you do obviously.

 



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453