By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Troll_Monster said:
raytracing is the best way to get realistic lighting and shadowing as it simulates light, but this is way too taxing on the GPU to be done at all in real time, that's why it's just left for pre-rendered video.

perhaps some time in the future when GPU's are powerful enough and designed to handle raytracing in real time can it be used for games, but that future is a long way away.

That's exactly what the article is talking about.

With Nvidia buying a company doing a big raytracing renderer it brings us closer to that future or at least hints that it is a direction Nvidia is interested in looking at.

Perhaps it is too far off to see it in next gen consoles but it is likely to happen eventually and I am curious how the people on this forum think it will influence gaming graphics if at all (current tech rendering techniques combined with next gen power increase might make the difference in graphical quality small enough that almost nobody can see the difference).

Also, thanks for the links Stranne. I haven't read the second one yet (long thread) but the article points why it might not be practical for games and why it might not be such an improvement.

However, like the conclusion says, mixing both techniques might be a good compromise nad would make sense from NVidia's point of view as they already have good cards for rasterisation so adding raytracing hardware could give them an edge for those time when using it makes sense (well, until ATI finds a way to do the smae with their cards at least).



"I do not suffer from insanity, I enjoy every minute of it"