By using this site, you agree to our Privacy Policy and our Terms of Use. Close
disolitude said:
Twistedpixel said:

I'll put it this way, they have 'ancient' 8 ROP units clocked at 500mhz. They ARE fillrate limited especially when considering any application of MSAA which cuts their Z-rate quite considerably. Your GTX 295 has 8* the theoretical fill rate and probably 10-20* the fillrate in practice.

Your issue is that you're running games optimised for 30-60FPS and trying to get them up to 120FPS with numerous bottlenecks. You're CPU limited and GPU limited at different points of the frame and the higher you attempt to push your frame-rate the more time your cards spend idle waiting for the CPU.

I would not have personally attempted 3D on anything less than DX11 hardware with the DX11 driver model. The biggest change DX11 gives us is the same multi-threaded just in time rendering that the Xbox 360 has enjoyed up until now. Since you're most likely CPU limited in some fashion, splitting the load over multiple cores should give your PC the ability to feed the graphics card a lot more efficiently.

The Sony update is interesting but im waiting for Fermi to come out to try 3D on the PC. I don't like the idea of paying so much money for a half assed setup when I can get better 3D for cheaper on the PC, with a 24" 3D monitor, and graphics card which will cost less than upgrading the TV and give a better return in terms of experience. If the implementation isn't half assed on the PS3? Well I already own the console and the glasses and by the time I figure that out the TVs will be both better and cheaper to display 3D, win/win.


Yeah I figured as much about the Ps3 video chip. And you're right about 3D not being optimized for 120hz on the PC. The funny thing is that I use a Samsung 67 inch DLP for 3d gaming which uses checkerboard. I have a Viewsonic 3D monitor as well but I like the DLP 3D much better (no ghosting).

So essentially I am running 1080p checkerboard which is 720p@ 60 hz x 2. However the 3D vision drivers and the videocard are pushing 1080p@60 hz X 2 and only then do they interlace the signal to appear in the checkerbaord pattern. Essentially wasting 1/2 of the performance...

But yeah, the Nvidia CES 2010 was quite impressive with the GF100 and mustiple display 3D...so that seems to be the way to go. I don't think you will look back on the console 3d once you get that going... Not even for a second...lol

Unfortunately display technology is advancing so rapidly that pretty much every year you'll get a news article saying "They've 'solved' the 3D problem with modern displays" and then say that people who have been holding off from 3D ought to buy now. They'll do this every year until they actually 'solve' the problem.

Until modern games come out which render double but only require the game engine to update at half the rate of display I think you'll have problems with wasted performance. You're not really wasting 3D performance so much as CPU performance and thats the area which simply does not scale nearly as well. You can easily double your 3D performance but doubling your CPU performance is a multi-billion dollar question.

My ideal setup I was considering migrating to in the mid term (1-2 years) is a dual projector display system for both gaming and movies. Since the projector essentially blanks when not in use, you only have to display the relevant frame data with no interlacing and at full resolution.



Do you know what its like to live on the far side of Uranus?