By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - NVIDIA G-Sync Review



I took some bits and parts from the review.. full review from Anandtech in the link

http://www.anandtech.com/show/7582/nvidia-gsync-review

For those who don't know what it is:
G-Sync, a hardware solution for displays that enables a semi-variable refresh rate driven by a supported NVIDIA graphics card.

AC4
I enabled G-Sync, once again leaving the refresh rate at 60Hz and dove back into the game. I was shocked; virtually all stuttering vanished. I had to keep FRAPS running to remind me of areas where I should be seeing stuttering. The combination of fast enough hardware to keep the frame rate in the G-Sync sweet spot of 40 - 60 fps and the G-Sync display itself produced a level of smoothness that I hadn’t seen before. I actually realized that I was playing Assassin’s Creed IV with an Xbox 360 controller literally two feet away from my PS4 and having a substantially better experience.

Batman AO
The improvement to Batman was insane. I kept expecting it to somehow not work, but G-Sync really did smooth out the vast majority of stuttering I encountered in the game - all without touching a single quality setting. You can still see some hiccups, but they are the result of other things (CPU limitations, streaming textures, etc…). That brings up another point about G-Sync: once you remove GPU/display synchronization as a source of stutter, all other visual artifacts become even more obvious. Things like aliasing and texture crawl/shimmer become even more distracting. The good news is you can address those things, often with a faster GPU, which all of the sudden makes the G-Sync play an even smarter one on NVIDIA’s part. Playing with G-Sync enabled raises my expectations for literally all other parts of the visual experience

Sleeping dogs

The first thing I noticed after enabling G-Sync is my instantaneous frame rate (according to FRAPS) dropped from 27-28 fps down to 25-26 fps. This is that G-Sync polling overhead I mentioned earlier. Now not only did the frame rate drop, but the display had to start repeating frames, which resulted in a substantially worse experience. The only solution here was to decrease quality settings to get frame rates back up again. I was glad I ran into this situation as it shows that while G-Sync may be a great solution to improve playability, you still need a fast enough GPU to drive the whole thing.

Final Words

After spending a few days with G-Sync, I’m just as convinced as I was in Montreal. The technology, albeit a relatively simple manipulation of display timing, is a key ingredient in delivering a substantially better gaming experience.

In pathological cases the impact can be shocking, particularly if you’re coming from a 60Hz panel today (with or without v-sync). The smoothness afforded by G-Sync is just awesome. I didn’t even realize how much of the v-sync related stutter I had simply come to accept. I’d frequently find a scene that stuttered a lot with v-sync enabled and approach it fully expecting G-Sync to somehow fail at smoothing things out this time. I always came away impressed. G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. 

In many situations the impact of G-Sync can be subtle. If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you. There’s also the fact that G-Sync optimizes for a situation that may or may not be so visible 100% of the time. Unlike moving to a higher resolution or increasing quality settings, G-Sync’s value is best realized in specific scenarios where there’s a lot of frame rate variability - particularly between 30 and 60 fps. Staying in that sweet spot is tougher to do on a 1080p panel, especially if you’ve already invested in a pretty fast video card.

If you’re already running games at a fairly constant 60 fps, what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. Ideally I’d love to see a 2560 x 1440 G-Sync display with an IPS panel that maybe even ships properly calibrated from the factory. I suspect we’ll at least get the former.

==============================================================================

Well I'm interested..



 

Face the future.. Gamecenter ID: nikkom_nl (oh no he didn't!!) 

Around the Network

Mirrors Carmack, Sweeney and Andersson's impressions from the eurogamer thingy.



 

Needs to be modified to work on AMD ASAP.



The thing is this will never be more than a niche tech unless they:

a) Incorporate the module into every Nvidia GPU so that every display works with it. (Thus giving Nvidia a great edge over AMD)

b) Propose a new standard form of signaling for monitors/input cables.

c) Release an "In Between" module that any card (Intel, AMD) can plug into that then plugs into a monitor.

Otherwise it is physX all over again. Except this has real potential...



Prediction for console Lifetime sales:

Wii:100-120 million, PS3:80-110 million, 360:70-100 million

[Prediction Made 11/5/2009]

3DS: 65m, PSV: 22m, Wii U: 18-22m, PS4: 80-120m, X1: 35-55m

I gauruntee the PS5 comes out after only 5-6 years after the launch of the PS4.

[Prediction Made 6/18/2014]

Eddie_Raja said:
The thing is this will never be more than a niche tech unless they:

a) Incorporate the module into every Nvidia GPU so that every display works with it. (Thus giving Nvidia a great edge over AMD)

b) Propose a new standard form of signaling for monitors/input cables.

c) Release an "In Between" module that any card (Intel, AMD) can plug into that then plugs into a monitor.

Otherwise it is physX all over again. Except this has real potential...


This is an insightful Q/A. Maybe answers some of your thoughts

http://www.eurogamer.net/articles/digitalfoundry-carmack-sweeney-andersson-interview



 

Around the Network
Eddie_Raja said:
The thing is this will never be more than a niche tech unless they:
a) Incorporate the module into every Nvidia GPU so that every display works with it. (Thus giving Nvidia a great edge over AMD)

I think you have a problem of understanding here. The module goes into the tv/monitor. You also need a powerful gpu, fon't even think below 770/780...



drkohler said:
Eddie_Raja said:
The thing is this will never be more than a niche tech unless they:
a) Incorporate the module into every Nvidia GPU so that every display works with it. (Thus giving Nvidia a great edge over AMD)

I think you have a problem of understanding here. The module goes into the tv/monitor. You also need a powerful gpu, fon't even think below 770/780...


No I absolutely understand the problem lol (It's 650 Ti Boost and up btw).  Yes the module is in the TV.  That is the problem!  You need 1) A new crazy expensive TV, and 2) A recent Nvidia GPU.   Basically you need to be a well off Nvidia Fanboy.   Until everyone can reasonable enjoy it, it will be almost a non-factor in purchasing decisions...



Prediction for console Lifetime sales:

Wii:100-120 million, PS3:80-110 million, 360:70-100 million

[Prediction Made 11/5/2009]

3DS: 65m, PSV: 22m, Wii U: 18-22m, PS4: 80-120m, X1: 35-55m

I gauruntee the PS5 comes out after only 5-6 years after the launch of the PS4.

[Prediction Made 6/18/2014]