By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - GeForce GTX officially supports FREESYNC

Lafiel said:
Azzanation said:
So my GTX1080 will support Free sync?

yes, all Pascal and Turing cards afaik will with the new driver that will be released on Jan 15th

Correct. 



                                                                                                             

Around the Network

Good, good. Also sounds like a half-decent time to upgrade my graphics card (GTX 770 2GB), although I'm not liking the price too much. Still, not that bad.



Zkuq said:
Good, good. Also sounds like a half-decent time to upgrade my graphics card (GTX 770 2GB), although I'm not liking the price too much. Still, not that bad.

I plan to upgrade this year too (GTX 960 2GB) but I have decided to go team red this time, its been a while.



Visit my eBay stampers store: eims-stampers

Deus Ex (2000) - a game that pushes the boundaries of what the video game medium is capable of to a degree unmatched to this very day.

Chazore said:
I'm still waiting for Freesync to completely surpass G-sync in every conceivable way, and then some, but until then I'll be sticking with my 1440p 165hz G-sync Acer Predator.

Where do you feel that Freesync falls short of G-sync?

Lafiel said:

yes, all Pascal and Turing cards afaik will with the new driver that will be released on Jan 15th

It is interesting how nVidia didn't lock this down to just Turing.
Makes me wonder why they couldn't also roll this out to older cards like Maxwell, Kepler, Fermi' etc'.




--::{PC Gaming Master Race}::--

Pemalite said:

Where do you feel that Freesync falls short of G-sync?

There's definitely lower quality control with FreeSync. Some of the FreeSync displays have horrendously small variable refresh rate range and some of the early implementations suffered from ghosting artifacts ... 

Pemalite said:

It is interesting how nVidia didn't lock this down to just Turing.
Makes me wonder why they couldn't also roll this out to older cards like Maxwell, Kepler, Fermi' etc'.

Their older video cards likely didn't support the VESA adaptive refresh rate implementation standard in their display engine ... 

I don't imagine that it was an arbitrary decision on Nvidia's account to lock out the feature on certain video card generations ... 



Around the Network
Conina said:
deskpro2k3 said:

And RTX doesn't,

Both Pascal and Turing cards support it with the next driver update.

ah my mistake thanks for that clarification. I'm still not getting an rtx though as I just bagged myself a new 1070ti.

and it looks beautiful.



fatslob-:O said:
Pemalite said:

Where do you feel that Freesync falls short of G-sync?

There's definitely lower quality control with FreeSync. Some of the FreeSync displays have horrendously small variable refresh rate range and some of the early implementations suffered from ghosting artifacts ... 

Thanks, Chazore.

Ghosting is a big issue... I find that a large proportion of VA panels tend to ghost allot more than say... TN or IPS, which compounds the issue.
You will get ghosting on a G-Sync VA panel as well.

Still. Things have improved since then, I would imagine older shittier panels are no longer a thing on the market.

fatslob-:O said:

Their older video cards likely didn't support the VESA adaptive refresh rate implementation standard in their display engine ... 

I don't imagine that it was an arbitrary decision on Nvidia's account to lock out the feature on certain video card generations ... 


Pascal isn't that fundamentally different from Maxwell though.



--::{PC Gaming Master Race}::--

https://www.techspot.com/article/1454-gsync-vs-freesync/

I think that this is a good read regarding features. 



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Asus PG27UQ gaming on 3840 x 2160 @120 Hz GSYNC HDR| HTC Vive Pro :3

Reached PC master race level.

Pemalite said:

Thanks, Chazore.

Ghosting is a big issue... I find that a large proportion of VA panels tend to ghost allot more than say... TN or IPS, which compounds the issue.
You will get ghosting on a G-Sync VA panel as well.

Still. Things have improved since then, I would imagine older shittier panels are no longer a thing on the market.

You have the wrong person ...

The other issue with FreeSync is that the feature does not work with overdrive on the vast majority of displays and it also doesn't help that AMD botched the launch of their own FreeSync 2 certification program by certifying a display that doesn't even support their advertised paramount feature which was low framerate compensation ... 

Pemalite said:

Pascal isn't that fundamentally different from Maxwell though.

Raven Ridge supported hardware accelerated VP9 decoding but RX Vega didn't on the other hand because even though both shared a nearly identical GPU architectures (more so than Maxwell/Pascal) yet both of them have different video engine's so it's not totally out of the realm of possibility as to why Maxwell was left out ...  

Plus things were totally different back then since it was far from certain which competing standard at the time was going to emerge on top so if you were Nvidia you'd have to reason that weren't contemplating adding a competing standard in their own hardware, they'd try to lobby as hard as possible to push their own technology to be ubiquitous ... (hindsight is always 20/20)

Maxwell was architected well before the tides started turning against G-Sync. It first started with the more diverse display selections, then came the X1X, then the biggest blow being HDMI Forum denying it but the last straw was with Intel adding hardware support and then things fell apart afterwards with G-Sync turning downright sour so in the end supporting the competing standard was an admission of defeat for them because they realized that G-Sync wasn't ever going to take off by that point ... 

Before any HDMI 2.1 displays or Intel hardware that supported VRR released, Nvidia just preemptively surrendered like they just did recently and sat there helplessly watching as the generic adaptive refresh technology emerged victorious in a decisive manner. There's no doubt AMD played a big role in making FreeSync as the ubiquitous standard but just as much credit goes to the other very important parties like the HDMI Forum and Intel for cementing it as the dominant industry standard ...  



Amd did the FreeSync 2.0 thingy and at that point the differnces between it and G-Sync are so small that unless you place them side by side, and carefully study things you probably wouldnt even be able to tell which is which.

The fact that FreeSync doesnt make your monitor cost a additional 100-150$ is reason enough for most people to get one of these instead.
Unless your the person that wants the very best, and dont care about costs at all..... FreeSync > G-Sync.

The fact that its comeing to TV's soon, and future consoles.... FreeSync is the future.