By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - GeForce GTX officially supports FREESYNC

Zkuq said:
Good, good. Also sounds like a half-decent time to upgrade my graphics card (GTX 770 2GB), although I'm not liking the price too much. Still, not that bad.

I plan to upgrade this year too (GTX 960 2GB) but I have decided to go team red this time, its been a while.



My Etsy store

My Ebay store

Deus Ex (2000) - a game that pushes the boundaries of what the video game medium is capable of to a degree unmatched to this very day.

Around the Network
Chazore said:
I'm still waiting for Freesync to completely surpass G-sync in every conceivable way, and then some, but until then I'll be sticking with my 1440p 165hz G-sync Acer Predator.

Where do you feel that Freesync falls short of G-sync?

Lafiel said:

yes, all Pascal and Turing cards afaik will with the new driver that will be released on Jan 15th

It is interesting how nVidia didn't lock this down to just Turing.
Makes me wonder why they couldn't also roll this out to older cards like Maxwell, Kepler, Fermi' etc'.




--::{PC Gaming Master Race}::--

Pemalite said:

Where do you feel that Freesync falls short of G-sync?

There's definitely lower quality control with FreeSync. Some of the FreeSync displays have horrendously small variable refresh rate range and some of the early implementations suffered from ghosting artifacts ... 

Pemalite said:

It is interesting how nVidia didn't lock this down to just Turing.
Makes me wonder why they couldn't also roll this out to older cards like Maxwell, Kepler, Fermi' etc'.

Their older video cards likely didn't support the VESA adaptive refresh rate implementation standard in their display engine ... 

I don't imagine that it was an arbitrary decision on Nvidia's account to lock out the feature on certain video card generations ... 



Conina said:
deskpro2k3 said:

And RTX doesn't,

Both Pascal and Turing cards support it with the next driver update.

ah my mistake thanks for that clarification. I'm still not getting an rtx though as I just bagged myself a new 1070ti.

and it looks beautiful.



CPU: Ryzen 7950X
GPU: MSI 4090 SUPRIM X 24G
Motherboard: MSI MEG X670E GODLIKE
RAM: CORSAIR DOMINATOR PLATINUM 32GB DDR5
SSD: Kingston FURY Renegade 4TB
Gaming Console: PLAYSTATION 5
fatslob-:O said:
Pemalite said:

Where do you feel that Freesync falls short of G-sync?

There's definitely lower quality control with FreeSync. Some of the FreeSync displays have horrendously small variable refresh rate range and some of the early implementations suffered from ghosting artifacts ... 

Thanks, Chazore.

Ghosting is a big issue... I find that a large proportion of VA panels tend to ghost allot more than say... TN or IPS, which compounds the issue.
You will get ghosting on a G-Sync VA panel as well.

Still. Things have improved since then, I would imagine older shittier panels are no longer a thing on the market.

fatslob-:O said:

Their older video cards likely didn't support the VESA adaptive refresh rate implementation standard in their display engine ... 

I don't imagine that it was an arbitrary decision on Nvidia's account to lock out the feature on certain video card generations ... 


Pascal isn't that fundamentally different from Maxwell though.



--::{PC Gaming Master Race}::--

Around the Network

https://www.techspot.com/article/1454-gsync-vs-freesync/

I think that this is a good read regarding features. 



Intel Core i7 8700K | 32 GB DDR 4 PC 3200 | ROG STRIX Z370-F Gaming | RTX 3090 FE| Crappy Monitor| HTC Vive Pro :3

Pemalite said:

Thanks, Chazore.

Ghosting is a big issue... I find that a large proportion of VA panels tend to ghost allot more than say... TN or IPS, which compounds the issue.
You will get ghosting on a G-Sync VA panel as well.

Still. Things have improved since then, I would imagine older shittier panels are no longer a thing on the market.

You have the wrong person ...

The other issue with FreeSync is that the feature does not work with overdrive on the vast majority of displays and it also doesn't help that AMD botched the launch of their own FreeSync 2 certification program by certifying a display that doesn't even support their advertised paramount feature which was low framerate compensation ... 

Pemalite said:

Pascal isn't that fundamentally different from Maxwell though.

Raven Ridge supported hardware accelerated VP9 decoding but RX Vega didn't on the other hand because even though both shared a nearly identical GPU architectures (more so than Maxwell/Pascal) yet both of them have different video engine's so it's not totally out of the realm of possibility as to why Maxwell was left out ...  

Plus things were totally different back then since it was far from certain which competing standard at the time was going to emerge on top so if you were Nvidia you'd have to reason that weren't contemplating adding a competing standard in their own hardware, they'd try to lobby as hard as possible to push their own technology to be ubiquitous ... (hindsight is always 20/20)

Maxwell was architected well before the tides started turning against G-Sync. It first started with the more diverse display selections, then came the X1X, then the biggest blow being HDMI Forum denying it but the last straw was with Intel adding hardware support and then things fell apart afterwards with G-Sync turning downright sour so in the end supporting the competing standard was an admission of defeat for them because they realized that G-Sync wasn't ever going to take off by that point ... 

Before any HDMI 2.1 displays or Intel hardware that supported VRR released, Nvidia just preemptively surrendered like they just did recently and sat there helplessly watching as the generic adaptive refresh technology emerged victorious in a decisive manner. There's no doubt AMD played a big role in making FreeSync as the ubiquitous standard but just as much credit goes to the other very important parties like the HDMI Forum and Intel for cementing it as the dominant industry standard ...  



Amd did the FreeSync 2.0 thingy and at that point the differnces between it and G-Sync are so small that unless you place them side by side, and carefully study things you probably wouldnt even be able to tell which is which.

The fact that FreeSync doesnt make your monitor cost a additional 100-150$ is reason enough for most people to get one of these instead.
Unless your the person that wants the very best, and dont care about costs at all..... FreeSync > G-Sync.

The fact that its comeing to TV's soon, and future consoles.... FreeSync is the future.



m0ney said:
Zkuq said:
Good, good. Also sounds like a half-decent time to upgrade my graphics card (GTX 770 2GB), although I'm not liking the price too much. Still, not that bad.

I plan to upgrade this year too (GTX 960 2GB) but I have decided to go team red this time, its been a while.

I upgraded on Black Friday from the GTX 960 2GB to a GTX 1070 6GB which I found for $429 with a copy of Monster Hunter. The 960 served me well, and could have continued to do so if I had chose to keep using it, especially for it's value (I think it was $200 when I picked it up a few months after release).



This was the best Geforce related news for me, in the past few months. My 4K Freesync Samsung monitor is going to be happy about this LOL. Finally going to be able to properly use the Adaptive V-Sync implementation and drop in-game V-Sync once and for all.



Current PC Build

CPU - i7 8700K 3.7 GHz (4.7 GHz turbo) 6 cores OC'd to 5.2 GHz with Watercooling (Hydro Series H110i) | MB - Gigabyte Z370 HD3P ATX | Gigabyte GTX 1080ti Gaming OC BLACK 11G (1657 MHz Boost Core / 11010 MHz Memory) | RAM - Corsair DIMM 32GB DDR4, 2400 MHz | PSU - Corsair CX650M (80+ Bronze) 650W | Audio - Asus Essence STX II 7.1 | Monitor - Samsung U28E590D 4K UHD, Freesync, 1 ms, 60 Hz, 28"