By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - GeForce GTX officially supports FREESYNC

m0ney said:
Zkuq said:
Good, good. Also sounds like a half-decent time to upgrade my graphics card (GTX 770 2GB), although I'm not liking the price too much. Still, not that bad.

I plan to upgrade this year too (GTX 960 2GB) but I have decided to go team red this time, its been a while.

I upgraded on Black Friday from the GTX 960 2GB to a GTX 1070 6GB which I found for $429 with a copy of Monster Hunter. The 960 served me well, and could have continued to do so if I had chose to keep using it, especially for it's value (I think it was $200 when I picked it up a few months after release).



Around the Network

This was the best Geforce related news for me, in the past few months. My 4K Freesync Samsung monitor is going to be happy about this LOL. Finally going to be able to properly use the Adaptive V-Sync implementation and drop in-game V-Sync once and for all.



Current PC Build

CPU - i7 8700K 3.7 GHz (4.7 GHz turbo) 6 cores OC'd to 5.2 GHz with Watercooling (Hydro Series H110i) | MB - Gigabyte Z370 HD3P ATX | Gigabyte GTX 1080ti Gaming OC BLACK 11G (1657 MHz Boost Core / 11010 MHz Memory) | RAM - Corsair DIMM 32GB DDR4, 2400 MHz | PSU - Corsair CX650M (80+ Bronze) 650W | Audio - Asus Essence STX II 7.1 | Monitor - Samsung U28E590D 4K UHD, Freesync, 1 ms, 60 Hz, 28"

fatslob-:O said:
Pemalite said:

Thanks, Chazore.

Ghosting is a big issue... I find that a large proportion of VA panels tend to ghost allot more than say... TN or IPS, which compounds the issue.
You will get ghosting on a G-Sync VA panel as well.

Still. Things have improved since then, I would imagine older shittier panels are no longer a thing on the market.

You have the wrong person ...

The other issue with FreeSync is that the feature does not work with overdrive on the vast majority of displays and it also doesn't help that AMD botched the launch of their own FreeSync 2 certification program by certifying a display that doesn't even support their advertised paramount feature which was low framerate compensation ... 

I was being facetious.

Don't get me wrong, I am not actually disagreeing with you. (Nor do I use Freesync.)
Just wanted Chazores take on it, sometimes those who aren't tech-heads like we are, have a different perspective on where a technology falls short.

fatslob-:O said:
Pemalite said:

Pascal isn't that fundamentally different from Maxwell though.

Raven Ridge supported hardware accelerated VP9 decoding but RX Vega didn't on the other hand because even though both shared a nearly identical GPU architectures (more so than Maxwell/Pascal) yet both of them have different video engine's so it's not totally out of the realm of possibility as to why Maxwell was left out ...

Graphics Core Next has a different, more modular design philosophy than nVidia's efforts... AMD can update one part of the chip like... Adding rapid packed math and leaving the rest of the chip identical.

Plus there is a bigger need for more flexible hardware decode/encode on mobile for power consumption reasons... Especially when such a large portion of users will be using VP9 via Youtube. - I also have Raven Ridge device.
Not defending AMD's actions, it would be nice for the decode/encode blocks to be updated to bring power consumption down.

VP9 is hardware accelerated on Vega, just not done on the video engine, it's shader assisted... Same happens on Polaris. (Which I also have.)

fatslob-:O said:

Maxwell was architected well before the tides started turning against G-Sync. It first started with the more diverse display selections, then came the X1X, then the biggest blow being HDMI Forum denying it but the last straw was with Intel adding hardware support and then things fell apart afterwards with G-Sync turning downright sour so in the end supporting the competing standard was an admission of defeat for them because they realized that G-Sync wasn't ever going to take off by that point ... 

Before any HDMI 2.1 displays or Intel hardware that supported VRR released, Nvidia just preemptively surrendered like they just did recently and sat there helplessly watching as the generic adaptive refresh technology emerged victorious in a decisive manner. There's no doubt AMD played a big role in making FreeSync as the ubiquitous standard but just as much credit goes to the other very important parties like the HDMI Forum and Intel for cementing it as the dominant industry standard ...  

I am aware. Just having a whinge. Granted I have a Pascal GPU anyway, so it's ultimately irrelevant to me.



--::{PC Gaming Master Race}::--