By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

fatslob-:O said:

Nvidia will finally support generic adaptive refresh rate implementations on their 10/20 series video cards after all these years ...

Didn't Intel recently mention their future GPU's would be supporting this as well? Hmm....



PS1   - ! - We must build a console that can alert our enemies.

PS2  - @- We must build a console that offers online living room gaming.

PS3   - #- We must build a console that’s powerful, social, costs and does everything.

PS4   - $- We must build a console that’s affordable, charges for services, and pumps out exclusives.

PRO  -%-We must build a console that's VR ready, checkerboard upscales, and sells but a fraction of the money printer.

PS5   - ^ -We must build a console that’s a generational cross product, with RT lighting, and price hiking.

PRO  -&- We must build a console that Super Res upscales and continues the cost increases.

Around the Network
EricHiggin said:

Didn't Intel recently mention their future GPU's would be supporting this as well? Hmm....

Yeah, Intel will start to support generic adaptive refresh rate technology starting with their Gen 11 iGPUs which is either going to release later this year or next year if they suffer some more 10nm logic node delays ... 



fatslob-:O said:

Nvidia will finally support generic adaptive refresh rate implementations on their 10/20 series video cards after all these years ...

Does that mean I don't have to buy G-Sync monitors anymore?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

vivster said:

Does that mean I don't have to buy G-Sync monitors anymore?

If you have either a 10 or 20 series video card then absolutely ... 

The only benefit G-sync brings are it's tighter standards but as long as you do your research regarding the many options out there supporting adaptive refresh rate implementations it can be circumvented. It arguably pays a high dividend in the end since generic implementations of adaptive refresh rate are future proof which means tons of devices in the future are guaranteed to work with the technology, this is not so for G-sync ... 



fatslob-:O said:
vivster said:

Does that mean I don't have to buy G-Sync monitors anymore?

If you have either a 10 or 20 series video card then absolutely ... 

The only benefit G-sync brings are it's tighter standards but as long as you do your research regarding the many options out there supporting adaptive refresh rate implementations it can be circumvented. It arguably pays a high dividend in the end since generic implementations of adaptive refresh rate are future proof which means tons of devices in the future are guaranteed to work with the technology, this is not so for G-sync ... 

Future proofing isn't really something I do with my PC hardware. The only things I want or expect to last longer than 2 years is my current SPC and my TV. If at some point my G-Sync monitor isn't compatible anymore with whatever I'm doing it's time for a new one either way.

The important part would be that I am not limited by G-Sync and have a greater variety once I buy my next monitor.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Around the Network

Here's a nice infographic ... 

Basically, Nvidia will automatically enable variable refresh rate with these approved displays in their drivers that don't implement G-Sync but you can still enable VRR manually in their driver settings with displays weren't approved ... 



vivster said:
fatslob-:O said:

Nvidia will finally support generic adaptive refresh rate implementations on their 10/20 series video cards after all these years ...

Does that mean I don't have to buy G-Sync monitors anymore?

No, now you have to buy a G-Sync-Ultimate monitor if you want the best.

https://www.reddit.com/r/nvidia/comments/adexzm/announcing_gsync_compatible_gsync_ultimate/

Last edited by Conina - on 07 January 2019

JEMC said:
Pemalite said:

If those clocks are true and if AMD manages to improve their gaming performance, Intel could be in very big trouble.

After all, the 9900K works in a range of 3.6 to 5.0GHz, making it only a slight faster in single threaded programs, but slower than it in multi-threaded ones, than the (probably) much cheaper 3600X. And then there's the 3700X, faster and with more cores.

But yes, it's better to wait until we know the final specs and then the reviews come in to gauge how these new processors compare with Intel's latest.

Zero chance the boost rate of every processor is neatly spaced 800 MHz away from the core clock, no exceptions, and that the highest clocks, but without the highest TDP, are found in the cut down 12 core part of all things. Not to mention we're still far from release. It smells like something guessing about their dream specs.

But I get that AMD is often the target of wishful speculation, they're the underdogs who supposedly bring more value for your money to the table, so it's natural people become emotionally invested in their products.



 

 

 

 

 

fatslob-:O said:

Nvidia will finally support generic adaptive refresh rate implementations on their 10/20 series video cards after all these years ...

The moment VRR (almost based on AMD's Freesync) became the standard, Nvidia had no choice but to adopt it. They still have the G-Sync 2 version for HDR content to sell premium monitors.

Oh, and it's worth noting that, even if a monitor doesn't pass Nvidia's certification, it won't mean that it won't work. We'll still be able to enable it from the default off mode.

haxxiy said:
JEMC said:

If those clocks are true and if AMD manages to improve their gaming performance, Intel could be in very big trouble.

After all, the 9900K works in a range of 3.6 to 5.0GHz, making it only a slight faster in single threaded programs, but slower than it in multi-threaded ones, than the (probably) much cheaper 3600X. And then there's the 3700X, faster and with more cores.

But yes, it's better to wait until we know the final specs and then the reviews come in to gauge how these new processors compare with Intel's latest.

Zero chance the boost rate of every processor is neatly spaced 800 MHz away from the core clock, no exceptions, and that the highest clocks, but without the highest TDP, are found in the cut down 12 core part of all things. Not to mention we're still far from release. It smells like something guessing about their dream specs.

But I get that AMD is often the target of wishful speculation, they're the underdogs who supposedly bring more value for your money to the table, so it's natural people become emotionally invested in their products.

You're right, people want AMD to almost obliterate Intel after so many years.

In any case, we'll probably know how much of true there is in that rumor in a couple of days when AMD makes its CES presentation.



Please excuse my bad English.

Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Still happy with my current G-sync monitor, so I've no reason to get all giddy for any sort of Free-sync monitor anytime soon. Maybe in 3-4 more years, hopefully when the quality and standards have quadrupled since then.



Mankind, in its arrogance and self-delusion, must believe they are the mirrors to God in both their image and their power. If something shatters that mirror, then it must be totally destroyed.