| fatslob-:O said: Nvidia will finally support generic adaptive refresh rate implementations on their 10/20 series video cards after all these years ... |
The moment VRR (almost based on AMD's Freesync) became the standard, Nvidia had no choice but to adopt it. They still have the G-Sync 2 version for HDR content to sell premium monitors.
Oh, and it's worth noting that, even if a monitor doesn't pass Nvidia's certification, it won't mean that it won't work. We'll still be able to enable it from the default off mode.
haxxiy said:
Zero chance the boost rate of every processor is neatly spaced 800 MHz away from the core clock, no exceptions, and that the highest clocks, but without the highest TDP, are found in the cut down 12 core part of all things. Not to mention we're still far from release. It smells like something guessing about their dream specs. But I get that AMD is often the target of wishful speculation, they're the underdogs who supposedly bring more value for your money to the table, so it's natural people become emotionally invested in their products. |
You're right, people want AMD to almost obliterate Intel after so many years.
In any case, we'll probably know how much of true there is in that rumor in a couple of days when AMD makes its CES presentation.
Please excuse my bad English.
Former gaming PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070
Current gaming PC: R5-7600, 32GB RAM 6000MT/s (CL30) and a RX 9060XT 16GB
Steam / Live / NNID : jonxiquet Add me if you want, but I'm a single player gamer.







