By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:

Did you just seriously state that G-Sync is not patented?


Because this patent submission says otherwise.
But don't take my word for it.
https://www.google.com/patents/US8120621

G-Sync is also trademarked.
https://www.geforce.com/hardware/technology/g-sync

Hence the G-Sync "™".

I'm not sure whether G-Sync is patented or not but at the very LEAST it looks as if Nvidia DOESN'T have a monopoly on adaptive refresh rate technology ... (That is why the inclusion of adaptive refresh rate into HDMI was unopposed by Nvidia.) 

Pemalite said:

When nVidia acquired Ageia, they acquired all their patents, licenses, trademarks, technology, everything.

Do I really need to hunt down the patents for all this as well?

Ageia existed for a little over half a decade before been acquired by Nvidia and their just one of the many other players out there regarding physics simulation which is also a largely *solved* field too so the impact of software patents are minimized in that area ...

Pemalite said:

I cannot agree with this.

One of the reasons why AMD is so behind nVidia is because they are not overhauling their architectures. nVidia has been. Kepler and Maxwell were massive overhauls, Pascal whilst being built from Maxwells base is still a solid improvement. 
Volta should see a large shift as well.

Depends on what you define as 'overhaul' ... 

I think AMD has had lot's of changes over time. GCN 3 had some invasive changes to the ISA where the micrcode is totally incompatible with previous iterations and GCN 5 (Vega) added tons of new hardware features ... (Vega could easily be described as the R600 of it's time with it's primitive shaders, tiled rasterizer, conservative modes for rasterization, rapid packed math and other cruft.)

Pemalite said:

Delta Colour Compression is probably at it's most important on lower-end cards that are typically the most bandwidth constrained pieces of hardware.

It would go a long way in making low-end GPU's more e-sports friendly.

It's not as if end users are going to directly use the functionality ... (there's other ways to save bandwidth such as texture compression, and Hi-Z) 

And no I'd argue it's high end GPUs that benefit the most from DCC considering they have the highest ALU/BW ratio ... 

Pemalite said:

Funny. I use it.

Besides, you are missing the point entirely and nitpicking. It's the fact older hardware misses functionality of newer hardware. It really is that simple.

If it's never used it may as well not be there to begin with ... 

Pemalite said:

I am not talking about HDR.

That's technically what '10-bit colour' is, just HDR10 ... (10 as in 10-bit) 

Pemalite said:

People might be wanting it for uses outside of gaming.

Then demand that the devs have it programmed in whatever app that has 3D gfx ... (VSR isn't anything special either since it's just another method for supersampling.)

Pemalite said:

Lower end GPU's are more attractive HTPC solutions. They should support the latest and greatest standards.


No one wants a loud, power hungry heap of crap like a Radeon RX 580 in a HTPC.

And there is more that newer GCN parts support that GCN 1.0 doesn't. I was merely using a few examples.

Then just get an RX 550 (50 watts) or wait for Raven Ridge/Vega 12 and be done with it ...

AMD just isn't obligated to update their entire product stack everytime a new microachitecture releases ... (AMD is probably sticking with Vega as a new baseline since it's their biggest change yet in terms of feature sets.) 

Pemalite said:

Reduced precision can have an impact on image quality.

I have already gone to great extent elaborating upon FP16 and it's impacts on potential image quality in other threads, so I would rather not have to repeat that again here.

Or it could have none of the side effects once we consider that developers can control it ... (the only reason why half precision gets bad rep was because there was no good tools to use it in the past but that's changed now) 

Impacting image quality is not a real argument to not use FP16 since the above is in play ... 

Pemalite said:


iOS will never be a high-end gaming platform.

Mac doesn't have the market penetration to be even be remotely relevant in the gaming industry.

Anyway. I think we have beaten this discussion to death now. The horse is dead.

But Mac IS relevant in the gaming industry since lot's of developers are willing to port games using the Metal gfx API ...