By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:

I'm not sure whether G-Sync is patented or not but at the very LEAST it looks as if Nvidia DOESN'T have a monopoly on adaptive refresh rate technology ... (That is why the inclusion of adaptive refresh rate into HDMI was unopposed by Nvidia.) 

Did the link not work? I just provided information regarding the patent.

fatslob-:O said:
Ageia existed for a little over half a decade before been acquired by Nvidia and their just one of the many other players out there regarding physics simulation which is also a largely *solved* field too so the impact of software patents are minimized in that area ...

Ageia Physics is covered under patent #20050075849.
https://www.google.com/patents/US7895411

fatslob-:O said:

Depends on what you define as 'overhaul' ... 

I think AMD has had lot's of changes over time. GCN 3 had some invasive changes to the ISA where the micrcode is totally incompatible with previous iterations and GCN 5 (Vega) added tons of new hardware features ... (Vega could easily be described as the R600 of it's time with it's primitive shaders, tiled rasterizer, conservative modes for rasterization, rapid packed math and other cruft.)

Graphics Core Next, whilst being a significant update, is still only an iterative update to Graphics Core Next.

Remember, GCN is extremely modular, large chunks of Graphics Core Next were actually untouched untill Vega half a decade later.

fatslob-:O said:

It's not as if end users are going to directly use the functionality ... (there's other ways to save bandwidth such as texture compression, and Hi-Z) 

And no I'd argue it's high end GPUs that benefit the most from DCC considering they have the highest ALU/BW ratio ...

It's the low-end GPU's with DDR3 memory or GDDR5 memory on a 64-bit bus that are the most desperate for bandwidth, they can even struggle with 720P gaming.
Playing around with a Radeon R7 240... Increasing the DDR3 Ram from 800Mhz to 1Ghz (25%) was a sizable increase in performance, it was the difference between Overwatch being unplayable... And Overwatch being playable.
That is the kind of gains Delta Colour Compression can bring. Not something to be shoved aside.

Rebadges suck.

fatslob-:O said:
If it's never used it may as well not be there to begin with ...


But they are used. Perhaps not by numbers that pleases you. But still used.
Heck, AMD found it useful enough to make an iterative update to True Audio with Polaris by bringing us True Audio Next.

fatslob-:O said:
That's technically what '10-bit colour' is, just HDR10 ... (10 as in 10-bit)

I am referring to colour depth.  The amount of distinct colours that can be portrayed on screen.

HDR or High-Dynamic Range is a technology meant to capture details in the darkest and lightest parts of an image simultaneously.

You can have 10-bit colour and no HDR, you can have HDR and only 8-bit colour.
HDR10 is in reference to HDR AND 10-bit colour.

And nor am I talking about the GPU's actual output anyway. I was referring to the video engine specifically, which doesn't support the necessary credentials for HEVC, Main 10, Rec 2020 etc'.

https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding
https://en.wikipedia.org/wiki/Rec._2020

fatslob-:O said:
Then demand that the devs have it programmed in whatever app that has 3D gfx ... (VSR isn't anything special either since it's just another method for supersampling.)

That is a messy way of doing things. It's much cleaner having such controls in a single unified place rather than spotty support between games.

Virtual Super Resolution is also done differently on AMD and nVidia's hardware.

nVidia for example implemented it as a shader program with a gaussion blurr filter on top for good measure.

AMD's implementation is superior. They have implemented it directly in hardware on the display controllers, resulting in no performance penalty.
The downside to AMD's approach however is hardware support. If the hardware doesn't support it, then you aren't getting it.

fatslob-:O said:

Then just get an RX 550 (50 watts) or wait for Raven Ridge/Vega 12 and be done with it ...

AMD just isn't obligated to update their entire product stack everytime a new microachitecture releases ... (AMD is probably sticking with Vega as a new baseline since it's their biggest change yet in terms of feature sets.)

The RX 550 is a shit HTPC card.

It has no low-profile, single slot variants in wide availability here. Let alone passively cooled ones.

The HTPC crowd does tend to be attracted to a very specific set of GPU's you know.

fatslob-:O said:

Or it could have none of the side effects once we consider that developers can control it ... (the only reason why half precision gets bad rep was because there was no good tools to use it in the past but that's changed now) 

Impacting image quality is not a real argument to not use FP16 since the above is in play ...

I have already gone into great depth of the extensive negative impacts FP16 has on image quality and the advantages it brings with it in regards to performance and power consumption before. So I would rather not go over it here again.

fatslob-:O said:
But Mac IS relevant in the gaming industry since lot's of developers are willing to port games using the Metal gfx API ...

The API is irrellevant.
Mac has always had a sizable amount of developers willing to throw them their support, but the majority of games just don't end up on the platform.

Their market size in AAA gaming does make them irrellevant. That's never going to change. Just like fetch, it's just never going to happen.



--::{PC Gaming Master Race}::--