By using this site, you agree to our Privacy Policy and our Terms of Use. Close
vivster said:
The i9 7900X seems like a nice upgrade to a 6850K.

Depends.

The i7 6850K is half the price of the i9 here. Uses less power. (Load and Idle).
And in lightly threaded applications there is a minimal performance gain.

http://www.anandtech.com/bench/product/1905?vs=1728

So unless you have a use for the extra 4x CPU cores. I wouldn't bother upgrading.

Now if you are on a Westemere, Thuban, Bulldozer Hex/Octo... That is an entirely different story.

fatslob-:O said:

My argument was Microsoft has ABSOLUTE control over the DirectX specification and that remains true no matter the circumstances ... 

Except they don't for reasons I outlined prior.
This discussion is becoming droll.

fatslob-:O said:


G-sync is not patented. (How else would the HDMI forum be able to standardize adaptive refresh rates without Nvidia's approval ? Nvidia knows this too since their one of the HDMI members too!)

 

Did you just seriously state that G-Sync is not patented?

Because this patent submission says otherwise.
But don't take my word for it.
https://www.google.com/patents/US8120621

G-Sync is also trademarked.
https://www.geforce.com/hardware/technology/g-sync

Hence the G-Sync "™".

fatslob-:O said:
As for patenting software such as Physx or Hairworks, that's nearly impossible since tons of physics simulation software were already written or already had their patents expired before Nvidia rolled out their own solution ... (Better luck next time for them ?)


When nVidia acquired Ageia, they acquired all their patents, licenses, trademarks, technology, everything.
Do I really need to hunt down the patents for all this as well?

fatslob-:O said:

I don't see the issue with it. That's what the vast majority of the chip designers do in this industry when creating *new* microachitectures ... (Why throw away millions of man hours or years of valid research the majority of which can be perfectly reused ? Heck, AMD Zen reused portions of their own in-house existing logic design while incorporating what it reverse engineered from their competitors designs. There's benefits too from reusing logic design like fixing CPU hardware bugs and believe it or not GPU hardware bugs too.)

I cannot agree with this.

One of the reasons why AMD is so behind nVidia is because they are not overhauling their architectures. nVidia has been. Kepler and Maxwell were massive overhauls, Pascal whilst being built from Maxwells base is still a solid improvement.
Volta should see a large shift as well.

fatslob-:O said:

The 520 and the 530 are OEM exclusive and can not be puchased individually so that's not an issue to customers who are going to buy new graphics cards. (By then Raven Ridge APUs will be suitable replacements for them.)

Doesn't matter what channel they are sold in. They are still rebadging junk from over half a decade ago and passing it off as something new.

It wasn't acceptable when nVidia was doing it years ago, it's not acceptable today with AMD.

fatslob-:O said:

DCC is a performance enhancing feature so the lowest end parts lacking it is no big loss

Delta Colour Compression is probably at it's most important on lower-end cards that are typically the most bandwidth constrained pieces of hardware.
It would go a long way in making low-end GPU's more e-sports friendly.

fatslob-:O said:

True Audio never went big so hat's not a loss either.

Funny. I use it.
Besides, you are missing the point entirely and nitpicking. It's the fact older hardware misses functionality of newer hardware. It really is that simple.

fatslob-:O said:
HDR10 support only requires WDDM 2.1 drivers (even original GCN can do HDR10 since that only requires changes to the content and software backends)

I am not talking about HDR.

fatslob-:O said:
VSR can be programmed in the game instead or AMD could just choose to give the features to the original GCN since there's no good reason hardware couldn't do it ...

People might be wanting it for uses outside of gaming.

fatslob-:O said:
You might have a point with HDMI 2.0 and maybe even HEVC (an open alternative like AV1 which is getting support from the biggest companies could supplant it in the end) ...

Lower end GPU's are more attractive HTPC solutions. They should support the latest and greatest standards.

No one wants a loud, power hungry heap of crap like a Radeon RX 580 in a HTPC.

And there is more that newer GCN parts support that GCN 1.0 doesn't. I was merely using a few examples.

fatslob-:O said:

I bet we won't ever see the fruits until software developers will start using these hardware features ... (You're right that AMD needs to invest but it's not in the hardware like you think, it's in the games.)

It's both.

fatslob-:O said:

How would you know that ? Do you have any data to back up that claim ? (precision issues are handled by the developers if there's going to be concerns about graphical quality compared to last time)

Reduced precision can have an impact on image quality.
I have already gone to great extent elaborating upon FP16 and it's impacts on potential image quality in other threads, so I would rather not have to repeat that again here.

fatslob-:O said:

That's changing with their Metal gfx API (which also has FP16 support) gaining traction with Windows ports.

iOS will never be a high-end gaming platform.
Mac doesn't have the market penetration to be even be remotely relevant in the gaming industry.

Anyway. I think we have beaten this discussion to death now. The horse is dead.



--::{PC Gaming Master Race}::--