By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Pemalite said:

Microsoft may not need S3 graphics anymore. But it is still supporting their technologies even in Direct X 12.
Besides. You are missing the entire point of this.

I was providing examples of where hardware manufacturers contributed to the Direct X specification. Nitpicking at a few edge-case scenarios isn't changing that fact.

As for Qualcomm. They likely license AMD's technologies and patents anyway.
ADRENO is a word play on "Radeon". - And Qualcomm bought the technology from AMD in the first place.

As for patenting computer graphics technology? Nope. Still happening. G-Sync, PhysX, Hairworks and so on are examples. Might not be happening at the same rates as it used to during the 3D boom, but it's still occuring.

My point wasn't even how IHVs did not contribute to the DirectX spec so that's just a strawman ... 

My argument was Microsoft has ABSOLUTE control over the DirectX specification and that remains true no matter the circumstances ... 

G-sync is not patented. (How else would the HDMI forum be able to standardize adaptive refresh rates without Nvidia's approval ? Nvidia knows this too since their one of the HDMI members too!) 

As for patenting software such as Physx or Hairworks, that's nearly impossible since tons of physics simulation software were already written or already had their patents expired before Nvidia rolled out their own solution ... (Better luck next time for them ?) 

I think the age of patenting computer graphics technology are really over once we head for real time physically based global illumination solutions like path tracing or other light transport methods because by then the field of real time computer graphics will have largely been *solved* ... (the only motivation behind creating patents is to capitalize on solutions for unsolved problems) 

Pemalite said:

It is a big issue.


AMD hasn't created "4x more graphics architectures since GCN".
Graphics Core Next 2, 3, 4, 5 and so on... Are all just iterative updates of GCN 1.0. They aren't brand new top-to-bottom designs.

Not only that, but AMD is rebadging GCN 1.0 parts from 2012 even today with the Radeon RX 520 and 530.

And by rebadging those parts, they miss out on some vital functionality... Their Tessellation performance is laughable, they have no Delta Colour Compression, no True Audio, no power saving and power state enhancements, no HEVC decoding, no HDMI 2.0... And if I remember correctly, also no 10-bit colour support, no Virtual Super Resolution... And I could go on, but I made my point.


As for Vega... Vega is just the GPU on top of the Radeon RX 500 stack. It is what Fury was to the Radeon 300 series.

I don't see the issue with it. That's what the vast majority of the chip designers do in this industry when creating *new* microachitectures ... (Why throw away millions of man hours or years of valid research the majority of which can be perfectly reused ? Heck, AMD Zen reused portions of their own in-house existing logic design while incorporating what it reverse engineered from their competitors designs. There's benefits too from reusing logic design like fixing CPU hardware bugs and believe it or not GPU hardware bugs too.) 

The 520 and the 530 are OEM exclusive and can not be puchased individually so that's not an issue to customers who are going to buy new graphics cards. (By then Raven Ridge APUs will be suitable replacements for them.) 

Tessellation performance is mostly linked to lower geometry performance so it's no secret why AMD GPUs compare badly with Nvidia counterparts. DCC is a performance enhancing feature so the lowest end parts lacking it is no big loss and True Audio never went big so hat's not a loss either. HDR10 support only requires WDDM 2.1 drivers (even original GCN can do HDR10 since that only requires changes to the content and software backends) and VSR can be programmed in the game instead or AMD could just choose to give the features to the original GCN since there's no good reason hardware couldn't do it ... 

You might have a point with HDMI 2.0 and maybe even HEVC (an open alternative like AV1 which is getting support from the biggest companies could supplant it in the end) ... 

Pemalite said:


Agreed.


But part of the issue is... AMD only made the Radeon group an independent entity again just a couple years ago, they are still abiding by their old plan, we won't see the fruits of AMD's seperate of the Radeon group for another year or two.

AMD needs to invest. Not just iterate. 

nVidia has done an excellent job innovating even without any real competition... And that has paid off for them.

I bet we won't ever see the fruits until software developers will start using these hardware features ... (You're right that AMD needs to invest but it's not in the hardware like you think, it's in the games.)

Pemalite said:


Just because they support FP16, doesn't mean there will be any gains with it over FP32.

The amount of GPU's available with double FP16 is miniscule. And then if you compare those GPU's with Steams statistics... Well. You get the point.

That depends, if your game is ALU bound or or has lot's of register pressure you will almost certainly see a gain with FP16 ... 

FP16 supported hardware need not be double rate and it's getting a lot of traction fast in hardware ... (AMD APUs with GCN3, discrete AMD GPUs with GCN 3/4, Intel Broadwell/Skylake/Kaby Lake CPUs and soon we'll have Vega, Raven Ridge, and Coffee Lake so FP16 support in hardware is far from 'miniscule'.) 

Pemalite said:

Then the performance gains will likely not be worth it for the significantly reduced precision. 

How would you know that ? Do you have any data to back up that claim ? (precision issues are handled by the developers if there's going to be concerns about graphical quality compared to last time)

Pemalite said:

On the contrary. That isn't what I think.

Apple isn't doing high-end AAA games in any great extent anyway.

That's changing with their Metal gfx API (which also has FP16 support) gaining traction with Windows ports. 

There's 3 gfx APIs (DX12, GNM/X, Metal) or you can make that 4 if you count in Vulkan with AMD's extension which will support FP16! (AMD, Apple, Intel, Microsoft and Sony are all in this together!) 

If I had to bet one thing it would be FP16 getting more traction than async compute (even though it's featured in every modern API) ... 



Around the Network
vivster said:
The i9 7900X seems like a nice upgrade to a 6850K.

Depends.

The i7 6850K is half the price of the i9 here. Uses less power. (Load and Idle).
And in lightly threaded applications there is a minimal performance gain.

http://www.anandtech.com/bench/product/1905?vs=1728

So unless you have a use for the extra 4x CPU cores. I wouldn't bother upgrading.

Now if you are on a Westemere, Thuban, Bulldozer Hex/Octo... That is an entirely different story.

fatslob-:O said:

My argument was Microsoft has ABSOLUTE control over the DirectX specification and that remains true no matter the circumstances ... 

Except they don't for reasons I outlined prior.
This discussion is becoming droll.

fatslob-:O said:


G-sync is not patented. (How else would the HDMI forum be able to standardize adaptive refresh rates without Nvidia's approval ? Nvidia knows this too since their one of the HDMI members too!)

 

Did you just seriously state that G-Sync is not patented?

Because this patent submission says otherwise.
But don't take my word for it.
https://www.google.com/patents/US8120621

G-Sync is also trademarked.
https://www.geforce.com/hardware/technology/g-sync

Hence the G-Sync "™".

fatslob-:O said:
As for patenting software such as Physx or Hairworks, that's nearly impossible since tons of physics simulation software were already written or already had their patents expired before Nvidia rolled out their own solution ... (Better luck next time for them ?)


When nVidia acquired Ageia, they acquired all their patents, licenses, trademarks, technology, everything.
Do I really need to hunt down the patents for all this as well?

fatslob-:O said:

I don't see the issue with it. That's what the vast majority of the chip designers do in this industry when creating *new* microachitectures ... (Why throw away millions of man hours or years of valid research the majority of which can be perfectly reused ? Heck, AMD Zen reused portions of their own in-house existing logic design while incorporating what it reverse engineered from their competitors designs. There's benefits too from reusing logic design like fixing CPU hardware bugs and believe it or not GPU hardware bugs too.)

I cannot agree with this.

One of the reasons why AMD is so behind nVidia is because they are not overhauling their architectures. nVidia has been. Kepler and Maxwell were massive overhauls, Pascal whilst being built from Maxwells base is still a solid improvement.
Volta should see a large shift as well.

fatslob-:O said:

The 520 and the 530 are OEM exclusive and can not be puchased individually so that's not an issue to customers who are going to buy new graphics cards. (By then Raven Ridge APUs will be suitable replacements for them.)

Doesn't matter what channel they are sold in. They are still rebadging junk from over half a decade ago and passing it off as something new.

It wasn't acceptable when nVidia was doing it years ago, it's not acceptable today with AMD.

fatslob-:O said:

DCC is a performance enhancing feature so the lowest end parts lacking it is no big loss

Delta Colour Compression is probably at it's most important on lower-end cards that are typically the most bandwidth constrained pieces of hardware.
It would go a long way in making low-end GPU's more e-sports friendly.

fatslob-:O said:

True Audio never went big so hat's not a loss either.

Funny. I use it.
Besides, you are missing the point entirely and nitpicking. It's the fact older hardware misses functionality of newer hardware. It really is that simple.

fatslob-:O said:
HDR10 support only requires WDDM 2.1 drivers (even original GCN can do HDR10 since that only requires changes to the content and software backends)

I am not talking about HDR.

fatslob-:O said:
VSR can be programmed in the game instead or AMD could just choose to give the features to the original GCN since there's no good reason hardware couldn't do it ...

People might be wanting it for uses outside of gaming.

fatslob-:O said:
You might have a point with HDMI 2.0 and maybe even HEVC (an open alternative like AV1 which is getting support from the biggest companies could supplant it in the end) ...

Lower end GPU's are more attractive HTPC solutions. They should support the latest and greatest standards.

No one wants a loud, power hungry heap of crap like a Radeon RX 580 in a HTPC.

And there is more that newer GCN parts support that GCN 1.0 doesn't. I was merely using a few examples.

fatslob-:O said:

I bet we won't ever see the fruits until software developers will start using these hardware features ... (You're right that AMD needs to invest but it's not in the hardware like you think, it's in the games.)

It's both.

fatslob-:O said:

How would you know that ? Do you have any data to back up that claim ? (precision issues are handled by the developers if there's going to be concerns about graphical quality compared to last time)

Reduced precision can have an impact on image quality.
I have already gone to great extent elaborating upon FP16 and it's impacts on potential image quality in other threads, so I would rather not have to repeat that again here.

fatslob-:O said:

That's changing with their Metal gfx API (which also has FP16 support) gaining traction with Windows ports.

iOS will never be a high-end gaming platform.
Mac doesn't have the market penetration to be even be remotely relevant in the gaming industry.

Anyway. I think we have beaten this discussion to death now. The horse is dead.



--::{PC Gaming Master Race}::--

Pemalite said:
vivster said:
The i9 7900X seems like a nice upgrade to a 6850K.

Depends.

The i7 6850K is half the price of the i9 here. Uses less power. (Load and Idle).
And in lightly threaded applications there is a minimal performance gain.

http://www.anandtech.com/bench/product/1905?vs=1728

So unless you have a use for the extra 4x CPU cores. I wouldn't bother upgrading.

Now if you are on a Westemere, Thuban, Bulldozer Hex/Octo... That is an entirely different story.

Should've looked at computerbase benchmarks first. In my relevant applications the 7900X is about 3-5% better than the 3850K. That's not enough to upgrade. Though it could still do noticeably better when I consider that my numerous side applications will be spread over more cores, which leaves more resources for more intensive stuff like gaming.

So overall it might be 10% increase maybe when playing RL with numerous other applications.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Time for the news:

 

SALES & "SALES"/DEALS

There's a new Humble Bundle: the Humble micro Jumbo Bundle! It's a collection of low-priced indie games. Also, if you enter your email, you can get two of those game for free!

 

No let's take a look at Steam's Midweek Madness sales:

 

SOFTWARE

-empty-

 

MODS/EMULATORS

New screenshots released for Half-Life’s 1999 demo recreation, Black Mesa: Uplink Redux
http://www.dsogaming.com/news/new-screenshots-released-for-half-lifes-1999-demo-recreation-black-mesa-uplink-redux/
Modder ‘Hezus’ is currently working on Black Mesa: Uplink Redux and released some brand new screenshots for it. Black Mesa: Uplink is the recreation of the Half-Life demo (Hal-Life: Uplink) released by Valve in early 1999 which featured content that was scrapped from the original storyline.

 

Warcraft: Total War alpha mod brings Azeroth to the turn-based battlefield
http://www.pcgamer.com/warcraft-total-war-alpha-mod-brings-azeroth-to-the-turn-based-battlefield/
Earlier this year, hobbyist modder Eoghan Wolfkin released an alpha state, work-in-progress Total War total conversion mod that brought World of Warcraft to Total War by way of the latter's Medieval 2 instalment. Posted in Total War Center in March, Wolfkin stressed that their mod was incomplete but could be used as a "resource" for future Warcraft mods. It's now made its way onto ModDB.

>>Here is it's ModDB page, if you're interested.

 

GAMING NEWS

Sonic Mania will feature a competitive multiplayer mode
http://www.dsogaming.com/news/sonic-mania-will-feature-a-competitive-multiplayer-mode/
SEGA has announced that Sonic Mania will feature a competitive multiplayer mode. In order to showcase this mode in action, the team released a new trailer dedicated to it that can be viewed below.

 

Tempest 4000 is coming to the PC this Holiday season
http://www.dsogaming.com/news/tempest-4000-is-coming-to-the-pc-this-holiday-season/
Atari today announced Tempest 4000, a visually stunning, action-packed shooter based on the classic arcade game, Tempest, will be available this holiday season on the PC. Developed by legendary game designer Jeff Minter, Tempest 4000 remains faithful to the original fast-paced gameplay while adding exciting new features and updated graphics.

 

Ni no Kuni II: Revenant Kingdom – Official PC System Requirements
http://www.dsogaming.com/news/ni-no-kuni-ii-revenant-kingdom-official-pc-system-requirements/
Bandai Namco has revealed the official PC requirements for Ni no Kuni II: Revenant Kingdom. According to the specs, PC gamers will need at least an Intel Core i5-4460 or an AMD FX-6300 CPU with 4GB of RAM and an NVIDIA GeForce GTX 750Ti or an AMD Radeon R7 260x graphics card.

>>There's another Ni no Kuni II news article, in the next post.

 

Valve Announces Artifact – DOTA 2 Card Game
http://www.dsogaming.com/news/valve-announces-artifact-dota-2-card-game/
During tonights The International Valve has subsided everyone’s curiosity and have announced that they will be releasing a new digital DOTA 2 card game called Artifact. Artifact will be featuring most things that DOTA does. So, the game will have lanes as well as characters included, there will also be cards for pushing waves of minions. Valve has released a teaser trailer to the game and it can be found below.

>>There's a thread about this new game, made by guess who... yes, Shikamo.

 

Gearbox reveals its new FPS/card game hybrid, Project 1v1
http://www.dsogaming.com/news/gearbox-reveals-its-new-fpscard-game-hybrid-project-1v1/
Gearbox has just unveiled a new game it’s been working on. This game, called Project 1v1, is a competitive first-person shooter that promises to combine the action of fast-paced 1v1 first-person combat with the metagame strategy of a collectible card game.

 

Turn 10 and Microsoft reveal the fourth list of cars for Forza Motorsport 7
http://www.dsogaming.com/news/turn-10-and-microsoft-reveal-the-fourth-list-of-cars-for-forza-motorsport-7/
Turn 10 Studios and Microsoft revealed the fourth list of cars that will be featured in Forza Motosport 7. This week, Turn 10 revealed 102 American cars and trucks from as early as 1970. PC and Xbox One racers will be able to drive cars such as Ford Shelby GT350R, Chevrolet Camaro, Dodge Viper, Ford Fiesta and Focus, as well as Jeep Grand Cherokee.

 

NBA 2K18 – First gameplay footage shown in this behind the scenes video
http://www.dsogaming.com/videotrailer-news/nba-2k18-first-gameplay-footage-shown-in-this-behind-the-scenes-video/
2K Games has released a new behind-the-scenes video for NBA 2K18, showing the first gameplay footage from it. This video reveals major updates to the appearance of uniforms and the players themselves for the most true-to-life visuals yet.

 

Rumor: Rez Infinite Possibly Coming to PC? [UPDATE]
http://www.dsogaming.com/news/rumor-rez-infinite-possibly-coming-to-pc/
Enhance Studios posted on their Twitter page not to long ago a gif, teasing fans of a possible port to PC coming, only leaving us with two digits and no meaning – 8.9.~
UPDATE: The game has just been released on Steam!



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

The news, chapter two:

 

Shinji Mikami recommends playing The Evil Within 2 on 'Casual' setting
http://www.pcgamer.com/shinji-mikami-recommends-playing-the-evil-within-2-on-casual-setting/
Last week we learned that The Evil Within 2 will push its psychological horror leanings "much harder" than its predecessor, and now series mastermind Shinji Mikami has recommended playing on the game's 'Casual' setting.

 

Ni No Kuni 2 special editions revealed, season pass confirmed
http://www.pcgamer.com/ni-no-kuni-2-special-editions-revealed-season-pass-confirmed/
Ni No Kuni 2: Revenant Kingdom won't make its previously-planned 2017 release, but that hasn't stopped publisher Bandai Namco from announcing three planned release editions, a season pass—yes, there will be one of those—and a preorder bonus, even though the game isn't actually available for preorder on PC just yet.

 

Overwatch Summer Games skins revealed
http://www.pcgamer.com/overwatch-summer-games-skins/
The Overwatch Summer Games seasonal event is live. The now-annual event launched today, bringing with it a pile of new Olympic and summer skins, unlocking last year's Summer Games loot, and bringing back the Rocket League-like game mode Lúcioball.

 

Heroes of the Storm's controversial Hanamura map is being removed and reworked
http://www.pcgamer.com/heroes-of-the-storms-controversial-hanamura-map-is-being-removed-and-reworked/
Heroes of the Storm players have not been happy with Hanamura. The Overwatch-themed map where players must escort payloads in order to damage the enemy core was introduced in April of this year, but quickly drew the ire of nearly the entire HotS community.

 

A new Destiny 2 trailer shows off the chaos of competitive PvP
http://www.pcgamer.com/a-new-destiny-2-trailer-shows-off-the-chaos-of-competitive-pvp/
A new Destiny 2 trailer hit today, this time with a focus on the competitive PvP modes. While it’s offers few revelations for those familiar with the original, it’s still an energetic montage (with some seriously bad music) that condenses the action the 'Crucible' (which is what Destiny calls its multiplayer) into a couple of minutes. For those unfamiliar with the game, class breakdown is pretty helpful—and I never tire of seeing the agile Hunter class wield a hand cannon. I get the impression mouse and keyboard will mean they’re going to be a nuisance on PC, but maybe Bungie’s already on the case. There’s some heavy super usage in there too, if you’ve yet to see those yet.

>>And after watching that, you can go and take a bath while enjoying, wait for it, the Destiny 2 scented candles!... I have no words.

 

No Man's Sky expansion 'Atlas Rises' will launch this week for free
http://www.pcgamer.com/no-mans-sky-expansion-atlas-rises-will-launch-this-week-for-free/
After weeks – or months? – of ARG shenanigans, it looks like Hello Games is gearing up to unveil the next major No Man's Sky update. Today they lifted the curtain just a tad, enough to reveal that update 1.3 is called Atlas Rises and that it will release some time this week.

 

PUBG weekly update lets dead players view markers on mini and world maps
http://www.pcgamer.com/pubg-weekly-update-lets-dead-players-view-markers-on-minimap/
PlayerUnknown's Battlegrounds has deployed its 20th weekly patch to Test Servers ahead of live introduction tomorrow. With it, comes a typical host of bug fixes, some optimisation tweaks, and a world and mini-map adjustment designed to aid team play.

 

GTA Online gets rad machine gun-equipped truck, new Adversary Mode in latest update
http://www.pcgamer.com/gta-online-gets-rad-machine-gun-equipped-truck-new-adversary-mode-in-latest-update/
GTA Online's latest update adds a new variant to the game's Overtime Rumble Adversary mode, called Overtime Shootout. This is a turn-based version of the target range mode, where players alternately use their vehicles to try and land the highest score. Playing Overtime Shootout from now until August 14 will bag you double GTA$ and Rockstar Points.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
Pemalite said:

Did you just seriously state that G-Sync is not patented?


Because this patent submission says otherwise.
But don't take my word for it.
https://www.google.com/patents/US8120621

G-Sync is also trademarked.
https://www.geforce.com/hardware/technology/g-sync

Hence the G-Sync "™".

I'm not sure whether G-Sync is patented or not but at the very LEAST it looks as if Nvidia DOESN'T have a monopoly on adaptive refresh rate technology ... (That is why the inclusion of adaptive refresh rate into HDMI was unopposed by Nvidia.) 

Pemalite said:

When nVidia acquired Ageia, they acquired all their patents, licenses, trademarks, technology, everything.

Do I really need to hunt down the patents for all this as well?

Ageia existed for a little over half a decade before been acquired by Nvidia and their just one of the many other players out there regarding physics simulation which is also a largely *solved* field too so the impact of software patents are minimized in that area ...

Pemalite said:

I cannot agree with this.

One of the reasons why AMD is so behind nVidia is because they are not overhauling their architectures. nVidia has been. Kepler and Maxwell were massive overhauls, Pascal whilst being built from Maxwells base is still a solid improvement. 
Volta should see a large shift as well.

Depends on what you define as 'overhaul' ... 

I think AMD has had lot's of changes over time. GCN 3 had some invasive changes to the ISA where the micrcode is totally incompatible with previous iterations and GCN 5 (Vega) added tons of new hardware features ... (Vega could easily be described as the R600 of it's time with it's primitive shaders, tiled rasterizer, conservative modes for rasterization, rapid packed math and other cruft.)

Pemalite said:

Delta Colour Compression is probably at it's most important on lower-end cards that are typically the most bandwidth constrained pieces of hardware.

It would go a long way in making low-end GPU's more e-sports friendly.

It's not as if end users are going to directly use the functionality ... (there's other ways to save bandwidth such as texture compression, and Hi-Z) 

And no I'd argue it's high end GPUs that benefit the most from DCC considering they have the highest ALU/BW ratio ... 

Pemalite said:

Funny. I use it.

Besides, you are missing the point entirely and nitpicking. It's the fact older hardware misses functionality of newer hardware. It really is that simple.

If it's never used it may as well not be there to begin with ... 

Pemalite said:

I am not talking about HDR.

That's technically what '10-bit colour' is, just HDR10 ... (10 as in 10-bit) 

Pemalite said:

People might be wanting it for uses outside of gaming.

Then demand that the devs have it programmed in whatever app that has 3D gfx ... (VSR isn't anything special either since it's just another method for supersampling.)

Pemalite said:

Lower end GPU's are more attractive HTPC solutions. They should support the latest and greatest standards.


No one wants a loud, power hungry heap of crap like a Radeon RX 580 in a HTPC.

And there is more that newer GCN parts support that GCN 1.0 doesn't. I was merely using a few examples.

Then just get an RX 550 (50 watts) or wait for Raven Ridge/Vega 12 and be done with it ...

AMD just isn't obligated to update their entire product stack everytime a new microachitecture releases ... (AMD is probably sticking with Vega as a new baseline since it's their biggest change yet in terms of feature sets.) 

Pemalite said:

Reduced precision can have an impact on image quality.

I have already gone to great extent elaborating upon FP16 and it's impacts on potential image quality in other threads, so I would rather not have to repeat that again here.

Or it could have none of the side effects once we consider that developers can control it ... (the only reason why half precision gets bad rep was because there was no good tools to use it in the past but that's changed now) 

Impacting image quality is not a real argument to not use FP16 since the above is in play ... 

Pemalite said:


iOS will never be a high-end gaming platform.

Mac doesn't have the market penetration to be even be remotely relevant in the gaming industry.

Anyway. I think we have beaten this discussion to death now. The horse is dead.

But Mac IS relevant in the gaming industry since lot's of developers are willing to port games using the Metal gfx API ... 



fatslob-:O said:

I'm not sure whether G-Sync is patented or not but at the very LEAST it looks as if Nvidia DOESN'T have a monopoly on adaptive refresh rate technology ... (That is why the inclusion of adaptive refresh rate into HDMI was unopposed by Nvidia.) 

Did the link not work? I just provided information regarding the patent.

fatslob-:O said:
Ageia existed for a little over half a decade before been acquired by Nvidia and their just one of the many other players out there regarding physics simulation which is also a largely *solved* field too so the impact of software patents are minimized in that area ...

Ageia Physics is covered under patent #20050075849.
https://www.google.com/patents/US7895411

fatslob-:O said:

Depends on what you define as 'overhaul' ... 

I think AMD has had lot's of changes over time. GCN 3 had some invasive changes to the ISA where the micrcode is totally incompatible with previous iterations and GCN 5 (Vega) added tons of new hardware features ... (Vega could easily be described as the R600 of it's time with it's primitive shaders, tiled rasterizer, conservative modes for rasterization, rapid packed math and other cruft.)

Graphics Core Next, whilst being a significant update, is still only an iterative update to Graphics Core Next.

Remember, GCN is extremely modular, large chunks of Graphics Core Next were actually untouched untill Vega half a decade later.

fatslob-:O said:

It's not as if end users are going to directly use the functionality ... (there's other ways to save bandwidth such as texture compression, and Hi-Z) 

And no I'd argue it's high end GPUs that benefit the most from DCC considering they have the highest ALU/BW ratio ...

It's the low-end GPU's with DDR3 memory or GDDR5 memory on a 64-bit bus that are the most desperate for bandwidth, they can even struggle with 720P gaming.
Playing around with a Radeon R7 240... Increasing the DDR3 Ram from 800Mhz to 1Ghz (25%) was a sizable increase in performance, it was the difference between Overwatch being unplayable... And Overwatch being playable.
That is the kind of gains Delta Colour Compression can bring. Not something to be shoved aside.

Rebadges suck.

fatslob-:O said:
If it's never used it may as well not be there to begin with ...


But they are used. Perhaps not by numbers that pleases you. But still used.
Heck, AMD found it useful enough to make an iterative update to True Audio with Polaris by bringing us True Audio Next.

fatslob-:O said:
That's technically what '10-bit colour' is, just HDR10 ... (10 as in 10-bit)

I am referring to colour depth.  The amount of distinct colours that can be portrayed on screen.

HDR or High-Dynamic Range is a technology meant to capture details in the darkest and lightest parts of an image simultaneously.

You can have 10-bit colour and no HDR, you can have HDR and only 8-bit colour.
HDR10 is in reference to HDR AND 10-bit colour.

And nor am I talking about the GPU's actual output anyway. I was referring to the video engine specifically, which doesn't support the necessary credentials for HEVC, Main 10, Rec 2020 etc'.

https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding
https://en.wikipedia.org/wiki/Rec._2020

fatslob-:O said:
Then demand that the devs have it programmed in whatever app that has 3D gfx ... (VSR isn't anything special either since it's just another method for supersampling.)

That is a messy way of doing things. It's much cleaner having such controls in a single unified place rather than spotty support between games.

Virtual Super Resolution is also done differently on AMD and nVidia's hardware.

nVidia for example implemented it as a shader program with a gaussion blurr filter on top for good measure.

AMD's implementation is superior. They have implemented it directly in hardware on the display controllers, resulting in no performance penalty.
The downside to AMD's approach however is hardware support. If the hardware doesn't support it, then you aren't getting it.

fatslob-:O said:

Then just get an RX 550 (50 watts) or wait for Raven Ridge/Vega 12 and be done with it ...

AMD just isn't obligated to update their entire product stack everytime a new microachitecture releases ... (AMD is probably sticking with Vega as a new baseline since it's their biggest change yet in terms of feature sets.)

The RX 550 is a shit HTPC card.

It has no low-profile, single slot variants in wide availability here. Let alone passively cooled ones.

The HTPC crowd does tend to be attracted to a very specific set of GPU's you know.

fatslob-:O said:

Or it could have none of the side effects once we consider that developers can control it ... (the only reason why half precision gets bad rep was because there was no good tools to use it in the past but that's changed now) 

Impacting image quality is not a real argument to not use FP16 since the above is in play ...

I have already gone into great depth of the extensive negative impacts FP16 has on image quality and the advantages it brings with it in regards to performance and power consumption before. So I would rather not go over it here again.

fatslob-:O said:
But Mac IS relevant in the gaming industry since lot's of developers are willing to port games using the Metal gfx API ...

The API is irrellevant.
Mac has always had a sizable amount of developers willing to throw them their support, but the majority of games just don't end up on the platform.

Their market size in AAA gaming does make them irrellevant. That's never going to change. Just like fetch, it's just never going to happen.



--::{PC Gaming Master Race}::--

Pemalite said:

Ageia Physics is covered under patent #20050075849.

https://www.google.com/patents/US7895411

I thought we we're talking about software patents ?

Pemalite said:

It's the low-end GPU's with DDR3 memory or GDDR5 memory on a 64-bit bus that are the most desperate for bandwidth, they can even struggle with 720P gaming.

Playing around with a Radeon R7 240... Increasing the DDR3 Ram from 800Mhz to 1Ghz (25%) was a sizable increase in performance, it was the difference between Overwatch being unplayable... And Overwatch being playable.
That is the kind of gains Delta Colour Compression can bring. Not something to be shoved aside.

Rebadges suck.

The lowest end discrete AMD GPU (Radeon 520) offers 48 GB/s of bandwidth. That gives you a peak 13.75 Flops/Byte ratio which is very comparable to PS4's 10.45 Flops/Byte ratio. The X1 on the other hand has a 19.26 Flops/Byte ratio while the X1X will sport a 18.40 Flops/Byte ratio so I do not believe for one second that low end GPUs desparately need this DCC feature when low end GPUs have a relatively healthy Flops/Byte ratio in comparison to the consoles of which I'd argue that PS4 had a slightly overengineerd memory system ... 

The RX Vega 64 liquid cooled will have a whopping 28.38 Flops/Byte ratio ... (It's pretty clear who needs this DCC feature and who doesn't and I bet AMD could've turned off DCC for Polaris 12 chips without ANY decrease in game performance.) 

Growth in ALU has outstripped BW and it affects the high end the most since they'll arguably be the first one's to experience the distortion in bottlenecks and HBM isn't going save them either ... 

Pemalite said:


I am referring to colour depth.  The amount of distinct colours that can be portrayed on screen.


HDR or High-Dynamic Range is a technology meant to capture details in the darkest and lightest parts of an image simultaneously.

You can have 10-bit colour and no HDR, you can have HDR and only 8-bit colour.
HDR10 is in reference to HDR AND 10-bit colour.

And nor am I talking about the GPU's actual output anyway. I was referring to the video engine specifically, which doesn't support the necessary credentials for HEVC, Main 10, Rec 2020 etc'.

https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding
https://en.wikipedia.org/wiki/Rec._2020

That's exactly what HDR10 aims to provide, more colour depth. With HDR10 the colour bit depth per channel increases from 8-bits to 10-bits (1024 colours in total) which consequently increases the colour space, contrast and peak luminance ... (You need not worry about whether or not GPUs can support a higher colour depth since practically any GPUs that support D3D FL 10_0 feature set can output 16-bits per channel for render targets. As for video engines that may be true but that only matters when we're doing playback of compressed video formats and by then AMD will have a permanent solution in place with Vega 12 and Raven Ridge giving 10-bit HEVC for their full stack long before the consumption of HDR10 content and displays will take off.) 

Pemalite said:


That is a messy way of doing things. It's much cleaner having such controls in a single unified place rather than spotty support between games.


Virtual Super Resolution is also done differently on AMD and nVidia's hardware.

nVidia for example implemented it as a shader program with a gaussion blurr filter on top for good measure.

AMD's implementation is superior. They have implemented it directly in hardware on the display controllers, resulting in no performance penalty.
The downside to AMD's approach however is hardware support. If the hardware doesn't support it, then you aren't getting it.

Actually I'd prefer it if the feature was supported in the app layer rather than the driver layer for reasons such as compatibility and content handling plus driver interfering with app logic such as VSR defeats the spirit of explicit APIs such as DX12 and Vulkan in the process ... (specialization is the way to go for high end) 

I'd doubt AMD would waste silicon for a feature that could be done easily in software and is just another form of dynamic resolution technology. Face it, there's absolutely no good reason why VSR couldn't be supported on original GCN architecture. AMD's just being negligent since they don't think it's worth adding support for 5 families of chips of which the strongest one (Tahiti) was falling behind in newer games ...

Pemalite said:


The RX 550 is a shit HTPC card.


It has no low-profile, single slot variants in wide availability here. Let alone passively cooled ones.

The HTPC crowd does tend to be attracted to a very specific set of GPU's you know.

If that specific set happens to be obsessing over the 'smallest possible' form factor then they might as well forgo buying discrete GPUs altogether and just straight up buy Kaby Lake based NUCs instead or whatever AMD will be rolling out with Raven Ridge since customers will get the most amount of features for a given space in these systems ... (the only good thing that the GT 1030 offers is a low cost solution for old system users who don't want to buy a new system altogether but there's nothing that's stopping single slot RX 550's from being any less valid solutions even if they have a fan)

HTPC crowd only cares about the features and the form factor, not the cooling solution ... 

Pemalite said:


I have already gone into great depth of the extensive negative impacts FP16 has on image quality
and the advantages it brings with it in regards to performance and power consumption before. So I would rather not go over it here again.

Well sure but that doesn't mean that it's automatically implicated like you would insinuate ... 

Pemalite said:


The API is irrellevant.

Mac has always had a sizable amount of developers willing to throw them their support, but the majority of games just don't end up on the platform.

Their market size in AAA gaming does make them irrellevant. That's never going to change. Just like fetch, it's just never going to happen.

Not really, it's very relevant since API can affect game engine design too and I believe the stars are aligning for Apple too ever since they rightfully ditched OpenGL for Metal ... (I think Metal is an incentive to port AAA games to Mac since OpenGL sucks so badly and that's one reason devs didn't want to deal with Mac in the first place when DirectX ended up being better by design.) 



Waiting is boring. Someone tell me something new about Volta and Navi.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

After the Cer announcement, this was bound to happen:

 

ASUS Now Also Delays its 4k Swift PG27UQ Ultra HD G-Sync Monitor to 2018
http://www.guru3d.com/news-story/asus-now-delays-its-4k-swift-pg27uq-ultra-hd-g-sync-monitor.html
Last week we reported that ACER has pushed back its 4K HDR GSYNC Predator screen towards 2018. Something is going on alright, as now ASUS as well is pushing back the release of the ASUS ROG Swift PG27UQ.

Both monitors have nearly the same specs, and use the same panel. So the panel itself seems to be the reason. The panel used would be an AU Optronics M270QAN02.2 AHVA IPS panel. These puppies offer a 3840×2160 resolution at an 144 Hz refresh rate and a LED backlighting system with 384 zones. Some specs of these monitors with an IPS panel.

 

Sorry viv. I know you were interested in the ASUS monitor.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.