Quantcast
View Post
Pemalite said:

It is as bad as I make it out to be.
The Radeon VII is packaged with far more expensive HBM2 memory... And despite it being built at 7nm will still consume 40w+ more energy during gaming.

Not an ideal scenario for AMD to be in... Which is why they couldn't undercut nVidia's already high-priced 2080 to lure gamers in. In short... It's a bad buy.

In saying that, I have to give credit where credit is due... Radeon VII is an absolute compute monster.

I'm not sure if memory bandwidth is all that much of an issue with the Radeon VII ? Maybe it has an overabundance of memory bandwidth ? Honestly, all that memory bandwidth is probably more useful for machine learning frameworks like Tensorflow or PyTorch ... 

Pemalite said:

Things are always changing. The Xbox One has Direct X 12, some developers use it and it's features... But any serious developer will target the low-level API's anyway.

The X1 has a pretty customized version of DirectX altogether so definitely a no go on PC but all those serious developers don't want to lose Windows 7 compatibility as well so they delay making changes to their code base and as a result AMD graphics hardware suffers for it ... 

Things need to change faster and quicker ... 

Pemalite said:

The Switch isn't as fixed as we think it is... Considering it's plethora of performance states... But I digress. But Maxwell is pretty easy to target for anyway.

nVidia has most engines onboard... And this has been a long historical trend, hence their "nVidia, the way it's meant to be played" campaign, CryEngine, Unreal Engine, Unity... List goes on.
They work closely with allot of industry bodies, more so than what AMD has historically done... Which has both it's Pro's and Con's.

It does mean that AMD is less likely to engage in building up technologies which are exclusive to their hardware.

It has different performance states but it's still pretty much fixed hardware just like how "boost clocks" on PC doesn't change the underlying hardware components. Rest assured on a console like the Switch, developers can afford to keep Maxwell specific shader optimizations ... 

Nvidia maybe able to spend more for optimizations but they can't afford to control every engine developer out there so more often than not engine developers will prioritize consoles and to a lesser extent AMD hardware as well like we see on Snowdrop, Frostbite (despite RTX integration), RE Engine, Dunia, and ForzaTech ... (Nvidia literally cannot control every aspects of the engine design because there are at least thousands of employees in the industry that contributes here) 

Pemalite said:

AMD does the same, hence why they cut off Terascale support in their drivers a couple years after they were releasing Terascale based APU's.
There are obviously Pro's and Con's to each companies approach.

I think you took that out of context. Nvidia's driver stack is nowhere near as extensive as AMD's ... (no way in hell does AMD spend the amount of time like Nvidia does for the upkeep)

An Nvidia 'driver' could be like discovering a new operating system altogether in comparison to AMD drivers. Nvidia has a nauseating amount of hacks in their drivers to maintain just to keep their performance wins ... 

I don't think you understand just how much of a maintenance burden Nvidia's approach is in comparison to AMD's ... (keeping the entire evolving software stack up to date with new exotic hardware releases EVERY 15 MONTHS is utter insanity)

Pemalite said:

The recycling results in stagnation... It's as simple as that. AMD has stagnated for years, nVidia stagnated when they were recycling hardware.
The other issue is... It's not a good thing for the consumer, when you buy a new series of GPU's, you are hoping for something new, not old with a different sticker... It's far from a good thing.

It's really not since AMD has to meet it's end of the obligations with other partners and I don't believe Nvidia has ever stagnated a whole lot even with releases of less than desirable architectures ... 

It's for consumers to decide what they want to expect, whether that being totally new or "old with a different sticker" (massive amount of hyperbole) so it's not our business to demand what they truly desire ... 

Pemalite said:

Agreed. There is still room for things to become disrupted in the console space though if IBM, Intel or nVidia etc' offer a compelling solution to Sony or Microsoft, but the chances of that is pretty slim to non existent anyway.
No one is able to offer such high performing graphics with a capable CPU other than nVidia... And nVidia is expensive, meaning not ideal for a cost-sensitive platform.

I doubt Nvidia can meet the needs of either Sony or Microsoft and Nvidia are the second worst if not the worst CPU designer out of all the major corporations ... (their ARM cores just like their GPUs in the mobile space guzzles a lot of power but what's more is that they are buggy as well)

Pemalite said:

nVidia does have some options. They don't need x86 or Power to remain relevant, ARM is making inroads into cloud computer/server space, albeit slowly.
I mean, ARM was such a serious threat that AMD has even invested in it.
https://www.amd.com/en/amd-opteron-a1100

nVidia is also seeing substantial growth in the Datacenter environment with increases of 85% in revenue.
https://www.anandtech.com/show/13235/nvidia-announces-q2-fy-2019-results-record-revenue

So I wouldn't discount them just yet... They have some substantial pull.

Most of Nvidia's YoY revenue growth comes down to raising their prices ... 

They are not growing their customer base as much as they nearly used to. Without x86 or POWER, Nvidia has no future ... (they should've settled with Intel for x86 patents instead some cash because now they're locked out of a race between where only AMD or Intel may compete) 

ARM is only ever truly competitive in the mobile space and we all know how that turned out ... 

Pemalite said:

Indeed. Although parts like the MX110/MX150 got a TON of design wins in notebooks, which were devices that went up against AMD's Ryzen APU's and often had the advantage in terms of graphics performance.

Mobile is a very fickle space... You have Qualcomm. And that is it... Apple, Huawei, Samsung all build their own SoC's, so there is very little market for nVidia to latch onto... I guess AMD made the right decision years ago to spin off Adreno to Qualcomm.

And even Chinese manufacturers like Xiaomi are entering the SoC game for their budget handsets... Meaning the likes of MediaTek and so on probably looks tenuous over the long term.

However, Tegra isn't done and dusted yet though, nVidia is seeing growth in Vehicles, IoT and so on.

It was probably for the best that AMD sold off Adreno because Nvidia figured it out the hard way that just having good graphics technology is not enough to succeed. Having to design ARM cores AND modems was more than what AMD or Nvidia were willing to chew on ... (Qualcomm instills a massive amount fear towards even giants like Intel)

Tegra is done for man. It was initially branded for mobile but plans changed since Nvidia couldn't design LTE modems for crap like Qualcomm did so Nvidia rebranded it for automotive but if a struggling auto manufacturer like Tesla are able to competently source their own chip designs then I have no doubt in my mind that Nvidia can be easily dumped in this sector as well ... 

IoT ? Nvidia solutions are even less compelling ever since they killed off Icera to stop developing wireless technology like 5G ... 

Pemalite said:

Apple not only has impressive graphics technology... But equally as impressive energy efficiency.

Even their CPU cores tend to be extremely efficient... But also have substantial performance ceilings, it's actually impressive with what they achieve.

In saying that... They do own everything from top to bottom, so they are able to garner some efficiency advantages that Android just cannot match.

Except for wireless technology like modems (Qualcomm), image sensors/cameras (Sony), flash memory and DRAM (Samsung) ... (a lot of corporations would kill to be in Apple's envious position in which they have strong control over their own ecosystem)

Apple is what we could say to be a semi-vertically integrated business since they don't own all of the supply chain ... 

Pemalite said:

Intels Graphics have historically been shit as well.

Even when things played out in Intels favour and had optimized it's graphics for games like Half Life... They still trailed the likes of ATI/AMD/nVidia.

Even back in the late 90's/early 2000's I would have opted for an S3/Matrox part over an Intel solution... And that says something... And they were arguably more competitive back then!

But drivers are probably Intel's largest Achilles heels, they are investing more on that front... And they absolutely must if they wish to be a force in the PC Gaming market.

I'd be okay with Intel graphics for low end gaming if their drivers weren't so bad ...

Pemalite said:

Haswell was a big step up, but still pretty uninspiring... Haswells Iris Pro did manage to double the performance of AMD's Trinity mobile APU's in some instances... But you would hope so with a chunky amount of eDRAM and without the TDP restrictions.

A large portion of Haswell's advantages in the Integrated Graphics Space back then was also partly attributed to Intels vastly superior CPU capability as well... Which is partly why the 5800K was starting to catch the Haswell Iris Pro thanks to a dramatic uplift in CPU performance.

However, then AMD pretty much left Intels Decelerator graphics in the dust going forward... Not to mention better 99th percentile, frame pacing and game compatibility with AMD's solutions.

I would take Vega 10/Vega 11 integrated graphics over any of Intels efforts currently.

Haswell was a really good improvement IMO since it was a first for Intel to come with it's own unique advantages in comparison to either AMD or Nvidia and they managed to standardize some of that stuff in DX12 as well! (particularly their work on ROVs and conservative rasterization) 

Skylake went to a whole new level and Gen 11/Xe will only take that further ... 

Pemalite said:

They are working on it!
https://www.anandtech.com/show/14117/intel-releases-new-graphics-control-panel-the-intel-graphics-command-center

They have years worth of catching up to do, but they are making inroads... If anyone can do it though, Intel probably can.

Not exactly what I'm looking for. Intel needs a catchy graphics optimized platform 'slogan' like 'GE' or 'TIMTBP' and I want Intel specific graphics solutions like Nvidia GameWorks library such as HairWorks etc ... 

Pemalite said:

Depends on the 470... The 1060 is superior in every meaningful metric with an advantage of upwards of 50%.
https://www.anandtech.com/bench/product/1872?vs=1771

Anandtech does need to update it's benchmark suite... But even with dated titles like Grand Theft Auto 5... That game is still played heavily with Millions of gamers, so I suppose it's important to retain for awhile longer yet... Plus it's still a fairly demanding title at 4k all things considered.

End of the day, a Geforce 1060 is a superior choice for gaming over a Radeon RX 470 or 570, unquestionably.

GTA V is an awful benchmark since it doesn't match up to the capabilities of modern hardware or modern games ... 

Pemalite said:

By then, the RX 580 and Xbox One X will be irrelevant anyway with next gen GPU's and Consoles in our hands.

No point playing "what-ifs" on hypotheticals, we can only go by with the information we have for today.

Touche!