By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Radeon RX Vega revealed

Alby_da_Wolf said:
Bofferbrauer2 said:

So do I

That said, I fear we will have to wait for a long while, at least on desktop PCs. AMD just finally releases the Bristol Ridge APUs on the consumer desktop market, so Raven Ridge certainly won't come out there until spring next year earliest. At least it's mobile version is supposed to come out during the holiday season.

Yeah. the good news is that it looks like AMD decided to skip Polaris for the next APU GPUs and go directly to Vega, as it gives both AMD and users better performances and power consumption, and for AMD, probably saving silicon for the same desired performance or getting more performances for the same silicon, depending on the model.
Or maybe AMD will make us wait even more, skipping Vega too, to use Navi and the new 7nm process and save even more silicon.   

Polaris is used in Carrizo/Bristol Ridge, they didn't skip it

No need to be so sarcastic though, pretty sure nobody at AMD expected Vega to be that late. And to me, the reason is actually not the chip itself, but the drivers because many of the new features of Vega are still not unlocked in the driver.



Around the Network

Vega? I'll wait for the Vegeta model



“It appeared that there had even been demonstrations to thank Big Brother for raising the chocolate ration to twenty grams a week. And only yesterday, he reflected, it had been announced that the ration was to be reduced to twenty grams a week. Was it possible that they could swallow that, after only twenty-four hours? Yes, they swallowed it.”

- George Orwell, ‘1984’

fatslob-:O said:
Pemalite said:

nVidia do not have APU's. It's a marketing term strictly limited to AMD.
nVidia does not have x86 SoC's as nVidia does not have an x86 license.

Don't exactly need an x86 patent but might need patents for x86 extensions to be useful ... (someone needs to remind me when the patents for x86-64 and SSE2 expire as patents only last for 2 decades)

Nvidia already designed an x86 processor before but so did tons of others before ...

You need an x86 license that allows you to use x86 extensions.

fatslob-:O said:
It should be but since graphics programmers are obsessed with achieving real time physically based ray tracing, we probably won't see a resolution increase ...

I think we might be a couple generations away from that still yet.


fatslob-:O said:
Hopefully GCN is amenable to ISA extensions, it's almost as if though Sony accounted for the fact that newer GCN microachitectures would have changed microcodes in their shader compiler thus they got double rate FP16 in the end with an ISA between GCN3/5 ... (It's good that Sony didn't lock down their shader compiler a specific GPU ISA microcode but their still probably stuck with AMD GPUs that offer similar functionality)


Graphics Core Next is an extremely modular design anyway, which is why AMD has been able to make constant iterative updates to parts of the architecture with minimal effort/cost whilst leaving the rest of the architecture identical to it's prior versions.
It's also why they can take parts of newer architectures and add them to older designs like with what they did with the Xbox One X and Playstation 4 Pro.

Vega is supposed to be largest deviation from prior Graphics Core Next designs with massive backend overhauls.

So if Sony wanted to, they could take a theoretical Graphics Core Next 6/7 design and regress say... The shaders to be compliant at a hardware level to Graphics Core Next 1.0 if they wanted.

But I believe that would be a silly approach to take. There are ways around such hardware incompatabilities using software-based approaches.

Bofferbrauer2 said:

1. Xbox ONE X is still using Jaguar because it's an incremental update of the Xbox ONE. While I was hoping for Ryzen in the XboX I certainly didn't expect it. The next gen consoles certainly won't use this Ryzen, but a future gen of Ryzen, but it will nontheless be Ryzen, as AMD won't have a new architecture anytime soon

Clearly then my statement wasn't directly applicable to you. But you cannot deny that there was overwhelming amounts of people fapping at the idea of Ryzen in the Xbox One X.

Bofferbrauer2 said:

4. yes, Moore's Law applies to any kind of microchip. But at the time of the PS360's releases 1GiB RAM was the usual amount on a gaming PC. I can still remember the outcry when it was revealed that the PS3 would only come with 256 MiB of RAM and how that won't be enough to fuel Cell. The desaturated, Gray-brownish colored games that followed are a testimony that this just wan't enough even at it's inception. Same happened when the current gen launched (though at a much lesser degree) and again with their upgrades.

The Playstation 3 didn't come with 256Mb of Ram. It came with 512Mb. Same with the Xbox 360.

Remember, the Xbox 360 released in 2005. Windows XP reigned supreme, Pentium 4/Pentium D/Athlon XP/Athlon 64 X2 were the CPU's of choice.

Low-end gaming rigs back then came with 512Mb of Ram, mid-range 1Gb, high-end 2Gb. And games of that era reflected that with games like Oblivion, Half Life 2, Doom 3 working on 512Mb systems.

But that is all irrellevant. That isn't representative of what we have today.

Today a low-end gaming rig would come with 8Gb of Ram. That's system Ram. Then you can add the Gigabytes of GDDR5 memory from the graphics card to that pool too.

Bofferbrauer2 said:

You need more RAM for 4k? Really? Tell that to the console manufacturers, because the PS4 Pro doesn't have more RAM and the additional RAM on the X is eaten up by it's gargantuan OS (5 GiB for the OS on a console? Really Microsoft???). (before you make a comment about it, yes I know higher screen resolutions need more (V)RAM)

Yes really. Higher resolutions, higher quality textures, more and higher quality meshes and so on and so forth require more Ram.

There are multiple reasons why the Playstation 4 Pro and Xbox One X cannot reliably achieve 4k. Ram capacity, memory bandwidth, CPU performance and GPU performance are all factors.
The Xbox One X has a massive increase in available memory for games over the Xbox One, Playstation 4 and Playstation 4 Pro for a damn specific reason.

And It's great that you recognize that higher screen resolutions need more VRAM. Because VRAM and System Ram is the same on a console. ;)
Are you just arguing with me for the sake of arguing?

Bofferbrauer2 said:

The next gen consoles won't be able to push 4k much better than the current upgrades, most will just be upscaled or checkerboxed in some fashion unles they turn down the details compared to the PC releases.


I cannot in good conscience agree with this. If you think the capability of graphics hardware is going to remain constant for the next several years when there is industry-wide pressures to push for 4k, then you are highly mistaken.

Bofferbrauer2 said:

This is due to consoles having to choose mid-range graphics chips for their consoles due to price and TDP constraints, and I doubt those will be much better at 4k then they are right now. Worse, as the consoles start to age they will inevitably drop down  again to 1080p or even less in the most demanding third party titles.

Correct. Consoles do have to choose cost-sensitive parts for obvious reasons.
However, today's high-end levels of performance becomes tomorrows mid-range, not always in sheer theoretical numbers like flops, but through efficiency gains via the reworking/new architectures that bring improvements such as compression, culling and such. Which is why the Radeon RX 480, despite having less flops than the RX 380, can beat the RX 380, all with 200mm2 taken off the die size, 50% less bandwidth and 125w less of power consumption.

GDDR6 should help bring such capabilties to the mid-range. I don't think you fully comprehend how long 3 years is in GPU development and what it means for GPU gains, especially as we look towards 12nm/10nm/7nm/5nm fabrication processes.



--::{PC Gaming Master Race}::--

Bofferbrauer2 said:
Alby_da_Wolf said:

Yeah. the good news is that it looks like AMD decided to skip Polaris for the next APU GPUs and go directly to Vega, as it gives both AMD and users better performances and power consumption, and for AMD, probably saving silicon for the same desired performance or getting more performances for the same silicon, depending on the model.
Or maybe AMD will make us wait even more, skipping Vega too, to use Navi and the new 7nm process and save even more silicon.   

Polaris is used in Carrizo/Bristol Ridge, they didn't skip it

No need to be so sarcastic though, pretty sure nobody at AMD expected Vega to be that late. And to me, the reason is actually not the chip itself, but the drivers because many of the new features of Vega are still not unlocked in the driver.

They already use GCN architecture, but 3rd generation, the one before Polaris.
No sarcasm, I'm fine with AMD even when it's late, and it saddens me that the market didn't reward it as much as it deserved during the six years that it constantly surpassed Intel on desktop CPUs.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


Alby_da_Wolf said:
Bofferbrauer2 said:

Polaris is used in Carrizo/Bristol Ridge, they didn't skip it

No need to be so sarcastic though, pretty sure nobody at AMD expected Vega to be that late. And to me, the reason is actually not the chip itself, but the drivers because many of the new features of Vega are still not unlocked in the driver.

They already use GCN architecture, but 3rd generation, the one before Polaris.
No sarcasm, I'm fine with AMD even when it's late, and it saddens me that the market didn't reward it as much as it deserved during the six years that it constantly surpassed Intel on desktop CPUs.

This. Stony Ridge, Bristol Ridge, Carizzo use Graphics Core Next 1.3/3.0.
Polaris is Graphics Core Next 1.4/4.0.

Carizzo-L actually uses Graphics Core Next 1.2/2.0.

There will not be a Polaris based APU.

The next APU will be based on Graphics Core Next 1.5/5.0/NCU (Aka. Vega) with Raven Ridge.



--::{PC Gaming Master Race}::--

Around the Network
thismeintiel said:

If the PS5 was to keep the same jump as previous gens, compared to the PS4's actual numbers, we would be looking at a jump to 55+ Tflops, with 96+ GB of RAM.  Now, who seriously thinks that is going to happen?  No one who is sane.  We are going to be getting a jump in power more in line with the jump from PS3 to PS4, with a big decrease in the jump in RAM.  16GB is very realistic, with 32GB being the top realistic pick on a wishlist.

 

Just thinking of 96GB of RAM.  You might as well just start running servers at that point.  I'd say 16GB would be top.  16GB is pretty much fine for PCs these days.  32GB is overkill.



sethnintendo said:
thismeintiel said:

If the PS5 was to keep the same jump as previous gens, compared to the PS4's actual numbers, we would be looking at a jump to 55+ Tflops, with 96+ GB of RAM.  Now, who seriously thinks that is going to happen?  No one who is sane.  We are going to be getting a jump in power more in line with the jump from PS3 to PS4, with a big decrease in the jump in RAM.  16GB is very realistic, with 32GB being the top realistic pick on a wishlist.

 

Just thinking of 96GB of RAM.  You might as well just start running servers at that point.  I'd say 16GB would be top.  16GB is pretty much fine for PCs these days.  32GB is overkill.

Consoles could use 16GB today thanks to their bloated OS. 5GB available for games is very limiting atm. The number one complaint today is blurry texxtures. However next gen comes out 2019 at the earliest, 2021 at the latest, and will stay relevant for another 6 or 7 years. 16GB in 2027 will be a big bottleneck again. 32GB should be the minimum.



Pemalite said:
thismeintiel said:

Is there a reason you keep saying Navi is coming in 2018?  It is being reported now that it is coming out in 2019, which is the year I expect the PS5, or at least its announcement.  If somehow it does hit 2018, it'll be very late 2018 and in very limited quantities.  2018 is when they are going to be focusing on the Vega 20 (which is being made in 7nm 14nm+, so maybe that will be better for the PS5), or whatever that will be called, now, and the Vega 64 Pro Duo.  And since Vega was delayed, what makes you think Navi won't be?  I guess keep the hope alive.

Because of AMD's roadmaps. There are some fake roadmaps floating around, so stick to legitimate sources if you can.

http://www.anandtech.com/show/11404/amd-updates-gpu-architecture-roadmap-after-navi-comes-next-gen
http://www.anandtech.com/show/10145/amd-unveils-gpu-architecture-roadmap-after-polaris-comes-vega

And the reason why Navi won't be delayed is simple.

AMD has more than one team. One team was responsible for Vega, another for Navi. Just because one team might be slightly slower out of the gate doesn't mean the next is.
nVidia has done the same thing in the past where one team was slower with one GPU release, the next team was on time, so you had a less than 12 month~ release gap between an entire product launch.

thismeintiel said:

Manufacturing costs always lower over time, even if not greatly after the first year or two.  Of course, at first, a lot of the cost is the manufacturer trying to make up for R&D costs.  AMD is not going to ignore those costs for Sony or MS, either.


Sure. They do. But there comes a point where you are better off overhauling your design.

You aren't going to get a Vega chip that is only going to cost $100 in 3 years time.

Haven't seen any reports saying they are fake.  Either way, one of your articles is from over a year ago and so it is outdated.  The first is newer, so more accurate. Care to point out where it says 2018?  It just says sometime before 2020.  With Navi's successor coming out 2020, not 2019, like you stated earlier.  And with Volta coming out, I would think AMD would want to correct all of the tech sites reporting that Navi isn't coming out until 2019.

And a $100 Vega for consumers? Nope. But a $100 Vega for a company wanting to buy millions of them? Most likely.  At max, it'll be $150.  Easily cheap enough to throw into a $399 box.



Pemalite said:

You need an x86 license that allows you to use x86 extensions.

A license is just an agreement between participating parties, what you want is access to the patents themselves ... 

Pemalite said:


I think we might be a couple generations away from that still yet.

I think we're on the cusp of it and I imagine Sony and Microsoft will collaborate with AMD on making demands for hardware features on GCN that will accelerate ray intersection tests ...

Pemalite said:


Graphics Core Next is an extremely modular design anyway, which is why AMD has been able to make constant iterative updates to parts of the architecture with minimal effort/cost whilst leaving the rest of the architecture identical to it's prior versions.

It's also why they can take parts of newer architectures and add them to older designs like with what they did with the Xbox One X and Playstation 4 Pro.

Vega is supposed to be largest deviation from prior Graphics Core Next designs with massive backend overhauls.

So if Sony wanted to, they could take a theoretical Graphics Core Next 6/7 design and regress say... The shaders to be compliant at a hardware level to Graphics Core Next 1.0 if they wanted.

But I believe that would be a silly approach to take. There are ways around such hardware incompatabilities using software-based approaches.

The most realistic software based approach would be just exposing the the microcode functionality as instrinsics so that way you don't introduce incompatiblities that breaks the API specs ...



SvennoJ said:

Consoles could use 16GB today thanks to their bloated OS. 5GB available for games is very limiting atm.

I swear to god, it's ridiculous the amount of RAM current gen systems waste on the OS. Bloated is an understatement.