Straffaren666 said:
I believe we both want to see a powerful PS5. I can understand the gloom though and I suspect it depends a lot on the expecations. I didn't expect Navi to be a Turing killer, AMD is/was just too far behind nVidia for that. So compared to RTX, Navi is somewhat a disappointment since it doesn't seem to beat an eight month old architecture, not even when having a process node advantage. However, compared to Vega it clearly seems to be a step forward and indicates the entire PS5 GPU will be beefed up, not just some parts of it, and combined with the performance/watt improvements it tickles my mind to see how far they will be able to push the PS5 GPU. Unfortunately, AMD is very vague about Navi/RDNA and I suspect there are a lot of misinformation, even from usually reliable sources like AnandTech, going around. For instance, I've not found any indications that the IPC improvement is as high as 25%, which would be quite remarkable. AMD claim there is a 1.25 performance/clock improvement compared to GCN, but that's a different metric than IPC. So either the journalists at AnadTech have spoken to AMD at Computex and the IPC improvements happens to coincide with the 1.25 performance/clock improvement or their information isn't based on facts but (at least in some part) incorrect assumptions. What is more worriesome though is that AMD don't disclose which GCN based card they have used to come up with the 1.25 performance/clock improvement. In a footnote they state it's the geometric mean for a benchmark consisting of 30 games rendered at 4K with 4xAA. If they have compared Navi to a Vega card, then a 1.25 performance/clock improvement is exceptionally good, but it's more likely they have compared it to a midrange Polaris card and then a big portion of the performance improvement simply comes from the increased bandwidth of Navi and not the RDNA architecture. Similarly they are vague about how they have measured the 1.5 performance/watt improvements. Is that compared to a Vega 14nm, Vega 7nm, Polaris 12nm or Polaris 14nm? I believe we'll get a lot more info at E3 and until then there will probably be a lot of wild speculations, both optimistic and pessimistic. |
If they compared it with Vega with Draw Stream Binning Rasterization active and Primitive Shaders active... A 1.25x increase in IPC would indeed be exceptionally good.
But if it's word play like when AMD stated that Zen 2 was 30% better than Zen 1, but 15% of that was due to frequency... It's far less impressive.
At 4k with 4x AA.. Fillrate and bandwidth would certainly become a bit of a limitation for Polaris, so a 1.5x increase over those parts would probably be expected considering the gains to be had with GDDR6.
I think the gains are to be expected though, it still keeps Vega as AMD's high-end offering and allows Navi to fill out the lower price points... Which is something I have been asserting all along, despite people thinking it was going to be AMD's competitive return to the high-end.
JEMC said: Well, so far I'm pleased with the provided numbers. They're not great or mind blowing, but they offer a nice performance increase over AMD's current parts. We have to remember that Navi is replacing AMD's current mid range cards, the RX 480/580/590s, that competed with Nvidia's GTX 1060. it was never meant to be a high end part. And yet we see the 5700 being compared with the RTX 2070! Even if the end product ends being with 10% of the Nvidia card in real world benchmarks, it will already be a massive improvement over what AMD has now. And there's also an interesting question: this is the 5700 chip/card, will there be a 5800 part? By the way, regarding power consumption, there will be two cards that will use 180 and 225W: https://wccftech.com/amd-radeon-rx-5000-navi-gpu-7nm-asrock-two-variants-report/ |
The 2070 is a mid-range part. An impressive one mind you. - Even though it's built on an older process.
However... The thing we need to keep in mind is that nVidia is potentially going to refresh it's Turing lineup with faster GDDR6.. A 15% increase or more in bandwidth for Turing might be enough to keep nVidia slightly ahead.
https://www.tweaktown.com/news/65955/nvidia-expected-unveil-faster-geforce-cards-fight-navi/index.html
haxxiy said: 225W would be pretty bad. The card they've shown should have been 150W at most if they're willing to catch up with Nvidia. 180W and it's just back to the Polaris, pre-Turing status quo. |
225w would necessitate 2x 6-pin PCI-E or a 1x 8-pin... Or perhaps 2x 8-pin if AMD is going to give some extra headroom for overclocking and stuff.
I wouldn't say it's bad though... But it's not energy-sipping either.
Straffaren666 said: Yeah, I attended a GDC lecture where nVidia showed it off for Wolfenstein 2. They were able to half the shading rate by using VRS when rendering in 4K and I couldn't see any noticeable artifacts. It was shown on a big projector and I sat 7-8 meters away, so there might have been artifacts that I missed. I don't know how well it scales to other types of games, but I couldn't think of anything special that would make it more suitable for just Wolfenstein. Looks like a very promising technique. |
Would love to go to a show like that, living in the country doesn't afford me such opportunities though. Jealous.
eva01beserk said: So much negativity and we dont even know the prices yet. AMD just trounced intel, cant we have a little more faith. just 3 more weeks till E3 and get the final details. |
Allot of people had unrealistic expectations that it would somehow be AMD's return to the high-end, which was never Navi's goal... And because those expectations weren't met, there is a bit of a confuffle around various circles.
Dulfite said: I'd ask for more specifics, but I suspect we can't really do benchmark comparison tests yet? In general, I'd like to know how much better. My PC is about 4 years old now, though I've upgraded it a decent amount over the years (including the 980 ti). Trying to juggle if I can continue to upgrade it or if it would be better to just build a newer pc in the nearish future (a few years from now), or just let it be what it is and simply get an xbox for 3rd party games and some of their 1st party ones. |
Just upgrade your GPU in a few years, the CPU is still extremely capable and will be for a long time to come.
--::{PC Gaming Master Race}::--