By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Pemalite said:

But when they were competitive price/performance wise, they took marketshare, they just never maintained that momentum.
For example... AMD's marketshare pretty much imploded when they kept rebadging the Radeon 7000 series to > 8000/R200/R300 series.

Some highlights of AMD's marketshare was the Radeon 9000 series, x850 series, x1950 series. - They dropped a ton of marketshare with the Radeon 2900 series, deservedly so. - Clawed some back with the Radeon 5000 series, deservedly so. - But then the 6000 series was just a refinement and didn't push boundaries...
Radeon 7000 series was tarnished due to frame pacing drivers and a focus on compute.

Radeon RX 6000 series is at an all time low because Ray Tracing, DLSS are overshadowing everything AMD has... Deservedly so.
But even when nVidia drops the ball with the 4000 series, AMD had the potential to release a very solid, much higher clocked 12GB-14GB-16GB Radeon 7600 and obliterate nVidia... But didn't.

So historically, when AMD releases a solid product lineup (I.E. Top to bottom stack), their marketshare increases, they just never kept that going for a second generation as nVidia's leaps (I.E. Maxwell) were significantly more impressive as of late.

Going by the chart data, they kept their momentum up for around 5-6 GPu gens (Nvidia gens not AMD), and then they just gave up, and it seems they gave up big time around the GTX 900 series, which is nearly a decade ago now, and that's a hella looooooong time in the tech space, let alone console/hardware years to consider.

Nvidia are technically fucking themselves atm by adding in extras to their branding (like RTX or RTX IO, or how this gen's number line is completely muddled up), but they still hold general consumer mindshare, just not hardcore dedicated (since most of us seem to be aware that it's 4090 or nothing this gen).

I just don't see AMD bothering at this point tbh, I mean look at the chart, they haven't cared about keeping up anything for nearly a decade now. That's just bad, dire even, to see AMD go on that long and not do shit about it. 

If Intel doesn't do something, we're really going to get fucked hard in the next 5+ years from now. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network

I think AMDs current issue is they are resource constrained due to Ai hype in the data center. Least that's the only explanation I can think of as to why they basically skimped out on a generation that should be an easy win for them in virtually every area under a 4090. I don't think the industry was prepared for the Ai boom to come so quickly and because of this, AMD and Intel are trying to play catch up against Nvidia in the datacenter.

You look at something like ROCm for CDNA. For like the past decade or so, that software stack has largely been a shit pickle that barely worked properly. In the past year or so, the advancements have been so quick that while it hasn't caught up to Nvidia's software stack quite yet, it is at a point where you could use it and it is competitively good enough as long as you are using it with CDNA chips. AMDs and Intels presentations over the last year has largely been about Ai as well similar to Nvidia. You don't hear it quite as much as with Nvidia because Nvidia loves their Ai in the gaming space with DLSS, VSR, etc while Radeon has largely ignored Ai in gaming but in the datacenter, they have been getting increasingly focused on it and rightfully so. That along with continued advancements in Epyc to make sure Intel isn't catching up anytime soon means that something has gotta give.

And what gave is Radeon. I think Radeon is waving the white flag this generation and not fighting at all really because AMD needs to catch up in Ai against Nvidia in the datacenter space. MI300X is going to come out like next year while H100s have been selling in droves this year. Nvidia's roadmap says they will be releasing next generation of their datacenter GPUs next year so Radeon needs to start advancing quickly in that space. But in gaming it's a stark contrast to RDNA 2 and RDNA 1 where they were very competitive in their respective categories in gaming space.

Last edited by Jizz_Beard_thePirate - on 30 July 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Pemalite said:
hinch said:

Oof Navi 31 on the 7900 GRE really got gimped with that 256-bit bus configuration. Should be $600 max imo.

Shows how far AMD/RTG and RDNA 3 is behind Nvidia with those specs vs their counterparts on Ada LL. And they still chose to sell at over $650.. what a joke. If it were $550 that would be a compelling buy vs a 4070. Seeing as it has more VRAM and a performs slightly better. But yeah, judging from this and 7000 series so far it doesn't bode well for the 7800/7700.

It's frustrating because nVidia sets the price and AMD tries to emulate it.

...When AMD could be undercutting nVidia, selling higher volume and taking more marketshare.

Yeah its also a bad look because they're also doing the same thing as Nvidia with the naming shennanigans and keeping the status quo of high prices. Only they aren't really competitive to Nvidia this gen in the DGPU space when it comes to overall performance, efficiency to supported feature sets. Had they been aggressive with pricing with the RX 7000 series cards and with marketed them right, and say "look we're cheaper (by some margin) and offer way more VRAM" people would sweep that up. Much like RDNA 2 is doing now in the lower end/mid range. But nope. They'd rather be greedy and want all the cake and eat it lol.



Chazore said:

Going by the chart data, they kept their momentum up for around 5-6 GPu gens (Nvidia gens not AMD), and then they just gave up, and it seems they gave up big time around the GTX 900 series, which is nearly a decade ago now, and that's a hella looooooong time in the tech space, let alone console/hardware years to consider.

Nvidia are technically fucking themselves atm by adding in extras to their branding (like RTX or RTX IO, or how this gen's number line is completely muddled up), but they still hold general consumer mindshare, just not hardcore dedicated (since most of us seem to be aware that it's 4090 or nothing this gen).

I just don't see AMD bothering at this point tbh, I mean look at the chart, they haven't cared about keeping up anything for nearly a decade now. That's just bad, dire even, to see AMD go on that long and not do shit about it. 

If Intel doesn't do something, we're really going to get fucked hard in the next 5+ years from now. 

I blame a lot of AMD's woes on Graphics core next.

When you release the Radeon 7870 and then re-release it as the 8870 and then re-release it as the 270X and then re-release it as the 370X... You know you are just being shafted.

Or we take the Radeon 7510 > R7 240 > R5 330 > R5 435 > Radeon 520... That is the same GPU being on the market for 5+ years.

They did get a *really* good return on investment because of it as one design can last half a decade, but they lost revenue, profit and marketshare because of it.

Jizz_Beard_thePirate said:

I think AMDs current issue is they are resource constrained due to Ai hype in the data center. Least that's the only explanation I can think of as to why they basically skimped out on a generation that should be an easy win for them in virtually every area under a 4090. I don't think the industry was prepared for the Ai boom to come so quickly and because of this, AMD and Intel are trying to play catch up against Nvidia in the datacenter.

You look at something like ROCm for CDNA. For like the past decade or so, that software stack has largely been a shit pickle that barely worked properly. In the past year or so, the advancements have been so quick that while it hasn't caught up to Nvidia's software stack quite yet, it is at a point where you could use it and it is competitively good enough as long as you are using it with CDNA chips. AMDs and Intels presentations over the last year has largely been about Ai as well similar to Nvidia. You don't hear it quite as much as with Nvidia because Nvidia loves their Ai in the gaming space with DLSS, VSR, etc while Radeon has largely ignored Ai in gaming but in the datacenter, they have been getting increasingly focused on it and rightfully so. That along with continued advancements in Epyc to make sure Intel isn't catching up anytime soon means that something has gotta give.

And what gave is Radeon. I think Radeon is waving the white flag this generation and not fighting at all really because AMD needs to catch up in Ai against Nvidia in the datacenter space. MI300X is going to come out like next year while H100s have been selling in droves this year. Nvidia's roadmap says they will be releasing next generation of their datacenter GPUs next year so Radeon needs to start advancing quickly in that space. But in gaming it's a stark contrast to RDNA 2 and RDNA 1 where they were very competitive in their respective categories in gaming space.

Considering CDNA is based on Graphics Core Next which has been around for over 10~ years, you would think the software stack would be very mature for that architecture by now.

In saying that, GCN was always stupidly good at compute, so that architecture is well positioned to capitalise on A.I. But AMD is fumbling the software ball.

They just dont have the engineers to develop two architectures in tandem (Even if RDNA is regressing and adopting VLIW-like design philosophies), with the software stacks to go with it.

In saying that... I do prefer AMD's driver control panel over nVidia's, nVidia's reminds me of something from the 90s.

hinch said:

Yeah its also a bad look because they're also doing the same thing as Nvidia with the naming shennanigans and keeping the status quo of high prices. Only they aren't really competitive to Nvidia this gen in the DGPU space when it comes to overall performance, efficiency to supported feature sets. Had they been aggressive with pricing with the RX 7000 series cards and with marketed them right, and say "look we're cheaper (by some margin) and offer way more VRAM" people would sweep that up. Much like RDNA 2 is doing now in the lower end/mid range. But nope. They'd rather be greedy and want all the cake and eat it lol.

The Radeon 7600 is just as bad of a product as the Geforce 4060.

It cannot soundly beat the Radeon 6650XT, it still only has 8GB of Ram and it's more expensive... But comparing it to the vanilla 6600, it's a better product, but also $100 AUD more expensive, so the price/performance goal post never got shifted.

If they release the 7600 on a 96-bit memory bus they *could* have had 12GB of Ram with 250GB/s of bandwidth or more easily enough. - Throw an extra 16-32MB of infinity cache to make up for it and come in at $199 and you would have sold GPU's like hotcakes.



--::{PC Gaming Master Race}::--

Pemalite said:


In saying that... I do prefer AMD's driver control panel over nVidia's, nVidia's reminds me of something from the 90s.

hinch said:

Yeah its also a bad look because they're also doing the same thing as Nvidia with the naming shennanigans and keeping the status quo of high prices. Only they aren't really competitive to Nvidia this gen in the DGPU space when it comes to overall performance, efficiency to supported feature sets. Had they been aggressive with pricing with the RX 7000 series cards and with marketed them right, and say "look we're cheaper (by some margin) and offer way more VRAM" people would sweep that up. Much like RDNA 2 is doing now in the lower end/mid range. But nope. They'd rather be greedy and want all the cake and eat it lol.

The Radeon 7600 is just as bad of a product as the Geforce 4060.

It cannot soundly beat the Radeon 6650XT, it still only has 8GB of Ram and it's more expensive... But comparing it to the vanilla 6600, it's a better product, but also $100 AUD more expensive, so the price/performance goal post never got shifted.

If they release the 7600 on a 96-bit memory bus they *could* have had 12GB of Ram with 250GB/s of bandwidth or more easily enough. - Throw an extra 16-32MB of infinity cache to make up for it and come in at $199 and you would have sold GPU's like hotcakes.

It's $100 AUD more expensive right now, but it's MSRP is actually down from the RX 6600 ($329 for the RX 6600, $269 for the 7600) and more in line of the RX 5600 (which was OEM-only, but the price of the 7600 lies between the 5600X and 5500). The performance uptick of ~30% over the RX 6600 is also rather decent.

In other words, it's price or performance isn't the problem, but rather it's value proposition, since the RX 6000 series GPUs dropped so much that it makes the 7600 look expensive by comparison. And since there hasn't been a successor to the 6600XT yet, the 7600 must pull double duty and replace the 6600XT/6650XT so far, against whose it's practically no upgrade at all, making it's value look even worse.

12GB would have been nice, no question. But since the memory bus is already rather narrow, 96 bit could have been too low even with more infinity cache and extra fast memory chips - though I think the problem would then have been that the price would then have needed to increase quite a bit (more memory, more cache, faster VRAM chips), destroying it's value proposition even more than it does right now just to get to 12GB.

Instead, if AMD would have opted for a 160 bit bus, they could have gone with 10GB and could have used slightly slower VRAM chips, balancing the price of the thicker board and extra memory with somewhat cheaper memory. 10GB might not look like a big increase, but it's definitely better and less limiting than just 8GB and considering the performance, it should also be enough for the card.

As for the driver control panel... yeah, NVidia's always makes me think I'm back on Win95, I wonder why they still stick to this carbon-dated design.

Last edited by Bofferbrauer2 - on 31 July 2023

Around the Network
Bofferbrauer2 said:
Pemalite said:


In saying that... I do prefer AMD's driver control panel over nVidia's, nVidia's reminds me of something from the 90s.

The Radeon 7600 is just as bad of a product as the Geforce 4060.

It cannot soundly beat the Radeon 6650XT, it still only has 8GB of Ram and it's more expensive... But comparing it to the vanilla 6600, it's a better product, but also $100 AUD more expensive, so the price/performance goal post never got shifted.

If they release the 7600 on a 96-bit memory bus they *could* have had 12GB of Ram with 250GB/s of bandwidth or more easily enough. - Throw an extra 16-32MB of infinity cache to make up for it and come in at $199 and you would have sold GPU's like hotcakes.

It's $100 AUD more expensive right now, but it's MSRP is actually down from the RX 6600 ($329 for the RX 6600, $269 for the 7600) and more in line of the RX 5600 (which was OEM-only, but the price of the 7600 lies between the 5600X and 5500). The performance uptick of ~30% over the RX 6600 is also rather decent.

In other words, it's price or performance isn't the problem, but rather it's value proposition, since the RX 6000 series GPUs dropped so much that it makes the 7600 look expensive by comparison. And since there hasn't been a successor to the 6600XT yet, the 7600 must pull double duty and replace the 6600XT/6650XT so far, against whose it's practically no upgrade at all, making it's value look even worse.

12GB would have been nice, no question. But since the memory bus is already rather narrow, 96 bit could have been too low even with more infinity cache and extra fast memory chips - though I think the problem would then have been that the price would then have needed to increase quite a bit (more memory, more cache, faster VRAM chips), destroying it's value proposition even more than it does right now just to get to 12GB.

Instead, if AMD would have opted for a 160 bit bus, they could have gone with 10GB and could have used slightly slower VRAM chips, balancing the price of the thicker board and extra memory with somewhat cheaper memory. 10GB might not look like a big increase, but it's definitely better and less limiting than just 8GB and considering the performance, it should also be enough for the card.

As for the driver control panel... yeah, NVidia's always makes me think I'm back on Win95, I wonder why they still stick to this carbon-dated design.

But we need to compare it to products and their pricing on the market as they stand currently.
MSRP hasn't really held much meaning over the last several years.

96bit would have been fine if they used faster memory, bandwidth wouldn't have moved much from where it is now if they used a 96bit bus but 22,400MT/s DRAM.



--::{PC Gaming Master Race}::--

Pemalite said:
Bofferbrauer2 said:

It's $100 AUD more expensive right now, but it's MSRP is actually down from the RX 6600 ($329 for the RX 6600, $269 for the 7600) and more in line of the RX 5600 (which was OEM-only, but the price of the 7600 lies between the 5600X and 5500). The performance uptick of ~30% over the RX 6600 is also rather decent.

In other words, it's price or performance isn't the problem, but rather it's value proposition, since the RX 6000 series GPUs dropped so much that it makes the 7600 look expensive by comparison. And since there hasn't been a successor to the 6600XT yet, the 7600 must pull double duty and replace the 6600XT/6650XT so far, against whose it's practically no upgrade at all, making it's value look even worse.

12GB would have been nice, no question. But since the memory bus is already rather narrow, 96 bit could have been too low even with more infinity cache and extra fast memory chips - though I think the problem would then have been that the price would then have needed to increase quite a bit (more memory, more cache, faster VRAM chips), destroying it's value proposition even more than it does right now just to get to 12GB.

Instead, if AMD would have opted for a 160 bit bus, they could have gone with 10GB and could have used slightly slower VRAM chips, balancing the price of the thicker board and extra memory with somewhat cheaper memory. 10GB might not look like a big increase, but it's definitely better and less limiting than just 8GB and considering the performance, it should also be enough for the card.

As for the driver control panel... yeah, NVidia's always makes me think I'm back on Win95, I wonder why they still stick to this carbon-dated design.

But we need to compare it to products and their pricing on the market as they stand currently.
MSRP hasn't really held much meaning over the last several years.

96bit would have been fine if they used faster memory, bandwidth wouldn't have moved much from where it is now if they used a 96bit bus but 22,400MT/s DRAM.

But those memory chips are more expensive than the 18.000 DRAMs, and you would have needed more of them for 12GB, so there's a substantial risk it would have made the card more expensive. I don't think AMD would have sold such a card for under $300 MSRP to compensate for the extra costs, which considering it's performance would have hurt it's value at least just as much than going with just 8GB.

The problem is not directly MSRP, it's more that this time last gen GPUs have had much bigger stocks due to the crypto craze. Normally by this time after the launch of a GPU generation all the old stock has long been sold, negating the problem at hand, but not this time.



Jizz_Beard_thePirate said:

And what gave is Radeon. I think Radeon is waving the white flag this generation and not fighting at all really because AMD needs to catch up in Ai against Nvidia in the datacenter space. MI300X is going to come out like next year while H100s have been selling in droves this year. Nvidia's roadmap says they will be releasing next generation of their datacenter GPUs next year so Radeon needs to start advancing quickly in that space. But in gaming it's a stark contrast to RDNA 2 and RDNA 1 where they were very competitive in their respective categories in gaming space.

Whoever was running the multiple wings at AMD (not Lisa, I'm talking below her), really, really let their fucking eyes get clouded so hard, because AMD have catching up to do in multiple fields in regards to what Nvidia is doing, and considering this gen they've waved the white flag, it tells us that next gen will be a repeat, because Nvidia will just come up with something again that AMD needs to catch up on, or Nvidia will make some stupidly massive breakthrough that AMD doesn't want to follow (For stupid reasons I guess), meaning they will have to find an alternative route (like with FSR), which will likely take them longer and have them falling behind again, thus raising yet another white flag.

I feel like some folks at AMD need to be let go, because this feels oddly like the Raja scenario, where someone is making the bad calls, far too late and at the worst possible timing. It may be Lisa, it may be someone else, but someone is definitely not firing on all cylinders in their noggin this gen, and they need to be weeded out and removed asap. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Bofferbrauer2 said:
Pemalite said:

But we need to compare it to products and their pricing on the market as they stand currently.
MSRP hasn't really held much meaning over the last several years.

96bit would have been fine if they used faster memory, bandwidth wouldn't have moved much from where it is now if they used a 96bit bus but 22,400MT/s DRAM.

But those memory chips are more expensive than the 18.000 DRAMs, and you would have needed more of them for 12GB, so there's a substantial risk it would have made the card more expensive. I don't think AMD would have sold such a card for under $300 MSRP to compensate for the extra costs, which considering it's performance would have hurt it's value at least just as much than going with just 8GB.

The problem is not directly MSRP, it's more that this time last gen GPUs have had much bigger stocks due to the crypto craze. Normally by this time after the launch of a GPU generation all the old stock has long been sold, negating the problem at hand, but not this time.

The 96bit bus would have reduced those costs.

6x 16Gigabit chips would do it.
Or 3x 32Gbit chips which Samsung has, which is actually less than the current 7600.



--::{PC Gaming Master Race}::--

Pemalite said:
Bofferbrauer2 said:

But those memory chips are more expensive than the 18.000 DRAMs, and you would have needed more of them for 12GB, so there's a substantial risk it would have made the card more expensive. I don't think AMD would have sold such a card for under $300 MSRP to compensate for the extra costs, which considering it's performance would have hurt it's value at least just as much than going with just 8GB.

The problem is not directly MSRP, it's more that this time last gen GPUs have had much bigger stocks due to the crypto craze. Normally by this time after the launch of a GPU generation all the old stock has long been sold, negating the problem at hand, but not this time.

The 96bit bus would have reduced those costs.

6x 16Gigabit chips would do it.
Or 3x 32Gbit chips which Samsung has, which is actually less than the current 7600.

Not sure if 96bit bus would be that much cheaper to counteract the higher price of the extended and faster memory, hence my 10GB middle-of-the-road suggestion before. But I may be wrong.