By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

JEMC said:
Captain_Yuri said:

Well the 6800XT is 72CU while the 6900XT/XTX is 80CU. That's why the 7900XT is a 7800XT is disguise because the 7900XT is 84 CU vs 7900XTX is 96 CU.

The 4080 is a 4070 disguised as a 4080 in paper specs. But since AMD themselves are saying that the 7900XTX is a 4080 competitor, it is a "4080" in the sense that is will compete against the 7900XT realistically if that makes any sense. Cause the 7900XTX will no doubt out perform the 4080 cause the 4080 is just too slow. But the 7900XT will most likely be a close match against the 4080.

So both companies are doing shitty things. Nvidia outright said fu and priced the 4080 to $1200 vs the $700 of the previous 3080 while AMD is being slimy by increasing the price to $900 and calling it a 7900XT while the previous price of the 6800XT was $650. If Nvidia doesn't lower the price of the 4080, it will certainly be the shittiest GPU of the year though. It will be a worse buy than Intel A770 imo.

Ok, so because the difference in CUs between the two 7900 is bigger than the difference between the 6900 and 6800s, you've deduced that the 7900XT should be a lower tier card than its name says, and that's why it should be the 7800XT. That's it, right?

I don't think it's as easy as that, because the RDNA architectue only has three gens and AMD has changed the naming and how they've used the different designs in each one of them versus Nvidia, that has a more reliable history of which tier uses which chips, but I see your point. 

Well it's not like Nvidia was giving out top end chips to 80 class either:

1080: GP104
2080: TU104
3080: GA102
4080: AD103

Hell in Nvidias history, 4080 would still be considered a higher their class than Pascal days. The only reason anyone cares about this is because Nvidia underestimated AMD and went Samsung so they were forced to give out high end chips. Now they they are back to TSMC, they are going back to their old ways... Least in terms of the tier of chips that is...

Last edited by Jizz_Beard_thePirate - on 05 November 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:
QUAKECore89 said:

Just in time i was reading this article, it's really sure chaos going on!

https://videocardz.com/newz/amd-confirms-rx-7900-xtx-is-rtx-4080-competitor-fsr3-may-be-supported-by-pre-rdna3-architectures

LOL

Zen 4 Flop
RDNA 3 Flop

No such thing a bad product, only a bad price.

JEMC said:

Ok, now I'm a bit lost. I know the difference in shaders with the Ada GPUs is so big that we've made those kind of comments, but is the difference between the XTX and the XT so big? I know one has 12 CUs less than the other, but the full Navi 32 GPU has, allegedly, 60CUs, putting the 7900XT in the middle of both full chips.

The shader output difference between the XT and XTX is about 24%. It should be a lower tier part.
It's not just CU's we are looking at here, but clock reductions too.
Does come with it 55w of less power usage, so that might be the make it/break it decider for a lot of people.

Really depends how AMD fills out the rest of the product stack I guess, but considering anything below is going to have cutbacks of more than 24% than the highest tier part, it could be some significant jumps between mid range and high-end.

Either way, I don't care much about the name of something, I do care about the price/performance. It could be called the Radeon 6200, but if it performs brilliantly, they will get my coin.



--::{PC Gaming Master Race}::--

Pemalite said:
Captain_Yuri said:

LOL

Zen 4 Flop
RDNA 3 Flop

No such thing a bad product, only a bad price.

I wouldn't say that. Gigabyte had power supplies that were exploding/catching on fire. Wouldn't use that even if it was free. But I get your point though.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
Pemalite said:

No such thing a bad product, only a bad price.

I wouldn't say that. Gigabyte had power supplies that were exploding/catching on fire. Wouldn't use that even if it was free. But I get your point though.

They were still just badly priced. They could have been used for something other than powering equipment.
Like filling of landfill.

To be honest, I haven't touched anything Gigabyte since the AM3 990FX days, they denied that their boards were flakey and had extreme vdroop, it was documented rather heavily in enthusiast forums with tons of evidence, essentially just calling us all liars.

Ended up returning it for a full refund and stuck with ASUS ever since. Screw 'em.



--::{PC Gaming Master Race}::--

Around the Network

Ryzen 7950X on 105W ECO mode has ~94% of the performance for ~61% of the power consumption, and runs over 30ºC cooler.

It makes me even wonder why AMD went for such inefficient clock speeds. Is it just to try to remain atop the charts? Or some sort of market game vs. Intel and the 7000 series itself to make a hypothetical 105W 7950X3D look that much more of a leap?



 

 

 

 

 

So, for those thinking about making a system upgrade this holiday (you know, treating yourselves with a present), the guys at TechPowerUp have been doing a series of articles where they compare two CPUs with a larger amount of games than usual while using a RTX 4090 to avoid GPU bottlenecks as much as possible.

So far there's only three of those articles, and only one of them uses one of the new released CPUs, which means that there's room for more of these articles, but at least tkey can help you decide what's better, a whole system upgrade or, if you're on an AM4 system, just a CPU upgrade.

Here are the links:

RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review: https://www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-core-i9-12900k/
     >>The 12900K is up to 16% faster at 1080p, 14% at 1440p and 6.5% at 4K

RTX 4090 & 53 Games: Ryzen 7 5800X vs Ryzen 7 5800X3D Review: https://www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-ryzen-7-5800x3d/
     >>The 5800X3D is up to 18.5% faster at 1080p, 15% at 1440p and 6.8% at 4K, making it faster than the 12900K

RTX 4090 & 53 Games: Core i9-13900K vs Ryzen 7 5800X3D Review: https://www.techpowerup.com/review/rtx-4090-53-games-core-i9-13900k-vs-ryzen-7-5800x3d/
     >>The 13900K is up to 6% faster at 1080p, 5% at 1440p and not even 1.5% at 4K

So far, these articles only reinforce the idea that Yuri has said countless times, that those on an AM4 system should go for a 5800X3D, but I hope they make more of these articles to include CPUs like the 13600K and the 7700X or 7600X, to have processors in other price points.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Time for a controversial prediction: If AMD doesn't change their current trajectory, Intel may be the choice of next gen consoles

Over the years, we had console hardware changes based on a variety of reasons. I think in the current landscape, Intel has a good chance of getting design wins for the next gen console hardware and it comes down to a few key reasons that's not all of which is AMD's fault. In the past, AMD was the only manufacturer of both CPU and GPU but that has now clearly changed.

1) Intel has their own fabs which means they aren't susceptible to TSMCs price increases in their CPUs
2) Intel has better packaging with their CPUs. They are able to ship more cores (even if they are e-cores) for similar prices to AMD counterparts
3) Arc is slow only due to driver problems. But for consoles, that doesn't matter as Sony/MS has full control. And remember, this was a trait of AMD in the past which they have largely fixed to a degree. 
4) Arc as an architecture is better than RDNA. RT performance isn't a driver issue from AMD, its a hardware issue. While AMD did finally add Ai Acceleration to RDNA 3, I get the feeling it will be similar to their RT performance and be behind.
5) Intel is desperate to get a win in Arc so they will be willing to lower their prices to unusual degrees as we seen from their Arc GPUs already

With this generation of consoles, the biggest new tech that most of the games have is Ray Tracing and fast loading but because of the hardware limitations of RDNA, its not possible to do too much with RT. Next gen consoles will have two big pillars imo. Ray Tracing and Ai Upscaling and if Intel is able to give those advantages to Sony/MS for similar if not less prices than AMD, they will switch.

Last edited by Jizz_Beard_thePirate - on 06 November 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Time for a controversial prediction: If AMD doesn't change their current trajectory, Intel may be the choice of next gen consoles

Over the years, we had console hardware changes based on a variety of reasons. I think in the current landscape, Intel has a good chance of getting design wins for the next gen console hardware and it comes down to a few key reasons that's not all of which is AMD's fault. In the past, AMD was the only manufacturer of both CPU and GPU but that has now clearly changed.

1) Intel has their own fabs which means they aren't susceptible to TSMCs price increases in their CPUs
2) Intel has better packaging with their CPUs. They are able to ship more cores (even if they are e-cores) for similar prices to AMD counterparts
3) Arc is slow only due to driver problems. But for consoles, that doesn't matter as Sony/MS has full control. And remember, this was a trait of AMD in the past which they have largely fixed to a degree. 
4) Arc as an architecture is better than RDNA. RT performance isn't a driver issue from AMD, its a hardware issue. While AMD did finally add Ai Acceleration to RDNA 3, I get the feeling it will be similar to their RT performance and be behind.
5) Intel is desperate to get a win in Arc so they will be willing to lower their prices to unusual degrees as we seen from their Arc GPUs already

With this generation of consoles, the biggest new tech that most of the games have is Ray Tracing and fast loading but because of the hardware limitations of RDNA, its not possible to do too much with RT. Next gen consoles will have two big pillars imo. Ray Tracing and Ai Upscaling and if Intel is able to give those advantages to Sony/MS for similar if not less prices than AMD, they will switch.

Man, if they were to switch, it'll be all over for AMD, since they have heavily relied on their console sales to keep them going for their other sectors (their CPU side has been picking up, but that alone will not sustain them indefinitely), if the consoles drop them, they won't stand much of a chance against the ongoing battle between Nvidia, and now Intel in the GPU space, though I don't see Intel magically releasing powerful hw for next gen, let alone the PC space, give them another decade.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Captain_Yuri said:

Time for a controversial prediction: If AMD doesn't change their current trajectory, Intel may be the choice of next gen consoles

Over the years, we had console hardware changes based on a variety of reasons. I think in the current landscape, Intel has a good chance of getting design wins for the next gen console hardware and it comes down to a few key reasons that's not all of which is AMD's fault. In the past, AMD was the only manufacturer of both CPU and GPU but that has now clearly changed.

1) Intel has their own fabs which means they aren't susceptible to TSMCs price increases in their CPUs
2) Intel has better packaging with their CPUs. They are able to ship more cores (even if they are e-cores) for similar prices to AMD counterparts
3) Arc is slow only due to driver problems. But for consoles, that doesn't matter as Sony/MS has full control. And remember, this was a trait of AMD in the past which they have largely fixed to a degree. 
4) Arc as an architecture is better than RDNA. RT performance isn't a driver issue from AMD, its a hardware issue. While AMD did finally add Ai Acceleration to RDNA 3, I get the feeling it will be similar to their RT performance and be behind.
5) Intel is desperate to get a win in Arc so they will be willing to lower their prices to unusual degrees as we seen from their Arc GPUs already

With this generation of consoles, the biggest new tech that most of the games have is Ray Tracing and fast loading but because of the hardware limitations of RDNA, its not possible to do too much with RT. Next gen consoles will have two big pillars imo. Ray Tracing and Ai Upscaling and if Intel is able to give those advantages to Sony/MS for similar if not less prices than AMD, they will switch.

Man, if they were to switch, it'll be all over for AMD, since they have heavily relied on their console sales to keep them going for their other sectors (their CPU side has been picking up, but that alone will not sustain them indefinitely), if the consoles drop them, they won't stand much of a chance against the ongoing battle between Nvidia, and now Intel in the GPU space, though I don't see Intel magically releasing powerful hw for next gen, let alone the PC space, give them another decade.

Well they don't really need powerful hardware, they need midranged hardware. Remember that ps4/pro used AMD even though they were no where near the top. All Intel needs is pricing and tech. And yea, if it happens, AMD will be in big trouble. AMDs current bread and butter is their CPUs, especially in datacenter so that is where I'd imagine all their efforts are going into while Intel is heavily losing in datacenter in CPU front and they didn't have much presence in the GPU front to begin with. Nvidia is dominating both in Client and Datacenter GPU space while AMD is dominating Intel in datacenter and increasingly stealing market share away from Intel in laptops. AMD is also taking share away in the desktop space but I don't think they are fond of the fact that their new generation CPUs are selling horribly even if their Ryzen 5000 CPUs are still selling very well.

So if Intel doesn't win consoles, it will be a very dire situation for them too. Imo they will axe Arc if it doesn't gain in market share or get a design in win consoles. And I doubt they will gain much market share in the PC space for a very long time for their GPUs.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850