By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Captain_Yuri said:
hinch said:

Curious to see how they plan on pricing these. And how the 7950 XTX performs.

Probably 5-10% faster at best since 7900XTX is already full die afaik so the only thing they can do is increase the clock speed similar to 6950XT.

Yeah close to what heavily OC'd AiB 7900 XTX is a good bet. Can't see them doing anything drastic as we're already well into a quarter of this generation.

Might be the case of adding some tweaks here and there.. strapping on a larger cooler for reference, bumping up TDP and clocks and calling it a day. Won't cost them much to do that other than adding a bit more cost for the cooler. Price it at $1000 or so. Drop the 7900XTX  $900 and 7900XT to $700 by the end of the year.



Around the Network
hinch said:
Captain_Yuri said:

Probably 5-10% faster at best since 7900XTX is already full die afaik so the only thing they can do is increase the clock speed similar to 6950XT.

Yeah close to what heavily OC'd AiB 7900 XTX is a good bet. Can't see them doing anything drastic as we're already well into a quarter of this generation.

Might be the case of adding some tweaks here and there.. strapping on a larger cooler for reference, bumping up TDP and clocks and calling it a day. Won't cost them much to do that other than adding a bit more cost for the cooler. Price it at $1000 or so. Drop the 7900XTX  $900 and 7900XT to $700 by the end of the year.

Yea the main thing for AMD will be that they learned a lot from their first MCM design and will apply it to RDNA 4. And hopefully their launch RDNA 4 with less deceit during their press conferences and give people realistic expectations and price their products more appropriately. Instead of rebranding the 800XT to 900XT and pricing the 900XT $900, do it the proper way and just call it the 800XT and price it $700-$750. And now that they have the Ai hardware, come out with an Ai based upscaler to compete against DLSS and XeSS. AMDs moto in the past has been they are the good guys and hopefully, they get back in track cause we all know Nvidia sure isn't.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

If I were to choose I'd rather have RTG focus on ray tracing. High-end screens can deal with frame interpolation and upscaling increasingly well (especially the former), so I don't give much thought to those.



 

 

 

 

 

Captain_Yuri said:

Two and a half years since launch GeForce RTX 3070 price finally drops below MSRP

https://videocardz.com/newz/two-and-a-half-years-since-launch-geforce-rtx-3070-price-finally-drops-below-msrp

Took em long enough

It's about damn time!

Captain_Yuri said:

Not a good quarter for AMD. Not as bad as Intel's, but certainly not good.

hinch said:
JEMC said:

It's so weird to see AMD already talking about xx50 parts that I can't help going back at launch and the rumor about AMD finding a last minute bug that prevented Navi31 to clock as high as it should, limiting its performance.

But that's just speculation. At least now we know what we can expect from them, even if we still don't know how well they'll perform.

True. The 7900 XTX has been selling quite well so I can't see this releasing any time soon. Probably Q4 for refresh Navi 31. If they can get clocks up for the refresh and get top Navi 31 to 3.2Ghz+, that will make RDNA 3 much more competitive to the 4090; at least in Raster. Though still won't be anywhere near as efficient. Maybe its a new stepping they'll make for a properly fixed design closer to what they shown (50% efficiency to performance gains) will replace production of current 7900/7900XTX's?

Speculation again. Might also be why AMD are taking so long with Navi 32. That and them cleaing old RX 6000 series stock.

Launching the xx50s a year later makes a bit more sense, as a mid-cycle refresh. But we'll see.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

One thing that's a bit annoying with the recent trend of games is that while yes they do require copious amounts of vram, even when you have the vram, $1000-$1200 GPUs still aren't cutting it for 4k 60fps native even in Raster.

For example, if we look at Jedi Survivor and The Last of Us.

You can see that $1000 7900XTX is barely cutting the 60fps average which you know means that it can't hold 60fps and the $1200 4080 is below it. It's like they are using 4090s as test bench. Kind of crazy honestly that you need a 4090 to get above 4k 60fps stable which it really shouldn't be the case..



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:

So I have been playing Star Wars Jedi Survivor since yesterday after their patch and it's like... Why not launch with this patch? Most of the stutters are gone and the performance has increased... There is still some minor stutters but it is 100% playable. If it launched like this, most people wouldn't complain. SMH EA. Least I got it for free.

Seeing as how the patch hasn't been far off from the release month, makes me think they knew they were up shit creek and did this patch to then get pass the cert and then out the door during the week.

Either way it's bad.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Captain_Yuri said:

One thing that's a bit annoying with the recent trend of games is that while yes they do require copious amounts of vram, even when you have the vram, $1000-$1200 GPUs still aren't cutting it for 4k 60fps native even in Raster.

For example, if we look at Jedi Survivor and The Last of Us.

You can see that $1000 7900XTX is barely cutting the 60fps average which you know means that it can't hold 60fps and the $1200 4080 is below it. It's like they are using 4090s as test bench. Kind of crazy honestly that you need a 4090 to get above 4k 60fps stable which it really shouldn't be the case..

There was a video from skill-up that I watched today, where they were talking about the state of AAA PC ports, and one guy brought up something I've been bitching about for some time now:

"on PC, specifically Nvidia, we're seeing a lot more reliance on DLSS to get us these high frame rates, whilst going native, we're seeing crippling frame rates or 60ish at times".


He also said before, that in the past we've always relied on simply brute forcing our way with games, but ever since DLSS/FSR showed up, we're now seeing GPU's suffer without games having either tech in each game. I knew over time, that while I thought these were some nifty bits of tech to use, they ultimately are artificial "band-aids", especially when you rip them off, and then you see performance drop, meaning we are absolutely relying on the tech to get us to stable frame rates, vs what we used to do, which was sheer brute forcing, and for the most part, that worked.

Seeing how DLSS/FSR support is picking up, not every game is supporting them as much, which makes me want to just say "drop it, let's go back to the brute force days". I say this, because I'm kinda getting fed up of GPU's getting half the tech baked into games (Hairworks, wtf happened to that?, Nvidia sand and fire tech, where are those in all the games today, or AMD Tress FX?), and now seeing said games suffer without said tech, and I strongly believe we shouldn't become reliant on them either.

Basically, DLSS/FSR is really neato and all that jazz, but I'd like to see them phased out in the next 2 gens, instead of "this is our future from now on, we're going to get crippled games without either tech", because that would set us up for a very bad long-term performance road (as well as devs still not getting fully to grips with DX 12/UE 5).



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Yea I don't think that's happening. I think Ai based upscaling and Ai enhanced frame generation is going to be a big role going forward. With RDNA 3 having pretty good Ai accelerators, I wouldn't be surprised if the PS5 Pro and Xbox Series X Pro come out with Ai upscaling while also feeds into the PCs. There's simply too much savings in hardware with Ai based upscaling as opposed to bruteforcing and especially with Nvidia, they can lock it down behind their platform.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Yea the main thing for AMD will be that they learned a lot from their first MCM design and will apply it to RDNA 4. And hopefully their launch RDNA 4 with less deceit during their press conferences and give people realistic expectations and price their products more appropriately. Instead of rebranding the 800XT to 900XT and pricing the 900XT $900, do it the proper way and just call it the 800XT and price it $700-$750. And now that they have the Ai hardware, come out with an Ai based upscaler to compete against DLSS and XeSS. AMDs moto in the past has been they are the good guys and hopefully, they get back in track cause we all know Nvidia sure isn't.

It just seemed like AMD weren't ready this generation. From the botched reveal to the actual launch. Then you have the software development of the drivers where 6000 series owners had no updates for months. Granted it was their first attempt at MCM, its not like they regressed in performance but I think they largely underdelivered. Hopefully we'll see some big changes and impovements with RDNA 4.

One way to get some good will back is price the cards reasonably. And move on with software development like you said. They have decent hardware, but the software is way behind. Hopefully we'll get to see some bettered version of FSR that's actually more comparable to DLSS and FSR 3.0 actually delivers.

Even better if AMD and RTG has the inititive to work on their own unique features and selling points in their GPU's.

JEMC said:

Launching the xx50s a year later makes a bit more sense, as a mid-cycle refresh. But we'll see.

Yeah, might garner more interest especially if Nvidia does a refresh as well going into next year.

Captain_Yuri said:

One thing that's a bit annoying with the recent trend of games is that while yes they do require copious amounts of vram, even when you have the vram, $1000-$1200 GPUs still aren't cutting it for 4k 60fps native even in Raster

You can see that $1000 7900XTX is barely cutting the 60fps average which you know means that it can't hold 60fps and the $1200 4080 is below it. It's like they are using 4090s as test bench. Kind of crazy honestly that you need a 4090 to get above 4k 60fps stable which it really shouldn't be the case..

Really goes to show how messed GPU's are this generation. Extreme price hikes baring the 4090. And needing to spend so much to get a decent experience to play the latest games in 4K. In most part largely thanks to poor optimisation. Really, we should be easily able to handle 4K ultra no issues at 60FPS or over on higher end GPU's. Especially when spending $1000 and over.

Honestly really hope Intel delivers with their Battlemage. Maybe that will wake AMD and Nvidia up a bit moving forward.

Last edited by hinch - on 03 May 2023

hinch said:
Captain_Yuri said:

Yea the main thing for AMD will be that they learned a lot from their first MCM design and will apply it to RDNA 4. And hopefully their launch RDNA 4 with less deceit during their press conferences and give people realistic expectations and price their products more appropriately. Instead of rebranding the 800XT to 900XT and pricing the 900XT $900, do it the proper way and just call it the 800XT and price it $700-$750. And now that they have the Ai hardware, come out with an Ai based upscaler to compete against DLSS and XeSS. AMDs moto in the past has been they are the good guys and hopefully, they get back in track cause we all know Nvidia sure isn't.

It just seemed like AMD weren't ready this generation. From the botched reveal to the actual launch. Then you have the software development of the drivers where 6000 series owners had no updates for months. Granted it was their first attempt at MCM, its not like they regressed in performance but I think they largely underdelivered. Hopefully we'll see some big changes and impovements with RDNA 4.

One way to get some good will back is price the cards reasonably. And move on with software development like you said. They have decent hardware, but the software is way behind. Hopefully we'll get to see some bettered version of FSR that's actually more comparable to DLSS and FSR 3.0 actually delivers.

Even better if AMD and RTG has the inititive to work on their own unique features and selling points in their GPU's.

Captain_Yuri said:

One thing that's a bit annoying with the recent trend of games is that while yes they do require copious amounts of vram, even when you have the vram, $1000-$1200 GPUs still aren't cutting it for 4k 60fps native even in Raster

You can see that $1000 7900XTX is barely cutting the 60fps average which you know means that it can't hold 60fps and the $1200 4080 is below it. It's like they are using 4090s as test bench. Kind of crazy honestly that you need a 4090 to get above 4k 60fps stable which it really shouldn't be the case..

Really goes to show how messed GPU's are this generation. Extreme price hikes baring the 4090. And needing to spend so much to get a decent experience to play the latest games in 4K. In most part largely thanks to poor optimisation. Really, we should be easily able to handle 4K ultra no issues at 60FPS or over on higher end GPU's. Especially when spending $1000 and over.

Honestly really hope Intel delivers with their Battlemage. Maybe that will wake AMD and Nvidia up a bit moving forward.

Yea the good news is that RDNA 3 has a lot of good groundwork to improve from. I always felt RDNA 2 lacked that RT performance and Ai performance and some of the compute power where has RDNA 3 largely fixes all that. It might not be on par with Nvidia but it also costs less so who cares. I think that if the PS5 Pro or Series X Pro does use RDNA 3 or newer, then Microsoft might come up with an Ai based upscaling tech like what we saw with DirectML way back in the day to use with their games. Then that would likely be implemented widely and will work with Nvidia/Radeon/Intel GPUs. We will see though.

And yea, it logically makes no sense as to why we are seeing such awful performance numbers for Raster. Like RT I get but Raster with no RT? Makes little to no sense. And increasing the prices to $70 adds that extra bit of salt in the wound. Meanwhile I have been playing Legend of Zelda Tears of the Kingdom that got leaked via emulation and that game just might be the best game to come out this decade. I feel like buying another switch cause between me and my sister, we won't be able to share the game, that's for sure lol. It is incredible despite some minor visual glitches on the emulation and that thing runs on a god damn 0.4 TF GPU while modern AAA games that run on 100 TF 4090 feels largely sleep worthy.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850