By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

300 games milestone has been achieved!



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:

AMD Radeon RX 7600 GPU specs confirmed, Navi 33 XL with 2048 Stream Processors and 8GB VRAM

https://videocardz.com/newz/amd-radeon-rx-7600-gpu-specs-confirmed-navi-33-xl-with-2048-stream-processors-and-8gb-vram

"AMD is set to launch its Radeon RX 7600 non-XT GPU on May 25th. The fact that this card uses ‘XL’ GPU variant strongly suggests that we might see ‘Navi 33 XT’ for the RX 7600 XT later, with more memory maybe?"

I would expect the 7600XT to come with 16GB of VRAM to compete with the 4060Ti 16GB.

It's going to get competitive I think, boon for us.

Captain_Yuri said:

But the bigger question I have is where is the rest of RDNA 3s lineup? All Radeon released is 7900XTX/XT and soon the 7600? From providing an alternative to Nvidia at every GPU class with RDNA 2, they seem to be only picking certain categories while continuing to produce RDNA 2 to fill in the gaps for the rest? I am not much of a power guy but there is something to be said for a mid range buyer to consider say a 6950XT which is a really big and power hungry GPU with triple 8 pin vs a small 4070 powered by a single 8 pin. If AMD would release a 7700XT with 16GB of Vram that is similar in power and size to a 4070 along with the benefits of RDNA 3, that would make the decisions better imo.

AMD has a capacity issue at the moment. - Plus the driver situation for the 7000 series has been a shit-show, I am assuming they are trying to refine that side of the equation before they start pushing higher-volume parts which will bring in more complaints.

Mummelmann said:

VRAM is now being sold like storage in Apple devices; insane upcharge for features/capabilities that should be standard.

And the market is lapping it up...






--::{PC Gaming Master Race}::--

JEMC said:

And don't forget that those results are with DLSS3.

Captain_Yuri said:

Also don't forget that it's with Ray Tracing on which the Witcher 3 now has

Yeah, that's what makes this whole shibang really terrible to look at.

Not only is it doing RT at 1080p, but it's using DLSS 3, and well, it's not Cyberpunk with a massive city and path tracing, and up to date current tech or textures (I know full well they are using Halk's optimised textures, since they hired him to work with them on the remaster), so seeing these results makes no sense, other than Nvidia just relying far too heavily on that band-aid.

Think of it this way, remove DLSS 3, and you'd think brute force would save the day, right?. Except this is WITH DLSS 3, not without, which means brute forcing with RT at 10-80 pee (I'ma stress 1080p so damn hard because we're in 2023 and 1080p is ancient as hell now, why the fuck would we even have problems at that res is beyond me) nets us less frames than barely 80fps.

I'd like to see what it can do with Witcher 3 without RT and without DLSS 3, just so I can laugh a little at the gains (the gains we should be getting in the first place).

I feel like looking back on all the releases of GPU's this gen so far, Nvidia was just banking on the 4090 as the only card to be actually worthy of being used, weren't they?. Like the 4080 is iffy, but again, it's no 4090, and going below that only nets you less and less along the way. I swear the 4090 was pitched in too good of a way, that everything else is now marginally worse to make the 4090 look good (esp when we all know that it's meant to be a 4080 itself, not the recently made up 90 series). 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
JEMC said:

And don't forget that those results are with DLSS3.

Captain_Yuri said:

Also don't forget that it's with Ray Tracing on which the Witcher 3 now has

Yeah, that's what makes this whole shibang really terrible to look at.

Not only is it doing RT at 1080p, but it's using DLSS 3, and well, it's not Cyberpunk with a massive city and path tracing, and up to date current tech or textures (I know full well they are using Halk's optimised textures, since they hired him to work with them on the remaster), so seeing these results makes no sense, other than Nvidia just relying far too heavily on that band-aid.

Think of it this way, remove DLSS 3, and you'd think brute force would save the day, right?. Except this is WITH DLSS 3, not without, which means brute forcing with RT at 10-80 pee (I'ma stress 1080p so damn hard because we're in 2023 and 1080p is ancient as hell now, why the fuck would we even have problems at that res is beyond me) nets us less frames than barely 80fps.

I'd like to see what it can do with Witcher 3 without RT and without DLSS 3, just so I can laugh a little at the gains (the gains we should be getting in the first place).

I feel like looking back on all the releases of GPU's this gen so far, Nvidia was just banking on the 4090 as the only card to be actually worthy of being used, weren't they?. Like the 4080 is iffy, but again, it's no 4090, and going below that only nets you less and less along the way. I swear the 4090 was pitched in too good of a way, that everything else is now marginally worse to make the 4090 look good (esp when we all know that it's meant to be a 4080 itself, not the recently made up 90 series). 

Well visually it may not be as amazing as cyberpunk but it does have RTGI and all that so it still runs very heavy. Remember that Path Tracing can kill a 4090 even on a game like Portal. And if you want to see how it performs without Ray Tracing, all you gotta do is look at a 3070 performance cause that's all it is at a lower power consumption and Ada features.

https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/29.html

I think Nvidia is just largely banking on the fact that Radeon largely isn't competing this gen so if a person wants to upgrade to something new and shiny, Nvidia is essentially the only choice sub $800. You can get RDNA 2 but a lot of the stack is similar in Raster performance to Lovelace with nearly double the power consumption and worse Ray Tracing and less features. Even the vram argument is iffy now since a 4060 Ti now has 16GB. And if the GPUs don't sell, they will allocate the stock towards the enterprise Ai craze where Microsoft and other companies keeps ordering their gpus in packs of 10,000.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

End of an era...

He's not totally going away but LTT will have a new CEO who ran NCIX, Corsair and Dell. We will see how that goes.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Captain_Yuri said:


But the bigger question I have is where is the rest of RDNA 3s lineup? All Radeon released is 7900XTX/XT and soon the 7600? From providing an alternative to Nvidia at every GPU class with RDNA 2, they seem to be only picking certain categories while continuing to produce RDNA 2 to fill in the gaps for the rest? I am not much of a power guy but there is something to be said for a mid range buyer to consider say a 6950XT which is a really big and power hungry GPU with triple 8 pin vs a small 4070 powered by a single 8 pin. If AMD would release a 7700XT with 16GB of Vram that is similar in power and size to a 4070 along with the benefits of RDNA 3, that would make the decisions better imo.

My guess is that AMD was forced to launch the 7900 GPUs way too early in an unfinished state to be able to compete with NVidia, hence all it's problems both in hardware and software and why there's something like an RDNA3+. So they are refining the chips while they still can (as in, while they still have RDNA2 GPUs of similar performance in stock) to iron out the problems and possibly gain an edge against NVidia while they can. 'm sure 7500/7700/7800 branded cards will also come at a later date but as long as they have the corresponding RDNA2 GPUs in stock, it's probably not in AMD's best interest to launch them too soon.



JEMC said:

Oh, by the way, AMD confirmed they're working on hybrid CPUs:

https://www.tomshardware.com/news/amd-to-make-hybrid-cpus-using-ai-for-chip-design-cto-papermaster-at-itf-world

With those, they could finally make Athlons with more than just 2 Zen 2 cores (Mendocino, 72xxU, Dali and Pollock, it's predecessors still were on Zen1...).

I doubt that AMD will use them nearly as much in higher performance chips as Intel does. I expect the Ryzen 9 8950 or the 9950 (not sure if the former will already get hybrid cores) will use 16p/8e cores, thus basically reversing the numbers from Intel's chips. The mobile, monolithic chips however will certainly get much mileage from them - and since the e-cores from AMD do support hyperthreading contrary to Intel's offerings, this should also vastly improve multicore performance even with less physical CPU cores added.



Captain_Yuri said:

Well visually it may not be as amazing as cyberpunk but it does have RTGI and all that so it still runs very heavy. Remember that Path Tracing can kill a 4090 even on a game like Portal. And if you want to see how it performs without Ray Tracing, all you gotta do is look at a 3070 performance cause that's all it is at a lower power consumption and Ada features.

I think Nvidia is just largely banking on the fact that Radeon largely isn't competing this gen so if a person wants to upgrade to something new and shiny, Nvidia is essentially the only choice sub $800. You can get RDNA 2 but a lot of the stack is similar in Raster performance to Lovelace with nearly double the power consumption and worse Ray Tracing and less features. Even the vram argument is iffy now since a 4060 Ti now has 16GB. And if the GPUs don't sell, they will allocate the stock towards the enterprise Ai craze where Microsoft and other companies keeps ordering their gpus in packs of 10,000.

Not really that surprised, but also not really that "wow" at the 3070, when I just noticed my 1080ti there, a 3 gen old card, managing 132fps (I was expecting a 3 gen newer card to do better on a game like Witcher 3, without the RT).

I've had this thought for some yrs now that Nvidia simply doesn't want to be much in the market for consumer GPU's now (We've seen this in AI self driving cars, the Cloud and other tech firms, hell even Nintendo), so it's not all that surprising to see them bank on crypto bros, then immediately jump o banking on AI bros and MS, instead of coming back to us.

I don't see AMD competing in a way that will get Nvidia to suddenly care again, they've shown us some time ago that they've stopped, they'll pull a Konami and leave one day, just watch (and honestly, good, because I've grown to dislike Nvidia and jensen's shitty mentality over the yrs anyway).

Captain_Yuri said:

End of an era...

He's not totally going away but LTT will have a new CEO who ran NCIX, Corsair and Dell. We will see how that goes.

A CEO that's come from corporate tells me that LTT is going to go full corpo in the future. Why didn't he hire someone that's from somewhere smaller?. Literally any time in history I have seen someone from big corpo, it has resulted in said company growing for sure, but also becoming more corpo (Like look at what Michael Eisner and Katzenberg did to Disney over the yrs, they made it worse, despite giving us the decade long Disney Renascence) .



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Bofferbrauer2 said:
Captain_Yuri said:


But the bigger question I have is where is the rest of RDNA 3s lineup? All Radeon released is 7900XTX/XT and soon the 7600? From providing an alternative to Nvidia at every GPU class with RDNA 2, they seem to be only picking certain categories while continuing to produce RDNA 2 to fill in the gaps for the rest? I am not much of a power guy but there is something to be said for a mid range buyer to consider say a 6950XT which is a really big and power hungry GPU with triple 8 pin vs a small 4070 powered by a single 8 pin. If AMD would release a 7700XT with 16GB of Vram that is similar in power and size to a 4070 along with the benefits of RDNA 3, that would make the decisions better imo.

My guess is that AMD was forced to launch the 7900 GPUs way too early in an unfinished state to be able to compete with NVidia, hence all it's problems both in hardware and software and why there's something like an RDNA3+. So they are refining the chips while they still can (as in, while they still have RDNA2 GPUs of similar performance in stock) to iron out the problems and possibly gain an edge against NVidia while they can. 'm sure 7500/7700/7800 branded cards will also come at a later date but as long as they have the corresponding RDNA2 GPUs in stock, it's probably not in AMD's best interest to launch them too soon.

Well if that is the case, then the clock is ticking cause Blackwell could come out next year.

Chazore said:
Captain_Yuri said:

Well visually it may not be as amazing as cyberpunk but it does have RTGI and all that so it still runs very heavy. Remember that Path Tracing can kill a 4090 even on a game like Portal. And if you want to see how it performs without Ray Tracing, all you gotta do is look at a 3070 performance cause that's all it is at a lower power consumption and Ada features.

I think Nvidia is just largely banking on the fact that Radeon largely isn't competing this gen so if a person wants to upgrade to something new and shiny, Nvidia is essentially the only choice sub $800. You can get RDNA 2 but a lot of the stack is similar in Raster performance to Lovelace with nearly double the power consumption and worse Ray Tracing and less features. Even the vram argument is iffy now since a 4060 Ti now has 16GB. And if the GPUs don't sell, they will allocate the stock towards the enterprise Ai craze where Microsoft and other companies keeps ordering their gpus in packs of 10,000.

Not really that surprised, but also not really that "wow" at the 3070, when I just noticed my 1080ti there, a 3 gen old card, managing 132fps (I was expecting a 3 gen newer card to do better on a game like Witcher 3, without the RT).

I've had this thought for some yrs now that Nvidia simply doesn't want to be much in the market for consumer GPU's now (We've seen this in AI self driving cars, the Cloud and other tech firms, hell even Nintendo), so it's not all that surprising to see them bank on crypto bros, then immediately jump o banking on AI bros and MS, instead of coming back to us.

I don't see AMD competing in a way that will get Nvidia to suddenly care again, they've shown us some time ago that they've stopped, they'll pull a Konami and leave one day, just watch (and honestly, good, because I've grown to dislike Nvidia and jensen's shitty mentality over the yrs anyway).

Captain_Yuri said:

End of an era...

He's not totally going away but LTT will have a new CEO who ran NCIX, Corsair and Dell. We will see how that goes.

A CEO that's come from corporate tells me that LTT is going to go full corpo in the future. Why didn't he hire someone that's from somewhere smaller?. Literally any time in history I have seen someone from big corpo, it has resulted in said company growing for sure, but also becoming more corpo (Like look at what Michael Eisner and Katzenberg did to Disney over the yrs, they made it worse, despite giving us the decade long Disney Renascence) .

Yea the lower down the stack you go, the less impressive the gains have been since Turing era. Quite the shame really considering how 60 series once used to defeat the previous gen 80 series.

Idk why people keep suggesting Nvidia will leave the gaming market where they hold over 80% of the market share. Like has there ever been a case of a business saying "Welp we hold 80%+ market share of a highly profitable industry, lets just leave for the luls?" Like Nvidia made 1.8 Billion dollars in gaming vs 3.6 Billion dollars in Datacenter Q4 FY23. That's no small chunk in revenue to just say, yea lets leave something that they have a guaranteed win in every generation.

Plus if Nvidia leaves and Radeon becomes the dominant player, you think the industry will be any better? If anything, the GPU market will be significantly worse when you realize that when AMD had market dominance against Intels CPU with Zen 3, they refused to launch proper budget CPUs for nearly 1.5 years after the initial release Zen 3. It took Intel to launch Alder Lake before AMD bothered to release a proper CPU in the budget segment as they were riding the success of Zen 3 to their fullest extent.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Yea the lower down the stack you go, the less impressive the gains have been since Turing era. Quite the shame really considering how 60 series once used to defeat the previous gen 80 series.

Idk why people keep suggesting Nvidia will leave the gaming market where they hold over 80% of the market share. Like has there ever been a case of a business saying "Welp we hold 80%+ market share of a highly profitable industry, lets just leave for the luls?" Like Nvidia made 1.8 Billion dollars in gaming vs 3.6 Billion dollars in Datacenter Q4 FY23. That's no small chunk in revenue to just say, yea lets leave something that they have a guaranteed win in every generation.

Plus if Nvidia leaves and Radeon becomes the dominant player, you think the industry will be any better? If anything, the GPU market will be significantly worse when you realize that when AMD had market dominance against Intels CPU with Zen 3, they refused to launch proper budget CPUs for nearly 1.5 years after the initial release Zen 3. It took Intel to launch Alder Lake before AMD bothered to release a proper CPU in the budget segment as they were riding the success of Zen 3 to their fullest extent.

I don't see NVidia leaving the gaming market either. What could happen somewhere down the line is that they put less focus on gaming and more focus on Tegra/Quadro, giving both AMD and Intel a chance to catch up on them both in market share and capabilities (as in, things like DLSS or CUDA). If that were to happen, then the market would be much more open without any dominant player and the prices probably also much more competitive. But that's really a best-case scenario for consumers, and I doubt this will happen anytime soon.