By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

JEMC said:
Bofferbrauer2 said:

So in short, the 4070 is a 3080 with an efficiency improvement and $100 off it's MSRP. At just 12GB Vram however, I fear it will run into a bottleneck here even faster than the 3070/Ti did - probably even before any next-gen GPUs get released.

I checked in Europe at Mindfactory, and you can get a 6800 for 529€, 6800XT for 569€ and 6950XT for 649€. and if you go 20€ on the latter, you even get the Powercolor 6950XT Red Devil, which is pretty much the best 6950XT there is. So the 4070 really needs to be available at or very close to the MSRP or else the AMD cards should all be better value than the 4070. And if early price leaks from Asia are anything to come by, it doesn't look like the 4070 will be available for that price, at least not from the board partners.

Speaking of AMD, the 6700XT is already available for 379€, which will put the upcoming 4060/Ti into a tough spot in terms of pricing.

I agree with you about the 4070. And it's interesting how the performance changes compared to the 3080 going from faster to slower as the resolution goes up.

Mummelmann said:

Guys; quick question as I'm sifting through the jungle of motherboards and compatibility issues. How long will the LGA 1700 remain relevant/viable for upgrade? I understand it's quite new, but I've heard that it might be short-lived, causing potential issues 4-5 years down the line if I wish to upgrade my CPU. I mostly want PCIe 5.0 support and DDR5, but that's not hard to find. I think perhaps AMD is a safer bet, with them usually keeping their sockets for longer (AM5 will more than likely be viable in 5 years time).

As I've mentioned before, I have no loyalty. I'm considering either a 13600K or perhaps the 7800X3D, the latter seems like crazy value in comparison, especially given its low consumption. 4090 combo with 7800X3D is looking mighty sweet.

As Yuri said, 1700 may get a refresh based on the current architecture, but we don't know how those may look like. Too many rumors about it.

And yeah, AMD said it would support AM5 until 2025, probably a couple of new CPU releases, but you should still be able to find a compatible processor in 5 years. Maybe not the latest or the best, but still a good upgrade (and also cheaper as it won't be new anymore).

Yeah, sounds about right. Also, I might not need a better CPU for a long time, seeing as how I'll be gaming mostly in 4K, I think it's hard to choke on that, as far as sheer frames are concerned anyway. I think I'll have a closer look at Ryzen for my build, it's simply better value. And, in a most unexpected turn of events; it'll make sure my room doesn't get too hot! Unexpected because the last AMD product I owned was a HD 7850, and before that the X1950XTX, those things could crank out some heat (and fan noise to boot).



Around the Network
Mummelmann said:
Captain_Yuri said:

Well AMD officially said 2025+ in their slides so at the very least, I'd expect support until 2025. Raptor Lake could potentially get one more CPU generation in the rumour mill with Raptor Lake refresh but in my point of view, going AMD is the better bet in terms of CPU and platform. X670E is simply a superior platform to Z790 with all of it's PCI-E 5 capabilities and IO and such. But you can always pair raptor lake cheaper Z690 boards so the platform cost could be cheaper overall depending on a few factors but basically little to no upgrade path.

Personally speaking, I am going with 7800X3D + Asus ProArt Creator X670E + 32GB of 6000 Mhz CL30 DDR5. Platform Longevity + PCI-E 5.0 for GPU + Storage + Fast and efficient CPU.

Sounds smart. I'm also entertaining the idea that there will be actually worthwhile m.2 drives on PCIe 5.0 within a few years. For now, the performance increase isn't worth the extra premium, not by by a long shot. I'll be going with a 1TB drive for OS and doodles, something around the 7000/7000 mark in speed. And for gaming/various stuff, I'll be getting either a 2TB or 4TB drive with roughly the same speed. Apparently, the larger the drive, the more/bigger write cycles it can perform before it says goodnight (which is important if I plan on keeping it for many years).

I generally keep my rigs for a long time, as my other electronics, which is why I need something powerful and stable, as well as something that can be improved in increments. I had my last phone for 5 years, and I still have every single console I ever owned. I think my average time between rigs is around 6,5-7 years or so.

Yea I won't be putting any Gen 5 SSDs in my PC, least not for a long time. The Gen 4 SSD deals are insane. I have seen deals for 4TB Gen 3 SSDs for as low as $250 USD or 4TB Gen 4 SSDs for as low as $350-$400 USD when they are on sale which is crazy. They won't be the fastest Gen 4/Gen 3 but they will be more than fast enough for gaming.

And yea, if you want a long term PC, 7800X3D + 4090 is certainly the way to go unless you do CPU heavy production tasks. 4090 has all the vram that you need along with all the premium features and long term drivers that Nvidia is known for. Plus emulators (especially switch emulators) work a lot better with Nvidia than Radeon.

And for me, there's too many scheduling issues with 7900X3D and 7950X3D based on the reviews that I have seen otherwise I may have gone with them so my plan is to get the 7800X3D and then if I want to upgrade to Zen 5X3D or Zen 6X3D with higher core count, then I can do so since by then, they should have either implemented a hardware scheduler like Intel or fixed the bugs.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:

Honestly that is still my recommendation outside of 4090 buyers. Buying neither Nvidia or Radeon feels good right now. Nvidia because of how greedy they are and AMD cause of a lack of features amongst other issues like with emulation.

There is a good amount of reasons to wait for next gen imo. GPU and PC shipments in general have massively gone down similar to Turning. The price and performance outside of flagship is largely unappealing like Turing. Nvidia should be switching to Samsung similar to Ampere. Nvidia is increasing their vram for the mid ranged every generation. There are new features like DLSS 3 and VSR that need some more updates to get their kinks worked out similar to Turing.

So I think there's a good chance that Blackwell will be similar to Ampere in terms of performance uplift for the price you pay. $600 could be the new price for the 5070 regardless but imo, the performance uplift would be worth the price by then. And they could potentially add in 16GB of vram for the 70 class as well.

Yeah, I mean if this were a traditional generation this would a 4060Ti tier card and priced $450-$500. And waiting nearing three years for this is just super underwhelming like the rest of the lower tier GPU's. To put this into perspective the 3060Ti was on par with a 2080 Super and launched at $399, or £369. This gen has 50% inflated cost over last gen.. and we're getting on par if not a little lower (DLSS 3 notwithstanding) than a regular 80 tier card of previous generation.

While I do think what Nvidia is doing with software is very cool and innovative.. their GPU pricing and naming shenannigans the last couple of years has been total garbage. Ampere prices stayed the same. The only other alternative is AMD with RDNA cards but they miss big features like Reflex and other really useful Nvidia proprietary features.

Not sure if its worth looking out for a deal now for a sub $300 as a stop gap. I feel we might see some major price drops for the 7900XT in the upcoming months. And perhaps the 7800XT could be sub-$500.

And true, moving price points I think sets them up with next gens 3nm costs. Though (like you said) we should at least we should see some good generational gains with an overhauled architecture. And by then RT should be viable on lower end cards.

Last edited by hinch - on 12 April 2023

hinch said:
Captain_Yuri said:

Honestly that is still my recommendation outside of 4090 buyers. Buying neither Nvidia or Radeon feels good right now. Nvidia because of how greedy they are and AMD cause of a lack of features amongst other issues like with emulation.

There is a good amount of reasons to wait for next gen imo. GPU and PC shipments in general have massively gone down similar to Turning. The price and performance outside of flagship is largely unappealing like Turing. Nvidia should be switching to Samsung similar to Ampere. Nvidia is increasing their vram for the mid ranged every generation. There are new features like DLSS 3 and VSR that need some more updates to get their kinks worked out similar to Turing.

So I think there's a good chance that Blackwell will be similar to Ampere in terms of performance uplift for the price you pay. $600 could be the new price for the 5070 regardless but imo, the performance uplift would be worth the price by then. And they could potentially add in 16GB of vram for the 70 class as well.

Yeah, I mean if this were a traditional generation this would a 4060Ti tier card and priced $450-$500. And waiting nearing three years for this is just super underwhelming like the rest of the lower tier GPU's. To put this into perspective the 3060Ti was on par with a 2080 Super and launched at $399, or £369. This gen has 50% inflated cost over last gen.. and we're getting on par if not a little lower (DLSS 3 notwithstanding) than a regular 80 tier card of previous generation.

While I do think what Nvidia is doing with software is very cool and innovative.. their GPU pricing and naming shenannigans the last couple of years has been total garbage. Ampere prices stayed the same. The only other alternative is AMD with RDNA cards but they miss big features like Reflex and other really useful Nvidia proprietary features.

Not sure if its worth looking out for a deal now for a sub $300 as a stop gap. I feel we might see some major price drops for the 7900XT in the upcoming months. And perhaps the 7800XT could be sub-$500.

And true, moving price points I think sets them up with next gens 3nm costs. Though (like you said) we should at least we should see some good generational gains with an overhauled architecture. And by then RT should be viable on lower end cards.

For a stopgap, if you were to need one, I'd suggest the 6700XT, as it can be found for about $350 right now. Nothing else comes really close in price/performance in that price range. Sure, it's a bit above $300, but it also might allow you to wait for the next gen if nothing else comes out this gen that sparks your itnerest.



Bofferbrauer2 said:

For a stopgap, if you were to need one, I'd suggest the 6700XT, as it can be found for about $350 right now. Nothing else comes really close in price/performance in that price range. Sure, it's a bit above $300, but it also might allow you to wait for the next gen if nothing else comes out this gen that sparks your itnerest.

I was looking at getting one but decided to wait for the 4070 reviews. There was a Power Color Red Devil version going for £350 and they seem to go on sale on time to time so thats something I might seriously consider getting.

Its about 2x performance of my 1070 which is pretty nice. And does seem to fair quite well with even the most demanding games rn on high settings 1440P. And the resale value is pretty high still. Where you can easily recoup up most of the cost on Ebay.



Around the Network
hinch said:
Captain_Yuri said:

Honestly that is still my recommendation outside of 4090 buyers. Buying neither Nvidia or Radeon feels good right now. Nvidia because of how greedy they are and AMD cause of a lack of features amongst other issues like with emulation.

There is a good amount of reasons to wait for next gen imo. GPU and PC shipments in general have massively gone down similar to Turning. The price and performance outside of flagship is largely unappealing like Turing. Nvidia should be switching to Samsung similar to Ampere. Nvidia is increasing their vram for the mid ranged every generation. There are new features like DLSS 3 and VSR that need some more updates to get their kinks worked out similar to Turing.

So I think there's a good chance that Blackwell will be similar to Ampere in terms of performance uplift for the price you pay. $600 could be the new price for the 5070 regardless but imo, the performance uplift would be worth the price by then. And they could potentially add in 16GB of vram for the 70 class as well.

Yeah, I mean if this were a traditional generation this would a 4060Ti tier card and priced $450-$500. And waiting nearing three years for this is just super underwhelming like the rest of the lower tier GPU's. To put this into perspective the 3060Ti was on par with a 2080 Super and launched at $399, or £369. This gen has 50% inflated cost over last gen.. and we're getting on par if not a little lower (DLSS 3 notwithstanding) than a regular 80 tier card of previous generation.

While I do think what Nvidia is doing with software is very cool and innovative.. their GPU pricing and naming shenannigans the last couple of years has been total garbage. Ampere prices stayed the same. The only other alternative is AMD with RDNA cards but they miss big features like Reflex and other really useful Nvidia proprietary features.

Not sure if its worth looking out for a deal now for a sub $300 as a stop gap. I feel we might see some major price drops for the 7900XT in the upcoming months. And perhaps the 7800XT could be sub-$500.

And true, moving price points I think sets them up with next gens 3nm costs. Though (like you said) we should at least we should see some good generational gains with an overhauled architecture. And by then RT should be viable on lower end cards.

Yea it's basically another Pascal to Turing situation where Pascal was god tier and Turing was hella lame. Ampere could have been another Pascal or but we all know what happened with crypto and now inflation and etc. Now I am not excusing Nvidia cause they are absolutely taking advantage of a shitty situation. But that is what it is because Nvidia is stupidly greedy so the GPU choices are once again horrid.

As far as a stop gap situation goes, it depends on your situation really. RDNA 2 has depreciated quite heavily which is a good thing for a stop gap solution because when it comes time to sell the 6700XT, you should get a good amount of money back, especially if you time it right. But it also depends on the games that you would be playing. Cause all a 6700XT is, is a faster Raster GPU. It won't give you much in terms of Ray Tracing, FSR is terrible at anything lower than 4k, the switch emulation if you are in to that can be a bit wonky with Radeon and no Reflex either.

So if you want a stop gap solution because your 1070 simply can't keep up with the games you want to play, then imo, 6700XT is a great stop gap solution. But if all you play is games like Overwatch where a 1070 is still plenty capable, then it might not be worth getting cause a 1070 is already a "stop gap" solution if you get what I am saying.

But what you could do is buy a 6700XT from amazon, use it for a week or two, see how you like it and if it's a worthy upgrade for the games that you play or want to play, keep it. Otherwise you can return it.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

The 4070 is a bit of a let down with only a paltry 12GB, I can see it hitting a VRAM wall later on... Also doesn't bode well for the  mid-range chips like the 4060 which will come up even shorter most likely.

If cost is the issue, then perhaps using slower memory chips, but more of it... And bolstering that with even better caches is the approach to take.

Mummelmann said:

Guys; quick question as I'm sifting through the jungle of motherboards and compatibility issues. How long will the LGA 1700 remain relevant/viable for upgrade? I understand it's quite new, but I've heard that it might be short-lived, causing potential issues 4-5 years down the line if I wish to upgrade my CPU. I mostly want PCIe 5.0 support and DDR5, but that's not hard to find. I think perhaps AMD is a safer bet, with them usually keeping their sockets for longer (AM5 will more than likely be viable in 5 years time).

As I've mentioned before, I have no loyalty. I'm considering either a 13600K or perhaps the 7800X3D, the latter seems like crazy value in comparison, especially given its low consumption. 4090 combo with 7800X3D is looking mighty sweet.

The 7800X3D is the chip/platform to go for right now.
It will have longer platform longevity.



--::{PC Gaming Master Race}::--

hinch said:

Skimmed through a few reviews and idk man.. might sit out this gen at this rate.

Whats dissapointing me the most as the silicon gets cut down more and all the benefits gained from going to a massively improved node.. gets lost. With essentially the same RT performance as last gen with a slight price improvement and efficiency. And just Ai as the big seller. Shits so bad that even RDNA 2 cards compete with RT with these cards lol.

Chazore said:

Sounds like the 4070 is basically the slightly better 3080, and by that I mean close to what the Ti model would've been, just like the 1080ti was to the 1080 (not a huge gap between the two, but close enough).

$600 isn't as bad as I'd thought, let's see if it also sticks to that in the GBP region, and then I'll be thinking (since I paid around £600ish for my 1080ti with 12gb, that's about as much as I'd be willing to pay, but for a 70 series that still seems a tad steep).

Yeah I mean its not a bad card. £600, too is my upper limit for buying a new graphics card. Sadly the uplift in performance has stagnated a lot this generation. Still a big jump from our old Pascal cards though and decent performance. Okay in todays inflated prices but that's not saying much.

See to me, AI isn't even the big seller, because I see it as a mere band-aid, needed in order for those cut down cards to reach the "impressive" 100fps mark, when before that we used to just brute force or simply turn down settings.

Take away the Ai this or that and the cards do a measly 50-60fps with RT, so basically same as first and partly 2nd gen, which feels like a comedic joke at this point with the yrs that have passed. 

I don't mind AI tech, but I wish AI GPU tech would stop being so heavily relied upon, because we're still stuck paying more money for hardware, when really we're now relying so heavily on the software side of things to do something our cards were able to do for decades. 

I'm prob gonna sit out this gen myself, because it's been pricy as hell, limited in stock, and most benches have to either rely on AI to make the results look decent, or not rely on AI and make the results look piddling as hell. Either way I'm not winning. 

Notice how Guru brought up the 1080ti multiple times in his vid?, haven't seen many benchers bring up that card in yrs now, so it's interesting to see him bring it up now of all times, with the 4000 series. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

I can't believe that Redfall article got that many comments about the no 60fps mode being added at launch, followed by the heapful of comments saying that we still don't ened 60fps in 2023...

I just can't, I'm sorry but with all the excuses going on this far, I just can't.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Captain_Yuri said:

Yea it's basically another Pascal to Turing situation where Pascal was god tier and Turing was hella lame. Ampere could have been another Pascal or but we all know what happened with crypto and now inflation and etc. Now I am not excusing Nvidia cause they are absolutely taking advantage of a shitty situation. But that is what it is because Nvidia is stupidly greedy so the GPU choices are once again horrid.

As far as a stop gap situation goes, it depends on your situation really. RDNA 2 has depreciated quite heavily which is a good thing for a stop gap solution because when it comes time to sell the 6700XT, you should get a good amount of money back, especially if you time it right. But it also depends on the games that you would be playing. Cause all a 6700XT is, is a faster Raster GPU. It won't give you much in terms of Ray Tracing, FSR is terrible at anything lower than 4k, the switch emulation if you are in to that can be a bit wonky with Radeon and no Reflex either.

So if you want a stop gap solution because your 1070 simply can't keep up with the games you want to play, then imo, 6700XT is a great stop gap solution. But if all you play is games like Overwatch where a 1070 is still plenty capable, then it might not be worth getting cause a 1070 is already a "stop gap" solution if you get what I am saying.

But what you could do is buy a 6700XT from amazon, use it for a week or two, see how you like it and if it's a worthy upgrade for the games that you play or want to play, keep it. Otherwise you can return it.

True, its Turing (2.0) all over again. Really wanted the 4070 to be a great card as the 3070 and other Ampere cards were simply not available or so hard to get ahold of since more of its life. But greed got in the way of Nvidia, and AMD to a lesser extent. And now we're resorting to compare brand new cards on a linear scale with previous gen. Instead of looking at generational leaps at the same price point. The only improvements mostly being Nvidia's bank and the 4090.

Personally, I only really play older competitive games on PC; though I did complete TLOU recently. So it not exactly that I need a new GPU.. plus I have a PS5 to play the latest games. Its just I would nice to play some of my older games with mods and new stuff like Starfield without some heavy compromises.

If another 3 fan model drops again I'll probably get end up getting one. Since I don't hold much hope for the 7800XT. Unless we get leaks/news any time soon. Get that and hold out for Battlemage next year or Blackwell/RDNA 4.

Chazore said:

See to me, AI isn't even the big seller, because I see it as a mere band-aid, needed in order for those cut down cards to reach the "impressive" 100fps mark, when before that we used to just brute force or simply turn down settings.

Take away the Ai this or that and the cards do a measly 50-60fps with RT, so basically same as first and partly 2nd gen, which feels like a comedic joke at this point with the yrs that have passed. 

I don't mind AI tech, but I wish AI GPU tech would stop being so heavily relied upon, because we're still stuck paying more money for hardware, when really we're now relying so heavily on the software side of things to do something our cards were able to do for decades. 

I'm prob gonna sit out this gen myself, because it's been pricy as hell, limited in stock, and most benches have to either rely on AI to make the results look decent, or not rely on AI and make the results look piddling as hell. Either way I'm not winning. 

Notice how Guru brought up the 1080ti multiple times in his vid?, haven't seen many benchers bring up that card in yrs now, so it's interesting to see him bring it up now of all times, with the 4000 series. 

Yep the raster performance hasn't moved much this generation outside of the flagship cards. And AMD is still nowhere to seen with their higher end/midrange cards. I think Ai to enhance performance without effecting image quality too much is cool, but yeah it shouldn't be needed to prop up a graphics card. And a $600 one at that. Plus the 12GB of VRAM is going to rear its ugly head in a couple years time when more and more games use more memory; not inclusing crappy optimisation.

Really looking at these cut down specs, its Nvidia charging us, more for less. And then passing the buck on DLSS 3 and other technologies to say we're giving you better performance. When its not really.

Last edited by hinch - on 12 April 2023