By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Chicho said:

Arc price has been reduced again. You can get  A750 shipped fron new egg for  $229 and a A770 for $269. How do you guys see them as entry mid level against the Nvidia and AMD options?

They still need to iron out their drivers. They've gotten much better but I wouldn't recommend them to anyone as their primary GPU. If you are buying one just to play around with it, in a secondary system, sure. I hope they get their drivers ironed out. The more competition the better.

Last edited by Darc Requiem - on 11 March 2023

Around the Network
Chicho said:

Arc price has been reduced again. You can get  A750 shipped fron new egg for  $229 and a A770 for $269. How do you guys see them as entry mid level against the Nvidia and AMD options?

As Darc Requiem has said, they're better than what they were at launch, but they're still behind AMD and Nvidia in terms of ease of use and support for older games.

They're no longer at a point where buying one makes no sense, but there may still be situations where an average consumer won't know what to do.

But who knows. Given how much they've improved in such a short amount of time, maybe in a couple months they'll be able to iron most of those problems and be a viable option for everyone.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
Chicho said:

Arc price has been reduced again. You can get  A750 shipped fron new egg for  $229 and a A770 for $269. How do you guys see them as entry mid level against the Nvidia and AMD options?

As Darc Requiem has said, they're better than what they were at launch, but they're still behind AMD and Nvidia in terms of ease of use and support for older games.

They're no longer at a point where buying one makes no sense, but there may still be situations where an average consumer won't know what to do.

But who knows. Given how much they've improved in such a short amount of time, maybe in a couple months they'll be able to iron most of those problems and be a viable option for everyone.

Honestly, at the rate Intel is improving, Arc maybe the superior choice at the low end if you only play newer games soon (6 months or so). Intel's raytracing implementation is much better than AMD's. It's actually competitive with Nvidia's. However when you can get a RX 6600 for roughly $225 or a RX 6600XT for around $275, it's hard to recommend Arc for someone on a budget. Personally I'd take an Arc 750 or 770 over anything Nvidia has at the low end. Their prices are absurd and the RTX 3050 should never bought by anyone ever. Even with mature drivers the performance of that card is just terrible.



Darc Requiem said:
JEMC said:

As Darc Requiem has said, they're better than what they were at launch, but they're still behind AMD and Nvidia in terms of ease of use and support for older games.

They're no longer at a point where buying one makes no sense, but there may still be situations where an average consumer won't know what to do.

But who knows. Given how much they've improved in such a short amount of time, maybe in a couple months they'll be able to iron most of those problems and be a viable option for everyone.

Honestly, at the rate Intel is improving, Arc maybe the superior choice at the low end if you only play newer games soon (6 months or so). Intel's raytracing implementation is much better than AMD's. It's actually competitive with Nvidia's. However when you can get a RX 6600 for roughly $225 or a RX 6600XT for around $275, it's hard to recommend Arc for someone on a budget. Personally I'd take an Arc 750 or 770 over anything Nvidia has at the low end. Their prices are absurd and the RTX 3050 should never bought by anyone ever. Even with mature drivers the performance of that card is just terrible.

I agree with you.

It's interesting how AMD cards have fallen in price to not only MSRP, but below that after the crypto crash, but the Nvidia cards have barely dropped to MSRP, and some not even that. That's why it's very hard to recommend any Nvidia card in the mid and low end.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Captain_Yuri said:

Last of Us Part 1 PC Requirements:

Yea the 32GB of ram era is up on us.

I feel like it's becoming artidfical, I mean it's mostly a tiny pocket of AAA's demanding this, yet once again, the gameplay and visuals and loading are almost the same as say, PS5.


What is me absolutely needing to dedicate 15gb memory to my game, going to get over what you can already play on a console, that doesn't require a full on 16gb?.

it just smells like laziness to me, like how some games asked for a 1080 freaking Ti for their games, and ended up being stuttery as hell. I really wish devs would outright explain WHY they need to use 16gb, like don't even need to go all Flux capacitor on our asses, just tell us in simple terms as to why we need to go out and buy 32gb kits now, when current gen isn't loaded with 32gb kits themselves. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Around the Network
JEMC said:
Captain_Yuri said:

Last of Us Part 1 PC Requirements:

*specs*

Yea the 32GB of ram era is up on us.

I've read a wild, but not too ridiculous, theory about those specs. On one hand, it's possible that Sony and the studios that handle the ports of its exclusive games simply suck when it comes to optimizing for PC, resulting in games with ridiculously high demands.

On the other hand, the theory is that Sony is pressuring its studios and partners to put up these stupidly high PC requirements to make the PS5 looks more powerful than it really is (the old look what kind of PC you need to play this game that runs perfectly fine on our cheaper console), with the secondaly intention of trying to attrack PC gamers to its console.

It's wild, I know, but so are those requirements.

I mean, either of those sound extremely plausible from Sony's side, because even though they are now wanting that PC money, they still ultimately want you to buy a PS5 and lock yourself into their already dominant console ecosystem, so making either poor ports or stupidly high hw demands ends up making their console specs more enticing for those undecided.

Like all this does for me really is to just not buy the game and wait another 5 yrs till it's dirt cheap and I have a new rig, instead of day 1, because their hw req's are becoming whack over time. I waited nearly 2yrs for HZD, primarily due to it's pricing and poor perf metrics, and Days Gone I waited half a year because of it's pricing, but heard it wasn't as bad as HZD, perf wise.

It's just weird seeing only a tine handful asking for 16gb, when literally no one else out there, even indie-dev wise, are asking for a fat 16gb in it's entirety.

I think Sony should have to bear the burden of proof and show us what that 16gb gets us, and not some 30 sec fluff trailer video either. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
JEMC said:

I've read a wild, but not too ridiculous, theory about those specs. On one hand, it's possible that Sony and the studios that handle the ports of its exclusive games simply suck when it comes to optimizing for PC, resulting in games with ridiculously high demands.

On the other hand, the theory is that Sony is pressuring its studios and partners to put up these stupidly high PC requirements to make the PS5 looks more powerful than it really is (the old look what kind of PC you need to play this game that runs perfectly fine on our cheaper console), with the secondaly intention of trying to attrack PC gamers to its console.

It's wild, I know, but so are those requirements.

I mean, either of those sound extremely plausible from Sony's side, because even though they are now wanting that PC money, they still ultimately want you to buy a PS5 and lock yourself into their already dominant console ecosystem, so making either poor ports or stupidly high hw demands ends up making their console specs more enticing for those undecided.

Like all this does for me really is to just not buy the game and wait another 5 yrs till it's dirt cheap and I have a new rig, instead of day 1, because their hw req's are becoming whack over time. I waited nearly 2yrs for HZD, primarily due to it's pricing and poor perf metrics, and Days Gone I waited half a year because of it's pricing, but heard it wasn't as bad as HZD, perf wise.

It's just weird seeing only a tine handful asking for 16gb, when literally no one else out there, even indie-dev wise, are asking for a fat 16gb in it's entirety.

I think Sony should have to bear the burden of proof and show us what that 16gb gets us, and not some 30 sec fluff trailer video either. 

That's the job of the reviewers, not Sony. The likes of Digital Foundry or Techpowerup that check how the games perform are the ones that have to check is those games really need that much RAM or not.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

hinch said:
Captain_Yuri said:

I wouldn't say a 6700xt is aging better than 3070 in a lot of current titles unless you have proof. Twitter drama certainly has been hammering the vram limitations but you need to be sure they aren't just trying to misslead people so we need to make sure that there's facts to back it up.

If we look at various recent titles, the 6700xt is in fact not aging better than 3070/Ti:

https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html

https://www.techpowerup.com/review/forspoken-benchmark-test-performance-analysis/5.html

https://www.techpowerup.com/review/the-callisto-protocol-benchmark-test-performance-analysis/5.html

You can see that while vram usage is high, the 3070 is infact not performing worse than 6700xt. A lot of the drama that I have seen surrounding the vram issue is when you enable ray tracing, that some of these Vram limited gpus are not performing as well as they should which is true. But that doesn't mean RDNA 2 is performing better in RT enable titles either as the RT performance on RDNA 2 is generally bad.

I do think a 7900xt for the same price as a 4070 Ti is a better buy though but I'd wait and see if Nvidia responds with price drops as well.

I think age better isn't right the phrase here.. What I'm saying is that having more VRAM is better than running out and having huge performance penalties. With newer titles using more higher quality textures 8GB isn't going to cut it for newer titles for 1440P. It was barely scraping the minimum requirements in 2020 and consoles have way,way more RAM. We had games going over that on cross gen stuff like RE7 on high quality textures. And it looks like the bar is going to get higher as we progress with newer engines and games.

What I'm saying is that you don't want to be close to the edge with VRAM because once your at the limit and go over, you're going to have to reduce settings to make something playable. Granted its not a magic bullet with performance have more is better than running out and getting single figure or massive FPS drops due to lack of RAM.

Another one is the 3080 with its 10GB RAM. Which was and is low for its performance tier and is already problematic with a few select titles with higher quality settings and resolutions.

What we're saying is that Nvidia have skimped on VRAM for years on the lower end stack to get people to go higher end. Its market segmentation but also seems a little like planned obsolecence for mid range buyers. Offering the bare minimum for each tier as possible per generation.

I do agree with what you are saying with Vram and Nvidia has done this for pretty much every generation other than Pascal. But I do think that going with Radeon just because of Vram isn't the answer either because they have a lot of other issues that people aren't mentioning is the point I am trying to make.

I am not saying that Nvidias relatively low Vram is okay or anything of that sort cause yes as you said  it is effectively planned obsolescence. But going onto Radeons camp where the experience is bad but in different ways is not a good option either imo.

We also don't know what next gen engines will do as we have yet to see any games with next gen tech. Things like Sampler Feedback and DirectStorage for GPUs are supposed to significantly lower the Vram usage. But there's also a chance that devs won't use them or will use them but use the added savings to add even more enhancements. But we won't know until games actually come out with those tech. Not to say we should buy vram limited gpus and put our hopes on them but I also wouldn't buy Radeon and put our hopes on them providing a good experience either. It's just better to wait and see if Nvidia discounts their 4080s and see what next gen engines do with vram limited gpus.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Chicho said:

Arc price has been reduced again. You can get  A750 shipped fron new egg for  $229 and a A770 for $269. How do you guys see them as entry mid level against the Nvidia and AMD options?

I do think if Intel continues to support their gpus for a long time, Arc will age better than both it's Nvidia and AMD counterparts. Arc essentially has the featureset of Nvidia with 16GB of vram if you get the A770. It's still hard to recommend because of the drivers but man, does it have big potential.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

PS: I am currently staying at a village where they use the Rooster 🐓 to know when to wake up with no wifi and getting internet connectivity via phone ain't easy. 3G is essentially a luxury.

A far cry from my tech heaven that I call my home but it has it's own niceties. Suffice to say, along with time zone differences, I won't able to reply within meaningful time frames.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850