By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Chazore said:
Jizz_Beard_thePirate said:

Yea it's very hard to tell what to do with the brand at this point. Fighting against Nvidia is hard, especially when they have similar advantages that you do such as being able to use the same fabs where as Intel vs Ryzen, Intel is forced to rely on their own fabs. But even if we take the architecture out of the equation, the leadership in Radeon continues to be terrible and endlessly makes bad decisions. It really feels like they need to get out of social media and understand their audience.

Like this generation, yea 7900xtx is as fast as a 4080 and not 4090. Fair enough. But what's the excuse with pricing 7900XT at $900? It was brutally shat on by pretty much every reviewer out there. But alright, maybe some crypto nonsense but then you fast forward to today and it's like... why is the 7700XT priced at $450 vs 7800XT priced at $500? Every reviewer is just comparing the two and is like, buy the 7800XT. Like imagine the good press Radeon could have gotten if they priced it like:

7900XTX - $1000
7900XT - $700 instead of $900
7800XT - $500
7700XT - $400 instead of $450

They would have basically had 4 very competitive and well received GPUs. Instead 7900XTX was mixed, 7900XT was shat on, 7800XT was well received, 7700XT was like, buy 7800XT. Like yea, Nvidia prices their GPUs pretty shittily but they are the default choice and they have software advantages. Radeons big failing is thinking they could sway buyers but using Nvidias tactics but Nvidia's tactics only works if you have 80% of the market share for over a decade. They needed to be the "good guys" for at least 3-4 generations to build that customer base. Instead they are a laughing stock because the entire Radeon division cucked themselves every chance they got outside of the 7800XT.

I know it may sound insane, but if I was running Radeon, I'd just keep on the straight & narrow, price my cards at least 200 cheaper than each Nvidia model, as well as focusing on actually catching up on those missing features (also some new cooling solution, because the whole 3 fans facing down is getting pretty stale, whilst Nvidia for a few gens now has that interesting "one fan up, one fan down" (which got some ppl actually clamouring for the founders editions).

Much as I'm really not a fan of the bad-aid AI rigmarole, I feel like AMD really has no choice but to go that route, since Nvidia and Intel are doing so, them just relying on FSR being what it is just isn't cutting it (even Intel's own version is said to be better than FSR in some instances and that sounds dire to me on AMD's side, who has spent some time longer than Intel has). 

I could imagine next gen will be yet another easy win for Nvidia, but they could seriously crush the hell out of AMD if they decided to have a conscience and maybe lower their prices a bit, which would pretty much kill AMD's cheaper price point advantage (but I don't see that happening, since any market leader hardly bothers to do that). 

Pretty much. Like Nvidias lineup has very glaring weaknesses as you go further down the line that Radeon can exploit if they really wanted to. Nvidia really doesn't like giving their more mid-low range a decent about of Vram. That range is also not very upscaling friendly whether you use DLSS or FSR. Like yea DLSS is better than FSR but they are both notably worse than Native at low resolutions where as 1440p-4k, DLSS/FSR gets massively better. So all Radeon needs to do is come out with a 8600XT with 12GB of vram for $250 and 8700XT with 16GB of vram at $350 and 8800XT with 20GB at $500. Leave anything higher than $500 to Nvidia because the volume of sales is sub $500. Then once you build a customer base and gain market share, then you can use the added R&D for software or added features.

Because while the margins are at the high end, brand loyalty is in the low-mid end. Have 3-4 generations of excellent sub $600 GPUs and that brand loyalty and market share will increase. This is how Ryzen has been so successful over the years because people got tired of Intels shat and their dumb pricing. But who knows at this point.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Jizz_Beard_thePirate said:

Pretty much. Like Nvidias lineup has very glaring weaknesses as you go further down the line that Radeon can exploit if they really wanted to. Nvidia really doesn't like giving their more mid-low range a decent about of Vram. That range is also not very upscaling friendly whether you use DLSS or FSR. Like yea DLSS is better than FSR but they are both notably worse than Native at low resolutions where as 1440p-4k, DLSS/FSR gets massively better. So all Radeon needs to do is come out with a 8600XT with 12GB of vram for $250 and 8700XT with 16GB of vram at $350 and 8800XT with 20GB at $500. Leave anything higher than $500 to Nvidia because the volume of sales is sub $500. Then once you build a customer base and gain market share, then you can use the added R&D for software or added features.

Because while the margins are at the high end, brand loyalty is in the low-mid end. Have 3-4 generations of excellent sub $600 GPUs and that brand loyalty and market share will increase. This is how Ryzen has been so successful over the years because people got tired of Intels shat and their dumb pricing. But who knows at this point.

I think my biggest gripe with Nvidia the past 2 gens has gotta be their focus on AI to carry the workload, and the fact that they only have two high end cards that are basically needed to tackle modern AAA games, the rest either can't perform very well, or absolutely need DLSS just to keep up slightly with the two highest end models (really did not help them this gen with fucking around with the numbers, like the 80 not actually being an 80 and the 90 being the real 80 series). 

Yeah, looking at that recent PL benchmark testing from Daniel, anything below 1440p using DLSS or FSR just doesn't look all that rgeat, and well, even with 2077 the gains aren't that grand either, only until you see him use FG+RR with the 4080/4090 do you see the stupidly large gains, which means AMD/Intel are up shit creek in that arena.

(His benches are great, but seriously depressing to see only the 4080/90 being the ultimate stand-out cards in his entire test run).

Yeah, I do wish AMD would try emulating what they did with Ryzen, because hot damn did that turn the tide and earn a load of goodwill (I mean it's working on me, I absolutely want my next CPU to be an AMD one, because I'm just done with Intel's prices, especially their dumbass naming schemes and numbering).



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Jizz_Beard_thePirate said:

Pretty much. Like Nvidias lineup has very glaring weaknesses as you go further down the line that Radeon can exploit if they really wanted to. Nvidia really doesn't like giving their more mid-low range a decent about of Vram. That range is also not very upscaling friendly whether you use DLSS or FSR. Like yea DLSS is better than FSR but they are both notably worse than Native at low resolutions where as 1440p-4k, DLSS/FSR gets massively better. So all Radeon needs to do is come out with a 8600XT with 12GB of vram for $250 and 8700XT with 16GB of vram at $350 and 8800XT with 20GB at $500. Leave anything higher than $500 to Nvidia because the volume of sales is sub $500. Then once you build a customer base and gain market share, then you can use the added R&D for software or added features.

Because while the margins are at the high end, brand loyalty is in the low-mid end. Have 3-4 generations of excellent sub $600 GPUs and that brand loyalty and market share will increase. This is how Ryzen has been so successful over the years because people got tired of Intels shat and their dumb pricing. But who knows at this point.

I think my biggest gripe with Nvidia the past 2 gens has gotta be their focus on AI to carry the workload, and the fact that they only have two high end cards that are basically needed to tackle modern AAA games, the rest either can't perform very well, or absolutely need DLSS just to keep up slightly with the two highest end models (really did not help them this gen with fucking around with the numbers, like the 80 not actually being an 80 and the 90 being the real 80 series). 

Yeah, looking at that recent PL benchmark testing from Daniel, anything below 1440p using DLSS or FSR just doesn't look all that rgeat, and well, even with 2077 the gains aren't that grand either, only until you see him use FG+RR with the 4080/4090 do you see the stupidly large gains, which means AMD/Intel are up shit creek in that arena.

(His benches are great, but seriously depressing to see only the 4080/90 being the ultimate stand-out cards in his entire test run).

Yeah, I do wish AMD would try emulating what they did with Ryzen, because hot damn did that turn the tide and earn a load of goodwill (I mean it's working on me, I absolutely want my next CPU to be an AMD one, because I'm just done with Intel's prices, especially their dumbass naming schemes and numbering).

Yea I personally don't use DLSS as an option that often. Like I'll use it on a game like Cyberpunk cause it looks great but majority of the games, I leave DLSS off and play Native at 4K. But honestly back in the good old days, I never really felt like I needed a "Halo Product" to get everything from a generation. Like when I got my GTX 970 for $350, I was blown away by the performance. Yea you had the 980 and the Titan but I was like meh, who needs that when my 970 can play 1440p easily. I eventually upgraded to a 1080 and even that wasn't too expensive unless you got the founders edition.

Now it feels like that unless you get a 4090, you get cucked in some way. Whether it's vram from Nvidia or software features from AMD, it just feels lackluster. Like a gtx 780 had 3GB of vram but a 970 not only was faster than 780 Ti but had 3.5GB of vram. A 1070 was not only as fast if not faster than 980 Ti but had 8GB of vram instead of 6GB on 980 Ti. Now it's like, sure a 3070 competitive against the 2080 Ti but it had 8GB of vram. And a 4070 is barely as fast as a 3080 and while it technically has more vram, that only counts if you compare it against the 10GB version and not the 12GB version.

It just all feels lame unless you spend the big bucks. And going AMD just feels like you aren't getting the experience that a modern PC GPU should be giving you when you look at Nvidia's feature sets. And yea, I have been a Ryzen owner since Zen 1 because I got ticked off of what Intel has been doing. I might switch back to Intel if Zen 5 doesn't cut it but I'll stay AMD if Zen 5 is very close.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

After driving all the good will Radeon has accrued over the years to the ground:



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Does a PSU rating actually make any difference in the electricity bill? As far as I know, most countries charge residences based on real power, not apparent power. So in theory your PC can be as wildly power-draw inefficient as you like without paying more for it.

(Although it should work as a more general way of knowing the quality of the PSU's components, so not skimping on that).



 

 

 

 

 

Around the Network
haxxiy said:

Does a PSU rating actually make any difference in the electricity bill? As far as I know, most countries charge residences based on real power, not apparent power. So in theory your PC can be as wildly power-draw inefficient as you like without paying more for it.

(Although it should work as a more general way of knowing the quality of the PSU's components, so not skimping on that).

I think you have a misunderstanding of what PSU power efficiency mean.  If your computer require 500 watts.  Your PSU is 80% efficient at 500 watts then that mean the PSU going to require you to draw 625 watts of power from the wall to provide your computer 500 watts of power.  If it 90% efficient it would only need to draw 555 watts from wall.  Therefore your real power draw from wall would be 70 watts less therefore reducing your power bill slightly.   



Jizz_Beard_thePirate said:

It just all feels lame unless you spend the big bucks. And going AMD just feels like you aren't getting the experience that a modern PC GPU should be giving you when you look at Nvidia's feature sets. And yea, I have been a Ryzen owner since Zen 1 because I got ticked off of what Intel has been doing. I might switch back to Intel if Zen 5 doesn't cut it but I'll stay AMD if Zen 5 is very close.

Oh god yeah, especially when I looked at those PL benchmarks, you were just not getting any of that with AMD. They looked so far behind it's embarrassing, and to ehar that next gen won't be featuring a high end model is just going to make next gen benches look far worse for them.

That being said, without the new high end model, I can definitely see them going full on in for FSR 3.5 and the rest of their kit with their new "bundled" mode (the one that'll turn on all the other features like anti-lag), and hoping the bench gods give them something of a result. They won't be able to claim the 40-50% performance uplift next gen though, not without a high end card to prove it, and no, upscaling shouldn't even remotely count, because we know when you turn that off, your frames go to shit (even without RT we've seen numerous games do this, even the bad ports).

Personally I'm stuck between choosing that 5800X3D or that 7800X3D, but I know with the later I'm paying that little bit more, but hey, the benches also look that bit more favourable for the 7800X3D CPu than the former (which seems to be ageing a bit now). I just wish Intel would go back to normal naming and keep the line simple (low end, medium end, high end, and a few K variants per line, that's it). 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

haxxiy said:

Does a PSU rating actually make any difference in the electricity bill? As far as I know, most countries charge residences based on real power, not apparent power. So in theory your PC can be as wildly power-draw inefficient as you like without paying more for it.

(Although it should work as a more general way of knowing the quality of the PSU's components, so not skimping on that).

For me personally, no. I've gone through it with my local electricity company, and it turns out devices like my Washing machine/tumble dryer, elec heater and microwave all combined, use up more than my desktop powering my CPU/GPU/monitor.

I'm UK based, so I'm not sure about the rest of the world, but my desktop isn't the powerhouse elec guzzler like I thought it would be. 

Last edited by Chazore - on 26 September 2023

Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Cyran said:
haxxiy said:

Does a PSU rating actually make any difference in the electricity bill? As far as I know, most countries charge residences based on real power, not apparent power. So in theory your PC can be as wildly power-draw inefficient as you like without paying more for it.

(Although it should work as a more general way of knowing the quality of the PSU's components, so not skimping on that).

I think you have a misunderstanding of what PSU power efficiency mean.  If your computer require 500 watts.  Your PSU is 80% efficient at 500 watts then that mean the PSU going to require you to draw 625 watts of power from the wall to provide your computer 500 watts of power.  If it 90% efficient it would only need to draw 555 watts from wall.  Therefore your real power draw from wall would be 70 watts less therefore reducing your power bill slightly.   

That's exactly what I meant.

Most people *think* they would incur the cost of that extra wattage, but that's just apparent power, which residences don't pay for in most places.

So unless you're running your PC on the site of some industrial enterprise (which would have to pay for that apparent power), it shouldn't reduce your power bill.



 

 

 

 

 

The Tuesday news:

SALES /PLAYER COUNT & DEALS

Steam's Top 10

(Click here for the top 100)

I didn't see it yesterday, but GOG has a new Weekly Sale with up to 90% discounts during 6 days: https://www.gog.com/en/promo/20230925_weekly_sale.

Steam has new sales and weeklong deals:

The Humble Store has two new sales:

And Fanatical hasn't launched its new Star Deal yet, but at least they have a new bundle: the Mystery Vault Bundle, with up to 20 unknown games to get: https://www.fanatical.com/en/bundle/mystery-vault-bundle.

SOFTWARE & DRIVERS

Valve launches SteamVR 2.0 beta with 'goal of bringing all of what's new on the Steam platform into VR'
https://www.pcgamer.com/valve-launches-steamvr-20-beta-with-goal-of-bringing-all-of-whats-new-on-the-steam-platform-into-vr/
Great news, VR fans. Valve has just released SteamVR 2.0 in beta. This is the software that powers the entire virtual reality experience on Steam, from hooking up to hardware to the interface you use to whizz around virtual worlds, and it just got a ton more features.
"Today we are shipping SteamVR 2.0 in beta," a Steam representative says. "We see this as the first major step toward our goal of bringing all of what's new on the Steam platform into VR."
Today's beta release adds Steam Chat and Voice Chat, an improved Steam store with VR front and centre, and an updated keyboard. It also adds "most of" the current features from Steam and Steam Deck. That's potentially pretty big, as since the launch of the Steam Deck, we've seen both Steam Deck and Steam receive heaps of improvements, most notably to big picture mode.

MODS, EMULATORS & FAN PROJECTS

Disk Cache Enabler Mod released for Skyrim, Fallout 4 & Starfield
https://www.dsogaming.com/pc-performance-analyses/disk-cache-enabler-mod-released-for-skyrim-fallout-4-starfield/
Modder ‘Archost’ has released a must-have mod for everyone who currently plays Fallout 4, The Elder Scrolls V: Skyrim Special Edition or Starfield.
This plugin basically makes these three Bethesda games use OS’ file cache, which leads to less disk access over time. According to the modder, owners of both SSDs and HDDs will benefit from it. Since this plugin reduces the disk calls, it can lead to less freezing and fewer sound drops.
(...)
Since this plugin can improve overall performance, we highly recommend using it. In theory, it could resolve or at least minimize Starfield’s stuttering issues. So, those interested can go ahead and download it from the following links.
>> There are no screenshots nor videos of these mods.

A nasty Johnny Silverhand bug nearly derailed my Cyberpunk 2077 playthrough, but this mod came to the rescue
https://www.pcgamer.com/cyberpunk-johnny-blue-glitch-how-to-fix/
When V started coughing up blood during a conversation with Johnny Silverhand, I thought Cyberpunk 2077 was really trying to drive home the urgency of my ticking time bomb of a brain. Throughout the next quest, and the one after that, my entire field of vision kept pulsing with the glitchy blue effect that highlighted Johnny popping into a conversation. It felt like a bit much, but I was really getting into the roleplaying, so I figured heading home to V's apartment for a night's rest would set me right. When it didn't, I asked my colleagues how they got rid of "Johnny vision" because clearly I'd missed something.
That's when I got the bad news from Phantom Liberty reviewer Ted Litchfield: "I don't think I've ever seen that happen before."
Uh oh.
(...)
Unfortunately, that solution was a no-go for me: I'd stubbornly played about three hours of Cyberpunk 2077 with "Johnny vision" slowly driving me mad, and I wasn't about to throw away that progress and redo all of it. So I sought salvation from the only logical source: the mod community.
I found it almost immediately. The mod FX Begone, which supports the 2.0 update, targets and removes a ton of Cyberpunk's visual effects. It can disable the green hue of your scanner, the red glitchiness that clouds your screen when you take big damage, and, crucially, the "Johnny glitch," which removed the constant blue effect from my screen.
>> The article has a video of the glitch.

Starfield mod lets you zip between planets seamlessly
https://www.pcgamer.com/starfield-mod-lets-you-zip-between-planets-seamlessly/
Starfield's space travel isn't exactly satisfying: no matter how much the music swells and the engines roar, travel itself is just a series of short cinematic sequences and loading screens. No one was expecting Starfield to be a sim like Elite Dangerous or a seamless experience like No Man's Sky, but I think a lot of players agree Starfield's travel system is just a little… meh. You never feel like you're really flying anywhere.
A new Starfield mod addresses this issue, sorta, and fixes it, kinda. At the very least, it gives you a different way to fly your ship. As spotted by PCGamesN, the mod is called Slower Than Light - Fly in a Star System. Essentially if you're in orbit around one planet or moon in a star system, you can now fly directly to another planet or moon in the same system. Seamlessly. No cinematics or loading screens at all. It's pretty cool.
>> There's an over 7 minutes video of the mod.

GAMING NEWS

Starfield Update 1.7.33 released, fixes blurry textures & AMD star lens flares, improves stability, full patch notes
https://www.dsogaming.com/patches/starfield-update-1-7-33-released-fixes-blurry-textures-amd-star-lens-flares-improves-stability-full-patch-notes/
Bethesda has just released Patch 1.7.33 for Starfield, and shared its complete list of changes and improvements. According to the changelog, Update 1.7.33 packs various stability and performance improvements to address crashing and freezes.

Final Fantasy 7: Ever Crisis is officially coming to PC via Steam
https://www.dsogaming.com/news/final-fantasy-7-ever-crisis-is-officially-coming-to-pc-via-steam/
Square Enix has announced that Final Fantasy 7: Ever Crisis will be officially coming to PC via Steam. Although there isn’t any ETA on when it will come out, the company claimed that it has already started working on its PC version.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.