By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Zkuq said:

So, I assembled my new PC last Friday, but the front audio jacks weren't working. I even disconnected and reconnected the connector once, and it didn't do anything. Well, I did it once more, this time trying to position the cable differently. It worked, so I put it where it was originally, and it was still working (and still is). It's a bit of a tight fit, so the cable's a bit twisted, but it's twisted a bit differently now, so maybe that was the cause. I kind of suspect the issue is actually that I didn't line up the connector correctly and missed the other row of pins entirely... twice, since I had one earlier attempt at fixing the issue. I almost managed to do that now too, which leads me to think this is what was causing the issue. Definitely not my proudest moment if that's the case, and also not something I expected to go wrong.

It's always the little things that end up causing the most annoying troubles. But at least you managed to fix it.

haxxiy said:

The case to mobo connectors are a nightmare to get every time.

I only know three things for certain when I get a new case: it will have good airflow, the bottom dust filter will be removed from the front (or side) and it will come with this:

And I'm willing to give up the bottom filter requirement to get a cse with one of those.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
hinch said:

Hardware Unboxed deep dive into FSR 3.0

Not bad. A little bit scuffed launch for the tech but there are positives. Less CPU overhead vs DLSS 3.. and works decently itself for generating frames with competitive input lag (though higher than DLSS 3). Downsides.. Image quality still leaves a lot to be desired due to FSR's current upscaling capabilities. There's framepacing issues. Needs a good base framerate like DLSS 3 and there are caviats like not having Freesync and having need to turn on Vsync for a smoother experience. Doesn't really recommend in its current state.

They'll improve on it but meh a lot of things to give up to use it rn. Plus you need to have upscaling turned on for FG to work properly. Then there no access to Reflex for Nvidia users and no Freesync/G-Sync support. Should've just delayed to fix those issues tbh, especially framepacing as that can't make for a good experience..

This has "don't bother with public release, just let it stew in the office" written all over it. All those caveats and downsides just continue to make FSR not all that worth it. Frame timing has become less important to me, to top priority with my games. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

AMD Might Launch Radeon RX 7600 XT Graphics Card With Massive 16 GB Memory

https://wccftech.com/amd-might-launch-radeon-rx-7600-xt-graphics-card-with-massive-16-gb-memory/

Nvidia: Lets launch a 3050 with 6GB
AMD: Lets launch a 7600XT with 16GB

Nintendo Switch 2 to Release on September 24th, 2024 With Two Models Priced at and Above $400 – Rumor

https://wccftech.com/nintendo-switch-2-september-release-two-models/

Take it with big salt

Intel Demos Meteor Lake iGPU 8K60 & SOC Tile E-Core Only 1080P Video Playback Capabilities

https://wccftech.com/intel-demos-meteor-lake-igpu-8k60-soc-tile-e-core-only-1080p-video-playback/

Ranking All Current GPUs from Worst to Best

A fun little thing they did that's not serious. The list does make sense even though it does feel a bit ludicrous that the #1 GPU costs $1600. But a 4090 is the only GPU this generation that gets a lot of things right and truly moves things forward even if it costs a hefty price. And a 7800XT being second makes sense as that GPU really disrupted the mid-range segment forcing Nvidia to lower both 4070 and 4060 Ti prices multiple times.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

hinch said:

Hardware Unboxed deep dive into FSR 3.0

Not bad. A little bit scuffed launch for the tech but there are positives. Less CPU overhead vs DLSS 3.. and works decently itself for generating frames with competitive input lag (though higher than DLSS 3). Downsides.. Image quality still leaves a lot to be desired due to FSR's current upscaling capabilities. There's framepacing issues. Needs a good base framerate like DLSS 3 and there are caviats like not having Freesync and having need to turn on Vsync for a smoother experience. Doesn't really recommend in its current state.

They'll improve on it but meh a lot of things to give up to use it rn. Plus you need to have upscaling turned on for FG to work properly. Then there no access to Reflex for Nvidia users and no Freesync/G-Sync support. Should've just delayed to fix those issues tbh, especially framepacing as that can't make for a good experience..

There is no better sales team for Nvidia products than Radeon themselves. Nvidia makes something proprietary claiming it requires specialized hardware to make it work. AMD announces they are making an open version that works on every GPU thus proving Nvidia is a big bad evil corp. AMDs solution releases and is objectively much worse than Nvidias solution. Customers see the reviews and justifies spending the premium on Nvidia products as AMD just proved Nvidias claims to be correct.

Honestly AMDs tactics need to change. Their idea this entire generation seems to be "FreeSync and VRR killed Gsync so if we make an open standard for every Nvidias technology, then it should kill off the adoption and need for their tech." But the issue is Freesync and VRR isn't significantly worse than Gsync. Gsync module has it's advantages such as being able to go down to 1hz and allow for ULMB but Freesync/VRR can do the variable refresh rate just as good as Gsync if you stay with-in it's window and allows for HDMI 2.1 support which the G-sync module still doesn't.

FSR 1/2/3 on the other hand effectively makes you games look and play much worse than DLSS. FSR is effectively has little to no difference than upscaling solutions that already exist and the thing is that PC gamers do not want that. Why? Because PC gamers shat on console gamers for generations for using upscaling tech similar to FSR. DLSS is different because it really is the next generation of upscaling tech that consoles and Radeon products simply do not have access to. And it's like, well if I am going to pay more than a console on a GPU, why wouldn't I get a GPU that has upscaling tech that is better than the rest? Cause god knows game devs aren't optimizing shat this generation.

Hopefully Herkelmens successor finds a way for Radeon to start innovating because playing second class is not working for them in the PC space. The marketing does not work because Nvidia is the default choice and it's known that people do more research before buying the "second choice" and if reviewers keep saying, Nvidia has the better tech, people are generally willing to pay the premium for it.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

The one good thing about all this is Anti-Lag+

While it doesn't work with Frame Gen on as of yet, the latency reduction is insane similar to how Reflex works. AMD just needs to make it work with their older Radeon GPUs instead of only RDNA 3 similar to how Reflex works all the way back to Maxwell GPUs.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

Dear God, $400 for a Nintendo console... and that's US price, here in Europe it will end being 440-450€ if not even more.

If true, I'll be curious to see the sales numbers after launch.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

By the way, talking about handhelds:

Twitter sleuths suggest a new Steam Deck is on its way, but any updates are likely to be for Valve's benefit not ours
https://www.pcgamer.com/twitter-sleuths-suggest-a-new-steam-deck-is-on-its-way-but-any-updates-are-likely-to-be-for-valves-benefit-not-ours/
Valve is well known for keeping any information about its hardware plans very tightly under wraps, to the point where you might as well hire the world's best psychic to forecast when something new is on its way. But never underestimate the sheer diligence of tech enthusiasts, especially the likes of Brad Lynch, who's spotted some interesting snippets of potential changes or updates for the Steam Deck.

His recent posts (as reported by The Verge) revolve around the so-called Model 1030 which looks to be a clear update of some kind for Valve's handheld gaming PC. It's worth pointing out that no matter what—if any—changes are forthcoming, 1030 is absolutely not a Steam Deck 2. Valve has been crystal clear on this matter, so if you're hoping that something with more performance is just around the corner, you're in for a long wait.

And, for those looking for OLED monitors:

Rtings latest OLED monitor burn-in tests are not good news for Samsung
https://www.pcgamer.com/rtings-latest-oled-monitor-burn-in-tests-are-not-good-news-for-samsung/
Rtings is four months and about 2,000 hours into its OLED PC monitor burn-in testing and the results so far do not look good for monitors with Samsung's QD-OLED panels.

Admittedly, Rtings only has three monitors on test, two with Samsung QD-OLED tech and one with LG WOLED tech. But both of the Samsung-equipped monitors are showing signs of burn-in, while the LG model appears to have avoided any image retention.

(...)

For the record, the models in the test are the Alienware 34 AW3423DWF, the LG Ultragear 27GR95QE-B and the Samsung Odyssey OLED G8 G85SB.(...)



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Jizz_Beard_thePirate said:
hinch said:

Hardware Unboxed deep dive into FSR 3.0

Not bad. A little bit scuffed launch for the tech but there are positives. Less CPU overhead vs DLSS 3.. and works decently itself for generating frames with competitive input lag (though higher than DLSS 3). Downsides.. Image quality still leaves a lot to be desired due to FSR's current upscaling capabilities. There's framepacing issues. Needs a good base framerate like DLSS 3 and there are caviats like not having Freesync and having need to turn on Vsync for a smoother experience. Doesn't really recommend in its current state.

They'll improve on it but meh a lot of things to give up to use it rn. Plus you need to have upscaling turned on for FG to work properly. Then there no access to Reflex for Nvidia users and no Freesync/G-Sync support. Should've just delayed to fix those issues tbh, especially framepacing as that can't make for a good experience..

There is no better sales team for Nvidia products than Radeon themselves. Nvidia makes something proprietary claiming it requires specialized hardware to make it work. AMD announces they are making an open version that works on every GPU thus proving Nvidia is a big bad evil corp. AMDs solution releases and is objectively much worse than Nvidias solution. Customers see the reviews and justifies spending the premium on Nvidia products as AMD just proved Nvidias claims to be correct.

Honestly AMDs tactics need to change. Their idea this entire generation seems to be "FreeSync and VRR killed Gsync so if we make an open standard for every Nvidias technology, then it should kill off the adoption and need for their tech." But the issue is Freesync and VRR isn't significantly worse than Gsync. Gsync module has it's advantages such as being able to go down to 1hz and allow for ULMB but Freesync/VRR can do the variable refresh rate just as good as Gsync if you stay with-in it's window and allows for HDMI 2.1 support which the G-sync module still doesn't.

FSR 1/2/3 on the other hand effectively makes you games look and play much worse than DLSS. FSR is effectively has little to no difference than upscaling solutions that already exist and the thing is that PC gamers do not want that. Why? Because PC gamers shat on console gamers for generations for using upscaling tech similar to FSR. DLSS is different because it really is the next generation of upscaling tech that consoles and Radeon products simply do not have access to. And it's like, well if I am going to pay more than a console on a GPU, why wouldn't I get a GPU that has upscaling tech that is better than the rest? Cause god knows game devs aren't optimizing shat this generation.

Hopefully Herkelmens successor finds a way for Radeon to start innovating because playing second class is not working for them in the PC space. The marketing does not work because Nvidia is the default choice and it's known that people do more research before buying the "second choice" and if reviewers keep saying, Nvidia has the better tech, people are generally willing to pay the premium for it.

The problem is, what other choice do they have?

NVidia doesn't share their technology, so AMD needs to make it's own version every single time, otherwise they won't be competitive anymore. But since AMD is much smaller than NVidia, an open standard is the only way they could do it without falling too far back, as then other companies and people can help them catch up again.

AMD also can't really innovate right now, as they simply have their hands full with catching up. AMD would need to massively expand their GPU department to be able to innovate at the current situation, and I'm not sure there's even enough talent on the market (good or bad) to achieve this.

According to Zippia, AMD has around 15500 employees, of which of course a big part is for the CPU department and for semicustoms, and has about 1260 job openings right now. NVidia, on the other hand, has over 26000 employees, most of which are working on the GPUs. NVidia also has massively expanded it's workforce, basically doubling it since 2020 and over three times as many as 2018, so there won't be much left to hire for AMD either way.



hinch said:

Hardware Unboxed deep dive into FSR 3.0

Not bad. A little bit scuffed launch for the tech but there are positives. Less CPU overhead vs DLSS 3.. and works decently itself for generating frames with competitive input lag (though higher than DLSS 3). Downsides.. Image quality still leaves a lot to be desired due to FSR's current upscaling capabilities. There's framepacing issues. Needs a good base framerate like DLSS 3 and there are caviats like not having Freesync and having need to turn on Vsync for a smoother experience. Doesn't really recommend in its current state.

They'll improve on it but meh a lot of things to give up to use it rn. Plus you need to have upscaling turned on for FG to work properly. Then there no access to Reflex for Nvidia users and no Freesync/G-Sync support. Should've just delayed to fix those issues tbh, especially framepacing as that can't make for a good experience..

19:20 These latency numbers are terrible overall (between 104-163ms). I believe "Vsync on" is partly to blame here.



Jizz_Beard_thePirate said:

Timmy ate more than he could chew. He tried to go up against Valve which has been a huge money sink trying to get all the exclusives for EGS. The theory was that if people simply used EGS, they would stick around and not bother with Valve. But that backfired very hard as all that did was waste 100s of millions of dollars with minimal user retention. Then they tried to go up against Apple which ended up with them getting cucked in court. Then with UE5, the engine has largely been a shit show thus far and the games that are coming out are flopping which means little to no revenue coming from game sales. And of course, Fortnite has been going down year over year if I remember correctly.

So now Timmy is at a point where he can no longer rely on the funds of Fortnite and Unreal Engine to fuel his projects. All his prospects have gone to shit and now he is firing people in the hopes that things will change. Somethings might get better like UE5 while other things will continue to go down like EGS and Fortnite. Hopefully he learned his lesson but I doubt it.

I sure did get a nice collection of games out of it .