By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Friday news, part two:

Blood Bowl 3 has orc nipple tassels, ultraviolence, and a proper release date
https://www.pcgamer.com/blood-bowl-3-has-orc-nipple-tassels-ultraviolence-and-a-proper-release-date/
After many delays, Blood Bowl 3 has a proper release date. First announced in 2019 with a launch slated for the following year, a new trailer has announced that Cyanide Studio's absurd death sport will finally return to us on February 23, 2023. Talk about overtime.

CD Projekt's next Witcher trilogy has a new game director
https://www.pcgamer.com/cd-projekts-next-witcher-trilogy-has-a-new-game-director/
Cyberpunk 2077 animation director Sebastian Kalemba has revealed on Twitter that he's taken the role of game director in CD Projekt's next Witcher game.
"Career news: I’m directing the new Witcher Saga," Kalemba tweeted. "Since joining @CDPROJEKTRED I believe nothing is impossible and raising the bar, telling emotional stories & creating worlds is what we’re here for. I’m proud to be part of CDPR and work with such a talented and passionate team."

PC Gamer readers can get special access to this PvP strategy game from ex-Blizzard veterans
https://www.pcgamer.com/spellcraft-alpha-access/
Spellcraft, the tabletop minis-inspired PvP strategy game from Blizzard veterans, One More Game, is entering a closed gameplay alpha on Friday, November 4. While most players will have to go through your standard closed gameplay selection process, PC Gamer readers have the opportunity for a guaranteed spot in the preview build.

This new multiplayer survival RPG will scratch your Valheim itch this month
https://www.pcgamer.com/this-new-multiplayer-survival-rpg-will-scratch-your-valheim-itch-this-month/
Open world fantasy survival RPG Frozen Flame has given players a few chances to check it out recently, with a popular demo during Steam Next Fest and an open beta over Halloween weekend. If you played and enjoyed it (or are bummed you missed out) you won't have to wait much longer to dive into the colorful fantasy world. Frozen Flame is launching into Steam Early Access on November 17, which is two weeks from now.

Modern Warfare 2 players are already taking sides in the battle over aim assist
https://www.pcgamer.com/modern-warfare-2-players-are-already-taking-sides-in-the-battle-over-aim-assist/
Voodoo PC founder Rahul Sood once claimed, many years ago, that Microsoft had hastily abandoned early experimentation with crossplay because PC players, armed with the precision and speed of mouse and keyboard, routinely "https://www.rahulsood.com/2010/07/console-gamers-get-killed-against-pc.html">destroyed" controller-equipped console players. Things have changed a great deal since then thanks to the advent of aim assist, and now we're at a point where some Modern Warfare 2 players on PC are complaining that their console counterparts are simply too dominating.
A number of threads have appeared recently in the Modern Warfare 2 subreddit, many accompanied by videos to prove their point, purporting to show that aim assist is grossly overpowered—to the point of being outright broken, in the opinions of some. There are even complaints that aim assist tracks targets through smoke and fire, when enemy targets are partially obscured.

Ghost Trick, which has the best dog in videogames, has been rated for PC in Korea
https://www.pcgamer.com/ghost-trick-which-has-the-best-dog-in-videogames-has-been-rated-for-pc-in-korea/
Once again the Game Rating and Administration Committee of Korea seems to have come through with a hint about an upcoming game. It was thanks to them we got an early clue about the Mass Effect remaster, as well as Sunset Overdrive's PC port, and a Silent Hill game was classified there a month ahead of the series' revival being made public. Now a PC version of Ghost Trick: Phantom Detective—a Capcom puzzle adventure developed for the Nintendo DS and released outside Japan in 2011—has been rated in Korea, as spotted by Gematsu.

Modern Warfare 2 composer departs project and disowns soundtrack over 'challenging' work dynamic with audio director
https://www.pcgamer.com/modern-warfare-2-composer-departs-project-and-disowns-soundtrack-over-challenging-work-dynamic-with-audio-director/
The composer behind Modern Warfare 2 has announced that she's terminating her involvement with the game. In a statement posted to Twitter, Sarah Schachner said that, owing to an "increasingly challenging" working dynamic with MW2's audio director, she's unable to continue working on the music and soundtrack release for MW2 and Warzone, so she's washing her hands of the whole thing.

Gears of War was sold to Microsoft because 'Epic didn't really know what to do' with the series
https://www.pcgamer.com/gears-of-war-was-sold-to-microsoft-because-epic-didnt-really-know-what-to-do-with-the-series/
Cliff Bleszinski, formerly of Epic and the lead designer on the first three Gears of War games, has shared his opinion on why the series was subsequently sold to Microsoft. In short, Epic thought it had done everything it could with the games. After the spinoff Gears of War: Judgement (co-developed with People Can Fly), the series was sold to Microsoft in 2014, which subsequently formed The Coalition to develop future entries.
>> Keep in mind that it's Cliff's opinion which, at this point, doesn't mean a lot.

Disney Dreamlight Valley 'fixes' Moana dress by making it look nothing like Moana
https://www.pcgamer.com/disney-dreamlight-valley-fixes-moana-dress-by-making-it-look-nothing-like-moana/
Gameloft pushed out a confusing "bug fix" for Disney Dreamlight Valley earlier this week, making a major change to the final reward gifted to you from seafaring princess Moana.

Hideo Kojima says the Abandoned conspiracies are 'a nuisance'
https://www.pcgamer.com/hideo-kojima-says-the-abandoned-conspiracies-are-a-nuisance/
Hideo Kojima has finally addressed the never-ending Abandoned conspiracy theories, 18 months after fans became convinced that he was linked to Blue Box Studio's survival horror.

And now, the weekend deals at GOG and Steam:

+GOG

+Steam

And that's it. Until next time, I wish you a happy and gaming weekend.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network
Chazore said:
QUAKECore89 said:

It looks like he was having hard time with almost everything not just AMD's misleading marketing, you have Nvidia's low quality adapter, RTX 4090 being oversized card for the compact cases, Intel Arc GPU's driver/control panel problems, overpriced AMD motherboards and CPUs and on and on and on lol

Let's be honest here lads, the GPU market has been a complete shitshow these past 2 yrs lol.

Yeah the 4090 is the fuckin bees knees, everyone knows that, but we can't all afford it, it's fucking chonkers and drains a lot of power to top it all off.

We're also entering a new eco crisis and power bills are going up and up, so like, I don't know why we should see the 4090 as some beacon of a win, when the rest of everything else has been such shit, I consider the 4090 a fuckup if anything, so I can get why Steve would be assmad, because everything imo has been shit. 

4090 has been a shit show no doubt but it is a beacon of win because it's the only product with the least amount of compromises. It will likely beat the 7900XTX in Raster by most likely 15-20%, it will beat the 7900XTX in Ray Tracing by 80+%, it has industry leading Ai upscaling that neither Intel or AMD can match, it has Reflex that no one has an alternative to or will have in sometime. It has industry leading encoders, it has cuda acceleration for workstation, it has support for Optix which beats 64 core threadripper CPUs and dual Radeon Workstation GPUs single handedly in Blender and the list goes on.

Now you may not care about a lot of that and that's fine. But that doesn't mean others don't. And that doesn't mean it doesn't come at a cost. 7900XTX costs $1000 and not $1600 not because AMD is doing you a favor, it's because the 7900XTX is inferior in virtually every area and in fact, most of the time, by a lot. The only area it's good in is Raster gaming if that's all you care about, great. But the 4090 no doubt a win for Nvidia because there will be nothing like it from AMD or Intel for the rest of the generation at any price point and for that, there is a cost.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

QUAKECore89 said:

I still think 4090 is the worst hardware product ever produced next to RX 6500 XT in terms of quality product and also out of box, the users returned their customized rigs back to PC retailers after adapters got melted and offered 100% refund due to lack of aftermarket cables in the market.

Yeah, i know this sounds like an extreme as fuck, but lol... Hear me out, we didn't even get aftermarket cable solution for the RTX 4090/4080 in advance before the cards arrived.

I feel like after the past 2 yrs we've gone through, the fact that we are treated to a higher price markup, burning cables and having to rely on an aftermarket solution, as well as having to accept injected extra frames as a new-form solution to perf gains is kinda disgraceful to the consumer market, like we've been dicked over for 2 yrs by up to 3 different parties, one of them being the supplier to us and being a part of the problem. 

I don't think you're being extreme, you're being honest and real about the current scenario we are in. 2 yrs later we should be treated with apologies, respect and diligence to do us right by selling us what we were originally asking for, better perf/price ratio, and so far we're not getting much of the latter, the former having to rely on stupid gimmicks like injecting non-existent frames, something which most animators would frown at (I love 60fps as much as the next guy, but injected frames that were never there is just not kosher and feels like a band-aid short term solution).

Yuri has a point about AMD though, they aren't doing this with their pricing to appear charitable, otherwise they would have lowered by another 100-150, but they are holding their prices until they see Nvidia draw their next hand, so we'll see just what happens when they do.


We desperately need a price war this gen, not another trump for Nvidia, setbacks for AMD and a stalemate on pricing, because that is only ever going to result in the 99% buggering off to hardware like consoles, even older hw, or the Steam Deck (last gen AMD/Nvidia wasn't exactly kind to mid/low tier either). 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Captain_Yuri said:

4090 has been a shit show no doubt but it is a beacon of win because it's the only product with the least amount of compromises. It will likely beat the 7900XTX in Raster by most likely 15-20%, it will beat the 7900XTX in Ray Tracing by 80+%, it has industry leading Ai upscaling that neither Intel or AMD can match, it has Reflex that no one has an alternative to or will have in sometime. It has industry leading encoders, it has cuda acceleration for workstation, it has support for Optix which beats 64 core threadripper CPUs and dual Radeon Workstation GPUs single handedly in Blender and the list goes on.

Now you may not care about a lot of that and that's fine. But that doesn't mean others don't. And that doesn't mean it doesn't come at a cost. 7900XTX costs $1000 and not $1600 not because AMD is doing you a favor, it's because the 7900XTX is inferior in virtually every area and in fact, most of the time, by a lot. The only area it's good in is Raster gaming if that's all you care about, great. But the 4090 no doubt a win for Nvidia because there will be nothing like it from AMD or Intel for the rest of the generation at any price point and for that, there is a cost.

That it may be, but if I and many others cannot afford it, or have to only use it for 4-k, then that just becomes an extra added cost and that alone comes with it's own set of compromises, topping it off with needing those after-market cables and you're kinda seeing a formula one boasting the latest engine, but the rest of the body needs duct tape and 3rd party parts to keep it from falling apart at the next turn.

Look at it this way, Musk wants to setup flights to the moon for mankind, but you and I both know 99% of us are going to afford that. Yes that makes it a win for humanity to be able to take frequent trips to the moon, but if 99% of us cannot afford it, then what is the point in it even existing?.

I get that the rich have these nice ultra luxuries, but from an outsider pov, they are pretty much useless to the rest of us, because they either do so little vs the price points or are not available for everyone, it's why I find most rich ppl lifestyles as extremely wasteful, pointless and vain. 

I fully expect the 4090 to remain unchallenged throughout this entire gen, simply because AMD is still playing catch up for a gen and a half now, and they seem more focused on FSR than anything else currently.

I do care about RT in the future, but atm, not many devs are implementing it in ways that make it a must have for virtually everyone, myself included, like look at Dl2, with RT off the game looks like shit, but with RT on it looks mediocre, that's bad form in my eyes, at least Cyberpunk looks semi decent without RT on and other settings on high to ultra, but DL2 is an example of not doing a good job (also DX 12, devs still haven't gotten to grips with that and UE still giving us stutters, so I'm not as hopeful for RT getting executed properly if devs cannot resolve the other two long-standing issues plaguing us still). 

I'm just annoyed that Nvidia gets to boast about what they have, but still keep the gates locked and the toll price high, like that does nothing for me and makes the boasting insanely vain and arrogant, that I actually want to kick Jensen so hard in the nuts that he has a heart attack and passes on the spot, it's like we're reaching last gen levels of Sony arrogance and I hate seeing a company get so smug and comfortable in that sort of position. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"

Chazore said:
Captain_Yuri said:

4090 has been a shit show no doubt but it is a beacon of win because it's the only product with the least amount of compromises. It will likely beat the 7900XTX in Raster by most likely 15-20%, it will beat the 7900XTX in Ray Tracing by 80+%, it has industry leading Ai upscaling that neither Intel or AMD can match, it has Reflex that no one has an alternative to or will have in sometime. It has industry leading encoders, it has cuda acceleration for workstation, it has support for Optix which beats 64 core threadripper CPUs and dual Radeon Workstation GPUs single handedly in Blender and the list goes on.

Now you may not care about a lot of that and that's fine. But that doesn't mean others don't. And that doesn't mean it doesn't come at a cost. 7900XTX costs $1000 and not $1600 not because AMD is doing you a favor, it's because the 7900XTX is inferior in virtually every area and in fact, most of the time, by a lot. The only area it's good in is Raster gaming if that's all you care about, great. But the 4090 no doubt a win for Nvidia because there will be nothing like it from AMD or Intel for the rest of the generation at any price point and for that, there is a cost.

That it may be, but if I and many others cannot afford it, or have to only use it for 4-k, then that just becomes an extra added cost and that alone comes with it's own set of compromises, topping it off with needing those after-market cables and you're kinda seeing a formula one boasting the latest engine, but the rest of the body needs duct tape and 3rd party parts to keep it from falling apart at the next turn.

Look at it this way, Musk wants to setup flights to the moon for mankind, but you and I both know 99% of us are going to afford that. Yes that makes it a win for humanity to be able to take frequent trips to the moon, but if 99% of us cannot afford it, then what is the point in it even existing?.

I get that the rich have these nice ultra luxuries, but from an outsider pov, they are pretty much useless to the rest of us, because they either do so little vs the price points or are not available for everyone, it's why I find most rich ppl lifestyles as extremely wasteful, pointless and vain. 

I fully expect the 4090 to remain unchallenged throughout this entire gen, simply because AMD is still playing catch up for a gen and a half now, and they seem more focused on FSR than anything else currently.

I do care about RT in the future, but atm, not many devs are implementing it in ways that make it a must have for virtually everyone, myself included, like look at Dl2, with RT off the game looks like shit, but with RT on it looks mediocre, that's bad form in my eyes, at least Cyberpunk looks semi decent without RT on and other settings on high to ultra, but DL2 is an example of not doing a good job (also DX 12, devs still haven't gotten to grips with that and UE still giving us stutters, so I'm not as hopeful for RT getting executed properly if devs cannot resolve the other two long-standing issues plaguing us still). 

I'm just annoyed that Nvidia gets to boast about what they have, but still keep the gates locked and the toll price high, like that does nothing for me and makes the boasting insanely vain and arrogant, that I actually want to kick Jensen so hard in the nuts that he has a heart attack and passes on the spot, it's like we're reaching last gen levels of Sony arrogance and I hate seeing a company get so smug and comfortable in that sort of position. 

The 4090 isn't a product for the masses. It's a Halo product. If say Honda launches a new NSX supercar that costs $150,000, they aren't launching it for a civic buyer. They are launching it for those that like supercars. That doesn't mean Honda isn't going to launch a new Civic for the masses however cause they will. Similarly Nvidia will launch a new product for the masses. If you want to hate on a product, the 4080 is what everyone should be hating on. Because in the same analogy, the 4080 is if Honda is charging close to NSX price for a Civic Type R. If Nvidia doesn't cut the price, the 7900XTX is an easy win against the 4080. Hating on the 4090 because it's out of most peoples price range is like saying I hate lambos because I can't buy them.

The thing with RT is that while it's not transformative in every game, virtually every new big game is coming out with it and we are still in the cross gen period. One of the big staples of UE5 is Lumen which has a software based and hardware based. The hardware based Lumen is accelerated in the RT cores and considering how many companies are using UE5, it will certainly become a staple. But the question you gotta ask yourself is when you are spending money on a new product, do you really want one where it's a one trick pony or do you want a product that, while costing more, gives you all the options? Because that's why it's hard to recommend AMD cause even Intel who is new to the GPU market has better feature set than AMD does and that's saying something. (It's just that Intels drivers are shit).



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network

Third Swing...

Third miss

Last edited by Jizz_Beard_thePirate - on 04 November 2022

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

I hope you guys notice what AMD and Intel up to on why they're behind, they're letting Nvidia always innovate their technologies first and wait for the controversy to lurk the critical technical problems from the media before they start actual compete with Nvidia.

This has to be explain why AMD and Intel produced some type of value GPUs slowly. They're dodging powers, we can't have proper competition since they're badly niche GPU options, GeForce remains popular GPU choice lmao.

Last edited by QUAKECore89 - on 04 November 2022

QUAKECore89 said:

I hope you guys notice what AMD and Intel up to on why they're behind, they're letting Nvidia always innovate their technologies first and wait for the controversy to lurk the critical technical problems from the media before they start actual compete with Nvidia.

This has to be explain why AMD and Intel produced some type of value GPUs slowly. They're dodging power lmao.

I would agree if they caught up when the tech matured but they haven't.

Turing came out and AMD didn't have RT/Ai upscaling with RDNA 1 so people made the excuse that Turing was a beta test and those features were not needed. Alright fair enough. Then AMD came out with RDNA 2 which performed worse than a 2080 Ti in many RT tests while Ampere far exceeded RDNA 2 in RT. People made the excuse that it was AMD's first attempt. Now RDNA 3 comes out and not only is it slower than 4090 in Raster while 6900XT was competitive against 3090, but it once again has last gen RT performance.

See the reality is that AMD isn't letting Nvidia innovate their technologies first in a 4D chess move. The reality is that AMD simply has inferior engineers and it's time to stop making excuses for them. AMD had 3 generations now to catch up yet the pattern is obvious. They will always be a generation behind in Ray Tracing just like how they were behind in DX11 during GCN era.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
QUAKECore89 said:

I hope you guys notice what AMD and Intel up to on why they're behind, they're letting Nvidia always innovate their technologies first and wait for the controversy to lurk the critical technical problems from the media before they start actual compete with Nvidia.

This has to be explain why AMD and Intel produced some type of value GPUs slowly. They're dodging power lmao.

I would agree if they caught up when the tech matured but they haven't.

Turing came out and AMD didn't have RT/Ai upscaling with RDNA 1 so people made the excuse that Turing was a beta test and those features were not needed. Alright fair enough. Then AMD came out with RDNA 2 which performed worse than a 2080 Ti in many RT tests while Ampere far exceeded RDNA 2 in RT. People made the excuse that it was AMD's first attempt. Now RDNA 3 comes out and not only is it slower than 4090 in Raster while 6900XT was competitive against 3090, but it once again has last gen RT performance.

See the reality is that AMD isn't letting Nvidia innovate their technologies first in a 4D chess move. The reality is that AMD simply has inferior engineers and it's time to stop making excuses for them. AMD had 3 generations now to catch up yet the pattern is obvious. They will always be a generation behind in Ray Tracing just like how they were behind in DX11 during GCN era.

I wouldn't say that they are necessarily inferior, but that they have another focus. For instance, it seems that AMD focused more on efficiency this time around, both in terms of power efficiency and in terms of production with turning to a chiplet-based design.

Also, they probably have much less ressources, as AMD needs to to split theirs between their CPU and GPU markets, sometimes having to heavily favor one over the other, even. This can make a big difference when it comes to coming up with new stuff if you don't have the ressources to make them work in due time.

Finally, it's possible that AMD just introduced RT as a counter to NVidia but didn't fully believe this would go so big and thus didn't grant it the amount of ressources it would have needed during the design phase of RDNA3, which was years in the past, and couldn't rectify this anymore when they saw where the market is going to. So let us hope that next time AMD doesn't make this mistake and gives RT the full attention as it is really needed to keep up with NVidia from here on out.



Bofferbrauer2 said:
Captain_Yuri said:

I would agree if they caught up when the tech matured but they haven't.

Turing came out and AMD didn't have RT/Ai upscaling with RDNA 1 so people made the excuse that Turing was a beta test and those features were not needed. Alright fair enough. Then AMD came out with RDNA 2 which performed worse than a 2080 Ti in many RT tests while Ampere far exceeded RDNA 2 in RT. People made the excuse that it was AMD's first attempt. Now RDNA 3 comes out and not only is it slower than 4090 in Raster while 6900XT was competitive against 3090, but it once again has last gen RT performance.

See the reality is that AMD isn't letting Nvidia innovate their technologies first in a 4D chess move. The reality is that AMD simply has inferior engineers and it's time to stop making excuses for them. AMD had 3 generations now to catch up yet the pattern is obvious. They will always be a generation behind in Ray Tracing just like how they were behind in DX11 during GCN era.

I wouldn't say that they are necessarily inferior, but that they have another focus. For instance, it seems that AMD focused more on efficiency this time around, both in terms of power efficiency and in terms of production with turning to a chiplet-based design.

Also, they probably have much less ressources, as AMD needs to to split theirs between their CPU and GPU markets, sometimes having to heavily favor one over the other, even. This can make a big difference when it comes to coming up with new stuff if you don't have the ressources to make them work in due time.

Finally, it's possible that AMD just introduced RT as a counter to NVidia but didn't fully believe this would go so big and thus didn't grant it the amount of ressources it would have needed during the design phase of RDNA3, which was years in the past, and couldn't rectify this anymore when they saw where the market is going to. So let us hope that next time AMD doesn't make this mistake and gives RT the full attention as it is really needed to keep up with NVidia from here on out.

Yea I wouldn't believe in that marketing nonsense about focusing on efficiency. The only reason they said they focused on efficiency is because they created an arch that couldn't compete. I do agree with you on the other two points though.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850