By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Carzy Zarx’s PC Gaming Emporium - Catch Up on All the Latest PC Gaming Related News

Sad to hear that about Arkane, they were really good at what they were doing. Last game I've played form them was Prey, and I thoroughly enjoyed it (though it has nothing to do with original Prey). Still think that Arx Fatalis is best game they ever made, and wished they made more games like that.



Around the Network

AMD exclusive sponsorship apparently means every other vendor gets gimped. Not only does a 6800XT perform 40% faster than 3080 and A770 is performing worse than 5700 (non XT), but if you have Intel CPUs with Hyper Threading and E-cores enabled, you get worse performance than if you disable them where is if you have AMD CPUs with Hyper Threading enabled, it scales just fine. DLSS mod vs FSR is as usual, worlds better with DLSS.

AMD has truly become a cancer of PC gaming and they have gone down to EGS levels of scummyness where both of them are lacking in innovation and features vs their competitors but they can pay developers and force your experience to be subpar.

PC is still the best way to play though if you have modern hardware and I'd take Starfields largely stutter free experience (other than traversal) over Jedis stutter fest. Especially if you enable DLSS via mod.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Chazore said:
Jizz_Beard_thePirate said:

Someone should ask Todd how does one upgrade from a 4090 since there is nothing above that. What a shat answer.

Least Rift Apart is getting tons of patches as is the trend of a lot of modern games. Ship in a bad state, fix game after 3-4 months of patching.

I love how he says this while complletely ignoring the performance of the Xbox systems, both of which have far, far lower power output than the high end cards on the current market, even the 4090 itself.

Is the performance optimization on Xbox really better? It is stuck at 30 fps, even on Xbox Series X, and of course they aren't using ultra settings on Xbox.

With similar settings on PC with similar hardware (Ryzen 3600, GeForce 2070 Super), according to the Digital Foundry video the fps are in the 40s. On a 120 Hz TV/monitor that should give the option to a stable 40fps/120 Hz lock... much better than a 30 fps lock.



hinch said:
JEMC said:

Well, the Steam Deck also has a 7" screen with a 800p resolution and few people complained about it, so I'd say that there's still a market for this size and resolution.

Besides, we're talking about Nintendo. They don't make a lot of money from their hardware, most of it comes from their software, and if they can save a few pennies here and there, you can bet that they'll do it.

Yeah I know its a perfectly servicable resolution for the size. Its just I'm wishing for progress from the origional NS lol. We'll see either way but yeah you're probably right 720P is the more sensible option considering its size, form factor and its cost savings and revenue to Nintendo.

If there's something I've learned from modern Nintendo (from the Wii until now), is that you need to keep expectations in check, and usually low. This is no longer the SNES, N64 or GameCube Nintendo that tried to make systems as powerful as possible. That time passed. Now it's about making hardware that's good enough for what they want to do with it and that it's easy to make a profit from.

Jizz_Beard_thePirate said:

AMD exclusive sponsorship apparently means every other vendor gets gimped. Not only does a 6800XT perform 40% faster than 3080 and A770 is performing worse than 5700 (non XT), but if you have Intel CPUs with Hyper Threading and E-cores enabled, you get worse performance than if you disable them where is if you have AMD CPUs with Hyper Threading enabled, it scales just fine. DLSS mod vs FSR is as usual, worlds better with DLSS.

AMD has truly become a cancer of PC gaming and they have gone down to EGS levels of scummyness where both of them are lacking in innovation and features vs their competitors but they can pay developers and force your experience to be subpar.

PC is still the best way to play though if you have modern hardware and I'd take Starfields largely stutter free experience (other than traversal) over Jedis stutter fest. Especially if you enable DLSS via mod.

*table*

I understand you anger, but I think you're directing it to the wrong place.

There are lots of new games that even today perform worse with E-cores enabled, and AMD has nothing to do with it. There are also games out there that perform worse with HT on, and Bethesda's engine is far from being new. And yes, I know that HT does work on AMD processors, it's true, but have to stopped to think that the Xbox consoles have AMD CPUs and, therefore, it makes sense that Bethesda has spend more time optimizing their code to take better advantage of that, and that work has translated to the PC version of the game?

After all, if AMD had gone that far to make the Intel CPUs look worse, why are the 13900K, 13700K and even 13600K processors the ones that run the game better than even AMD's premium X3D ones? It doesn't make sense.

And calling AMD a cancer to PC gaming... well, I don't really know how to respond to that.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
Jizz_Beard_thePirate said:

AMD exclusive sponsorship apparently means every other vendor gets gimped. Not only does a 6800XT perform 40% faster than 3080 and A770 is performing worse than 5700 (non XT), but if you have Intel CPUs with Hyper Threading and E-cores enabled, you get worse performance than if you disable them where is if you have AMD CPUs with Hyper Threading enabled, it scales just fine. DLSS mod vs FSR is as usual, worlds better with DLSS.

AMD has truly become a cancer of PC gaming and they have gone down to EGS levels of scummyness where both of them are lacking in innovation and features vs their competitors but they can pay developers and force your experience to be subpar.

PC is still the best way to play though if you have modern hardware and I'd take Starfields largely stutter free experience (other than traversal) over Jedis stutter fest. Especially if you enable DLSS via mod.

*table*

I understand you anger, but I think you're directing it to the wrong place.

There are lots of new games that even today perform worse with E-cores enabled, and AMD has nothing to do with it. There are also games out there that perform worse with HT on, and Bethesda's engine is far from being new. And yes, I know that HT does work on AMD processors, it's true, but have to stopped to think that the Xbox consoles have AMD CPUs and, therefore, it makes sense that Bethesda has spend more time optimizing their code to take better advantage of that, and that work has translated to the PC version of the game?

After all, if AMD had gone that far to make the Intel CPUs look worse, why are the 13900K, 13700K and even 13600K processors the ones that run the game better than even AMD's premium X3D ones? It doesn't make sense.

And calling AMD a cancer to PC gaming... well, I don't really know how to respond to that.

The thing is that we have had countless examples of console "optimization" not being translated to pc gaming because there are too many things that are different. We have seen this all the way back to GCN era when even though consoles has had AMD exclusive hardware, it has largely never translated to PC because PC requires a different set of optimizations to get going as consoles are semi-custom. We also know for a fact that AMD calls their sponsorship for Starfield an AMD Exclusive partnership and we also know for a fact that AMD sent their engineers to Bethesda for this game. So no, I highly doubt it is because of console optimization translating to PC.

And yea those Raptor Lake cpus might run better than X3D but the behaviour is strange and is only exclusive to Intel. Like it's one thing to be like okay, e-cores don't do much so turn it off but the Hyper-Threading scaling makes zero sense.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
Jizz_Beard_thePirate said:
JEMC said:

I understand you anger, but I think you're directing it to the wrong place.

There are lots of new games that even today perform worse with E-cores enabled, and AMD has nothing to do with it. There are also games out there that perform worse with HT on, and Bethesda's engine is far from being new. And yes, I know that HT does work on AMD processors, it's true, but have to stopped to think that the Xbox consoles have AMD CPUs and, therefore, it makes sense that Bethesda has spend more time optimizing their code to take better advantage of that, and that work has translated to the PC version of the game?

After all, if AMD had gone that far to make the Intel CPUs look worse, why are the 13900K, 13700K and even 13600K processors the ones that run the game better than even AMD's premium X3D ones? It doesn't make sense.

And calling AMD a cancer to PC gaming... well, I don't really know how to respond to that.

The thing is that we have had countless examples of console "optimization" not being translated to pc gaming because there are too many things that are different. We have seen this all the way back to GCN era when even though consoles has had AMD exclusive hardware, it has largely never translated to PC because PC requires a different set of optimizations to get going as consoles are semi-custom. We also know for a fact that AMD calls their sponsorship for Starfield an AMD Exclusive partnership and we also know for a fact that AMD sent their engineers to Bethesda for this game. So no, I highly doubt it is because of console optimization translating to PC.

And yea those Raptor Lake cpus might run better than X3D but the behaviour is strange and is only exclusive to Intel. Like it's one thing to be like okay, e-cores don't do much so turn it off but the Hyper-Threading scaling makes zero sense.

Nvidia has also send their own engineers to developers to help with Nvidia sponsored games dozens of times to make sure that the games run as good as possible on their cards, and no one said that they were a cancer or were destroying the PC market.

As for console optimations that haven't made it to PC, yes, that's true. But this is also Bethesda's first new game this gen, and who knows what they've done to their engine to work with those Zen2 cores.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
Jizz_Beard_thePirate said:

The thing is that we have had countless examples of console "optimization" not being translated to pc gaming because there are too many things that are different. We have seen this all the way back to GCN era when even though consoles has had AMD exclusive hardware, it has largely never translated to PC because PC requires a different set of optimizations to get going as consoles are semi-custom. We also know for a fact that AMD calls their sponsorship for Starfield an AMD Exclusive partnership and we also know for a fact that AMD sent their engineers to Bethesda for this game. So no, I highly doubt it is because of console optimization translating to PC.

And yea those Raptor Lake cpus might run better than X3D but the behaviour is strange and is only exclusive to Intel. Like it's one thing to be like okay, e-cores don't do much so turn it off but the Hyper-Threading scaling makes zero sense.

Nvidia has also send their own engineers to developers to help with Nvidia sponsored games dozens of times to make sure that the games run as good as possible on their cards, and no one said that they were a cancer or were destroying the PC market.

As for console optimations that haven't made it to PC, yes, that's true. But this is also Bethesda's first new game this gen, and who knows what they've done to their engine to work with those Zen2 cores.

Show me an AAA game where a 3080 is 40% faster in Raster than a 6800XT cause I sure haven't seen any.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Jizz_Beard_thePirate said:
JEMC said:

Nvidia has also send their own engineers to developers to help with Nvidia sponsored games dozens of times to make sure that the games run as good as possible on their cards, and no one said that they were a cancer or were destroying the PC market.

As for console optimations that haven't made it to PC, yes, that's true. But this is also Bethesda's first new game this gen, and who knows what they've done to their engine to work with those Zen2 cores.

Show me an AAA game where a 3080 is 40% faster in Raster than a 6800XT cause I sure haven't seen any.

Have I denied that AMD GPUs perform better than Nvidia's with today's drivers? No, I haven't. Why don't we wait until Nvidia launches another driver with further optimizations before we start losing our minds?

Also, I'm not sure if you're written that the way you wanted, but you're right, I haven't been able to find a game where the 3080 is 40% faster than a 6800XT. Max I've found is 26%. Without RT, of course.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
Jizz_Beard_thePirate said:

Show me an AAA game where a 3080 is 40% faster in Raster than a 6800XT cause I sure haven't seen any.

Have I denied that AMD GPUs perform better than Nvidia's with today's drivers? No, I haven't. Why don't we wait until Nvidia launches another driver with further optimizations before we start losing our minds?

Also, I'm not sure if you're written that the way you wanted, but you're right, I haven't been able to find a game where the 3080 is 40% faster than a 6800XT. Max I've found is 26%. Without RT, of course.

The point is that when a game is Radeon sponsored and this isn't just me ranting, this view is shared by many tech journalists out there is the fact that AMD doesn't want to play nice with others. If we look at various Nvidia sponsored titles, the real difference really stems from when Ray Tracing is involved but as we saw with Intel launching their cards and AMD launching RDNA 3, Ray Tracing isn't Nvidia gimping their competitors but rather just a showing of the Ray Tracing capabilities of each GPU. Nvidia sponsored titles also has a trend of allowing all upscaling methods and not "missing" certain ones. There are some exceptions but there is a pretty big gap between the sponsorships. The reason is Nvidia has an open source upscaling integration api called Streamline which Intel is part of hence why we typically see XeSS integration and it also allows FSR to be easily implemented too. AMD on the other hand refused to be part of it.

So imo, AMD is very much acting like how EGS acted against Steam where Steam has all these nice features that people like such as Family Sharing, Remote Play and such. But instead of competing in the feature set, EGS would rather have the barebones feature set and bribe developers to be EGS exclusive. Cause if I want to play Starfield at 90fps on my 4090, my choices are:

a) Use FSR that makes my game look terrible
b) Use DLSS mods that can crash the game from time to time and corrupt save files
c) Just play the game how it is which is sub 60fps drops when in big cities

It's annoying because generally when it's an Nvidia sponsored title, Nvidia users can have their DLSS, AMD users can have their FSR and Intel can have their XeSS. But with AMD sponsored titles where oh it's just Raster or light Ray Tracing and have to use FSR. And sometimes after 4 months, there is an official DLSS update.

Last edited by Jizz_Beard_thePirate - on 09 September 2023

                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Conina said:

Is the performance optimization on Xbox really better? It is stuck at 30 fps, even on Xbox Series X, and of course they aren't using ultra settings on Xbox.

With similar settings on PC with similar hardware (Ryzen 3600, GeForce 2070 Super), according to the Digital Foundry video the fps are in the 40s. On a 120 Hz TV/monitor that should give the option to a stable 40fps/120 Hz lock... much better than a 30 fps lock.

I meant in that he's going around telling us to "upgrade", whilst he cannot in the same sentence tell the console side to do the same, because they are already stuck into this current gen, and because they have already done what they could for the Xbox systems, it tells us that even if we threw more power at the game on PC, he'd still come back with that faulty logic, instead of him admitting he screwed the pooch. 

Yuri's post before with the DF video sums up that all sides fucked this up, not only just AMD, and Todd is just going around telling us to upgrade to non existent HW, that it kinda comes off as ignorant and insulting if you look at it from a performance and realistic perspective. 



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"