By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Finally joined the PC Master Race

Conina said:
goopy20 said:

Well, it's true that everybody has different standards. However console level is pretty much how developers meant their games to be played. If you have a pc that can run them at better settings and resolution, then that's great. But developers in general try to keep multiplatform games as equal as possible. Hence why high-end gpu's are overkill for anyone but a few of the gaming royalty, who can't enjoy a game unless it runs at 120fps in native 4k. 

goopy20 said:

This isn't rocket science people, the lead platform is PS4/XBO (fixed it for you) for most major AAA developers.

So to recap that:

Even if the base PS4/XBO version can't deliver stable 30 fps (or in other cases stable 60 fps) in some games, it is nevertheless pretty much how developers meant their games to be played.

If someone has a PC to run these games better, it is overkill for anyone but a few of the gaming royalty.

But if someone has PC to run games slightly worse than on PS4 or Xbox One (or slightly better than Xbox One but slightly worse than PS4), then it is a huge problem, even if the performance can be fixed with moving some sliders a notch to the left.

Oh, and of course developers in general try to keep multiplatform games as equal as possible, therefore the developers focus on the XBO as the (less powerful) lead platform and give parity between the XBO and PS4, ignoring the extra performance of the PS4, instead of trying to get the best out of both systems (or five systems including XBO S, PS4 Pro and XBO X)... because parity is more important and anything above is "overkill for anyone but a few of the console gaming royalty".

Got it.

That is actually pretty much correct. There's a big difference with console games dropping below 30fps and when this happens on pc. The difference is that it's a lot rarer and far less noticeable on consoles as developers play test the games to make sure it's not game breaking when it does happen. With pc, framerates can be all over the place depending on your hardware and it can totally ruin the experience. That's simply because developers aren't going to test and optimize their games for the 200 different gpu's that are available on the market.

I'm not saying it's a huge problem if a game runs slightly worse than the console version. It all depends on what you find acceptable and slightly worse should be acceptable for most people. But yeah, running AC Origins in 720p at the lowest settings in 20fps on a 560ti does look like a completely different game than the ps4 counterpart and it's not how the developers meant the game to be experienced.

Here's it running on a 560 TI at lowest settings in 720p

https://www.youtube.com/watch?v=Z4-R7agVn1c

Here's the ps4 version

https://www.youtube.com/watch?v=3w-3qINOSMY

Of course, it looks even better on a gaming rig in native 4k and ultra settings. But if you play on a tv, you're really not going to notice that much of a difference. You're sure as shit going to see a difference with the example I posted above, though. Don't get me wrong, ultra settings are nice if your rig can handle them. But how often have you seen your framerate get butchered by ultra settings and had to look really close to notice the difference? They hardly ever make a game look that much better compared to extra horsepower that is required to run it.   

Last edited by goopy20 - on 04 October 2019

Around the Network

Sorry, did someone just say console games rarely drop below 30 FPS? LOL, 95% (my guess) of console gaming is sub 30. The best optimised console game this gen is HZD, and that drops to 29 on rare occassions.



Random_Matt said:
Sorry, did someone just say console games rarely drop below 30 FPS? LOL, 95% (my guess) of console gaming is sub 30. The best optimised console game this gen is HZD, and that drops to 29 on rare occassions.

Yes there are drops, but like I said, they are far less noticeable. I know this is hard to grasp for some pc gamers who are used playing games at 360fps and native 4k. But console games that were designed to run at 30fps in 1080p actually play and look pretty smooth on a tv. It's also the reason why console games look as good as they do on such old hardware. And why I personally hope 30fps (60fps for online shooters and race games) and 1080p will still be the standard when these next gen consoles come out. 



goopy20 said:

That is actually pretty much correct. There's a big difference with console games dropping below 30fps and when this happens on pc. The difference is that it's a lot rarer and far less noticeable on consoles as developers play test the games to make sure it's not game breaking when it does happen. With pc, framerates can be all over the place depending on your hardware and it can totally ruin the experience. That's simply because developers aren't going to test and optimize their games for the 200 different gpu's that are available on the market.  

It has nothing to do with "developers play testing the games on consoles". Of course you can set 30 fps as an upper limit also in PC games if you don't like "framerates to be all over the place", many PC games offer that in the settings.

Or you use a frame limiter to have that effect for almost any PC game, where you can set the upper limit of the games gradually (f. e. upper limit 25, 28, 30, 32, 35, 36, 40, 50... fps). 40 fps is great for 120 Hz displays, 36 or 48 fps is great for 144 Hz displays. And with Gsync or Freesync displays you don't have to care about fps and frametimes at all in most games.



goopy20 said:

What I'm trying to say is that almost no major developer is focusing on pc gaming anymore.

For crying out loud. I just gave you a long-ass list of developers that do focus on PC.

goopy20 said:

Meaning, we are no longer seeing games that are designed from the ground up with high-end pc specs in mind and that can't be ported to console anymore. A couple of generations ago this was different when you had games like Half-Life, Crysis, Morrowwind etc. that were a big leap beyond anything that was out on consoles at the time. I loved those days and I remember upgrading my gpu every 2 years back then.

StarCitizen.
Ashes of the Singularity.

Both aren't possible on console, making your assertion false... Just like most of them that you have put forth so far...

goopy20 said:

However, now that consoles are basically pc's with fixed specs, all major developers top priority is making sure their games run as good as possible on consoles.

And yet... So many games end up as sub-30fps experiences. *Cough*Control*Cough*


Or what about an older console-exclusive like the much coveted... Zelda: Breath of the Wild?


Again. You have been proven wrong.

goopy20 said:

You say the pc is settings the standard with things like Ray Tracing, but right now it's just a handful of patches on pc that only very few people actually get to enjoy. Next gen, however, we fill see Ray Tracing really become the standard as games will be designed from the ground up to take advantage of it. And if that means 95% of the steam users, who don't own a RTX, can't play it at comparable settings as the console version, then the developers simply won't care.

You have the choice to run with Ray Tracing on PC. Next-gen if a developer doesn't wish to implement Ray Tracing in their games, then stiff shit, you don't get Ray Tracing on console, but you do get it on PC.

Clearly you don't understand what Ray Tracing is if you think it's "Just a patch". - I think you need to do some research on what Ray Tracing is all about and the approaches that nVidia, AMD, Intel and Microsoft are taking with it... And how Ray Tracing has been implemented over the last 10 years.

It hasn't just happened overnight you know.

5% of steam users is still millions of gamers, PC is bigger than console remember.

goopy20 said:

Not denying there is a huge market for pc and Switch games and I already said RTS games are best played on pc. But I'm talking about the $100m budget blockbuster games like the next GTA. Games that will be pushing these new consoles to its limits and that'll need to sell millions just to break even. I mean, why do you think Rockstar is taking so long to release their games to pc? 

You either have a short memory or an extreme confirmation bias.

Either way, there is a game that is pushing the PC to it's limits, it's called StarCitizen.

Many AAA multiplats are built with the PC as the leed platform, leveraging cutting edge techniques and effects like Ray Tracing, with the games severly dumbed down visually for consoles where they only enjoy medium quality or less settings.

Rockstar have a history of shitty PC support, so using them as an example doesn't really give you any credibility...

goopy20 said:

That is actually pretty much correct. There's a big difference with console games dropping below 30fps and when this happens on pc. The difference is that it's a lot rarer and far less noticeable on consoles as developers play test the games to make sure it's not game breaking when it does happen. With pc, framerates can be all over the place depending on your hardware and it can totally ruin the experience. That's simply because developers aren't going to test and optimize their games for the 200 different gpu's that are available on the market.

No. The difference is the same, they are both dropping below 30fps.
The panel technology is identical, thus they tend to exhibit the same behavioral issues.

Framerate issues are common on console, that's a fact... But unlike PC, you can't turn the settings down, overclock or upgrade your internal hardware to solve the issue permanently, you are stuck with it.

goopy20 said:

I'm not saying it's a huge problem if a game runs slightly worse than the console version. It all depends on what you find acceptable and slightly worse should be acceptable for most people. But yeah, running AC Origins in 720p at the lowest settings in 20fps on a 560ti does look like a completely different game than the ps4 counterpart and it's not how the developers meant the game to be experienced.

Here's it running on a 560 TI at lowest settings in 720p

https://www.youtube.com/watch?v=Z4-R7agVn1c

Here's the ps4 version

https://www.youtube.com/watch?v=3w-3qINOSMY

The Geforce 560Ti is a weaker GPU than the Radeon 7850/7870 derived GPU in the Playstation 4, so obviously anyone with a clue will expect it to be less performant.

The 560Ti will be a closer comparison when pitted against the Xbox One.

Just remember the 560Ti is 8 years old and was never super fast even when it was released, the fact it's running newer games just reinforces the fact you don't need to upgrade your PC every year or two.

goopy20 said:

Of course, it looks even better on a gaming rig in native 4k and ultra settings. But if you play on a tv, you're really not going to notice that much of a difference. You're sure as shit going to see a difference with the example I posted above, though. Don't get me wrong, ultra settings are nice if your rig can handle them. But how often have you seen your framerate get butchered by ultra settings and had to look really close to notice the difference? They hardly ever make a game look that much better compared to extra horsepower that is required to run it.   

You don't need ultra settings to be better than consoles though, high-settings are often a substantial step up over the consoles low/medium settings that allot of games employ.

Games like Battlefield have always shown a big jump over the console releases on PC as Frostbite runs and looks best on PC.

goopy20 said:

Yes there are drops, but like I said, they are far less noticeable. I know this is hard to grasp for some pc gamers who are used playing games at 360fps and native 4k. But console games that were designed to run at 30fps in 1080p actually play and look pretty smooth on a tv. It's also the reason why console games look as good as they do on such old hardware. And why I personally hope 30fps (60fps for online shooters and race games) and 1080p will still be the standard when these next gen consoles come out. 

Nah. They are noticeable. It's even worst when there is shit frame pacing which makes the controls feel floaty... Like Halo 3 on Xbox 360.

PC gamers tend to be more technically knowledgeable about hardware and gaming technology overall, they do tend to build their own rigs and invest allot of time and money. - Thus they will pick out the flaws far more readily than your average console gamer, but the flaws still exist on console, just console gamers don't care as much.

I mean last gen console gamers were promoting the visuals of Halo 4 and Uncharted 3... But from a PC gamer perspective they were still only 720P, 30fps titles... And often dropped below 30fps. - But you know, console gamers didn't actually care, they still (wrongly) thought it was better than anything the PC had to offer (I was gaming at higher than 4k resolution at the time!)



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
goopy20 said:

What I'm trying to say is that almost no major developer is focusing on pc gaming anymore.

For crying out loud. I just gave you a long-ass list of developers that do focus on PC.

goopy20 said:

Meaning, we are no longer seeing games that are designed from the ground up with high-end pc specs in mind and that can't be ported to console anymore. A couple of generations ago this was different when you had games like Half-Life, Crysis, Morrowwind etc. that were a big leap beyond anything that was out on consoles at the time. I loved those days and I remember upgrading my gpu every 2 years back then.

StarCitizen.
Ashes of the Singularity.

Both aren't possible on console, making your assertion false... Just like most of them that you have put forth so far...

goopy20 said:

However, now that consoles are basically pc's with fixed specs, all major developers top priority is making sure their games run as good as possible on consoles.

And yet... So many games end up as sub-30fps experiences. *Cough*Control*Cough*


Or what about an older console-exclusive like the much coveted... Zelda: Breath of the Wild?


Again. You have been proven wrong.

goopy20 said:

You say the pc is settings the standard with things like Ray Tracing, but right now it's just a handful of patches on pc that only very few people actually get to enjoy. Next gen, however, we fill see Ray Tracing really become the standard as games will be designed from the ground up to take advantage of it. And if that means 95% of the steam users, who don't own a RTX, can't play it at comparable settings as the console version, then the developers simply won't care.

You have the choice to run with Ray Tracing on PC. Next-gen if a developer doesn't wish to implement Ray Tracing in their games, then stiff shit, you don't get Ray Tracing on console, but you do get it on PC.

Clearly you don't understand what Ray Tracing is if you think it's "Just a patch". - I think you need to do some research on what Ray Tracing is all about and the approaches that nVidia, AMD, Intel and Microsoft are taking with it... And how Ray Tracing has been implemented over the last 10 years.

It hasn't just happened overnight you know.

5% of steam users is still millions of gamers, PC is bigger than console remember.

goopy20 said:

Not denying there is a huge market for pc and Switch games and I already said RTS games are best played on pc. But I'm talking about the $100m budget blockbuster games like the next GTA. Games that will be pushing these new consoles to its limits and that'll need to sell millions just to break even. I mean, why do you think Rockstar is taking so long to release their games to pc? 

You either have a short memory or an extreme confirmation bias.

Either way, there is a game that is pushing the PC to it's limits, it's called StarCitizen.

Many AAA multiplats are built with the PC as the leed platform, leveraging cutting edge techniques and effects like Ray Tracing, with the games severly dumbed down visually for consoles where they only enjoy medium quality or less settings.

Rockstar have a history of shitty PC support, so using them as an example doesn't really give you any credibility...

goopy20 said:

That is actually pretty much correct. There's a big difference with console games dropping below 30fps and when this happens on pc. The difference is that it's a lot rarer and far less noticeable on consoles as developers play test the games to make sure it's not game breaking when it does happen. With pc, framerates can be all over the place depending on your hardware and it can totally ruin the experience. That's simply because developers aren't going to test and optimize their games for the 200 different gpu's that are available on the market.

No. The difference is the same, they are both dropping below 30fps.
The panel technology is identical, thus they tend to exhibit the same behavioral issues.

Framerate issues are common on console, that's a fact... But unlike PC, you can't turn the settings down, overclock or upgrade your internal hardware to solve the issue permanently, you are stuck with it.

goopy20 said:

I'm not saying it's a huge problem if a game runs slightly worse than the console version. It all depends on what you find acceptable and slightly worse should be acceptable for most people. But yeah, running AC Origins in 720p at the lowest settings in 20fps on a 560ti does look like a completely different game than the ps4 counterpart and it's not how the developers meant the game to be experienced.

Here's it running on a 560 TI at lowest settings in 720p

https://www.youtube.com/watch?v=Z4-R7agVn1c

Here's the ps4 version

https://www.youtube.com/watch?v=3w-3qINOSMY

The Geforce 560Ti is a weaker GPU than the Radeon 7850/7870 derived GPU in the Playstation 4, so obviously anyone with a clue will expect it to be less performant.

The 560Ti will be a closer comparison when pitted against the Xbox One.

Just remember the 560Ti is 8 years old and was never super fast even when it was released, the fact it's running newer games just reinforces the fact you don't need to upgrade your PC every year or two.

goopy20 said:

Of course, it looks even better on a gaming rig in native 4k and ultra settings. But if you play on a tv, you're really not going to notice that much of a difference. You're sure as shit going to see a difference with the example I posted above, though. Don't get me wrong, ultra settings are nice if your rig can handle them. But how often have you seen your framerate get butchered by ultra settings and had to look really close to notice the difference? They hardly ever make a game look that much better compared to extra horsepower that is required to run it.   

You don't need ultra settings to be better than consoles though, high-settings are often a substantial step up over the consoles low/medium settings that allot of games employ.

Games like Battlefield have always shown a big jump over the console releases on PC as Frostbite runs and looks best on PC.

goopy20 said:

Yes there are drops, but like I said, they are far less noticeable. I know this is hard to grasp for some pc gamers who are used playing games at 360fps and native 4k. But console games that were designed to run at 30fps in 1080p actually play and look pretty smooth on a tv. It's also the reason why console games look as good as they do on such old hardware. And why I personally hope 30fps (60fps for online shooters and race games) and 1080p will still be the standard when these next gen consoles come out. 

Nah. They are noticeable. It's even worst when there is shit frame pacing which makes the controls feel floaty... Like Halo 3 on Xbox 360.

PC gamers tend to be more technically knowledgeable about hardware and gaming technology overall, they do tend to build their own rigs and invest allot of time and money. - Thus they will pick out the flaws far more readily than your average console gamer, but the flaws still exist on console, just console gamers don't care as much.

I mean last gen console gamers were promoting the visuals of Halo 4 and Uncharted 3... But from a PC gamer perspective they were still only 720P, 30fps titles... And often dropped below 30fps. - But you know, console gamers didn't actually care, they still (wrongly) thought it was better than anything the PC had to offer (I was gaming at higher than 4k resolution at the time!)

I have to admit that Control was a total mess on consoles when it was released and I'm surprised they were able to release it in a barely playable state. It did get patched, though and now performance is much better across the board. https://www.youtube.com/watch?v=YOJFnjr5yYE Besides, the pc version had the same issues and even a beastly 1080ti runs that game at 20fps in 4k.

Just because we were already gaming in 4k on pc, doesn't mean Uncharted, running in 720p/ 30fps on ps3, wasn't one of the best looking games I played back then. If you only look at resolution and framerate numbers, then maybe not. But if you look at the total visual package, that game basically set a new standard. Sure, you can say it was a corridor shooter, but that's the beauty of console optimization. Developers can make their games look as good as possible while building them around the hardware's limitations. This is why the ps4 has games like GOW, Horizon Zero Dawn, TLOU 2 etc. Games that in my humble opinion, look at least as good as anything I've seen on pc today -- and that's with a 10-year-old ps4 gpu. Next year that hardware limitation will be something comparable to a 2080RTX, so excuse me for being excited to see what that will look like, but it should be a pretty big leap. 

From what I've seen, Star Citizen does actually look like a game that gives an idea of what next gen games will look like. And I'm definitely going to check that out when they release squadron42. 

Last edited by goopy20 - on 05 October 2019

Seriously why is this thread still going? It has been ruined.



Yea.. not paying much attention to the bickering between PC and consoles myself.. I consider myself a gamer first.. now with a gaming PC and the Vive.. just something else to play alongside my PS4 and psvr.

As long as it's fun.. I don't even pay attention much to graphics or frames.. the whole point of the gaming PC was just so I could play other games that was stuttering through my laptop.. thus the build.



Man.. I hate it when your girl has to leave my place to come back to you..

BillyBong said:

Yea.. not paying much attention to the bickering between PC and consoles myself.. I consider myself a gamer first.. now with a gaming PC and the Vive.. just something else to play alongside my PS4 and psvr.

As long as it's fun.. I don't even pay attention much to graphics or frames.. the whole point of the gaming PC was just so I could play other games that was stuttering through my laptop.. thus the build.

For me PC is just about the massive catalogue of old games as well as the new stuff. For instance I cannot play bloodlines or BG or whatever on consoles.



BillyBong said:

Yea.. not paying much attention to the bickering between PC and consoles myself.. I consider myself a gamer first.. now with a gaming PC and the Vive.. just something else to play alongside my PS4 and psvr.

As long as it's fun.. I don't even pay attention much to graphics or frames.. the whole point of the gaming PC was just so I could play other games that was stuttering through my laptop.. thus the build.

Same here man and I didn't mean to turn this thread into a console vs pc thing. As you're into horror games and already have psvr, I'm assuming you already played RE7 on psvr. That was for me the most terrifying gaming experience I ever had and definitely a game I would love to play on a high end pc and Vive. Hunt Showdown also seems pretty good. Haven't tried it myself but that is one of the few horror games that should make good use of your 1080TI.