By using this site, you agree to our Privacy Policy and our Terms of Use. Close
goopy20 said:

I wouldn't exactly call that playable. Framerate is all over the place and it will go down to 10 fps on the more crowded scenes on a 6770. We can debate this all day but just answer me one simple question.

Did or didn't pc games became a lot more demanding as soon as developers stopped supporting the ps3/360? That was a rhetorical question and we both know that a 5770 could run AC Black Flag, which was still a cross gen game, in ultra settings at 60fps. Pc gamers were outraged when the pc requirements were announced for AC Unity. Just look in the comments section of this article:

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

And yes people could turn down settings, play it in 720p and whatnot on lower than the minimum required specs, but which self respecting pc gamer would want to play like that? The fact is that anyone who wanted similar performance as the consoles had to upgrade if they still had something like a 5770. I mean do you honestly believe it's a coincidence that all games released after developers moved on from ps3 to ps4 required a 660 GTX or higher as the minimum specs?

Look, I'm not ignoring facts, nor am I stating them. But I will give you a prediction and you tell me if it sounds about right.

  • 2020 - the ps5 and 3*** GTX come out.
  • 2021 - developers will move away from ps4/ Xbox 
  • Minimum pc requirements in 2021 for all major multiplatform games will be a 2080RTX. 
  • 3080RTX required to play multiplatform games in native 4k    
  • You can still play some of those games on a 1060GTX as long as you turn graphics and resolution down, effectively making them look like ps4 ports in the process. 
  • 2023 - The 4*** GTX gets released and for a measly $2000, you have full bragging rights that you can play all console games at 120fps, native 4k and full Nvidia Hairworks enabled.
  • 2027 - ps6 comes out and the cycle start again
  • 2028 - Starship Citizen gets an official release and ends up on ps6 as well.

You still haven't learned anything, do you?

Just looking at your timetable and I know that you are waaaaaaaaaaaay off.

  1. Developers moving away from PS4/XBO in 2021 already? They took over 4 years to do so with the last gen consoles, and they didn't have such a great success like the PS4 or any mid-gen refreshes that are still easy to port to from the next gen.
  2. You seem to equate the next gen of NVidia with the next console generation, and do so in terms of performance. This is where you all get wrong, as the consoles can only use more mainstream hardware, no high-end. That's due to both the thermals (RTX 3080 will most probably be around 300W TDP, more than the TDP budget of an entire console) and price (while they don't pay the same price as consumers, NVidia certainly wouldn't sell their GPUs 80% off to console makers. With something in performance range of an RTX 3080 you can't expect a pricetag for a console that would be below $1200)
  3. We just showed you over several posts that the minimum requirements didn't grow anywhere near as fast. RTX 2080 may be minimum in 2025, but certainly not in 2021 already. Like I explained already, the Next Gen will take more mainstream hardware, which performance-wise will be closer to the RX 5700 or RTX 2060
  4. Of course you will be able to continue playing on a GTX 1060. And you will be able without turning down the graphics or resolution down, or really just slightly so. Will still look better than PS4 Pro. As explained above, the next gen consoles won't be that more powerful than a GTX 1060, at least not at release. Maybe after a mid-gen upgrade, but certainly not early on.
  5. GTX 4xxx series only in 2023? Not with AMD banging at their door. If they would do so, they would sell them for $200, not $2000, because they fell too far behind. Really, your 2023 point just shows that you neither understand the economics or how the console market, the PC market or PC hardware work at all.

And if the world would really run on the rules that you posted, do you really think anybody would buy anything else than the top-end GPU? Or that AMD and NVidia would even make such lower-end GPUs? If yes, then you don't seem to understand what you're saying in your post, because it would make anything else than the highest-end useless and not viable for anybody anymore.

Oh, and about Batman Arkham Knight, you didn't hear the outcry back then, do you? Because PC's were perfectly able to run PS4 settings, even older ones. The executives didn't understand that you need more than a small studio of 12 people and 8 weeks to port a game to PC, which was the reason why graphical features were cut - they just didn't have the time to implement them, and the rest was cobbled together so badly that it needed a total overhaul to run properly on anything that wasn't the PS4 from which the code was ported.

The code of Arkham Knight doesn't understand that a PC has a separate RAM and VRAM, and tries to treat both as unified RAM. As a result, VRAM consumption is monstrous. Also, they didn't have the time to optimize the code for NVidia one bit, hence why the game runs much smoother on AMD GPUs. An RX 480 suffices amply for 1440p60FPS while a GTX 1060 gets some stutters from texture streaming. You can run the game on a Ryzen 5 2400G in 1080p60 without additional GPU. Oh, and the missing features got patched in a couple months later, when the team that ported the game actually got the time to implement them.

Warner Brothers have a history of shoddy PC ports, Arkham Knight was only the tip of the Iceberg. Mortal Kombat X was already a mess, but nothing as bad as this one. Even Ubisoft didn't fuck up that hard - and they fucked up Tetris of all things!

Last edited by Bofferbrauer2 - on 26 September 2019