Bofferbrauer2 said:
You still haven't learned anything, do you? Just looking at your timetable and I know that you are waaaaaaaaaaaay off.
And if the world would really run on the rules that you posted, do you really think anybody would buy anything else than the top-end GPU? Or that AMD and NVidia would even make such lower-end GPUs? If yes, then you don't seem to understand what you're saying in your post, because it would make anything else than the highest-end useless and not viable for anybody anymore. Oh, and about Batman Arkham Knight, you didn't hear the outcry back then, do you? Because PC's were perfectly able to run PS4 settings, even older ones. The executives didn't understand that you need more than a small studio of 12 people and 8 weeks to port a game to PC, which was the reason why graphical features were cut - they just didn't have the time to implement them, and the rest was cobbled together so badly that it needed a total overhaul to run properly on anything that wasn't the PS4 from which the code was ported. The code of Arkham Knight doesn't understand that a PC has a separate RAM and VRAM, and tries to treat both as unified RAM. As a result, VRAM consumption is monstrous. Also, they didn't have the time to optimize the code for NVidia one bit, hence why the game runs much smoother on AMD GPUs. An RX 480 suffices amply for 1440p60FPS while a GTX 1060 gets some stutters from texture streaming. You can run the game on a Ryzen 5 2400G in 1080p60 without additional GPU. Oh, and the missing features got patched in a couple months later, when the team that ported the game actually got the time to implement them. Warner Brothers have a history of shoddy PC ports, Arkham Knight was only the tip of the Iceberg. Mortal Kombat X was already a mess, but nothing as bad as this one. Even Ubisoft didn't fuck up that hard - and they fucked up Tetris of all things! |
Look I am only making a prediction here and we have to wait how much is going to play out. However, it is based on scientific factual evidence. First off, it didn't take 4 years before developers ditched ps3/ 360. The ps4 came out in november 2013 and exactly a year later games like AC Unity came out that weren't released on last gen anymore. Also, isn't it confirmed that the ps5/ Scarlett will have a Navi GPU that sits somewhere between a 2070 and 2080RTX?
And you're right, maybe Batman was a shoddy port. But isn't it weird how pc gamers were all blaming developers for sucky programming for most games released during that period? Could be a case of mass bad programming, or it could be a case of developers spending years making games that were designed from the ground up to run on ps4 spec hardware and pc gamers were an after thought. I mean, they simply didn't care aif pc gamers couldn't play their game because they didn't have a GPU with 4 gig of vram, a 4 core cpu and at least a 6600GTX. They didn't care because they knew console gaming is where 98% of their sales would be coming from.
Last edited by goopy20 - on 26 September 2019






