goopy20 said:
Well at least you're willing to admit that minimum requirements will go up next gen. I've changed the minimum requirements a bit because we simply don't know yet what exactly their real-life performance will be. Some are saying 2080GTX level but realistically I think it will be more comparable to a RX5700 and 8-core Ryzen cpu. Whatever the case may be, that will be the exact minimum requirements to play these games in a way the developers intended their games to be played. And that doesn't mean 360x360 at the lowest settings on a toaster from 2009. Yes, we will see some cross-gen titles that won't require those kind of specs but about a year after launch, developers will move away from the ps4 and then it will be. Just look at the ps4. It came out in 2014 and in 2015 we already had major AAA games coming out that weren't possible to run on a ps3 (or ps3 equivalent gpu) anymore like: Batman AK, Infamous, Fall Out 4, Rise of the Tomb raider, AC Unity, Witcher 3, Bloodborne, Battlefront etc. Those were all games that pushed modern budget gpu's of that time (like a 750Ti) to its limits. And if you wanted to play them at pc-master-race settings, you needed to upgrade to something like a 970GTX. Now, obviously gpu's like a 970GTX and up are still perfectly fine nowadays when all games are designed around a 660GTX. But it's still a scientific fact that a 2060 or 2070GTX will be far less capable 2 years from now as they are now. Basically they will be what the the 750Ti was when the ps4 came out, a bare minimum to play ps5 titles at similar graphics settings. |
I genuinely think the reason for the "sour" attitude towards your PC thoughts and recommendations comes from the fact that you said earlier in the thread that you bought a 1060 based gaming PC recently, I mean that was a mid ranged card in 2016 when it launched, so 3 and a bit years ago that mid range card was there for people to upgrade from the likes of the more power hungry 780TI to get a 10 series card which had around the same power but without consuming every watt in the house, the thing people have been trying to get through to you is that a mid range card years after it was launched... is not a mid range card anymore, you can check benchmarks for it, against the 3GB version of the 1060 the 780TI tops it by a few % in performance, the 6GB version barely passes the 780TI, https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-Nvidia-GTX-780-Ti/3639vs2165 that is a GPU from 2013 which can be gotten (in some regions) for under £100, you are looking at your PC which you recently bought and thinking "damn... I just got this and need to upgrade soon" but that was because when the 1060 launched it wasn't a top of the line card and yeah like I said if you had went high end in 2013 and got the 780TI at the time it would be giving you around the same performance you get from your recently bought machine, only difference would have been you would have gotten that performance for 7 years rather than 1-2 if you aim to replace your PC/GPU next year.
That's the mistake which I think Pemalite is trying to open your eyes to here in this thread, which is the fact that high end cards from their various ranges will not have to be reducing resolutions to get modern games to 1080/60 because the more and more complicated a scene gets now for those games at that resolution are going to be particles or AI based improvements most of which will take a heavier toll on the CPU than they will the actual graphics cards ability to draw the game out, sure... a toaster from 2009 won't run games great.... but a toaster from 2019 that someone bought with a 3 year old mid range GPU in it wont run games great either, anyone buying a toaster for gaming will always be let down by the gaming which it provides because they have bought a poor PC. If you invest in high end PC hardware (or even older, higher end stuff) you will not run into the issue of games on medium/ lower resolutions for a lot longer.
Here for arguments sake of Old = Replace it, https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-3GB-vs-Nvidia-GTX-980-Ti/3646vs3439 a comparison of the 980TI which came out over a year before your 1060 and in terms of power is 1.5x the capability of the GPU you bought somewhere in the region of 3 years after it launched, again.... you could have had the 980TI since 2015 in your PC today and all that time you would be enjoying performance 1 and a half times better than your current card gives you, the 980TI will be able to maintain being a competent card for that extra % over your 1060 when it comes to reaching the recommended specs for new titles as well.
@Bolded No, those cards will be the exact same capability they are in 30 years time as they are today... GPU's do not lose power as time passes, you are talking about comparative power not scientifically measurable power of the card, also.... you are going towards the 2060 there as a powerful card, that again is a mid range version of the 20 series, again... it's not a bad card but https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2060-vs-Nvidia-GTX-980-Ti/4034vs3439 there is how it stands up next to a 2015 GPU. That is to say, it's a mid range card today, do not buy it if you are wanting 4k/ultra settings.... also the PS5 will not be 4k/60/Ultra not an utter hope in hell outside of 2d sprite based games which a 750TI would have a cut at running in 4k
Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive








