By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - Finally joined the PC Master Race

Pemalite said:
goopy20 said:

I didn't say anyone should just chug their pc in the bin if they don't have a 2700RTX next gen. I'm saying the minimum requirements will go up to match exactly what will be in these consoles for most major releases in 2021. If you want to play those games in 720p at the lowest settings, then fine. But after all the talk about 120FPS, 4k resolution on triple monitors, it doesn't sound like that is a viable option.

So it's an RTX 2700 now? One thing is for sure... You are consistent at being inconsistent.
Yes, minimum requirements will go up next gen.

No, you will not need an RTX 2080.
No, you don't have to game at 120fps, 4k resolution or triple monitors, that isn't what "minimum" means.

Clearly you haven't been reading or paying attention to the evidence presented in this thread if that is the false conclusions you have come up with.

goopy20 said:

Of course, a RX590 is still a very capable mid-range card, it's basically the gpu that sits inside the Xbox One X. What you seem to be missing, however, is that the ps4 came out 7 years ago and all major games we're seeing right now are designed for the ps4. Keep in mind that the ps4 has a gpu that came out 5 years before the RX590. We will have to see how a RX590 compares to the ps5's gpu, but do you honestly believe the RX590 will be able to keep up? If so then I will be seriously disappointed and the OP still won't have any games that'll really make use of his 2080ti.

What you seem to be forgetting is that the Radeon 7870 which is a GPU found in the Playstation 4 is able to play the bulk of PC games at Playstation 4 levels of image quality.

I was using the Radeon 5870 as an example of old PC's playing newer games just fine, a GPU that predates the Playstation 4 and Radeon 7870, please, read the information and evidence presented instead of blatantly ignoring it.

The Radeon RX 590 is a very capable mid-range card, like you said. - And it's very comparable to the GPU in the Xbox One X, but it's also a GPU that is significantly inferior to a Geforce RTX 2080, which you claim will be the minimum requirement when next-gen consoles launch.

The RX 590 isn't going to become suddenly useless overnight because next-gen consoles launched, it will still be capable of running the latest games, just at reduced visual settings, that is the entire point of this discussion, that you don't need to upgrade to the latest and greatest to have a good gaming experience.

goopy20 said:

Let me just ask you this. What exactly are you expecting from these next gen consoles?  

I have already answered this question, I suggest you go back and re-read my statements.

goopy20 said:

I mean, if a 590RX will be fine next gen, then please explain to me why anyone would wanna buy the next Xbox, when the Xbox One X already has a RX590 in it? You're basically saying that MS doesn't even need to release a new console as they already have the Xbox X on the market for 2 years now...  

Next-gen's biggest benefits aren't all tied up in the shiny new GPU, the current generation consoles launched with terrible CPU's and only average amounts of RAM.

But Ray Tracing is next-gens trump card on the visual front, people will buy the latest and greatest console just for that.

goopy20 said:

It's what game publishers write on the back of the box. You honestly believe they would lock out a huge slice of potential customers just for fun and giggles? Like I said, they write it so customers don't cry in outrage about lackluster porting and demand refunds if their game runs like donkey poo on anything below the minimum requirements.

Funny how you didn't reply to the rest, though, as I'm actually dying to know if you expect any leap from these next gen consoles at all? Also, please explain to me why anyone would want to buy the Scarlett when the Xbox One X already has a RX590? A gpu, you say is more than capable of running these next gen games. 

No game publisher has written the minimum hardware requirements for 4k, 120fps on the back of a PC box, ignoring the fact that PC game boxes generally don't exist anymore anyway... Haha

The RX 590 and Xbox One X will both be capable of running next-gen games for several years until we transition from 8th to 9th gen, just like how it took a few years to transition from 7th gen to 8th gen and how it took a few years to transition from 6th gen to 7th gen.


Well at least you're willing to admit that minimum requirements will go up next gen. I've changed the minimum requirements a bit because we simply don't know yet what exactly their real-life performance will be. Some are saying 2080GTX level but realistically speaking I think it will be leaning more towards a RX5700 and 8-core Ryzen cpu. Whatever the case may be, that will be the exact minimum requirements to play these games in a way the developers intended their games to be played. And that doesn't mean 360x360 at the lowest settings on a toaster from 2009.

Yes, we will see some cross-gen titles that won't require those kind of specs right away. But about a year after launch, developers will move away from the ps4 and then it will be. Just look at the ps4. It came out in 2014 and in 2015 we already had major AAA games coming out that weren't possible to run on a ps3 (or ps3 equivalent gpu) anymore. Huge games like: Batman AC, Infamous, Fall Out 4, Rise of the Tomb raider, AC Unity, Witcher 3, Bloodborne, Battlefront etc. Those were all games that pushed modern budget gpu's of that time (like a 750Ti) to its limits. And if you wanted to play them at pc-master-race settings, you needed to upgrade to something like a 970GTX. 

Now, obviously gpu's like a 970GTX and above, are still perfectly fine nowadays when all games are designed around a 660GTX. But it's still a scientific fact that a 2060 or 2070RTX will be far less capable 2 years from now, then they are now. Basically they will be what the 750Ti was when the ps4 came out, a bare minimum to play ps5 titles at similar graphics settings. And yes, recommended settings will probably be a 2080RTX or higher. It's that simple and even if those games can be made to run on lower spec pc's that's beside the point. Again, my point is that cards like a 1080ti or higher will finally be put to proper use. 

Last edited by goopy20 - on 30 September 2019

Around the Network
goopy20 said:
Pemalite said:

So it's an RTX 2700 now? One thing is for sure... You are consistent at being inconsistent.
Yes, minimum requirements will go up next gen.

No, you will not need an RTX 2080.
No, you don't have to game at 120fps, 4k resolution or triple monitors, that isn't what "minimum" means.

Clearly you haven't been reading or paying attention to the evidence presented in this thread if that is the false conclusions you have come up with.

What you seem to be forgetting is that the Radeon 7870 which is a GPU found in the Playstation 4 is able to play the bulk of PC games at Playstation 4 levels of image quality.

I was using the Radeon 5870 as an example of old PC's playing newer games just fine, a GPU that predates the Playstation 4 and Radeon 7870, please, read the information and evidence presented instead of blatantly ignoring it.

The Radeon RX 590 is a very capable mid-range card, like you said. - And it's very comparable to the GPU in the Xbox One X, but it's also a GPU that is significantly inferior to a Geforce RTX 2080, which you claim will be the minimum requirement when next-gen consoles launch.

The RX 590 isn't going to become suddenly useless overnight because next-gen consoles launched, it will still be capable of running the latest games, just at reduced visual settings, that is the entire point of this discussion, that you don't need to upgrade to the latest and greatest to have a good gaming experience.

I have already answered this question, I suggest you go back and re-read my statements.

Next-gen's biggest benefits aren't all tied up in the shiny new GPU, the current generation consoles launched with terrible CPU's and only average amounts of RAM.

But Ray Tracing is next-gens trump card on the visual front, people will buy the latest and greatest console just for that.

No game publisher has written the minimum hardware requirements for 4k, 120fps on the back of a PC box, ignoring the fact that PC game boxes generally don't exist anymore anyway... Haha

The RX 590 and Xbox One X will both be capable of running next-gen games for several years until we transition from 8th to 9th gen, just like how it took a few years to transition from 7th gen to 8th gen and how it took a few years to transition from 6th gen to 7th gen.


Well at least you're willing to admit that minimum requirements will go up next gen. I've changed the minimum requirements a bit because we simply don't know yet what exactly their real-life performance will be. Some are saying 2080GTX level but realistically I think it will be more comparable to a RX5700 and 8-core Ryzen cpu. Whatever the case may be, that will be the exact minimum requirements to play these games in a way the developers intended their games to be played. And that doesn't mean 360x360 at the lowest settings on a toaster from 2009.

Yes, we will see some cross-gen titles that won't require those kind of specs but about a year after launch, developers will move away from the ps4 and then it will be. Just look at the ps4. It came out in 2014 and in 2015 we already had major AAA games coming out that weren't possible to run on a ps3 (or ps3 equivalent gpu) anymore like: Batman AK, Infamous, Fall Out 4, Rise of the Tomb raider, AC Unity, Witcher 3, Bloodborne, Battlefront etc. Those were all games that pushed modern budget gpu's of that time (like a 750Ti) to its limits. And if you wanted to play them at pc-master-race settings, you needed to upgrade to something like a 970GTX. 

Now, obviously gpu's like a 970GTX and up are still perfectly fine nowadays when all games are designed around a 660GTX. But it's still a scientific fact that a 2060 or 2070GTX will be far less capable 2 years from now as they are now. Basically they will be what the the 750Ti was when the ps4 came out, a bare minimum to play ps5 titles at similar graphics settings. 

I genuinely think the reason for the "sour" attitude towards your PC thoughts and recommendations comes from the fact that you said earlier in the thread that you bought a 1060 based gaming PC recently, I mean that was a mid ranged card in 2016 when it launched, so 3 and a bit years ago that mid range card was there for people to upgrade from the likes of the more power hungry 780TI to get a 10 series card which had around the same power but without consuming every watt in the house, the thing people have been trying to get through to you is that a mid range card years after it was launched... is not a mid range card anymore, you can check benchmarks for it, against the 3GB version of the 1060 the 780TI tops it by a few % in performance, the 6GB version barely passes the 780TI, https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-6GB-vs-Nvidia-GTX-780-Ti/3639vs2165 that is a GPU from 2013 which can be gotten (in some regions) for under £100, you are looking at your PC which you recently bought and thinking "damn... I just got this and need to upgrade soon" but that was because when the 1060 launched it wasn't a top of the line card and yeah like I said if you had went high end in 2013 and got the 780TI at the time it would be giving you around the same performance you get from your recently bought machine, only difference would have been you would have gotten that performance for 7 years rather than 1-2 if you aim to replace your PC/GPU next year.

That's the mistake which I think Pemalite is trying to open your eyes to here in this thread, which is the fact that high end cards from their various ranges will not have to be reducing resolutions to get modern games to 1080/60 because the more and more complicated a scene gets now for those games at that resolution are going to be particles or AI based improvements most of which will take a heavier toll on the CPU than they will the actual graphics cards ability to draw the game out, sure... a toaster from 2009 won't run games great.... but a toaster from 2019 that someone bought with a 3 year old mid range GPU in it wont run games great either, anyone buying a toaster for gaming will always be let down by the gaming which it provides because they have bought a poor PC. If you invest in high end PC hardware (or even older, higher end stuff) you will not run into the issue of games on medium/ lower resolutions for a lot longer.

Here for arguments sake of Old = Replace it, https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-3GB-vs-Nvidia-GTX-980-Ti/3646vs3439 a comparison of the 980TI which came out over a year before your 1060 and in terms of power is 1.5x the capability of the GPU you bought somewhere in the region of 3 years after it launched, again.... you could have had the 980TI since 2015 in your PC today and all that time you would be enjoying performance 1 and a half times better than your current card gives you, the 980TI will be able to maintain being a competent card for that extra % over your 1060 when it comes to reaching the recommended specs for new titles as well.

@Bolded No, those cards will be the exact same capability they are in 30 years time as they are today... GPU's do not lose power as time passes, you are talking about comparative power not scientifically measurable power of the card, also.... you are going towards the 2060 there as a powerful card, that again is a mid range version of the 20 series, again... it's not a bad card but https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2060-vs-Nvidia-GTX-980-Ti/4034vs3439 there is how it stands up next to a 2015 GPU. That is to say, it's a mid range card today, do not buy it if you are wanting 4k/ultra settings.... also the PS5 will not be 4k/60/Ultra not an utter hope in hell outside of 2d sprite based games which a 750TI would have a cut at running in 4k



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

goopy20 said:

Well at least you're willing to admit that minimum requirements will go up next gen. I've changed the minimum requirements a bit because we simply don't know yet what exactly their real-life performance will be. Some are saying 2080GTX level but realistically speaking I think it will be leaning more towards a RX5700 and 8-core Ryzen cpu. Whatever the case may be, that will be the exact minimum requirements to play these games in a way the developers intended their games to be played. And that doesn't mean 360x360 at the lowest settings on a toaster from 2009.

If you go back several pages you will see that I made the statement that minimum requirements will go up next gen, as it goes up every gen.

But an RX 580 will be gaming for years to come, not at 360x360.

goopy20 said:

Yes, we will see some cross-gen titles that won't require those kind of specs right away. But about a year after launch, developers will move away from the ps4 and then it will be. Just look at the ps4. It came out in 2014 and in 2015 we already had major AAA games coming out that weren't possible to run on a ps3 (or ps3 equivalent gpu) anymore. Huge games like: Batman AC, Infamous, Fall Out 4, Rise of the Tomb raider, AC Unity, Witcher 3, Bloodborne, Battlefront etc. Those were all games that pushed modern budget gpu's of that time (like a 750Ti) to its limits. And if you wanted to play them at pc-master-race settings, you needed to upgrade to something like a 970GTX. 

Developers were building games that demanded more from a PC than Xbox 360/Playstation 3 equivalent specs when the Xbox One and Playstation 4 launched. - The PC had moved on from a Direct X 9 era onto a Direct X 10 and then a Direct X 11 one and that included an entirely new rendering paradigm being pushed in titles at the time like Crysis, Metro, Alien vs Predator with it's crazy tessellation, Battlefield and so on... But that also meant those games were definitively the visual showpiece on the PC, looking almost next-gen compared to the Xbox 360 and Playstation 3.

That hasn't happened this time, a Radeon 7870 can still run the majority of Playstation 4 games at Playstation 4 levels of quality just fine, older GPU's (Which I provided substantial evidence for) like the Radeon 5870 and 6970 can also play the majority of Playstation 4 games just fine.

Technology has started to stagnate, we aren't doubling performance every year anymore, hardware can last longer, which is why a CPU from 2011 can max out every game in 2019.

goopy20 said:

Now, obviously gpu's like a 970GTX and above, are still perfectly fine nowadays when all games are designed around a 660GTX. But it's still a scientific fact that a 2060 or 2070RTX will be far less capable 2 years from now, then they are now. Basically they will be what the 750Ti was when the ps4 came out, a bare minimum to play ps5 titles at similar graphics settings. And yes, recommended settings will probably be a 2080RTX or higher. It's that simple and even if those games can be made to run on lower spec pc's that's beside the point. Again, my point is that cards like a 1080ti or higher will finally be put to proper use. 

Think older than a 660GTX. A GTX 580 can smash most games at 1080P just fine.

So now you have shifted the goalpost of the RTX 2080 to being recommended rather than minimum? I have to ask if you are just trolling at this point? You aren't being consistent in your assertions.








--::{PC Gaming Master Race}::--

Pemalite said:
goopy20 said:

Well at least you're willing to admit that minimum requirements will go up next gen. I've changed the minimum requirements a bit because we simply don't know yet what exactly their real-life performance will be. Some are saying 2080GTX level but realistically speaking I think it will be leaning more towards a RX5700 and 8-core Ryzen cpu. Whatever the case may be, that will be the exact minimum requirements to play these games in a way the developers intended their games to be played. And that doesn't mean 360x360 at the lowest settings on a toaster from 2009.

If you go back several pages you will see that I made the statement that minimum requirements will go up next gen, as it goes up every gen.

But an RX 580 will be gaming for years to come, not at 360x360.

goopy20 said:

Yes, we will see some cross-gen titles that won't require those kind of specs right away. But about a year after launch, developers will move away from the ps4 and then it will be. Just look at the ps4. It came out in 2014 and in 2015 we already had major AAA games coming out that weren't possible to run on a ps3 (or ps3 equivalent gpu) anymore. Huge games like: Batman AC, Infamous, Fall Out 4, Rise of the Tomb raider, AC Unity, Witcher 3, Bloodborne, Battlefront etc. Those were all games that pushed modern budget gpu's of that time (like a 750Ti) to its limits. And if you wanted to play them at pc-master-race settings, you needed to upgrade to something like a 970GTX. 

Developers were building games that demanded more from a PC than Xbox 360/Playstation 3 equivalent specs when the Xbox One and Playstation 4 launched. - The PC had moved on from a Direct X 9 era onto a Direct X 10 and then a Direct X 11 one and that included an entirely new rendering paradigm being pushed in titles at the time like Crysis, Metro, Alien vs Predator with it's crazy tessellation, Battlefield and so on... But that also meant those games were definitively the visual showpiece on the PC, looking almost next-gen compared to the Xbox 360 and Playstation 3.

That hasn't happened this time, a Radeon 7870 can still run the majority of Playstation 4 games at Playstation 4 levels of quality just fine, older GPU's (Which I provided substantial evidence for) like the Radeon 5870 and 6970 can also play the majority of Playstation 4 games just fine.

Technology has started to stagnate, we aren't doubling performance every year anymore, hardware can last longer, which is why a CPU from 2011 can max out every game in 2019.

goopy20 said:

Now, obviously gpu's like a 970GTX and above, are still perfectly fine nowadays when all games are designed around a 660GTX. But it's still a scientific fact that a 2060 or 2070RTX will be far less capable 2 years from now, then they are now. Basically they will be what the 750Ti was when the ps4 came out, a bare minimum to play ps5 titles at similar graphics settings. And yes, recommended settings will probably be a 2080RTX or higher. It's that simple and even if those games can be made to run on lower spec pc's that's beside the point. Again, my point is that cards like a 1080ti or higher will finally be put to proper use. 

Think older than a 660GTX. A GTX 580 can smash most games at 1080P just fine.

So now you have shifted the goalpost of the RTX 2080 to being recommended rather than minimum? I have to ask if you are just trolling at this point? You aren't being consistent in your assertions.

Ok, so you're basically saying that the Xbox One X/ ps4 pro will run all of the next gen games fine? I'll just be over here, crying in a corner as that kinda sucks.

Look, a 580GTX was a $499 high-end gpu when it was released and is basically the same as a 660GTX. So yeah that would also run any ps4 game just fine. Like I said, anything comparable or higher than the 660GTX that's inside the ps4 will run fine. However, anyone with a lower spec gpu of that time had to upgrade. You have to take things a bit in perspective here. In 2012 the $299 560GTX came out and it was a great card that could run Crysis 2 at max settings at 60fps/ 1080p.

But when the ps4 came out 2 years later, it became pretty much useless overnight. Yes it could still run AC Unity but at 12fps in 720p at the lowest settings: https://www.youtube.com/watch?v=B4xQD7AeM2o

Like I said a million times, we don't know exactly how these next gen consoles will perform. But yes, if it's 2060/2070 RTX level, then a 2060/ 2070RTX will be the minimal requirement and a 2080 or higher will probably be recommended to play them at the highest settings. This really isn't rocket science man. Next gen a 1060/ RX580 will be what the 560GTX was when this console generation started. Off course you can always dial down resolution but what you seem to be missing is that console games are not optimized for 4k, they're optimized for 1080p or even 900p in some cases. Meaning, there won't be much head room to play them at lower resolutions on older hardware. If you think a next gen GTA6, looking like minecraft because you have to play it in 460p at the lowest settings, is still playable, then fine man. But for me that is not an acceptable way to play anything.

@Ganoncrotch

I am not sour at all about buying my pc. I actually had a 970GTX that broke down on me and I knew perfectly well that a 1060GTX wasn't much, if any, of an upgrade. There was a time when I did upgrade my gpu every 2 years or so and sure, I wanted to get a 2070GTX instead. But here's the thing. A 2070GTX would have cost me about $450 extra and to me, that just wasn't worth it. I mean why would I, when a 1060GTX can run anything at max settings in 1440p already? I am sure a lot of pc gamers who spend $500 or more on a gpu feel some buying remorse like I had. It's like owning a super car that can go 300mph, but you have nowhere to drive it. Not saying they are completely useless and if you're into VR, or want to play with triple monitors at 120FPS, then I'm sure it's great. But for the average gamer like me, a 2070RTX would be complete overkill at the moment.

Now, unless Permalite is correct and the 1060GTX will run all games fine next gen, I'm aware that I will have to upgrade when the next gen kicks off. I'm not sour about that, in fact I would love to see graphics take a leap that force me to upgrade, instead of playing ps4 games in 4k at 300fps. However by then a 2070GTX will probably cost me around $100 on ebay. That's the beauty of pc, you can always upgrade when it's needed. The whole point I'm trying to make here, and why nobody could name the OP a single game that really takes advantage of his 1080TI, is that we'll have to wait till next gen before we will need to. 

Last edited by goopy20 - on 01 October 2019

goopy20 said:

Ok, so you're basically saying that the Xbox One X/ ps4 pro will run all of the next gen games fine? I'll just be over here, crying in a corner as that kinda sucks.

Look, a 580GTX was a $499 high-end gpu when it was released and is basically the same as a 660GTX. So yeah that would also run any ps4 game just fine. Like I said, anything comparable or higher than the 660GTX that's inside the ps4 will run fine. However, anyone with a lower spec gpu of that time had to upgrade. You have to take things a bit in perspective here. In 2012 the $299 560GTX came out and it was a great card that could run Crysis 2 at max settings at 60fps/ 1080p.

But when the ps4 came out 2 years later, it became pretty much useless overnight. Yes it could still run AC Unity but at 12fps in 720p at the lowest settings: https://www.youtube.com/watch?v=B4xQD7AeM2o

"Ok, so you're basically saying that the Xbox One X/ ps4 pro will run all of the next gen games fine?"

No. Nobody is saying that the Xbox One X/ ps4 pro will run all of the next gen games fine. Because the Jaguar CPU cores aren't up to the task, even if the GPU part were strong enough.

"In 2012 the $299 560GTX came out and it was a great card that could run Crysis 2 at max settings at 60fps/ 1080p."

So many errors (lies? fake news?) in such a short sentence. I fixed it for you:

In 2011 the $199 560GTX came out and it was an okay card that could run Crysis 2 at very high settings (which weren't max settings) with no AA at 60fps/ 1080p.

"But when the ps4 came out 2 years later..."

Two and a half year later...

"...it became pretty much useless overnight. Yes it could still run AC Unity..."

...which ran shitty on PCs AND consoles and was in no way representative of the average game performance in 2014/2015. Oh, and when the game was released in November 2014, the GTX 560 was already three and a half year old.

"...but at 12fps in 720p at the lowest settings."

I watched the linked video. It showed 14 - 30 fps (depending on the area) and it weren't the lowest settings but custom settings.

Last edited by Conina - on 01 October 2019

Around the Network
Random_Matt said:
Are some people still thinking PS5 will compete with a RTX 2080?

I would be amazed if any rational thinking person thought that.

I just spend £1600 on a PC and I don't have an RTX 2080. (Although the 2070 super is basically a 2080 but you know what I mean). I'm going to buy the PS5 next year and I'm expecting it to cost me around £500. Which is a lot more than the PS4 at £350 (It would've been £400 in post brexit UK) but still a big price jump and I would be VERY surprised if it even touched the performance of my PC in any way. It's not a fair comparison at all, Console optimisation included. 



There's only 2 races: White and 'Political Agenda'
2 Genders: Male and 'Political Agenda'
2 Hairstyles for female characters: Long and 'Political Agenda'
2 Sexualities: Straight and 'Political Agenda'

@Ganoncrotch

I am not sour at all about buying my pc. I actually had a 970GTX that broke down on me and I knew perfectly well that a 1060GTX wasn't much, if any, of an upgrade. There was a time when I did upgrade my gpu every 2 years or so and sure, I wanted to get a 2070GTX instead. But here's the thing. A 2070GTX would have cost me about $450 extra and to me, that just wasn't worth it. I mean why would I, when a 1060GTX can run anything at max settings in 1440p already? I am sure a lot of pc gamers who spend $500 or more on a gpu feel some buying remorse like I had. It's like owning a super car that can go 300mph, but you have nowhere to drive it. Not saying they are completely useless and if you're into VR, or want to play with triple monitors at 120FPS, then I'm sure it's great. But for the average gamer like me, a 2070RTX would be complete overkill at the moment.

Now, unless Permalite is correct and the 1060GTX will run all games fine next gen, I'm aware that I will have to upgrade when the next gen kicks off. I'm not sour about that, in fact I would love to see graphics take a leap that force me to upgrade, instead of playing ps4 games in 4k at 300fps. However by then a 2070GTX will probably cost me around $100 on ebay. That's the beauty of pc, you can always upgrade when it's needed. The whole point I'm trying to make here, and why nobody could name the OP a single game that really takes advantage of his 1080TI, is that we'll have to wait till next gen before we will need to. 

A 1060 can run anything at max settings in 1440p?

When you say "max settings" are you saying.... Max Settings* meaning Max but not with max AA or lighting... shadows on medium, particles on low?

That's 1080p ultra and the card isn't reaching 60fps (unlike the older 980TI which again... was designed to be high end) unless you're suggesting that your graphics card somehow renders 1440p faster than it does 1080p? I see some bench's for the 1060 at 1440 they're not pretty, but ... I think you have to know that right? you can't think that card can do something like metro exodus at 1440p/ultra?

2019 Metro Exodus
21
27.3

21-27fps

Switch would have a better cut at running the game if optimized right. But then Switch would know not to be trying to run the game 1440p/Ultra on hardware that wasn't designed for that.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Conina said:
goopy20 said:

Ok, so you're basically saying that the Xbox One X/ ps4 pro will run all of the next gen games fine? I'll just be over here, crying in a corner as that kinda sucks.

Look, a 580GTX was a $499 high-end gpu when it was released and is basically the same as a 660GTX. So yeah that would also run any ps4 game just fine. Like I said, anything comparable or higher than the 660GTX that's inside the ps4 will run fine. However, anyone with a lower spec gpu of that time had to upgrade. You have to take things a bit in perspective here. In 2012 the $299 560GTX came out and it was a great card that could run Crysis 2 at max settings at 60fps/ 1080p.

But when the ps4 came out 2 years later, it became pretty much useless overnight. Yes it could still run AC Unity but at 12fps in 720p at the lowest settings: https://www.youtube.com/watch?v=B4xQD7AeM2o

"Ok, so you're basically saying that the Xbox One X/ ps4 pro will run all of the next gen games fine?"

No. Nobody is saying that the Xbox One X/ ps4 pro will run all of the next gen games fine. Because the Jaguar CPU cores aren't up to the task, even if the GPU part were strong enough.

"In 2012 the $299 560GTX came out and it was a great card that could run Crysis 2 at max settings at 60fps/ 1080p."

So many errors (lies? fake news?) in such a short sentence. I fixed it for you:

In 2011 the $199 560GTX came out and it was an okay card that could run Crysis 2 at very high settings (which weren't max settings) with no AA at 60fps/ 1080p.

"But when the ps4 came out 2 years later..."

Two and a half year later...

"...it became pretty much useless overnight. Yes it could still run AC Unity..."

...which ran shitty on PCs AND consoles and was in no way representative of the average game performance in 2014/2015. Oh, and when the game was released in November 2014, the GTX 560 was already three and a half year old.

Ok my bad, it was $200 and came out 2,5 years before the ps4. Still doesn't change the fact that it was a pretty common gpu before the ps4 came out. Just like the 1060GTX is the most common gpu among gamers right now.

The 560GTX was a capable enough gpu back then until the ps4 came out and games started running at 12fps in 720p at the lowest settings.  And no, that wasn't just AC Unity, that happened with almost all major games that weren't cross platform  anymore.

Here's what BF1 looks like on a 560GTX: https://www.youtube.com/watch?v=qzppk-pZsIc

Or Batman AK: https://www.youtube.com/watch?v=9nTmKuGUYgM

FF15 hitting 20fps on a 560 TI : https://www.youtube.com/watch?v=V3c0HG3gF_g

So looking back, doesn't it make sense that the same thing will happen with the 1060GTX?



Ganoncrotch said:

@Ganoncrotch

I am not sour at all about buying my pc. I actually had a 970GTX that broke down on me and I knew perfectly well that a 1060GTX wasn't much, if any, of an upgrade. There was a time when I did upgrade my gpu every 2 years or so and sure, I wanted to get a 2070GTX instead. But here's the thing. A 2070GTX would have cost me about $450 extra and to me, that just wasn't worth it. I mean why would I, when a 1060GTX can run anything at max settings in 1440p already? I am sure a lot of pc gamers who spend $500 or more on a gpu feel some buying remorse like I had. It's like owning a super car that can go 300mph, but you have nowhere to drive it. Not saying they are completely useless and if you're into VR, or want to play with triple monitors at 120FPS, then I'm sure it's great. But for the average gamer like me, a 2070RTX would be complete overkill at the moment.

Now, unless Permalite is correct and the 1060GTX will run all games fine next gen, I'm aware that I will have to upgrade when the next gen kicks off. I'm not sour about that, in fact I would love to see graphics take a leap that force me to upgrade, instead of playing ps4 games in 4k at 300fps. However by then a 2070GTX will probably cost me around $100 on ebay. That's the beauty of pc, you can always upgrade when it's needed. The whole point I'm trying to make here, and why nobody could name the OP a single game that really takes advantage of his 1080TI, is that we'll have to wait till next gen before we will need to. 

A 1060 can run anything at max settings in 1440p?

When you say "max settings" are you saying.... Max Settings* meaning Max but not with max AA or lighting... shadows on medium, particles on low?

That's 1080p ultra and the card isn't reaching 60fps (unlike the older 980TI which again... was designed to be high end) unless you're suggesting that your graphics card somehow renders 1440p faster than it does 1080p? I see some bench's for the 1060 at 1440 they're not pretty, but ... I think you have to know that right? you can't think that card can do something like metro exodus at 1440p/ultra?

2019 Metro Exodus
21
27.3

21-27fps

Switch would have a better cut at running the game if optimized right. But then Switch would know not to be trying to run the game 1440p/Ultra on hardware that wasn't designed for that.

Ouch, no need to insult my 1060. I know it's not great but it works good enough for me right now. Next gen, I'm sure I will get me a 2070GTX somewhere down the road.



I'm sitting here with Hunt Showdown taking advantage of my GPU, so I'm not sure why I'm something of a myth or a ghost, when the reality is already there, with the game making absolute use of my GPU.



Step right up come on in, feel the buzz in your veins, I'm like an chemical electrical right into your brain and I'm the one who killed the Radio, soon you'll all see

So pay up motherfuckers you belong to "V"