By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
goopy20 said:

But not everyone is willing to spend $1000 or more on a gpu. I bought my pc to have the best bang for my buck and a 1060GTX and i5 8400 are pretty decent specs for today's games.

No one is telling you that you have to.

goopy20 said:

I bought my pc to have the best bang for my buck and a 1060GTX and i5 8400 are pretty decent specs for today's games.

You could have gotten better hardware for the same price though.

goopy20 said:

I'm probably alone in this but I do not own 3 displays, nor do I think that triple-display and 120fps provides the absolute best in gaming immersion.   

I used to run eyefinity... Thus was gaming at 5760x1080 and later 7680x1440 (Which is more pixels than 4k.)
I can assure you the immersion is there and it is real... I just got tired of spending upwards of $4,000 on graphics processors every year to power it at the time.

Now happily game on a single 120hz, 1440P, 32" display.

goopy20 said:

It's funny, though, how first people are trying so hard to convince me that a 4870 AMD is still fine to play modern games. And now my 1060GTX and i5 8400 are labelled as crap and I'm buying incorrect gear... 

You aren't getting it. You are complaining about the lack of component longevity and you buy average hardware.
The Radeon 4870 was the fastest GPU out when it released... And thus lasted the test of time because of it.

Your hardware isn't "crap" it's just average.

goopy20 said:

And you're right, releasing a game now with a 2080RTX as the minimum requirement will end up in financial disaster. That is why those games don't exist until the next gen consoles hit the market. My whole point is that when the next gen consoles come out, the hardware that will be in them will be the new minimum requirements for pc. Looking back and thinking about power usage and heat displacement, I'm guessing the ps5's GPU will be leaning more towards a RX5700/ 2060RTX.

The point you are missing is that when next-gen consoles release, developers will continue to build games with low-end PC's in mind, that means hypothetical GPU's like a Geforce RTX/GTX 3030 or RX 6300... Which would definitely have less performance than the RTX 2080 or RX 5700.

Conina said:

And the beauty of it... it works "out of the box" for thousands of PC games: insert new graphics card -> crank up the resolution/texture/post-processing and/or enjoy higher framerates and better frametimes.

PS4 Pro or Xbox One X only run better for a small part of their game libraries than the base systems.

Earlier this month I upgraded from Xbox One to Xbox One X and the "enhanced" games look awesome, even original-Xbox-games like "Conker: Live & Reloaded" or 360-games like "Red Dead Redemption 1".

Then I wanted to continue the complete edition of "Forza Motorsport 6" and was disappointed that it looked the same as on my old Xbox One... it wasn't worthy enough to get the "X enhancement treatment", so it only uses a fraction of the hardware power.

Yeah. When I went back to replay Dragon Age: Inquisition on the Xbox One X, I was extremely disappointed in the fact it was still stuck at 900P with medium settings and thus looked extremely soft on the "most powerful console ever". - On PC I can dial it up to 4k and downsample it to 1440P not a problem.

It's great when games get enhanced, but it's a gamble essentially.

Somehow it feels like you're missing the point about what we are arguing here. I never once complained about pc component longevity. In fact I said I will be disappointed if I can still play all of the next-gen AAA games on my 1060GTX. I'm sounding like a broken record but literally the only thing I'm saying is that minimum pc requirements will go up next gen. I know I will not be able to get the same experience on my 1060GTX compared to the next gen console versions and I'm perfectly fine with that. Maybe some games will still run (some no doubt run better than others) but who wants to be gaming like that? If I spend $60 on a game and have it running like this: (AC Oddysey running on a 5850) https://www.youtube.com/watch?v=YLZiJcmi1z0, then for me it would be a clear sign that I'm in need of an upgrade. Even if some games still run fine.

Obviously, next gen games will work fine on lower hypothetical future cards like a 3060RTX, which will likely perform the same as a 2080RTX. But what does that have to with anything? I already said that a minimum requirement of a RX5700/ 2700RTX is pretty steep right now, but 4 years from now those cards will be relics on pc and you can probably buy a used 2070RTX for like $99. 

It's also irrelevant if pc has, or hasn't got a better SSD. What's important is that next gen games will be designed from the ground up to take full advantage of SSD, meaning bigger streaming worlds and basically zero load times. This also means that a 1tb SSD will be the bare minimum in the near future.

Also, things like the ps4 pro and Xbox One X have the same problem that high end pc's are having. Nobody is going to make a game that takes full advantage of the hardware. Instead you get the same games as the base consoles with just some added resolution. Which is nice but really not that noticeable when you're gaming on a tv. That is why I'm hoping that next gen console games will not be targeting native 4k as the new standard. That would be a massive waste of resources. Instead I would much rather see 1080p with a huge boost in overall graphical fidelity.

Last edited by goopy20 - on 28 September 2019