By using this site, you agree to our Privacy Policy and our Terms of Use. Close
goopy20 said:

But not everyone is willing to spend $1000 or more on a gpu. I bought my pc to have the best bang for my buck and a 1060GTX and i5 8400 are pretty decent specs for today's games.

No one is telling you that you have to.

goopy20 said:

I bought my pc to have the best bang for my buck and a 1060GTX and i5 8400 are pretty decent specs for today's games.

You could have gotten better hardware for the same price though.

goopy20 said:

I'm probably alone in this but I do not own 3 displays, nor do I think that triple-display and 120fps provides the absolute best in gaming immersion.   

I used to run eyefinity... Thus was gaming at 5760x1080 and later 7680x1440 (Which is more pixels than 4k.)
I can assure you the immersion is there and it is real... I just got tired of spending upwards of $4,000 on graphics processors every year to power it at the time.

Now happily game on a single 120hz, 1440P, 32" display.

goopy20 said:

It's funny, though, how first people are trying so hard to convince me that a 4870 AMD is still fine to play modern games. And now my 1060GTX and i5 8400 are labelled as crap and I'm buying incorrect gear... 

You aren't getting it. You are complaining about the lack of component longevity and you buy average hardware.
The Radeon 4870 was the fastest GPU out when it released... And thus lasted the test of time because of it.

Your hardware isn't "crap" it's just average.

goopy20 said:

And you're right, releasing a game now with a 2080RTX as the minimum requirement will end up in financial disaster. That is why those games don't exist until the next gen consoles hit the market. My whole point is that when the next gen consoles come out, the hardware that will be in them will be the new minimum requirements for pc. Looking back and thinking about power usage and heat displacement, I'm guessing the ps5's GPU will be leaning more towards a RX5700/ 2060RTX.

The point you are missing is that when next-gen consoles release, developers will continue to build games with low-end PC's in mind, that means hypothetical GPU's like a Geforce RTX/GTX 3030 or RX 6300... Which would definitely have less performance than the RTX 2080 or RX 5700.

Conina said:
Pemalite said:

You don't need a $2000 GPU, shit even a $2000 PC to get a similar output as a Playstation 4 Pro though.
And in a few years when you upgrade/buy a new PC, you can dial up the settings on those older games and revisit them in essentially what becomes a free remaster.

And the beauty of it... it works "out of the box" for thousands of PC games: insert new graphics card -> crank up the resolution/texture/post-processing and/or enjoy higher framerates and better frametimes.

PS4 Pro or Xbox One X only run better for a small part of their game libraries than the base systems.

Earlier this month I upgraded from Xbox One to Xbox One X and the "enhanced" games look awesome, even original-Xbox-games like "Conker: Live & Reloaded" or 360-games like "Red Dead Redemption 1".

Then I wanted to continue the complete edition of "Forza Motorsport 6" and was disappointed that it looked the same as on my old Xbox One... it wasn't worthy enough to get the "X enhancement treatment", so it only uses a fraction of the hardware power.

Yeah. When I went back to replay Dragon Age: Inquisition on the Xbox One X, I was extremely disappointed in the fact it was still stuck at 900P with medium settings and thus looked extremely soft on the "most powerful console ever". - On PC I can dial it up to 4k and downsample it to 1440P not a problem.

It's great when games get enhanced, but it's a gamble essentially.



--::{PC Gaming Master Race}::--