By using this site, you agree to our Privacy Policy and our Terms of Use. Close
goopy20 said:
Pemalite said:

Benchmarks are evidence.
So what you are saying is that you are against any evidence provided? Wow.





Either way, I am refuting your comments not to change your mind, but for others who peruse the forums... Thus the evidence just makes your arguments look like you have an extreme confirmation bias and thus you don't have any real argument to present.

In short those videos proves that a Radeon 5870 can run:
* Overwatch. - Perfectly playable, 1080P, High Settings.
* Sea of Thieves. - Perfectly playable, 30fps. 1080P.
* Fortnite. 1080P, 60fps.
* For Honor. 1080P. 60fps.
* Battlefield 1/5 1080P. 30fps.

GTA 5, FarCry 5, Dirt 4, Rainbow Six: Siege, Witcher 3... Again. All playable on a 10 year old Radeon 5870.


Meaning your argument that you need the latest and greatest GPU's on PC "because of the consoles" is actually redundant.


120fps is overkill? Clearly you have never used a 120hz monitor otherwise you wouldn't be saying that.

You should do some research on refresh rate and why it is important and why you need a framerate to match.

http://gaminghardwarereviews.com/monitors/monitor-refresh-rate/

Majority of multiplats take advantage of cutting edge PC technology. Control is the latest example.


PC also has exclusives which I listed prior such as StarCitizen which are visual showpieces.

Thus your argument is entirely without merit and can be dropped into the "fake news" category, it's been proven otherwise.

Can you see the contradiction in your statements?
Let me point it out:

And:

As for Ray Tracing, it is most certainly able to be the norm on PC. - There is an application that uses the depth buffer to enable Ray Tracing, hence why we can have Ray Tracing in any game, even Crysis from 2007, which released 12 years ago.

See here:

The rumors? Microsoft have outright stated that their console will have hardware accelerated Ray Tracing, Sony hasn't made such a confirmation AFAIK, but they have stated they will support Ray Tracing.

And no. Everyone who doesn't own a 2000 series RTX GPU will need to upgrade to play most multi-plats. - You do know you can turn visual settings on and off right?

It is most certainly a proper game. - And it's being released in "modules".






All, I'm saying is that benchmarks of which 10 year old gpu can still run ps4 titles nowadays is pointless. The point is that the limits of current gen consoles have been reached, hence why new consoles will come out next year. We still have to wait for the exact specs but yes, Ray Tracing will no doubt be the standard. So how can you then still say something like a 1060 GTX will be able to run these next gen games, when a 2080ti can't even chug out 20 fps with ray tracing enabled? https://www.youtube.com/watch?v=wmleyuN7-Ew

Of course, your can turn down graphic settings. But if you want equal or even better graphics compared to the next gen consoles, you simply will need a 2080RTX or better and a 8 core cpu to run them. 

Because if Raytracing really comes with the new consoles, they will also not have any more FPS unless they came with a drastically different, simplified implementation or just use Raycasting instead of raytracing and call it a day. There's a reason there haven't been any games in the past to use either Raytracing or rely exclusively on volumetric elements (aka Voxel. And no, Minecraft doesn't use Voxel despite calling the blocks like that).