By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - Finally joined the PC Master Race

Bofferbrauer2 said:

Give me a week and I'll be back home. My old PC runs on an Athlon X4 630 and a Radeon HD 5770, so relatively similar to the hardware Perm touted before. I'm not very big on modern games from AAA studios since they've become increasingly exploitative and formulaic, but I could give it a try. Age of Wonders III, Galactic Civilizations III, South Park: The fractured but Whole and DiRT Rally are a couple of games that do run on my PC just fine in 1080p, some with reduced details. I can try Kingdom Come: Deliverance on it when I'll be back home since I have that game in my library, but didn't get around to play it yet.

I have a Radeon 6950 available which is in a similar ballpark, albeit uses VLIW4 so it's not apples to apples with older Terascale uArchs.

However a Radeon 5770 actually comes up short against a Radeon 5870, by half in many instances, but if you decrease resolution substantially to make up for half the fillrate, you should be able to run many newer titles.
https://www.anandtech.com/bench/product/538?vs=511

goopy20 said:

All, I'm saying is that benchmarks of which 10 year old gpu can still run ps4 titles nowadays is pointless.

Just because you deem it as such, doesn't make it so.
But hey! At-least I am providing evidence to backup my assertions, why not try doing the same?  ;)

goopy20 said:

The point is that the limits of current gen consoles have been reached, hence why new consoles will come out next year.

And yet, games still look better every year.

goopy20 said:

We still have to wait for the exact specs but yes, Ray Tracing will no doubt be the standard. So how can you then still say something like a 1060 GTX will be able to run these next gen games, when a 2080ti can't even chug out 20 fps with ray tracing enabled? https://www.youtube.com/watch?v=wmleyuN7-Ew

Anyone who buys a Geforce 1060 expecting RTX Ray Tracing is living in a fantasy land... They can still run the same games, just without RTX Ray Tracing.

It's actually that simple... And that isn't going to change for years to come, Ray Tracing will be a toggled option just like Tessellation, Screen Space Reflections, Ambient Occlusion, Anti-Aliasing, Anisotropic Filtering, list goes on. Starting to get the gist?

Plus not all Ray Tracing implementations are made equal, developers are still coming to terms on the best approaches to implement Ray Tracing (In-fact many older engines years ago used Path Tracing anyway which is early/simpler variants of Ray Tracing such as CryEngine.)


https://en.wikipedia.org/wiki/Path_tracing

In-fact many implementations of subsurface scattering (A technology showcased in the late 7th gen such as in Halo 4) used Path Tracing.

goopy20 said:

Of course, your can turn down graphic settings. But if you want equal or even better graphics compared to the next gen consoles, you simply will need a 2080RTX or better and a 8 core cpu to run them. 

Prove it.

Fact is, you are using the exact same arguments people used when the 8th gen consoles launched, they were wrong then, so what makes you right now?




www.youtube.com/@Pemalite

Around the Network
goopy20 said:
Pemalite said:

Benchmarks are evidence.
So what you are saying is that you are against any evidence provided? Wow.





Either way, I am refuting your comments not to change your mind, but for others who peruse the forums... Thus the evidence just makes your arguments look like you have an extreme confirmation bias and thus you don't have any real argument to present.

In short those videos proves that a Radeon 5870 can run:
* Overwatch. - Perfectly playable, 1080P, High Settings.
* Sea of Thieves. - Perfectly playable, 30fps. 1080P.
* Fortnite. 1080P, 60fps.
* For Honor. 1080P. 60fps.
* Battlefield 1/5 1080P. 30fps.

GTA 5, FarCry 5, Dirt 4, Rainbow Six: Siege, Witcher 3... Again. All playable on a 10 year old Radeon 5870.


Meaning your argument that you need the latest and greatest GPU's on PC "because of the consoles" is actually redundant.


120fps is overkill? Clearly you have never used a 120hz monitor otherwise you wouldn't be saying that.

You should do some research on refresh rate and why it is important and why you need a framerate to match.

http://gaminghardwarereviews.com/monitors/monitor-refresh-rate/

Majority of multiplats take advantage of cutting edge PC technology. Control is the latest example.


PC also has exclusives which I listed prior such as StarCitizen which are visual showpieces.

Thus your argument is entirely without merit and can be dropped into the "fake news" category, it's been proven otherwise.

Can you see the contradiction in your statements?
Let me point it out:

And:

As for Ray Tracing, it is most certainly able to be the norm on PC. - There is an application that uses the depth buffer to enable Ray Tracing, hence why we can have Ray Tracing in any game, even Crysis from 2007, which released 12 years ago.

See here:

The rumors? Microsoft have outright stated that their console will have hardware accelerated Ray Tracing, Sony hasn't made such a confirmation AFAIK, but they have stated they will support Ray Tracing.

And no. Everyone who doesn't own a 2000 series RTX GPU will need to upgrade to play most multi-plats. - You do know you can turn visual settings on and off right?

It is most certainly a proper game. - And it's being released in "modules".






All, I'm saying is that benchmarks of which 10 year old gpu can still run ps4 titles nowadays is pointless. The point is that the limits of current gen consoles have been reached, hence why new consoles will come out next year. We still have to wait for the exact specs but yes, Ray Tracing will no doubt be the standard. So how can you then still say something like a 1060 GTX will be able to run these next gen games, when a 2080ti can't even chug out 20 fps with ray tracing enabled? https://www.youtube.com/watch?v=wmleyuN7-Ew

Of course, your can turn down graphic settings. But if you want equal or even better graphics compared to the next gen consoles, you simply will need a 2080RTX or better and a 8 core cpu to run them. 

Because if Raytracing really comes with the new consoles, they will also not have any more FPS unless they came with a drastically different, simplified implementation or just use Raycasting instead of raytracing and call it a day. There's a reason there haven't been any games in the past to use either Raytracing or rely exclusively on volumetric elements (aka Voxel. And no, Minecraft doesn't use Voxel despite calling the blocks like that).



Pemalite said:
Bofferbrauer2 said:

Give me a week and I'll be back home. My old PC runs on an Athlon X4 630 and a Radeon HD 5770, so relatively similar to the hardware Perm touted before. I'm not very big on modern games from AAA studios since they've become increasingly exploitative and formulaic, but I could give it a try. Age of Wonders III, Galactic Civilizations III, South Park: The fractured but Whole and DiRT Rally are a couple of games that do run on my PC just fine in 1080p, some with reduced details. I can try Kingdom Come: Deliverance on it when I'll be back home since I have that game in my library, but didn't get around to play it yet.

I have a Radeon 6950 available which is in a similar ballpark, albeit uses VLIW4 so it's not apples to apples with older Terascale uArchs.

However a Radeon 5770 actually comes up short against a Radeon 5870, by half in many instances, but if you decrease resolution substantially to make up for half the fillrate, you should be able to run many newer titles.
https://www.anandtech.com/bench/product/538?vs=511

goopy20 said:

All, I'm saying is that benchmarks of which 10 year old gpu can still run ps4 titles nowadays is pointless.

Just because you deem it as such, doesn't make it so.
But hey! At-least I am providing evidence to backup my assertions, why not try doing the same?  ;)

goopy20 said:

The point is that the limits of current gen consoles have been reached, hence why new consoles will come out next year.

And yet, games still look better every year.

goopy20 said:

We still have to wait for the exact specs but yes, Ray Tracing will no doubt be the standard. So how can you then still say something like a 1060 GTX will be able to run these next gen games, when a 2080ti can't even chug out 20 fps with ray tracing enabled? https://www.youtube.com/watch?v=wmleyuN7-Ew

Anyone who buys a Geforce 1060 expecting RTX Ray Tracing is living in a fantasy land... They can still run the same games, just without RTX Ray Tracing.

It's actually that simple... And that isn't going to change for years to come, Ray Tracing will be a toggled option just like Tessellation, Screen Space Reflections, Ambient Occlusion, Anti-Aliasing, Anisotropic Filtering, list goes on. Starting to get the gist?

Plus not all Ray Tracing implementations are made equal, developers are still coming to terms on the best approaches to implement Ray Tracing (In-fact many older engines years ago used Path Tracing anyway which is early/simpler variants of Ray Tracing such as CryEngine.)


https://en.wikipedia.org/wiki/Path_tracing

In-fact many implementations of subsurface scattering (A technology showcased in the late 7th gen such as in Halo 4) used Path Tracing.

goopy20 said:

Of course, your can turn down graphic settings. But if you want equal or even better graphics compared to the next gen consoles, you simply will need a 2080RTX or better and a 8 core cpu to run them. 

Prove it.

Fact is, you are using the exact same arguments people used when the 8th gen consoles launched, they were wrong then, so what makes you right now?

Ok if you want facts. Just look at the early ps4 titles that weren't cross platform anymore, like AC Unity. Black Flag, which also came out on 360 and ps3, ran beautifully on almost any gpu. However, as soon as developers ditched the older gen, this happened...

Assassin's Creed Unity PC Specs Require a Lot of Your Rig

You might be forced to upgrade if planning to play Unity on PC.

Ubisoft today announced the minimum and recommended PC specs for Assassin's Creed Unity, and let's just say it's going to generate Nvidia and AMD some new business.

A 64-bit operating system is required in order to play Unity, and you'll need a whopping 50 GB of hard drive space to install it. The processor and RAM requirements aren't especially noteworthy, but what stands out most are the video card requirements.

A GTX 680 or HD 7970 is the bare minimum for what will run the game. The only video cards supported at release are the GTX 680 or better; the GTX 700 series; the HD 7970 or better; and the R9 200 series. Laptop versions of these "may work but are not officially supported."

By comparison, the last Assassin's Creed game, Black Flagrecommended a GTX 470 or HD 5850 (GTX 260/HD 4870 required), both of which are significantly older than what's being asked for by Unity. Even looking at other games that have been or will be released this fall--Alien: Isolation (GT 430/HD 5550), The Evil Within (GTX 460), and Call of Duty: Advanced Warfare (GTS 450/HD 5870)--show how Unity's requirements blow them all away

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

And yes, maybe the minimum requirements aren't 100% accurate and you can get the game running on a 660GTX as well. But anyone who wanted similar or better graphics compared to the console versions had to upgrade back then to at least a 670GTX. Also, as soon as all of the major developers stopped supporting the ps3/360, minimum pc requirements went up big time and the 200/ 400 GTX series became pretty much useless overnight.

If you don't believe me, I double dare you to try and get AC Unity running on that Radeon HD 5770 without setting your house on fire. 

Last edited by goopy20 - on 25 September 2019

Two people in this thread are bringing benches, and other undeniable facts to the table. One of them is highly knowledgeable of PC hw and tech in general, while a third person brings their "opinion" to the table, asserting it as fact.

This is going to go around in circles, because one person refuses to yield to the textbook facts.



Mankind, in its arrogance and self-delusion, must believe they are the mirrors to God in both their image and their power. If something shatters that mirror, then it must be totally destroyed.

Bofferbrauer2 said:

Because if Raytracing really comes with the new consoles, they will also not have any more FPS unless they came with a drastically different, simplified implementation or just use Raycasting instead of raytracing and call it a day. There's a reason there haven't been any games in the past to use either Raytracing or rely exclusively on volumetric elements (aka Voxel. And no, Minecraft doesn't use Voxel despite calling the blocks like that).

Sure.
Ray Tracing is still stupidly demanding and still in it's relative infancy, Ray Tracing is something that will become progressively more demanding and more prominent as time goes on, it hasn't just happened overnight, it's been years of progress, Ray Tracing cores did allow us to take a big leap forwards, but we are far far far away from living in a fully ray traced world, it will be a hybrid rasterization+ray tracing approach for years to come.

goopy20 said:

Ok if you want facts. Just look at the early ps4 titles that weren't cross platform anymore, like AC Unity. Black Flag, which also came out on 360 and ps3, ran beautifully on almost any gpu. However, as soon as developers ditched the older gen, this happened...

Assassin's Creed Unity PC Specs Require a Lot of Your Rig

You might be forced to upgrade if planning to play Unity on PC.

Ubisoft today announced the minimum and recommended PC specs for Assassin's Creed Unity, and let's just say it's going to generate Nvidia and AMD some new business.

A 64-bit operating system is required in order to play Unity, and you'll need a whopping 50 GB of hard drive space to install it. The processor and RAM requirements aren't especially noteworthy, but what stands out most are the video card requirements.

A GTX 680 or HD 7970 is the bare minimum for what will run the game. The only video cards supported at release are the GTX 680 or better; the GTX 700 series; the HD 7970 or better; and the R9 200 series. Laptop versions of these "may work but are not officially supported."

By comparison, the last Assassin's Creed game, Black Flagrecommended a GTX 470 or HD 5850 (GTX 260/HD 4870 required), both of which are significantly older than what's being asked for by Unity. Even looking at other games that have been or will be released this fall--Alien: Isolation (GT 430/HD 5550), The Evil Within (GTX 460), and Call of Duty: Advanced Warfare (GTS 450/HD 5870)--show how Unity's requirements blow them all away

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

And yes, maybe the minimum requirements aren't 100% accurate and you can get the game running on a 660GTX as well. But anyone who wanted similar or better graphics compared to the console versions had to upgrade back then to at least a 670GTX. Also, as soon as all of the major developers stopped supporting the ps3/360, minimum pc requirements went up big time and the 200/ 400 GTX series became pretty much useless overnight.

Nice cut and paste job.
But you are missing the point. Still. You are still clinging to developer/publisher recommended/minimum specs as some kind of gospel, we have already proven they are inaccurate.

Here is assassins Creed Unity running on a Radeon 5870, 1080P 30-40+ FPS.


As you can see you don't need a Radeon 7970/Radeon R9 200 GPU.
And to show a Geforce GTX 480 didn't "become useless overnight" here it is running Unity, 1080P some ultra settings, 30fps.



Again, you have been proven wrong.

goopy20 said:

If you don't believe me, I double dare you to try and get AC Unity running on that Radeon HD 5770 without setting your house on fire. 

Don't shift the goal post. We were talking about the Radeon 5870, not 5770.
But you know what? I'll still prove you wrong anyway, just for kicks and giggles.

Here it is running on a Radeon 6770. - Which is a rebadged Radeon 5770.
https://www.anandtech.com/show/4296/amds-radeon-hd-6770-radeon-hd-6750-the-retail-radeon-5700-rebadge

Chazore said:
Two people in this thread are bringing benches, and other undeniable facts to the table. One of them is highly knowledgeable of PC hw and tech in general, while a third person brings their "opinion" to the table, asserting it as fact.

This is going to go around in circles, because one person refuses to yield to the textbook facts.

It's certainly been entertaining, that's for sure.




www.youtube.com/@Pemalite

Around the Network
Pemalite said:
goopy20 said:

If you don't believe me, I double dare you to try and get AC Unity running on that Radeon HD 5770 without setting your house on fire. 

Don't shift the goal post. We were talking about the Radeon 5870, not 5770.
But you know what? I'll still prove you wrong anyway, just for kicks and giggles.

Here it is running on a Radeon 6770. - Which is a rebadged Radeon 5770.
https://www.anandtech.com/show/4296/amds-radeon-hd-6770-radeon-hd-6750-the-retail-radeon-5700-rebadge

I think that one was pointed at me. And guess what: I have Assassin's Creed Odyssey (got it for free on UPlay a couple months ago), and it works, even though my cooler isn't happy (I had to underclock and undervolt the CPU by 200Mhz/0.115V for some games because the cooler has some issues, but I don't see why I should get a new one for such an old build). I didn't mention the game before because it's from 2014, so already somewhat older.



Pemalite said:
Bofferbrauer2 said:

Because if Raytracing really comes with the new consoles, they will also not have any more FPS unless they came with a drastically different, simplified implementation or just use Raycasting instead of raytracing and call it a day. There's a reason there haven't been any games in the past to use either Raytracing or rely exclusively on volumetric elements (aka Voxel. And no, Minecraft doesn't use Voxel despite calling the blocks like that).

Sure.
Ray Tracing is still stupidly demanding and still in it's relative infancy, Ray Tracing is something that will become progressively more demanding and more prominent as time goes on, it hasn't just happened overnight, it's been years of progress, Ray Tracing cores did allow us to take a big leap forwards, but we are far far far away from living in a fully ray traced world, it will be a hybrid rasterization+ray tracing approach for years to come.

goopy20 said:

Ok if you want facts. Just look at the early ps4 titles that weren't cross platform anymore, like AC Unity. Black Flag, which also came out on 360 and ps3, ran beautifully on almost any gpu. However, as soon as developers ditched the older gen, this happened...

Assassin's Creed Unity PC Specs Require a Lot of Your Rig

You might be forced to upgrade if planning to play Unity on PC.

Ubisoft today announced the minimum and recommended PC specs for Assassin's Creed Unity, and let's just say it's going to generate Nvidia and AMD some new business.

A 64-bit operating system is required in order to play Unity, and you'll need a whopping 50 GB of hard drive space to install it. The processor and RAM requirements aren't especially noteworthy, but what stands out most are the video card requirements.

A GTX 680 or HD 7970 is the bare minimum for what will run the game. The only video cards supported at release are the GTX 680 or better; the GTX 700 series; the HD 7970 or better; and the R9 200 series. Laptop versions of these "may work but are not officially supported."

By comparison, the last Assassin's Creed game, Black Flagrecommended a GTX 470 or HD 5850 (GTX 260/HD 4870 required), both of which are significantly older than what's being asked for by Unity. Even looking at other games that have been or will be released this fall--Alien: Isolation (GT 430/HD 5550), The Evil Within (GTX 460), and Call of Duty: Advanced Warfare (GTS 450/HD 5870)--show how Unity's requirements blow them all away

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

And yes, maybe the minimum requirements aren't 100% accurate and you can get the game running on a 660GTX as well. But anyone who wanted similar or better graphics compared to the console versions had to upgrade back then to at least a 670GTX. Also, as soon as all of the major developers stopped supporting the ps3/360, minimum pc requirements went up big time and the 200/ 400 GTX series became pretty much useless overnight.

Nice cut and paste job.
But you are missing the point. Still. You are still clinging to developer/publisher recommended/minimum specs as some kind of gospel, we have already proven they are inaccurate.

Here is assassins Creed Unity running on a Radeon 5870, 1080P 30-40+ FPS.


As you can see you don't need a Radeon 7970/Radeon R9 200 GPU.
And to show a Geforce GTX 480 didn't "become useless overnight" here it is running Unity, 1080P some ultra settings, 30fps.



Again, you have been proven wrong.

goopy20 said:

If you don't believe me, I double dare you to try and get AC Unity running on that Radeon HD 5770 without setting your house on fire. 

Don't shift the goal post. We were talking about the Radeon 5870, not 5770.
But you know what? I'll still prove you wrong anyway, just for kicks and giggles.

Here it is running on a Radeon 6770. - Which is a rebadged Radeon 5770.
https://www.anandtech.com/show/4296/amds-radeon-hd-6770-radeon-hd-6750-the-retail-radeon-5700-rebadge

Chazore said:
Two people in this thread are bringing benches, and other undeniable facts to the table. One of them is highly knowledgeable of PC hw and tech in general, while a third person brings their "opinion" to the table, asserting it as fact.

This is going to go around in circles, because one person refuses to yield to the textbook facts.

It's certainly been entertaining, that's for sure.

I wouldn't exactly call that playable. Framerate is all over the place and it will go down to 10 fps on the more crowded scenes on a 6770. We can debate this all day but just answer me one simple question.

Did or didn't pc games became a lot more demanding as soon as developers stopped supporting the ps3/360? That was a rhetorical question and we both know that a 5770 could run AC Black Flag, which was still a cross gen game, in ultra settings at 60fps. Pc gamers were outraged when the pc requirements were announced for AC Unity. Just look in the comments section of this article:

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

Or look at what happened when Batman Arkham Knight came out and people could barely run it on a 960 GTX, let alone a 660 or lower:

It gets worse - Batman: Arkham Knight on PC lacks console visual features

Performance is poor and specific graphical effects are missing too.

https://www.eurogamer.net/articles/digitalfoundry-2015-batman-arkham-knight-pc-lacks-console-visual-features

And yes people could turn down settings, play it in 720p and whatnot on lower than the minimum required specs, but which self respecting pc gamer would want to play like that? The fact is that anyone who wanted similar performance as the consoles had to upgrade if they still had something like a 5770. I mean do you honestly believe it's a coincidence that all games released after developers moved on from ps3 to ps4 required a 660 GTX or higher as the minimum specs?

Look, I'm not ignoring facts, nor am I stating them. But I will give you a prediction and you tell me if it sounds about right.

  • 2020 - the ps5 and 3*** GTX come out.
  • 2021 - developers will move away from ps4/ Xbox 
  • Minimum pc requirements in 2021 for all major multiplatform games will be a 2080RTX. 
  • 3080RTX required to play multiplatform games in native 4k    
  • You can still play some of those games on a 1060GTX as long as you turn graphics and resolution down, effectively making them look like ps4 ports in the process. 
  • 2023 - The 4*** GTX gets released and for a measly $2000, you have full bragging rights that you can play all console games at 120fps, native 4k and full Nvidia Hairworks enabled.
  • 2027 - ps6 comes out and the cycle start again
  • 2028 - Starship Citizen gets an official release and ends up on ps6 as well.
Last edited by goopy20 - on 26 September 2019

goopy20 said:

I wouldn't exactly call that playable. Framerate is all over the place and it will go down to 10 fps on the more crowded scenes on a 6770. We can debate this all day but just answer me one simple question.

Did or didn't pc games became a lot more demanding as soon as developers stopped supporting the ps3/360? That was a rhetorical question and we both know that a 5770 could run AC Black Flag, which was still a cross gen game, in ultra settings at 60fps. Pc gamers were outraged when the pc requirements were announced for AC Unity. Just look in the comments section of this article:

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

And yes people could turn down settings, play it in 720p and whatnot on lower than the minimum required specs, but which self respecting pc gamer would want to play like that? The fact is that anyone who wanted similar performance as the consoles had to upgrade if they still had something like a 5770. I mean do you honestly believe it's a coincidence that all games released after developers moved on from ps3 to ps4 required a 660 GTX or higher as the minimum specs?

Look, I'm not ignoring facts, nor am I stating them. But I will give you a prediction and you tell me if it sounds about right.

  • 2020 - the ps5 and 3*** GTX come out.
  • 2021 - developers will move away from ps4/ Xbox 
  • Minimum pc requirements in 2021 for all major multiplatform games will be a 2080RTX. 
  • 3080RTX required to play multiplatform games in native 4k    
  • You can still play some of those games on a 1060GTX as long as you turn graphics and resolution down, effectively making them look like ps4 ports in the process. 
  • 2023 - The 4*** GTX gets released and for a measly $2000, you have full bragging rights that you can play all console games at 120fps, native 4k and full Nvidia Hairworks enabled.
  • 2027 - ps6 comes out and the cycle start again
  • 2028 - Starship Citizen gets an official release and ends up on ps6 as well.

You still haven't learned anything, do you?

Just looking at your timetable and I know that you are waaaaaaaaaaaay off.

  1. Developers moving away from PS4/XBO in 2021 already? They took over 4 years to do so with the last gen consoles, and they didn't have such a great success like the PS4 or any mid-gen refreshes that are still easy to port to from the next gen.
  2. You seem to equate the next gen of NVidia with the next console generation, and do so in terms of performance. This is where you all get wrong, as the consoles can only use more mainstream hardware, no high-end. That's due to both the thermals (RTX 3080 will most probably be around 300W TDP, more than the TDP budget of an entire console) and price (while they don't pay the same price as consumers, NVidia certainly wouldn't sell their GPUs 80% off to console makers. With something in performance range of an RTX 3080 you can't expect a pricetag for a console that would be below $1200)
  3. We just showed you over several posts that the minimum requirements didn't grow anywhere near as fast. RTX 2080 may be minimum in 2025, but certainly not in 2021 already. Like I explained already, the Next Gen will take more mainstream hardware, which performance-wise will be closer to the RX 5700 or RTX 2060
  4. Of course you will be able to continue playing on a GTX 1060. And you will be able without turning down the graphics or resolution down, or really just slightly so. Will still look better than PS4 Pro. As explained above, the next gen consoles won't be that more powerful than a GTX 1060, at least not at release. Maybe after a mid-gen upgrade, but certainly not early on.
  5. GTX 4xxx series only in 2023? Not with AMD banging at their door. If they would do so, they would sell them for $200, not $2000, because they fell too far behind. Really, your 2023 point just shows that you neither understand the economics or how the console market, the PC market or PC hardware work at all.

And if the world would really run on the rules that you posted, do you really think anybody would buy anything else than the top-end GPU? Or that AMD and NVidia would even make such lower-end GPUs? If yes, then you don't seem to understand what you're saying in your post, because it would make anything else than the highest-end useless and not viable for anybody anymore.

Oh, and about Batman Arkham Knight, you didn't hear the outcry back then, do you? Because PC's were perfectly able to run PS4 settings, even older ones. The executives didn't understand that you need more than a small studio of 12 people and 8 weeks to port a game to PC, which was the reason why graphical features were cut - they just didn't have the time to implement them, and the rest was cobbled together so badly that it needed a total overhaul to run properly on anything that wasn't the PS4 from which the code was ported.

The code of Arkham Knight doesn't understand that a PC has a separate RAM and VRAM, and tries to treat both as unified RAM. As a result, VRAM consumption is monstrous. Also, they didn't have the time to optimize the code for NVidia one bit, hence why the game runs much smoother on AMD GPUs. An RX 480 suffices amply for 1440p60FPS while a GTX 1060 gets some stutters from texture streaming. You can run the game on a Ryzen 5 2400G in 1080p60 without additional GPU. Oh, and the missing features got patched in a couple months later, when the team that ported the game actually got the time to implement them.

Warner Brothers have a history of shoddy PC ports, Arkham Knight was only the tip of the Iceberg. Mortal Kombat X was already a mess, but nothing as bad as this one. Even Ubisoft didn't fuck up that hard - and they fucked up Tetris of all things!

Last edited by Bofferbrauer2 - on 26 September 2019

Actually, Arkham Knight ran that way due to it being a shitty port, from a console porting studio, Iron Galaxy, and they've a known history of this sort of bad performance./ Blaming bad performance of a port on hardware does not make a justified point at all.

Also:

"2028 - Starship Citizen gets an official release and ends up on ps6 as well"

That's Star Citizen, and no, CR stated that it wouldn't be based for consoles because he wanted to craft it for PC gamers, by PC gamers, for the hardware that platform constantly provides. So no, it won't be on console, let alone just a PS one.

Jesus, Kerotan stated this about Star Citizen as well as moving posts. Seriously hope he hasn't wormed his way back onto the site, via an ancient old account. 



Mankind, in its arrogance and self-delusion, must believe they are the mirrors to God in both their image and their power. If something shatters that mirror, then it must be totally destroyed.

Bofferbrauer2 said:
goopy20 said:

I wouldn't exactly call that playable. Framerate is all over the place and it will go down to 10 fps on the more crowded scenes on a 6770. We can debate this all day but just answer me one simple question.

Did or didn't pc games became a lot more demanding as soon as developers stopped supporting the ps3/360? That was a rhetorical question and we both know that a 5770 could run AC Black Flag, which was still a cross gen game, in ultra settings at 60fps. Pc gamers were outraged when the pc requirements were announced for AC Unity. Just look in the comments section of this article:

https://www.gamespot.com/articles/assassins-creed-unity-pc-specs-require-a-lot-of-yo/1100-6423137/

And yes people could turn down settings, play it in 720p and whatnot on lower than the minimum required specs, but which self respecting pc gamer would want to play like that? The fact is that anyone who wanted similar performance as the consoles had to upgrade if they still had something like a 5770. I mean do you honestly believe it's a coincidence that all games released after developers moved on from ps3 to ps4 required a 660 GTX or higher as the minimum specs?

Look, I'm not ignoring facts, nor am I stating them. But I will give you a prediction and you tell me if it sounds about right.

  • 2020 - the ps5 and 3*** GTX come out.
  • 2021 - developers will move away from ps4/ Xbox 
  • Minimum pc requirements in 2021 for all major multiplatform games will be a 2080RTX. 
  • 3080RTX required to play multiplatform games in native 4k    
  • You can still play some of those games on a 1060GTX as long as you turn graphics and resolution down, effectively making them look like ps4 ports in the process. 
  • 2023 - The 4*** GTX gets released and for a measly $2000, you have full bragging rights that you can play all console games at 120fps, native 4k and full Nvidia Hairworks enabled.
  • 2027 - ps6 comes out and the cycle start again
  • 2028 - Starship Citizen gets an official release and ends up on ps6 as well.

You still haven't learned anything, do you?

Just looking at your timetable and I know that you are waaaaaaaaaaaay off.

  1. Developers moving away from PS4/XBO in 2021 already? They took over 4 years to do so with the last gen consoles, and they didn't have such a great success like the PS4 or any mid-gen refreshes that are still easy to port to from the next gen.
  2. You seem to equate the next gen of NVidia with the next console generation, and do so in terms of performance. This is where you all get wrong, as the consoles can only use more mainstream hardware, no high-end. That's due to both the thermals (RTX 3080 will most probably be around 300W TDP, more than the TDP budget of an entire console) and price (while they don't pay the same price as consumers, NVidia certainly wouldn't sell their GPUs 80% off to console makers. With something in performance range of an RTX 3080 you can't expect a pricetag for a console that would be below $1200)
  3. We just showed you over several posts that the minimum requirements didn't grow anywhere near as fast. RTX 2080 may be minimum in 2025, but certainly not in 2021 already. Like I explained already, the Next Gen will take more mainstream hardware, which performance-wise will be closer to the RX 5700 or RTX 2060
  4. Of course you will be able to continue playing on a GTX 1060. And you will be able without turning down the graphics or resolution down, or really just slightly so. Will still look better than PS4 Pro. As explained above, the next gen consoles won't be that more powerful than a GTX 1060, at least not at release. Maybe after a mid-gen upgrade, but certainly not early on.
  5. GTX 4xxx series only in 2023? Not with AMD banging at their door. If they would do so, they would sell them for $200, not $2000, because they fell too far behind. Really, your 2023 point just shows that you neither understand the economics or how the console market, the PC market or PC hardware work at all.

And if the world would really run on the rules that you posted, do you really think anybody would buy anything else than the top-end GPU? Or that AMD and NVidia would even make such lower-end GPUs? If yes, then you don't seem to understand what you're saying in your post, because it would make anything else than the highest-end useless and not viable for anybody anymore.

Oh, and about Batman Arkham Knight, you didn't hear the outcry back then, do you? Because PC's were perfectly able to run PS4 settings, even older ones. The executives didn't understand that you need more than a small studio of 12 people and 8 weeks to port a game to PC, which was the reason why graphical features were cut - they just didn't have the time to implement them, and the rest was cobbled together so badly that it needed a total overhaul to run properly on anything that wasn't the PS4 from which the code was ported.

The code of Arkham Knight doesn't understand that a PC has a separate RAM and VRAM, and tries to treat both as unified RAM. As a result, VRAM consumption is monstrous. Also, they didn't have the time to optimize the code for NVidia one bit, hence why the game runs much smoother on AMD GPUs. An RX 480 suffices amply for 1440p60FPS while a GTX 1060 gets some stutters from texture streaming. You can run the game on a Ryzen 5 2400G in 1080p60 without additional GPU. Oh, and the missing features got patched in a couple months later, when the team that ported the game actually got the time to implement them.

Warner Brothers have a history of shoddy PC ports, Arkham Knight was only the tip of the Iceberg. Mortal Kombat X was already a mess, but nothing as bad as this one. Even Ubisoft didn't fuck up that hard - and they fucked up Tetris of all things!

Look I am only making a prediction here and we have to wait how much is going to play out. However, it is based on scientific factual evidence. First off, it didn't take 4 years before developers ditched ps3/ 360. The ps4 came out in november 2013 and exactly a year later games like AC Unity came out that weren't released on last gen anymore. Also, isn't it confirmed that the ps5/ Scarlett will have a Navi GPU that sits somewhere between a 2070 and 2080RTX?  

And you're right, maybe Batman was a shoddy port. But isn't it weird how pc gamers were all blaming developers for sucky programming for most games released during that period? Could be a case of mass bad programming, or it could be a case of developers spending years making games that were designed from the ground up to run on ps4 spec hardware and pc gamers were an after thought. I mean, they simply didn't care aif pc gamers couldn't play their game because they didn't have a GPU with 4 gig of vram, a 4 core cpu and at least a 6600GTX. They didn't care because they knew console gaming is where 98% of their sales would be coming from.     

Last edited by goopy20 - on 26 September 2019