By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - AMD Confirmed that PS5 will be using RDNA 2 GPU (the same like Xbox Series X)

Tagged games:

 

What do you think

Yes 8 33.33%
 
I love this 5 20.83%
 
Damn Corona Virus 11 45.83%
 
Total:24
vivster said:
goopy20 said:

I'm not sure it will take years. Sony's exclusives should already make pretty good use of the ps5 from the start, and this gen we already had games like Batman AK, Witcher 3, Dying Light, AC Unity etc pretty early on. Most of those games had a minimum requirement of a GTX660 (which is in current gen) and some recommended a GTX770. 

When the ps4/Xone came out there were already a ton of budget options to get console like performance on pc with cards like the GTX750Ti or the $80 AMD 260X. Also, there was the $199 GTX960 which pissed all over the GTX660. Next gen it looks like we won't have those kind of budget options, though. At least not for a while.  

Thankfully we're here to be sure for you. Because we've been on PC for multiple console generations.

Same here buddy. But isn't it a scientific fact that we've never had consoles release with hardware that isn't even on the market yet for pc? When the ps4/Xone came out, the 660GTX was already 2 years old. 

Last edited by goopy20 - on 09 March 2020

Around the Network
JRPGfan said:

Permalite this is reguarding the "12 tflop" number of rumored Xbox One X / Playstation 5, vs a Geforce 2080.

Obviously.

JRPGfan said:

1) yes, kinda.

No. It certainly is.

JRPGfan said:

2) this isnt relevant to the debate.  Right now, its  Geforece 2080ti Tflops vs Future Console Tflops.  I just pointed out RDNA 1 is ~equal to nvidia (perf to flops).

You brought up the point first, if it's irrelevant, why mention it in the first place?

Why not make an Apples to Apples comparison of Future GPU vs Future Console?

RDNA is only able to be competitive with nVidia (nVidia still has a big advantage don't forget), due to the fact it's on 7nm and nVidia is on 12nm.

JRPGfan said:

3) this isnt relevant to the debate.  Right now, we re not debateing how much or to what degree ray-traceing will be shown in upcomeing games.

You are debating about Teraflops in the context of performance, so Ray Tracing performance is most certainly relevant.

JRPGfan said:

4) yes, next gen PC graphics cards, will once again be far ahead.

And they are coming before Next-Gen consoles.

JRPGfan said:

--->  However the orginal point Archangel came out with, was that Consoles will be drastically weaker than a Geforce 2080 ti.
(because of $$$-tag says so)


Just wait until the hardware drops before making any assertions maybe?

JRPGfan said:


My logic says, it wont be.
Consoles if their 12 Tflop should be close to current Geforce 2080ti's.

GCN -> RDNA 1 -> RDNA 2, should be enough to make that happend.

Flops are irrelevant.

goopy20 said:
Pemalite said:

Console games aren't going to suddenly become 10x more graphically demanding, it's a transitional process, so it will take a couple years.
By then... The Xbox Series X and Playstation 5 will be on the lower-end of the performance spectrum.

We would possibly be even looking at 5nm parts by then. (Even if 7nm and 5nm is just advertising numbers rather than actual geometry size shrinks.)

I'm not sure it will take years. Sony's exclusives should already make pretty good use of the ps5 from the start, and this gen we already had games like Batman AK, Witcher 3, Dying Light, AC Unity etc pretty early on. Most of those games had a minimum requirement of a GTX660 (which is in current gen) and some recommended a GTX770. 

The average gaming pc caught up quickly with the ps4/Xone because there were a ton of budget options available that offered console like performance on pc. There were cards like the GTX750Ti, the $80 AMD 260X and the $199 GTX960/970 pissed all over current gen consoles. Next gen it looks like we won't have those kind of budget options, though. At least not for a while.  

They are exclusives. They aren't going to be ported to PC, making that point redundant.

A few games like Batman, Dying Light, Assassins Creed most certainly did require beefier hardware... However, all those games were still operational on a Radeon 7850, just less performant, which required a scaling back of visuals.

Such discrepancies will always happen when it comes to ports... The base Xbox One for example with one of the Assassins Creed games ran better than the Playstation 4 version... That didn't happen because the Xbox One has better hardware, far from it. Just the Port wasn't optimal.

One thing we need to keep in mind about minimum requirements is... They are a guideline only.
You most certainly can run on hardware inferior/different to the specifications listed, higher requirements are often listed than necessary to reduce the amount of support requests saving on costs.

I am not aware of any 8th generation title that cannot be run on a Radeon 7850 and a Core 2 Quad CPU... Low Res Gamer has even built up a following in running games on antiquated hardware.

In saying that... It is interesting you listed the GTX 660... That is roughly equivalent to a Radeon 7850 anyway.
https://www.anandtech.com/bench/product/783?vs=778



--::{PC Gaming Master Race}::--

goopy20 said:
vivster said:

Thankfully we're here to be sure for you. Because we've been on PC for multiple console generations.

Same here buddy. But isn't it a scientific fact that we've never had consoles release with hardware that isn't even on the market yet for pc? When the ps4/Xone came out, the 660GTX was already 2 years old. 

Both AMD and Nvidia are gonna release new GPUs before the new consoles come out. And in Nvidia's case their new generation will put their last generation and the consoles to shame. Just a year later consoles will be your regular midrange again.

The only scientific fact I see here is that consoles won't have much of an effect on PC gaming, as usual. Even if some of the games come out with slightly elevated requirements that will not be true for the vast majority of PC games.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.



Pemalite said:
JRPGfan said:

Permalite this is reguarding the "12 tflop" number of rumored Xbox One X / Playstation 5, vs a Geforce 2080.

Obviously.

JRPGfan said:

1) yes, kinda.

No. It certainly is.

JRPGfan said:

2) this isnt relevant to the debate.  Right now, its  Geforece 2080ti Tflops vs Future Console Tflops.  I just pointed out RDNA 1 is ~equal to nvidia (perf to flops).

You brought up the point first, if it's irrelevant, why mention it in the first place?

Why not make an Apples to Apples comparison of Future GPU vs Future Console?

RDNA is only able to be competitive with nVidia (nVidia still has a big advantage don't forget), due to the fact it's on 7nm and nVidia is on 12nm.

JRPGfan said:

3) this isnt relevant to the debate.  Right now, we re not debateing how much or to what degree ray-traceing will be shown in upcomeing games.

You are debating about Teraflops in the context of performance, so Ray Tracing performance is most certainly relevant.

JRPGfan said:

4) yes, next gen PC graphics cards, will once again be far ahead.

And they are coming before Next-Gen consoles.

JRPGfan said:

--->  However the orginal point Archangel came out with, was that Consoles will be drastically weaker than a Geforce 2080 ti.
(because of $$$-tag says so)


Just wait until the hardware drops before making any assertions maybe?

JRPGfan said:


My logic says, it wont be.
Consoles if their 12 Tflop should be close to current Geforce 2080ti's.

GCN -> RDNA 1 -> RDNA 2, should be enough to make that happend.

Flops are irrelevant.

goopy20 said:

I'm not sure it will take years. Sony's exclusives should already make pretty good use of the ps5 from the start, and this gen we already had games like Batman AK, Witcher 3, Dying Light, AC Unity etc pretty early on. Most of those games had a minimum requirement of a GTX660 (which is in current gen) and some recommended a GTX770. 

The average gaming pc caught up quickly with the ps4/Xone because there were a ton of budget options available that offered console like performance on pc. There were cards like the GTX750Ti, the $80 AMD 260X and the $199 GTX960/970 pissed all over current gen consoles. Next gen it looks like we won't have those kind of budget options, though. At least not for a while.  

They are exclusives. They aren't going to be ported to PC, making that point redundant.

A few games like Batman, Dying Light, Assassins Creed most certainly did require beefier hardware... However, all those games were still operational on a Radeon 7850, just less performant, which required a scaling back of visuals.

Such discrepancies will always happen when it comes to ports... The base Xbox One for example with one of the Assassins Creed games ran better than the Playstation 4 version... That didn't happen because the Xbox One has better hardware, far from it. Just the Port wasn't optimal.

One thing we need to keep in mind about minimum requirements is... They are a guideline only.
You most certainly can run on hardware inferior/different to the specifications listed, higher requirements are often listed than necessary to reduce the amount of support requests saving on costs.

I am not aware of any 8th generation title that cannot be run on a Radeon 7850 and a Core 2 Quad CPU... Low Res Gamer has even built up a following in running games on antiquated hardware.

In saying that... It is interesting you listed the GTX 660... That is roughly equivalent to a Radeon 7850 anyway.
https://www.anandtech.com/bench/product/783?vs=778

Isn't a Radeon 7850 pretty much what's in the ps4/Xone? Wouldn't that also mean that any multi platform game should be able to run on a GTX660/7850 at similar graphics settings as the console versions? The difference with next gen compared to current gen, is that the GTX660/7850 were almost 2 years old budget gpu's when the ps4/Xone came out. Lots of people were disappointed when the current gen specs were revealed, especially when they learned the cpu was pretty lackluster compared to the average pc cpu at that time.

Next gen things will be very different. I mean who would've expected consoles to come out with specs that are comparable to a modern high-end pc gpu? Obviously, even a RTX2080 will be dated eventually when low-mid range gpu's come out that offer that same kind of performance at much lower costs, but it will probably take a lot longer compared to current gen. Who knows, maybe we will get a $200 RTX3060 gpu and an AMD equivalent right before the new consoles launch. However, looking at the current prices, the RTX3060 will probably be around $350 or more, let alone the RTX3080.

Maybe you're right and minimum pc requirements aren't going to be RTX2080, especially in the beginning. However, the recommended requirements for getting a similar gameplay exprience as the console versions, will be. Just like a 660GTX, and at least 2 gigs of Vram, was recommended for most games when the ps4/xone came out.  https://gadgets.ndtv.com/games/news/looking-to-play-batman-arkham-knight-on-pc-you-might-want-to-read-this-first-706702     

Last edited by goopy20 - on 10 March 2020

Around the Network
Pemalite said:
JRPGfan said:

Permalite this is reguarding the "12 tflop" number of rumored Xbox One X / Playstation 5, vs a Geforce 2080.

Obviously.

JRPGfan said:

1) yes, kinda.

No. It certainly is.

JRPGfan said:

2) this isnt relevant to the debate.  Right now, its  Geforece 2080ti Tflops vs Future Console Tflops.  I just pointed out RDNA 1 is ~equal to nvidia (perf to flops).

You brought up the point first, if it's irrelevant, why mention it in the first place?

Why not make an Apples to Apples comparison of Future GPU vs Future Console?

RDNA is only able to be competitive with nVidia (nVidia still has a big advantage don't forget), due to the fact it's on 7nm and nVidia is on 12nm.

JRPGfan said:

3) this isnt relevant to the debate.  Right now, we re not debateing how much or to what degree ray-traceing will be shown in upcomeing games.

You are debating about Teraflops in the context of performance, so Ray Tracing performance is most certainly relevant.

JRPGfan said:

4) yes, next gen PC graphics cards, will once again be far ahead.

And they are coming before Next-Gen consoles.

JRPGfan said:

--->  However the orginal point Archangel came out with, was that Consoles will be drastically weaker than a Geforce 2080 ti.
(because of $$$-tag says so)


Just wait until the hardware drops before making any assertions maybe?

JRPGfan said:


My logic says, it wont be.
Consoles if their 12 Tflop should be close to current Geforce 2080ti's.

GCN -> RDNA 1 -> RDNA 2, should be enough to make that happend.

Flops are irrelevant.

goopy20 said:

I'm not sure it will take years. Sony's exclusives should already make pretty good use of the ps5 from the start, and this gen we already had games like Batman AK, Witcher 3, Dying Light, AC Unity etc pretty early on. Most of those games had a minimum requirement of a GTX660 (which is in current gen) and some recommended a GTX770. 

The average gaming pc caught up quickly with the ps4/Xone because there were a ton of budget options available that offered console like performance on pc. There were cards like the GTX750Ti, the $80 AMD 260X and the $199 GTX960/970 pissed all over current gen consoles. Next gen it looks like we won't have those kind of budget options, though. At least not for a while.  

They are exclusives. They aren't going to be ported to PC, making that point redundant.

A few games like Batman, Dying Light, Assassins Creed most certainly did require beefier hardware... However, all those games were still operational on a Radeon 7850, just less performant, which required a scaling back of visuals.

Such discrepancies will always happen when it comes to ports... The base Xbox One for example with one of the Assassins Creed games ran better than the Playstation 4 version... That didn't happen because the Xbox One has better hardware, far from it. Just the Port wasn't optimal.

One thing we need to keep in mind about minimum requirements is... They are a guideline only.
You most certainly can run on hardware inferior/different to the specifications listed, higher requirements are often listed than necessary to reduce the amount of support requests saving on costs.

I am not aware of any 8th generation title that cannot be run on a Radeon 7850 and a Core 2 Quad CPU... Low Res Gamer has even built up a following in running games on antiquated hardware.

In saying that... It is interesting you listed the GTX 660... That is roughly equivalent to a Radeon 7850 anyway.
https://www.anandtech.com/bench/product/783?vs=778

I just wanted to make a small correction on the AC case.

X1 have a slightly faster CPU, plus lower resolution (or equal?) on that game, so when both combine on an open world game it is expected that X1 would perform a little better than PS4 even if PS4 had faster RAM (not sure how much the eDRAM of X1 compensated the problem in this case) and GPU.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

goopy20 said:

Isn't a Radeon 7850 pretty much what's in the ps4/Xone?

Yup. Although the Xbox One is more like a Radeon 7750 which is a bit of a step down from a 7850.

goopy20 said:

Wouldn't that also mean that any multi platform game should be able to run on a GTX660/7850 at similar graphics settings as the console versions?

Roughly. There is of course some deviation... Most PC gamers for instance will run their games at a native resolution rather than a half-assed 1600x900... And lower visual settings as a result.
That's the PC's true advantage... Choice.

goopy20 said:

The difference with next gen compared to current gen, is that the GTX660/7850 were almost 2 years old budget gpu's when the ps4/Xone came out.

The Playstation 4 and Xbox One GPU's had a "few enhancements" over the PC derivatives, the ESRAM and extra ACE units being prime examples.

But yes, they were out-dated core designs upon release... However, back then Graphics Core Next was still competitive with nVidia... That isn't the case this time around.
nVidia are a full generation ahead of AMD at the moment.

goopy20 said:

Lots of people were disappointed when the current gen specs were revealed, especially when they learned the cpu was pretty lackluster compared to the average pc cpu at that time.

CPU's were a sticking point, but to be fair... Console manufacturers have generally prioritized graphics capabilities over CPU capabilities anyway... The Super Nintendo for example was heavily criticized for it's anemic CPU relative to it's graphics and audio capabilities.

The OG Xbox despite having a Celeron/Pentium 3 hybrid... Was not class-leading on it's release either as the PC had 2.8Ghz class CPU's vs the OG Xbox with it's 733Mhz chip.

And even the Playstation 3 with the heavily "advertised" Cell broadbrand processor... Wasn't that great of a development environment, plus could only achieve decent performance with iterative refinement floating point.

Developers just learned to build their games within the confines of the hardware they have available.
Even next-gen the 8-core Ryzen CPU's won't be class leading, they will be decent and probably one of the larger CPU performance jumps in console history.

goopy20 said:

Next gen things will be very different. I mean who would've expected consoles to come out with specs that are comparable to a modern high-end pc gpu? Obviously, even a RTX2080 will be dated eventually when low-mid range gpu's come out that offer that same kind of performance at much lower costs, but it will probably take a lot longer compared to current gen

The RTX 2080 will be mid-ranged/upper-mid range performance tier once nVidia transitions to 7nm before the next-gen consoles release.

Even the RTX 2080 is beaten by the 2080 Super, 2080 Ti and Titan RTX anyway, it's far from the best GPU on the market...

Next-gen consoles will be releasing in the same performance tier relative to the PC's high-end as the 8th gen Playstation 4 console, the Radeon 7850 was upper-mid range, but was quickly replaced by the Radeon R9 290X before the consoles dropped anyway, meaning consoles were a GPU generation behind.

Same scenario... Just replace Radeon 7850 in the Playstation 4 with RDNA 2 in the Playstation 5... And replace Radeon R9 290X with Geforce RTX 3080.

It really doesn't help that AMD is technologically a generation behind in the PC space either.

One thing is for sure, once RDNA 2 drops, nVidia cannot charge a premium for Ray Tracing unless AMD's Ray Tracing comes up short in performance and/or features.

goopy20 said:

Maybe you're right and minimum pc requirements aren't going to be RTX2080, especially in the beginning. However, the recommended requirements for getting a similar gameplay exprience as the console versions, will be. Just like a 660GTX, and at least 2 gigs of Vram, was recommended for most games when the ps4/xone came out.  https://gadgets.ndtv.com/games/news/looking-to-play-batman-arkham-knight-on-pc-you-might-want-to-read-this-first-706702     

Anyone who thinks the RTX 2080 will be minimum requirements this year is smoking crack.

The Radeon 7850 wasn't minimum requirements when the Playstation 4 released.
The Radeon x1800 wasn't minimum requirements when the Xbox 360 released.
The Geforce 3 Ti 500 wasn't the minimum requirements when the Original Xbox released...

So I think we can comfortably say that the RTX 2080 is not going to be minimum requirements when next-gen releases, the historical precedence is pretty clear on this.

In a few years the requirements will creep up, naturally.

DonFerrari said:

I just wanted to make a small correction on the AC case.

X1 have a slightly faster CPU, plus lower resolution (or equal?) on that game, so when both combine on an open world game it is expected that X1 would perform a little better than PS4 even if PS4 had faster RAM (not sure how much the eDRAM of X1 compensated the problem in this case) and GPU.

Noted. Still resulted in a better port on Xbox One.
I would assume there would be other similar cases on the console as well.







--::{PC Gaming Master Race}::--

vivster said:
goopy20 said:

Same here buddy. But isn't it a scientific fact that we've never had consoles release with hardware that isn't even on the market yet for pc? When the ps4/Xone came out, the 660GTX was already 2 years old. 

Both AMD and Nvidia are gonna release new GPUs before the new consoles come out. And in Nvidia's case their new generation will put their last generation and the consoles to shame. Just a year later consoles will be your regular midrange again.

The only scientific fact I see here is that consoles won't have much of an effect on PC gaming, as usual. Even if some of the games come out with slightly elevated requirements that will not be true for the vast majority of PC games.

Of course they will. On pc there's always new and better hardware around the corner. Like I said, the only difference this time is that the consoles won't have hardware that's completely outdated as soon as they come out. At least not if you would consider a RTX2080 dated by the end of this year. Enthusiast will be able to pick up a RTX3070/3080 and I'm sure all next gen games will run in native 4k on them, assuming developers are aiming for 1440p on ps5/SeriesX.

It will be interesting to see what the "main stream" RTX3060 will cost and how it'll stack up against these next gen consoles and the RTX2080, though. I mean a GTX1080 maybe a bit old but it still outperforms a RTX2060 in most games. 

Last edited by goopy20 - on 10 March 2020

Pemalite said:
goopy20 said:

Isn't a Radeon 7850 pretty much what's in the ps4/Xone?

Yup. Although the Xbox One is more like a Radeon 7750 which is a bit of a step down from a 7850.

goopy20 said:

Wouldn't that also mean that any multi platform game should be able to run on a GTX660/7850 at similar graphics settings as the console versions?

Roughly. There is of course some deviation... Most PC gamers for instance will run their games at a native resolution rather than a half-assed 1600x900... And lower visual settings as a result.
That's the PC's true advantage... Choice.

goopy20 said:

The difference with next gen compared to current gen, is that the GTX660/7850 were almost 2 years old budget gpu's when the ps4/Xone came out.

The Playstation 4 and Xbox One GPU's had a "few enhancements" over the PC derivatives, the ESRAM and extra ACE units being prime examples.

But yes, they were out-dated core designs upon release... However, back then Graphics Core Next was still competitive with nVidia... That isn't the case this time around.
nVidia are a full generation ahead of AMD at the moment.

goopy20 said:

Lots of people were disappointed when the current gen specs were revealed, especially when they learned the cpu was pretty lackluster compared to the average pc cpu at that time.

CPU's were a sticking point, but to be fair... Console manufacturers have generally prioritized graphics capabilities over CPU capabilities anyway... The Super Nintendo for example was heavily criticized for it's anemic CPU relative to it's graphics and audio capabilities.

The OG Xbox despite having a Celeron/Pentium 3 hybrid... Was not class-leading on it's release either as the PC had 2.8Ghz class CPU's vs the OG Xbox with it's 733Mhz chip.

And even the Playstation 3 with the heavily "advertised" Cell broadbrand processor... Wasn't that great of a development environment, plus could only achieve decent performance with iterative refinement floating point.

Developers just learned to build their games within the confines of the hardware they have available.
Even next-gen the 8-core Ryzen CPU's won't be class leading, they will be decent and probably one of the larger CPU performance jumps in console history.

goopy20 said:

Next gen things will be very different. I mean who would've expected consoles to come out with specs that are comparable to a modern high-end pc gpu? Obviously, even a RTX2080 will be dated eventually when low-mid range gpu's come out that offer that same kind of performance at much lower costs, but it will probably take a lot longer compared to current gen

The RTX 2080 will be mid-ranged/upper-mid range performance tier once nVidia transitions to 7nm before the next-gen consoles release.

Even the RTX 2080 is beaten by the 2080 Super, 2080 Ti and Titan RTX anyway, it's far from the best GPU on the market...

Next-gen consoles will be releasing in the same performance tier relative to the PC's high-end as the 8th gen Playstation 4 console, the Radeon 7850 was upper-mid range, but was quickly replaced by the Radeon R9 290X before the consoles dropped anyway, meaning consoles were a GPU generation behind.

Same scenario... Just replace Radeon 7850 in the Playstation 4 with RDNA 2 in the Playstation 5... And replace Radeon R9 290X with Geforce RTX 3080.

It really doesn't help that AMD is technologically a generation behind in the PC space either.

One thing is for sure, once RDNA 2 drops, nVidia cannot charge a premium for Ray Tracing unless AMD's Ray Tracing comes up short in performance and/or features.

goopy20 said:

Maybe you're right and minimum pc requirements aren't going to be RTX2080, especially in the beginning. However, the recommended requirements for getting a similar gameplay exprience as the console versions, will be. Just like a 660GTX, and at least 2 gigs of Vram, was recommended for most games when the ps4/xone came out.  https://gadgets.ndtv.com/games/news/looking-to-play-batman-arkham-knight-on-pc-you-might-want-to-read-this-first-706702     

Anyone who thinks the RTX 2080 will be minimum requirements this year is smoking crack.

The Radeon 7850 wasn't minimum requirements when the Playstation 4 released.
The Radeon x1800 wasn't minimum requirements when the Xbox 360 released.
The Geforce 3 Ti 500 wasn't the minimum requirements when the Original Xbox released...

So I think we can comfortably say that the RTX 2080 is not going to be minimum requirements when next-gen releases, the historical precedence is pretty clear on this.

In a few years the requirements will creep up, naturally.

DonFerrari said:

I just wanted to make a small correction on the AC case.

X1 have a slightly faster CPU, plus lower resolution (or equal?) on that game, so when both combine on an open world game it is expected that X1 would perform a little better than PS4 even if PS4 had faster RAM (not sure how much the eDRAM of X1 compensated the problem in this case) and GPU.

Noted. Still resulted in a better port on Xbox One.
I would assume there would be other similar cases on the console as well.

Yep, probably had several other cases, but AC is the first I can remember got a lot of coverage for being better on Xbox and the rumors of MS paying for Ubi to do it. After that we were over a year into the gen and the comparison between ports on both systems had lost a lot of focus.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

goopy20 said:
vivster said:

Both AMD and Nvidia are gonna release new GPUs before the new consoles come out. And in Nvidia's case their new generation will put their last generation and the consoles to shame. Just a year later consoles will be your regular midrange again.

The only scientific fact I see here is that consoles won't have much of an effect on PC gaming, as usual. Even if some of the games come out with slightly elevated requirements that will not be true for the vast majority of PC games.

Of course they will. On pc there's always new and better hardware around the corner. Like I said, the only difference this time is that the consoles won't have hardware that's completely outdated as soon as they come out. At least not if you would consider a RTX2080 dated by the end of this year. Enthusiast will be able to pick up a RTX3070/3080 and I'm sure all next gen games will run in native 4k on them, assuming developers are aiming for 1440p on ps5/SeriesX.

It will be interesting to see what the "main stream" RTX3060 will cost and how it'll stack up against these next gen consoles and the RTX2080, though. I mean a GTX1080 maybe a bit old but it still outperforms a RTX2060 in most games. 

I'm not sold yet that the consoles will actually reach the performance of a 2080. If the rumors of the new Nvidia gen are true a 3060 might actually come pretty close to the consoles.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.