By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Digital Foundry - Dirt 5

Azzanation said:

The circle of the console industry. Last Gen PS fans: Our console is more powerful. XB fans: Power doesnt matter. This Gen XB fans: Our console is more powerful. PS fans: Power doesnt matter. Curious to see the twist this gen if and when the Series X starts out performing the PS5. Curious to see of those same few hang around this topic. Things will never change.

I don't think a slight power gap has ever been a big deal. PS2 was the weakest out of the 6th gen consoles and ran circles around the competition.

People will primarily buy the systems that has most of their favorite games, or where most of their friends play, etc.
The reason why the comparisons last gen made headlines was imo because XBO commonly didn't reach the industry standard 1080p. That was bad PR, because even people who don't understand frames and resolution could look at that and think it sounds bad.



Around the Network
Azzanation said:

The circle of the console industry. Last Gen PS fans: Our console is more powerful. XB fans: Power doesnt matter. This Gen XB fans: Our console is more powerful. PS fans: Power doesnt matter. Curious to see the twist this gen if and when the Series X starts out performing the PS5. Curious to see of those same few hang around this topic. Things will never change.

Nope. The point is the marketing and senior leadership not the fanboys. Sony mentioned the TF of PS4 and that was it, they didn't even talk about being the strongest, they didn't even formally acknowledge Xbox at all, the most they did was showing how to share games but didn't mention that it would be different anywhere else. MS tries to address PS all the time, and they have been formally gloating.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DroidKnight said:
SvennoJ said:

HDMI 2.1 didn't exist 6 years ago.. There are plenty 120 hz tvs out there yet none accept 120 fps input unless they have hdmi 2.1

It's another nice trip up for the casual consumer.

HDMI 2.1 is needed to push 8K 60FPS or 4K 120FPS.  Regular HDMI 2.0 is fine to push 1080p 120FPS.

I stand corrected. It's HDR that came later, no 1080p HDR tvs afaik.

It should be fine since psvr pushes 120fps over HDMI 2.0

Anyway seeing 120fps in action in psvr (the few native games Track Mania / Polybius) it's nice but not worth the visual downgrades to me.



Azzanation said:

The circle of the console industry. Last Gen PS fans: Our console is more powerful. XB fans: Power doesnt matter. This Gen XB fans: Our console is more powerful. PS fans: Power doesnt matter. Curious to see the twist this gen if and when the Series X starts out performing the PS5. Curious to see of those same few hang around this topic. Things will never change.

Last gen was more about who hit native 1080p which was more relevant before better upscaling techniques and 4K screens. The difference between native and upscaled was very noticeable on 1080p LCD start of last gen. By the end of the gen the difference was much less due to improved dynamic/temporal resolution scaling. On 4K it's far less noticeable since the pixels are a quarter of the size.

Of course frame rate dips and screen tearing are still very much an issue. VRR will solve that sort of (hopefully it won't be used as a crutch to leave the frame rate all over the place) Last gen had the right priority, no screen tear, avoid judder, dynamic resolution. I wonder why we're back to screen tearing, is VRR to blame for that? VRR is a niche of (good) 4K HDR tvs which is small subset of general 4K tvs which are still in the minority worldwide.



SvennoJ said:
DroidKnight said:

HDMI 2.1 is needed to push 8K 60FPS or 4K 120FPS.  Regular HDMI 2.0 is fine to push 1080p 120FPS.

I stand corrected. It's HDR that came later, no 1080p HDR tvs afaik.

It should be fine since psvr pushes 120fps over HDMI 2.0

Anyway seeing 120fps in action in psvr (the few native games Track Mania / Polybius) it's nice but not worth the visual downgrades to me.

Some 1080p screens have started getting HDR, not sure if good or not, but I've seem some models with it in Brazil.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
DonFerrari said:
SvennoJ said:

I stand corrected. It's HDR that came later, no 1080p HDR tvs afaik.

It should be fine since psvr pushes 120fps over HDMI 2.0

Anyway seeing 120fps in action in psvr (the few native games Track Mania / Polybius) it's nice but not worth the visual downgrades to me.

Some 1080p screens have started getting HDR, not sure if good or not, but I've seem some models with it in Brazil.

I wouldn't expect much from it. My 1080p laptop screen also has 'some' HDR capabilities according to windows 10, only in videos though. So far the only effect I've noticed is some HDR photos in Google going horribly over bright, almost complete whiteout. First it loads correctly then it turns up the brightness to 9000.

VRR isn't much of a standard either, it all depends on the tv what range is supported and what works. HDMI really dropped the ball with HDR and VRR, no standards at all. It seems so simple, wait for frame, display, repeat.



SvennoJ said:
DonFerrari said:

Some 1080p screens have started getting HDR, not sure if good or not, but I've seem some models with it in Brazil.

I wouldn't expect much from it. My 1080p laptop screen also has 'some' HDR capabilities according to windows 10, only in videos though. So far the only effect I've noticed is some HDR photos in Google going horribly over bright, almost complete whiteout. First it loads correctly then it turns up the brightness to 9000.

VRR isn't much of a standard either, it all depends on the tv what range is supported and what works. HDMI really dropped the ball with HDR and VRR, no standards at all. It seems so simple, wait for frame, display, repeat.

It’s up to content makers to make use of the improved HDR specs of HDMI 2.1, which brings dynamic (lol) HDR into the mix. Meaning it can use metadata in the same manner as Dolby Vision, adjusting the image on a frame by frame basis.



Hiku said:

I don't think a slight power gap has ever been a big deal. PS2 was the weakest out of the 6th gen consoles and ran circles around the competition.

People will primarily buy the systems that has most of their favorite games, or where most of their friends play, etc.
The reason why the comparisons last gen made headlines was imo because XBO commonly didn't reach the industry standard 1080p. That was bad PR, because even people who don't understand frames and resolution could look at that and think it sounds bad.

DonFerrari said:

Nope. The point is the marketing and senior leadership not the fanboys. Sony mentioned the TF of PS4 and that was it, they didn't even talk about being the strongest, they didn't even formally acknowledge Xbox at all, the most they did was showing how to share games but didn't mention that it would be different anywhere else. MS tries to address PS all the time, and they have been formally gloating.

SvennoJ said:

Last gen was more about who hit native 1080p which was more relevant before better upscaling techniques and 4K screens. The difference between native and upscaled was very noticeable on 1080p LCD start of last gen. By the end of the gen the difference was much less due to improved dynamic/temporal resolution scaling. On 4K it's far less noticeable since the pixels are a quarter of the size.

Of course frame rate dips and screen tearing are still very much an issue. VRR will solve that sort of (hopefully it won't be used as a crutch to leave the frame rate all over the place) Last gen had the right priority, no screen tear, avoid judder, dynamic resolution. I wonder why we're back to screen tearing, is VRR to blame for that? VRR is a niche of (good) 4K HDR tvs which is small subset of general 4K tvs which are still in the minority worldwide.

Fanboys continue to gloat when there plastic box out preforms the other, but when the tables turn and the other plastic box starts out performing the other, the swap of logic changes. 

Exacty what we saw with the PS4 > XB1 to XB1X > PRO to now PS5 > XSX, waiting on the switch again. Same old nonsense, same old mentality.

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

Last edited by Azzanation - on 23 November 2020

Hynad said:
Pemalite said:

Otherwise in pure compute scenarios the Xbox Series X not only has the brute-strength advantage but also some efficiency advantages such as Variable Rate Shading which gives it the edge for things like global illumination lighting and shading.

You are assuming, like DF did a while ago, that the PS5 doesn’t have Variable Rate Shading or their own similar solution, which could be worse, just as good, or even better than DX12U’s VRS.  It would be interesting if you could mention every feature the PS5 does and doesn’t support in comparison to the Series X. But I don’t think people should assume the PS5 doesn’t support certain features just because Sony isn’t openly bragging about them.

Even if the Playstation 5 doesn't have Variable Rate Shading baked in hardware, developers can implement it in software using their own algorithms, I just don't see it happening and it will come with overhead.

But I can only go by what features Microsoft and Sony have championed, if Sony or Microsoft hasn't advertised a certain *important* feature by now, can we assume they have it baked in hardware?
They are constantly trying to 1-up each other in the console stakes... Sony has rightfully talked up the SSD and Microsoft has talked up it's RDNA2 advantages.

Digital Foundry though I wouldn't disregard so readily, they tend to be right more often than not.

Trumpstyle said:

I don't think it's software/tool issue at all that causing the performance different, we need to remember Eurogamers preview, Gears 5 devs got Geforce 2080 performance from XsX after just 2 weeks of optimizations.

https://www.youtube.com/watch?v=qcY4nRHapmE&t
(between 8:00-9.00)

And Gears 5 don't favor AMD over Nvidia in this title:

<SNIP>

In average, Radeon 5700XT is around 5% faster than Geforce 2070 when compared to other games. I'm pretty certain that I know what is going on but gonna keep it a secret and it's just speculation anyway. If I'm right, tools won't help XsX but it's not so bad for Xbox fans, RTX performance between the consoles should be very close to eachother.

Geforce RTX 2080 isn't that impressive.
Keep in mind that is 2018 hardware and we are almost in 2021.

Interested to know your findings on "what is going on".

If it's anything like the Teraflops list where you pretty much outlined every single variable so you can claim you were "right"... Well. Don't need to remind you how fallacious that line of thinking is.

JRPGfan said:
OTBWY said:

At least the S allows for someone to put his/her game in rest mode and not crash.

Btw your sig is insane. Not positive insane, but insane insane. Assuming it's not a joke, you know Mario is the foundation of basically every modern platformer right?

It does go down to like 576p to run at 100-120fps though.

Xbox Series S:

It is soooo blurry compaired to PS5 or XSX, its nuts.
If I was a dev, I woulda just dropped 120fps mode on the Series S.
This is too big a drop in image quality imo, to give up for a mode few TVs support.
Who has brand new expensive tvs with 120fps,.... but buys a Series S anyways? Its a bad look.


I think it's fine. You don't have to use the 120fps mode, you can turn it off.
Eventually the Xbox Series S/X will get a hardware update and games like this will potentially operate at a higher resolution naturally without the need of a "remaster".

It also makes comparisons fun.

576p is a dogs breakfast though... But would probably look good on a CRT at those refresh rates.

Azzanation said:

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

I tend to avoid vsync, so I am glad it's not there.
It introduces input lag... And if your framerate drops enough... It can take you from say... 45fps down to 30 instantly which can look jarring.
It's great if you are just hovering at like 33fps and want it locked at 30.

My displays have freesync, so it's a non-issue.

I will personally opt for a little bit of screen tearing over vsync, but that's just me personally.



--::{PC Gaming Master Race}::--

Pemalite said:
Hynad said:

You are assuming, like DF did a while ago, that the PS5 doesn’t have Variable Rate Shading or their own similar solution, which could be worse, just as good, or even better than DX12U’s VRS.  It would be interesting if you could mention every feature the PS5 does and doesn’t support in comparison to the Series X. But I don’t think people should assume the PS5 doesn’t support certain features just because Sony isn’t openly bragging about them.

Even if the Playstation 5 doesn't have Variable Rate Shading baked in hardware, developers can implement it in software using their own algorithms, I just don't see it happening and it will come with overhead.

But I can only go by what features Microsoft and Sony have championed, if Sony or Microsoft hasn't advertised a certain *important* feature by now, can we assume they have it baked in hardware?
They are constantly trying to 1-up each other in the console stakes... Sony has rightfully talked up the SSD and Microsoft has talked up it's RDNA2 advantages.

Digital Foundry though I wouldn't disregard so readily, they tend to be right more often than not.

Trumpstyle said:

I don't think it's software/tool issue at all that causing the performance different, we need to remember Eurogamers preview, Gears 5 devs got Geforce 2080 performance from XsX after just 2 weeks of optimizations.

https://www.youtube.com/watch?v=qcY4nRHapmE&t
(between 8:00-9.00)

And Gears 5 don't favor AMD over Nvidia in this title:

In average, Radeon 5700XT is around 5% faster than Geforce 2070 when compared to other games. I'm pretty certain that I know what is going on but gonna keep it a secret and it's just speculation anyway. If I'm right, tools won't help XsX but it's not so bad for Xbox fans, RTX performance between the consoles should be very close to eachother.

Geforce RTX 2080 isn't that impressive.
Keep in mind that is 2018 hardware and we are almost in 2021.

Interested to know your findings on "what is going on".

If it's anything like the Teraflops list where you pretty much outlined every single variable so you can claim you were "right"... Well. Don't need to remind you how fallacious that line of thinking is.

JRPGfan said:

It does go down to like 576p to run at 100-120fps though.

Xbox Series S:

It is soooo blurry compaired to PS5 or XSX, its nuts.
If I was a dev, I woulda just dropped 120fps mode on the Series S.
This is too big a drop in image quality imo, to give up for a mode few TVs support.
Who has brand new expensive tvs with 120fps,.... but buys a Series S anyways? Its a bad look.


I think it's fine. You don't have to use the 120fps mode, you can turn it off.
Eventually the Xbox Series S/X will get a hardware update and games like this will potentially operate at a higher resolution naturally without the need of a "remaster".

It also makes comparisons fun.

576p is a dogs breakfast though... But would probably look good on a CRT at those refresh rates.

Azzanation said:

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

I tend to avoid vsync, so I am glad it's not there.
It introduces input lag... And if your framerate drops enough... It can take you from say... 45fps down to 30 instantly which can look jarring.
It's great if you are just hovering at like 33fps and want it locked at 30.

My displays have freesync, so it's a non-issue.

I will personally opt for a little bit of screen tearing over vsync, but that's just me personally.

Sony have been very silent with PS5. Besides SSD and Tempest Chip they have been very brief on everything else during the reveal. So we can't really be sure of what is there or not, sure since if they haven't specifically talked about it then it's fair to assume it isn't in the PS5. Although I think oodle textures was mentioned as is equivalent to VRS right?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."