By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Digital Foundry - Dirt 5

SvennoJ said:
DroidKnight said:

HDMI 2.1 is needed to push 8K 60FPS or 4K 120FPS.  Regular HDMI 2.0 is fine to push 1080p 120FPS.

I stand corrected. It's HDR that came later, no 1080p HDR tvs afaik.

It should be fine since psvr pushes 120fps over HDMI 2.0

Anyway seeing 120fps in action in psvr (the few native games Track Mania / Polybius) it's nice but not worth the visual downgrades to me.

Some 1080p screens have started getting HDR, not sure if good or not, but I've seem some models with it in Brazil.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
DonFerrari said:
SvennoJ said:

I stand corrected. It's HDR that came later, no 1080p HDR tvs afaik.

It should be fine since psvr pushes 120fps over HDMI 2.0

Anyway seeing 120fps in action in psvr (the few native games Track Mania / Polybius) it's nice but not worth the visual downgrades to me.

Some 1080p screens have started getting HDR, not sure if good or not, but I've seem some models with it in Brazil.

I wouldn't expect much from it. My 1080p laptop screen also has 'some' HDR capabilities according to windows 10, only in videos though. So far the only effect I've noticed is some HDR photos in Google going horribly over bright, almost complete whiteout. First it loads correctly then it turns up the brightness to 9000.

VRR isn't much of a standard either, it all depends on the tv what range is supported and what works. HDMI really dropped the ball with HDR and VRR, no standards at all. It seems so simple, wait for frame, display, repeat.



SvennoJ said:
DonFerrari said:

Some 1080p screens have started getting HDR, not sure if good or not, but I've seem some models with it in Brazil.

I wouldn't expect much from it. My 1080p laptop screen also has 'some' HDR capabilities according to windows 10, only in videos though. So far the only effect I've noticed is some HDR photos in Google going horribly over bright, almost complete whiteout. First it loads correctly then it turns up the brightness to 9000.

VRR isn't much of a standard either, it all depends on the tv what range is supported and what works. HDMI really dropped the ball with HDR and VRR, no standards at all. It seems so simple, wait for frame, display, repeat.

It’s up to content makers to make use of the improved HDR specs of HDMI 2.1, which brings dynamic (lol) HDR into the mix. Meaning it can use metadata in the same manner as Dolby Vision, adjusting the image on a frame by frame basis.



Hiku said:

I don't think a slight power gap has ever been a big deal. PS2 was the weakest out of the 6th gen consoles and ran circles around the competition.

People will primarily buy the systems that has most of their favorite games, or where most of their friends play, etc.
The reason why the comparisons last gen made headlines was imo because XBO commonly didn't reach the industry standard 1080p. That was bad PR, because even people who don't understand frames and resolution could look at that and think it sounds bad.

DonFerrari said:

Nope. The point is the marketing and senior leadership not the fanboys. Sony mentioned the TF of PS4 and that was it, they didn't even talk about being the strongest, they didn't even formally acknowledge Xbox at all, the most they did was showing how to share games but didn't mention that it would be different anywhere else. MS tries to address PS all the time, and they have been formally gloating.

SvennoJ said:

Last gen was more about who hit native 1080p which was more relevant before better upscaling techniques and 4K screens. The difference between native and upscaled was very noticeable on 1080p LCD start of last gen. By the end of the gen the difference was much less due to improved dynamic/temporal resolution scaling. On 4K it's far less noticeable since the pixels are a quarter of the size.

Of course frame rate dips and screen tearing are still very much an issue. VRR will solve that sort of (hopefully it won't be used as a crutch to leave the frame rate all over the place) Last gen had the right priority, no screen tear, avoid judder, dynamic resolution. I wonder why we're back to screen tearing, is VRR to blame for that? VRR is a niche of (good) 4K HDR tvs which is small subset of general 4K tvs which are still in the minority worldwide.

Fanboys continue to gloat when there plastic box out preforms the other, but when the tables turn and the other plastic box starts out performing the other, the swap of logic changes. 

Exacty what we saw with the PS4 > XB1 to XB1X > PRO to now PS5 > XSX, waiting on the switch again. Same old nonsense, same old mentality.

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

Last edited by Azzanation - on 23 November 2020

Hynad said:
Pemalite said:

Otherwise in pure compute scenarios the Xbox Series X not only has the brute-strength advantage but also some efficiency advantages such as Variable Rate Shading which gives it the edge for things like global illumination lighting and shading.

You are assuming, like DF did a while ago, that the PS5 doesn’t have Variable Rate Shading or their own similar solution, which could be worse, just as good, or even better than DX12U’s VRS.  It would be interesting if you could mention every feature the PS5 does and doesn’t support in comparison to the Series X. But I don’t think people should assume the PS5 doesn’t support certain features just because Sony isn’t openly bragging about them.

Even if the Playstation 5 doesn't have Variable Rate Shading baked in hardware, developers can implement it in software using their own algorithms, I just don't see it happening and it will come with overhead.

But I can only go by what features Microsoft and Sony have championed, if Sony or Microsoft hasn't advertised a certain *important* feature by now, can we assume they have it baked in hardware?
They are constantly trying to 1-up each other in the console stakes... Sony has rightfully talked up the SSD and Microsoft has talked up it's RDNA2 advantages.

Digital Foundry though I wouldn't disregard so readily, they tend to be right more often than not.

Trumpstyle said:

I don't think it's software/tool issue at all that causing the performance different, we need to remember Eurogamers preview, Gears 5 devs got Geforce 2080 performance from XsX after just 2 weeks of optimizations.

https://www.youtube.com/watch?v=qcY4nRHapmE&t
(between 8:00-9.00)

And Gears 5 don't favor AMD over Nvidia in this title:

<SNIP>

In average, Radeon 5700XT is around 5% faster than Geforce 2070 when compared to other games. I'm pretty certain that I know what is going on but gonna keep it a secret and it's just speculation anyway. If I'm right, tools won't help XsX but it's not so bad for Xbox fans, RTX performance between the consoles should be very close to eachother.

Geforce RTX 2080 isn't that impressive.
Keep in mind that is 2018 hardware and we are almost in 2021.

Interested to know your findings on "what is going on".

If it's anything like the Teraflops list where you pretty much outlined every single variable so you can claim you were "right"... Well. Don't need to remind you how fallacious that line of thinking is.

JRPGfan said:
OTBWY said:

At least the S allows for someone to put his/her game in rest mode and not crash.

Btw your sig is insane. Not positive insane, but insane insane. Assuming it's not a joke, you know Mario is the foundation of basically every modern platformer right?

It does go down to like 576p to run at 100-120fps though.

Xbox Series S:

It is soooo blurry compaired to PS5 or XSX, its nuts.
If I was a dev, I woulda just dropped 120fps mode on the Series S.
This is too big a drop in image quality imo, to give up for a mode few TVs support.
Who has brand new expensive tvs with 120fps,.... but buys a Series S anyways? Its a bad look.


I think it's fine. You don't have to use the 120fps mode, you can turn it off.
Eventually the Xbox Series S/X will get a hardware update and games like this will potentially operate at a higher resolution naturally without the need of a "remaster".

It also makes comparisons fun.

576p is a dogs breakfast though... But would probably look good on a CRT at those refresh rates.

Azzanation said:

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

I tend to avoid vsync, so I am glad it's not there.
It introduces input lag... And if your framerate drops enough... It can take you from say... 45fps down to 30 instantly which can look jarring.
It's great if you are just hovering at like 33fps and want it locked at 30.

My displays have freesync, so it's a non-issue.

I will personally opt for a little bit of screen tearing over vsync, but that's just me personally.



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
Hynad said:

You are assuming, like DF did a while ago, that the PS5 doesn’t have Variable Rate Shading or their own similar solution, which could be worse, just as good, or even better than DX12U’s VRS.  It would be interesting if you could mention every feature the PS5 does and doesn’t support in comparison to the Series X. But I don’t think people should assume the PS5 doesn’t support certain features just because Sony isn’t openly bragging about them.

Even if the Playstation 5 doesn't have Variable Rate Shading baked in hardware, developers can implement it in software using their own algorithms, I just don't see it happening and it will come with overhead.

But I can only go by what features Microsoft and Sony have championed, if Sony or Microsoft hasn't advertised a certain *important* feature by now, can we assume they have it baked in hardware?
They are constantly trying to 1-up each other in the console stakes... Sony has rightfully talked up the SSD and Microsoft has talked up it's RDNA2 advantages.

Digital Foundry though I wouldn't disregard so readily, they tend to be right more often than not.

Trumpstyle said:

I don't think it's software/tool issue at all that causing the performance different, we need to remember Eurogamers preview, Gears 5 devs got Geforce 2080 performance from XsX after just 2 weeks of optimizations.

https://www.youtube.com/watch?v=qcY4nRHapmE&t
(between 8:00-9.00)

And Gears 5 don't favor AMD over Nvidia in this title:

In average, Radeon 5700XT is around 5% faster than Geforce 2070 when compared to other games. I'm pretty certain that I know what is going on but gonna keep it a secret and it's just speculation anyway. If I'm right, tools won't help XsX but it's not so bad for Xbox fans, RTX performance between the consoles should be very close to eachother.

Geforce RTX 2080 isn't that impressive.
Keep in mind that is 2018 hardware and we are almost in 2021.

Interested to know your findings on "what is going on".

If it's anything like the Teraflops list where you pretty much outlined every single variable so you can claim you were "right"... Well. Don't need to remind you how fallacious that line of thinking is.

JRPGfan said:

It does go down to like 576p to run at 100-120fps though.

Xbox Series S:

It is soooo blurry compaired to PS5 or XSX, its nuts.
If I was a dev, I woulda just dropped 120fps mode on the Series S.
This is too big a drop in image quality imo, to give up for a mode few TVs support.
Who has brand new expensive tvs with 120fps,.... but buys a Series S anyways? Its a bad look.


I think it's fine. You don't have to use the 120fps mode, you can turn it off.
Eventually the Xbox Series S/X will get a hardware update and games like this will potentially operate at a higher resolution naturally without the need of a "remaster".

It also makes comparisons fun.

576p is a dogs breakfast though... But would probably look good on a CRT at those refresh rates.

Azzanation said:

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

I tend to avoid vsync, so I am glad it's not there.
It introduces input lag... And if your framerate drops enough... It can take you from say... 45fps down to 30 instantly which can look jarring.
It's great if you are just hovering at like 33fps and want it locked at 30.

My displays have freesync, so it's a non-issue.

I will personally opt for a little bit of screen tearing over vsync, but that's just me personally.

Sony have been very silent with PS5. Besides SSD and Tempest Chip they have been very brief on everything else during the reveal. So we can't really be sure of what is there or not, sure since if they haven't specifically talked about it then it's fair to assume it isn't in the PS5. Although I think oodle textures was mentioned as is equivalent to VRS right?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Pemalite said:
Azzanation said:

@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.

I tend to avoid vsync, so I am glad it's not there.
It introduces input lag... And if your framerate drops enough... It can take you from say... 45fps down to 30 instantly which can look jarring.
It's great if you are just hovering at like 33fps and want it locked at 30.

My displays have freesync, so it's a non-issue.

I will personally opt for a little bit of screen tearing over vsync, but that's just me personally.

I only turn vsync off if it affects the mouse usage (like it did in witcher 2 on PC for example, bad coding)

It's a personal preference, I guess console games could simply offer the option, preferably a system wide setting. I rather have a stable picture with lower fps than 2 parts of different frames on the screen. I play on a 144hz screen so the drop isn't that bad and v-sync makes it either 72, 48, 36, 28.8 or 24 fps. Actually I play at 20.6, 18, 16, 14.4 fps in FS 2020, maximizing fidelity. At 144hz it's only 6.9 ms to wait for the next opportunity.

Input lag shouldn't be an issue with v-sync (I only had that problem with TW2), controller polling and engine updates should be (and usually are) independent of output frame rate. But yep, you do have to 'wait' up to 16.7 ms longer to see the result if a 60fps game on a 60hz screen drops a frame. That still looks less annoying to me than a big tear at the top of the screen.

Didn't most games have soft lock last gen? Cap at 30, but allowed to tear briefly when going under? Is that the current problem, that the brief going under is not that brief anymore or is v-sync completely off.



DonFerrari said:

Sony have been very silent with PS5. Besides SSD and Tempest Chip they have been very brief on everything else during the reveal. So we can't really be sure of what is there or not, sure since if they haven't specifically talked about it then it's fair to assume it isn't in the PS5. Although I think oodle textures was mentioned as is equivalent to VRS right?

Sony or more specifically Cerny gave out a ton of information and we have had teardowns since.

SvennoJ said:

I only turn vsync off if it affects the mouse usage (like it did in witcher 2 on PC for example, bad coding)

It's a personal preference, I guess console games could simply offer the option, preferably a system wide setting. I rather have a stable picture with lower fps than 2 parts of different frames on the screen. I play on a 144hz screen so the drop isn't that bad and v-sync makes it either 72, 48, 36, 28.8 or 24 fps. Actually I play at 20.6, 18, 16, 14.4 fps in FS 2020, maximizing fidelity. At 144hz it's only 6.9 ms to wait for the next opportunity.

Input lag shouldn't be an issue with v-sync (I only had that problem with TW2), controller polling and engine updates should be (and usually are) independent of output frame rate. But yep, you do have to 'wait' up to 16.7 ms longer to see the result if a 60fps game on a 60hz screen drops a frame. That still looks less annoying to me than a big tear at the top of the screen.

Didn't most games have soft lock last gen? Cap at 30, but allowed to tear briefly when going under? Is that the current problem, that the brief going under is not that brief anymore or is v-sync completely off.

Really game dependent.
We aren't in the era where console gamers were applauding the visuals while running a game at 25fps like in the 7th gen.

Vsync introduces input lag because it delays frames from being shown on screen, thus increasing the time when you input something and when it appears on screen, the amount of input lag will be dependent on the game engine, it's performance and how the rendering pipeline is setup.

But without a doubt, Vsync always introduces input lag.



--::{PC Gaming Master Race}::--

Pemalite said:
DonFerrari said:

Sony have been very silent with PS5. Besides SSD and Tempest Chip they have been very brief on everything else during the reveal. So we can't really be sure of what is there or not, sure since if they haven't specifically talked about it then it's fair to assume it isn't in the PS5. Although I think oodle textures was mentioned as is equivalent to VRS right?

Sony or more specifically Cerny gave out a ton of information and we have had teardowns since.

I’m sure you already know that, but since you keep arguing that we somehow already know everything about the console, Cerny did not mention every single feature the system is capable of, and the teardown didn’t provide much either.

Why you keep arguing this stance about the PS5 is peculiar.



Hynad said:
Pemalite said:

Sony or more specifically Cerny gave out a ton of information and we have had teardowns since.

I’m sure you already know that, but since you keep arguing that we somehow already know everything about the console, Cerny did not mention every single feature the system is capable of, and the teardown didn’t provide much either.

Why you keep arguing this stance about the PS5 is peculiar.

I'm not arguing about this "particular" stance in regards to the Playstation 5. It's the same stance regardless of console.
I am a PC gamer first and foremost, I have no brand loyalty to the consoles.

So by your logic... If 7 years into the consoles cycle and there is no mention of variable rate shading, can we still assume the console has it "hidden away"?
Where do we draw the line on where we can discount a certain feature being present in the hardware?
The Playstation 3 was never mentioned to have a Tessellation unit, but the Xbox 360 did have that technology, do we still assume that the Playstation 3 might have the technology because Sony never discussed it? No. No we do not.

I get that not every feature will be mentioned, but you generally assume large and important features will be mentioned and variable rate shading is one such feature that is gaining prominence on the PC due to the efficiency gains it brings forth.

Digital Foundry even goes on the record to assert that there is little/no evidence of variable rate shading.
https://www.eurogamer.net/articles/digitalfoundry-2020-ps5-reveal-does-it-deliver-the-next-gen-dream

The Playstation 5's Graphics API's do actually support the feature as well. (I.E. Vulkan and OpenGL.)
RDNA1 doesn't support the feature, RDNA2 does support the feature. - You need both software and hardware support to leverage the hardware feature.

The feature can be done entirely in software if a developer wished to go down that path outside of hardware/API. (Unlikely.)

That is my position... And that is to assume the feature isn't present because there is no evidence of it, there has been no discussion by Sony about it.
And I am happy to change my view when new evidence/statement from Sony comes forward regarding it.

I hold the same position in regards to the existence of God, that because there is no evidence of God, I will discard the idea that God exists until such a time it can be proven empirically. It's logical.



--::{PC Gaming Master Race}::--