Forums - Sony Discussion - Unreal Engine 5 Announced + PS5 Demo

didnt do DLSS wonders with control?



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Around the Network
kirby007 said:
didnt do DLSS wonders with control?

On performance, yes, but not without a price. It also hurts the image quality, making things look overly sharp and degraded (obviously what they have to do in order to aide performance). I'm sure it's great for plenty of gamers, but for me, visual quality has to come with a nice balance. I'll play Control at 50-60fps if I can maintain top quality (I know that sounds great, but not when you're sporting top of the line hardware and a monitor north of 120Hz), and thus, that's what I do.



                                                                                                             

CGI-Quality said:
setsunatenshi said:

I do indeed and that's a great point. My guess that's exactly the reason why Cerny was being very vague when discussing the expansion of storage and highlighted the certification process "future" drives will need to go to in order to be used. I think that's him opening the door to future SSDs being designed in a way that takes advantage of the ps5 architecture.

As a (also) PC gamer, my biggest hope is that this turns out to be a great architecture design and we could see some benefits of it rolling out to the PC space. I expect this to be a game changer, but am I sure that will be the case? Not yet. The games will be the proof at the end of the day.

As a side note, I'm extremely curious to see if things like DLSS 2.0 (or 3.0) deliver on it's promise and what sort of AMD equivalent will there be, if any. 

DLSS would have to improve greatly before I commit 100%. With every game I’m offered it with, it stays off.

I'm not using it yet because i'm using a 1080p high refresh rate monitor and gtx 1080 (both on laptop and desktop). I feel the need to upgrade to 4k hdr once prices come down a bit, so DLSS is my only hope for high refresh rates at such high def. Upscaling from 1080 to 4k has been looking pretty good to me, especially since I primarily play FPS games, I shouldn't notice any decreased image quality.



CGI-Quality said:
kirby007 said:
didnt do DLSS wonders with control?

On performance, yes, but not without a price. It also hurts the image quality, making things look overly sharp and degraded (obviously what they have to do in order to aide performance). I'm sure it's great for plenty of gamers, but for me, visual quality has to come with a nice balance. I'll play Control at 50-60fps if I can maintain top quality (I know that sounds great, but not when you're sporting top of the line hardware and a monitor north of 120Hz), and thus, that's what I do.

To be fair, we are probably the kind of person who would simply upgrade so we don't actually need to use DLSS anyway.

For those with lower-end rigs/hardware, it can be a good technology to bolster framerates/resolutions.

To me it's just another tool we can use to achieve something if we need it.

setsunatenshi said:
CGI-Quality said:

DLSS would have to improve greatly before I commit 100%. With every game I’m offered it with, it stays off.

I'm not using it yet because i'm using a 1080p high refresh rate monitor and gtx 1080 (both on laptop and desktop). I feel the need to upgrade to 4k hdr once prices come down a bit, so DLSS is my only hope for high refresh rates at such high def. Upscaling from 1080 to 4k has been looking pretty good to me, especially since I primarily play FPS games, I shouldn't notice any decreased image quality.

You can use DLSS to "upscale" your image to 4k, then super sample it down to 1080P, it's worth giving it a try in your case, might bring some benefits that appeal to you.



--::{PC Gaming Master Race}::--

Pemalite said:
CGI-Quality said:

On performance, yes, but not without a price. It also hurts the image quality, making things look overly sharp and degraded (obviously what they have to do in order to aide performance). I'm sure it's great for plenty of gamers, but for me, visual quality has to come with a nice balance. I'll play Control at 50-60fps if I can maintain top quality (I know that sounds great, but not when you're sporting top of the line hardware and a monitor north of 120Hz), and thus, that's what I do.

To be fair, we are probably the kind of person who would simply upgrade so we don't actually need to use DLSS anyway.

For those with lower-end rigs/hardware, it can be a good technology to bolster framerates/resolutions.

To me it's just another tool we can use to achieve something if we need it.

Of course, which is why I said "I'm sure it's great for plenty of gamers".



                                                                                                             

Around the Network
Pemalite said:
setsunatenshi said:

I'm not using it yet because i'm using a 1080p high refresh rate monitor and gtx 1080 (both on laptop and desktop). I feel the need to upgrade to 4k hdr once prices come down a bit, so DLSS is my only hope for high refresh rates at such high def. Upscaling from 1080 to 4k has been looking pretty good to me, especially since I primarily play FPS games, I shouldn't notice any decreased image quality.

You can use DLSS to "upscale" your image to 4k, then super sample it down to 1080P, it's worth giving it a try in your case, might bring some benefits that appeal to you.

Actually I might try it with DLSS 2.0, just need to wait until a game i'm actually playing supports it :)



CGI-Quality said:
Pemalite said:

To be fair, we are probably the kind of person who would simply upgrade so we don't actually need to use DLSS anyway.

For those with lower-end rigs/hardware, it can be a good technology to bolster framerates/resolutions.

To me it's just another tool we can use to achieve something if we need it.

Of course, which is why I said "I'm sure it's great for plenty of gamers".

Gaming laptops :D 4k gaming in slim form factors. I'm ok with that compromise :)



Pemalite said:
setsunatenshi said:

Again, my understanding is that, yes, it's a pretty fast SSD, but what makes it shine is how the system interfaces with it. 12 channel memory controller with 6 priority queues (vs 2 on a regular PC NVME) make it very low latency when you compare it to a typical storage setup.

Do you not see the potential issue if you drop in a commodity PC SSD?

There are fundamental design choices in Sony's I/O layout that may make it incompatible with commodity PC drives.
Microsoft also made a similar decision, hence it's propriety "memory card" approach.

I doubt it's as low latency as Optane or an SLC SSD.

That is the reason Cerny said a regular SSD would probably need 7Gb/s transfer rate to compensate for the lack of 6 priority levels and that they will need to test different SSDs on the market to verify if they will be acceptable. I would expect something along 2021-2022 having some SSDs that could be installed and probably we will have some partnership and official SDD upgrades.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

I don't buy a graphics engineer would mistake a video file for an actual demo (running a video file at 40 fps is also bizarre, video files are either 24 fps, 30 fps, or rarely 60 fps). The translated version has the engineers clearly saying the SSD throughput only needs to be in the MB/sec range, not GB/sec. Why in the world would you even bother talking about that over a video file. Sweeney likely has a marketing deal with Sony and doesn't want to rock the boat on that. New AMD GPUs getting outperformed by older Nvidia GPUs in actual practise is nothing new, been happening for ages and Epic already admitted the demo will run fine on a 2070 Super. The PS5 is nothing special GPU wise. 

Last edited by Soundwave - on 19 May 2020

Soundwave said:

I don't buy a graphics engineer would mistake a video file for an actual demo (running a video file at 40 fps is also bizarre, video files are either 24 fps, 30 fps, or rarely 60 fps). The translated version has the engineers clearly saying the SSD throughput only needs to be in the MB/sec range, not GB/sec. Why in the world would you even bother talking about that over a video file. Sweeney likely has a marketing deal with Sony and doesn't want to rock the boat on that. New AMD GPUs getting outperformed by older Nvidia GPUs in actual practise is nothing new, been happening for ages and Epic already admitted the demo will run fine on a 2070 Super. The PS5 is nothing special GPU wise. 

There are so many factors that could be at play.

1.) The engineer might have mispoke saying 40 instead of 30.

2.) Sweeney might be respecting some deal.

3.) The PS5 demo might have had some artificial limits set on it, so that it would run at a set standard.  

There is so much room for misunderstanding by Sweeney, a technical limit, the engineer mispeaking, or something totally different.  It's hard to say.