By using this site, you agree to our Privacy Policy and our Terms of Use. Close
goopy20 said:
Pemalite said:

Lumin is an "approximation". - Lumin is a "collection" of lighting techniques all working to augment each other in order to get the best result with a low overhead, it's not the best approach to lighting, shit... It's not even the most efficient, but it's damn well impressive for the hardware it's being showcased on.

For larger objects the Global Illumination uses voxels which are cubes of light.
For medium sized objects they use signed distance fields, which is basically grid sampling.. Another use for it is cloth simulation collision detection.
And for smaller sized objects they use screen-space data.. Which is similar to the approach that Gears of War 5 used for it's lighting.


Yeah it did look impressive, but how do you think it compares to the Nvidia RTX implementation? And which version of RT do you think will make more sense for developers to use (assuming that AMD's Ray Tracing cores perform about the same as on the RTX cards)?

Personally I'm still hoping to see Path Tracing! Not in 4k, of course, but in 1080p with something like DLLS enabled, who knows :)

Depends on the developer goals, developers aren't all making the same games... For a game that has allot of metal/shiny surfaces, it would make absolute sense to use Ray Tracing for reflections. (Think: Cyberpunk.)

For a game that has allot of outdoor vistas with dense foliage and so forth, then using Ray Tracing for shadowing would probably be the best bang for buck.

We aren't at a point where we can Ray Trace everything so we need to still pick and choose where it offers the best bang-for buck, there were some games on the Playstation 4 that used Ray Tracing for their lighting... Because it made artistic sense for the developers to go down that path.

Take Battlefield 5 for example, it's Ray Tracing is pretty understated, in my opinion they didn't get the best bang-for-buck, they might have been better served using it for Global Illumination rather than reflections.

SvennoJ said:

Whut? SSDs are a huge benefit in low powered laptops with integrated graphics.

In what way? If you are suggesting it somehow increases framerates...

goopy20 said:
Pemalite said:

Lumin is an "approximation". - Lumin is a "collection" of lighting techniques all working to augment each other in order to get the best result with a low overhead, it's not the best approach to lighting, shit... It's not even the most efficient, but it's damn well impressive for the hardware it's being showcased on.

For larger objects the Global Illumination uses voxels which are cubes of light.
For medium sized objects they use signed distance fields, which is basically grid sampling.. Another use for it is cloth simulation collision detection.
And for smaller sized objects they use screen-space data.. Which is similar to the approach that Gears of War 5 used for it's lighting.


Yeah it did look impressive, but how do you think it compares to the Nvidia RTX implementation? And which version of RT do you think will make more sense for developers to use (assuming that AMD's Ray Tracing cores perform about the same as on the RTX cards)?

Personally I'm still hoping to see Path Tracing! Not in 4k, of course, but in 1080p with something like DLLS enabled, who knows :)

Depends on the developer goals, developers aren't all making the same games... For a game that has allot of metal/shiny surfaces, it would make absolute sense to use Ray Tracing for reflections. (Think: Cyberpunk.)

For a game that has allot of outdoor vistas with dense foliage and so forth, then using Ray Tracing for shadowing would probably be the best bang for buck.

We aren't at a point where we can Ray Trace everything so we need to still pick and choose where it offers the best bang-for buck, there were some games on the Playstation 4 that used Ray Tracing for their lighting... Because it made artistic sense for the developers to go down that path.

Take Battlefield 5 for example, it's Ray Tracing is pretty understated, in my opinion they didn't get the best bang-for-buck, they might have been better served using it for Global Illumination rather than reflections.

SvennoJ said:

RAM use is linked to storage I/O speed. PC compensates for slow storage speeds by using lots of RAM. My laptop has 22 GB of ram even though it has an SSD. 16GB of system ram, 53% in use just for browsing. Cache, cache and more cache. When ram was still an issue on PC swap files were used. But oh boy, when system RAM ran low and the swap file (on HDD) got used the system crawled to a halt.

More power means you need more RAM for bigger screen buffers and bigger textures. Thus by any means, MS didn't think to give it enough RAM or I/O speed to lessen the burden on RAM to make full use of the power.

Swap files are still used in 2020 on the latest and greatest PC's. They never stopped being used, even when you change your storage. - Currently my PC is using about 3GB of swap.

More power doesn't necessarily mean you need more memory, there are operations that are pretty memory insensitive all things considered, things like texturing need more space sure, but using the compute overhead to bolster an effect somewhere else may not.

SvennoJ said:


A useless comparison

7th Gen, avg 100 MB/s HDD speed, 22.4 GB/s RAM speed (360)    RAM is 230 times faster
8th Gen, avg 100 MB/s HDD speed, 176 GB/s RAM speed (ps4)     RAM is 1800 times faster
9th Gen, PS5 upto 9 GB/s SSD speed, 448 GB/s                           RAM is 50 times faster
9th Gen, SX upto 5 GB/s SSD speed, up to 560 GB/s                    RAM is 112 times faster

Looks like PS5 is in a great position to augment RAM (the biggest bottleneck in any game design) with SSD but Series X still has a better ratio than 7th gen.

The 7th and 8th gen drives were 60MB/s tops on a good day, often 30-40MB/s with random access.

SvennoJ said:

Skyrim for example wouldn't have run into trouble, using the SSD as cache for the changes made to the world.

Skyrim didn't run into any trouble on the Xbox 360 despite it only using DVD+Mechanical HDD.

Skyrims issue was more or less a game engine issue... Despite Bethesdas claims that the Creation Engine is "ground up" brand-new, it's still based on Gamebryo which in turn is based on Net Immerse... And one characteristic of that engine is that it is extremely powerful with scripting and cell loading, but it does mean it comes with the occasional hitching irrespective of the underlying hardware.

Would an SSD have helped? Sure. No doubt. But so would more Ram. - The 7th gen was held back by that 512Mb memory pool.

I did do some nitty gritty work on oblivions engine at some point, working to get it to run on less than optimal hardware configurations... I.E. 128MB of Ram even.

SvennoJ said:

Don't you do any fact checking???

https://www.kotaku.co.uk/2017/03/14/breath-of-the-wild-was-never-meant-to-be-the-wii-us-swansong

The Legend of Zelda: Breath of the Wild is a superb launch title for Nintendo Switch, but it was never intended as such. For almost all of its development, Breath of the Wild was a Wii U game through and through.

Development of BotW started 5 years before the Switch came out! Besides that, WiiU to Switch isn't much of a jump at all.

https://www.theverge.com/2018/3/3/17070664/nintendo-switch-technology-bayonetta-2-anniversary

The Switch is roughly as powerful as the Wii U — a little speedier, sure, but they’re not in different galaxies.

There is a big jump in hardware feature sets and efficiency between Wii U and Switch. - The fact that the Switch can take every WiiU game and run them at higher resolutions and framerates with little effort is a testament to that.

Keep in mind that Breath of the Wild was the best looking WiiU game... Where-as on the Switch, it still hasn't had it's best looking games yet.

In saying that... It's not the same kind of jump between the Playstation 3 > Playstation 4 or Xbox 360 > Xbox One, it's a much smaller jump in overall capability, but it's there. (And I do own both the Switch and WiiU so can compare in real time.)

only777 said:

Phil is lying here and his statement is provable false.  He's used a classic bait and switch tactic too.

First, we know that lower power PC's hold back game development.  Ex-EA games game engine dev "The Cherno" said this one his YouTube Channel. ( https://www.youtube.com/channel/UCQ-W1KE9EYfdxhL6S4twUNw ) I think the video he said in was this one: https://www.youtube.com/watch?v=erxUR9SI4F0 - Although I've not watched it back to check.

Also Phil says how PC lower end rigs are not holding back Higher end rigs.  Let's pretend that this statement is true.  But he is clearly saying this in response to Sony and how Sony showed with games like Ratchet and Clank (the warp mechanic in particular) that PS5 allows them to create games that was impossible on previous hardware.

Well the problem with Phil saying that PC isn't held back by lower end rigs is that he is comparing apples to oranges.  A PC games HAS to be made with lower specs in mind, but also the hardware of a lower spec machine and high spec machine is pretty much the same really.  it's just faster/larger numbers of the same thing.

As we saw in the "road to PS5" video, the way data is moved around in the console is much different to a PC.  With the custom decompression and pipe line differences, it's not the same as just PS4 but with bigger numbers.

I thought Phil would be better than this.

Considering that the consoles are low-end PC rigs... They tend to be the lowest common denominator.

Developers don't *need* to build their games to target low-end PC's, heck. Crysis didn't.

zero129 said:

It will have no problems playing all nextgen games in a lower res and i already showed you how nvidia has tech that can take a 540P image and upscale it to 1080P while also looking better then a native 1080P image and a 720P image up to 1440P. MS and AMD have also been working on such tech and it makes perfect sense why with the Series S targeting 1080-1440P and series X Targeting 4K as such tech will allow both console to hit that target at a fraction of the cost easy.

Next gen, target resolutions are going to be unimportant, heck they almost are now.

Microsoft has DirectML and AMD has Radeon Image Sharpening which should provide some impressive image reconstruction, it's going to make pixel counting redundant for allot of games.

A game like Call of Duty Modern Warfare 2019 has a very filmic presentation due to the image reconstruction and anti-aliasing and other post-process techniques being employed.

If a game is 1080P, I am okay with that, but it better be an impressive 1080P.

zero129 said:

But Like i said Series S is a next gen console but for 1080P tv owners.

The Series X is also for 1080P owners, the super sampling of the Xbox One X on a 1080P display does bring with it a ton advantages... Plus the higher-end console will have better texturing, shadowing, lighting which will pop far better on a 1080P display.

The Series S is more or less for those on a budget who don't really give a crap about specifications... They just wanna game and have fun.








--::{PC Gaming Master Race}::--