By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Prediction - PS3-PS4 and X360-XOne last big performance spec leap

fatslob-:O said:

You only need 2048*2048 textures in worst case scenario if a player decides to do a close up on a 1080p screen and we even have texture compression and mipmapping to keep the storage problem tractable ... 

If we're doing 1080p then 16GB will be plenty and I imagine it will be enough for 1440p too but maybe not so much for 4K but I hope we go in the direction of real-time physically based global illumination instead of higher resolutions ...

Well. You know my position on resolutions.
1080P is old news. 1440P is affordable and a marked improvement over 1080P.
4k is still reserved for the high-end.

fatslob-:O said:

Well that's probably because most devs don't have a good LOD or streaming system in place. Ideally you wouldn't need a texture any bigger than the total size of the framebuffer itself ... (1:1 sampling ratio is a perfect match)

I disagree.
Battlefield 1, despite how pretty she looks in places... Can still have landscape textures that look like a dogs breakfast.

That is both on the Xbox One. (Mostly 720P/Medium settings.) and my PC (1440P, Ultra.)
At 1440P and 4k due to the increased clarity, you do tend to notice lower-quality assets easier.

Cerebralbore101 said:

Realistic dynamic lighting can now be done. Textures can be taken from real life photographs of inanimate objects, and then processed in photoshop. Polycounts are high enough to afford to model small details like the buttons on your shirt, without losing performance.

What the heck else can be done?

I understand the "leap" from PS360 to current gen. Having just megabytes of ram to work with was a joke.

But now? Again, what is there to improve on?

We have had Dynamic Lighting for almost 2 decades.

But even today's lighting is not very accurate... And as FatSlob alludes to... Global Illumination can provide for some big gains.

But really, this generation has been all about the "little details". - Massive increases in geometry allowing for smaller details like tiny rocks on the ground to "pop", lots of particle/smoke effects, material based shading and so on.

And everything is generally more dynamic.
Last console generation, shadowing, lighting and such was "baked" into the texture work as the amount of power they offered was extremely limited.
This generation, details tend to be more dynamic, which is also more computationally intensive.

Which is also why during the very end of the last generation consoles... Some multiplatform games actually regressed in graphics.
Developers had started building their games with dynamic details... And didn't bother to bake those details when backporting to last gen consoles, so they looked very bland by comparison.

Cerebralbore101 said:

UE4 already does that though. Are there improvements to be made, or are modern games not taking advantage of it yet? 


It's also in CryEngine, Frosbite, Unity and so on...
But they are all differing on their implementation and thus quality.

This will give you a good idea on the different types of Global Illumination.
https://colinbarrebrisebois.com/2015/11/06/finding-next-gen-part-i-the-need-for-robust-and-fast-global-illumination-in-games/

malistix1985 said:

PS3/XBox360 to xbox one/ps4 was a pretty low jump in form of CPU

It was a good direction in terms of available ram, ram speed and GPU, same what xbox one x is doing right now.
Its all about how that power is used, the jump from xbox360 to xbox one and xbox one to xbox one x isn't a huge difference.

Next generation will have better CPU's thus making more instruction cycles possible, making better physics possible, making better ai possible, making higher framerates possible

Jaguar was a massive jump over Xenon and Cell, especially with Integer capability.
Not everything is about theoretical flops you know. ;)




www.youtube.com/@Pemalite

Around the Network
fatslob-:O said:
Cerebralbore101 said:

UE4 already does that though. Are there improvements to be made, or are modern games not taking advantage of it yet? 

UE4 is trash and there are definitely very real improvements to be made in graphics since AAA games still aren't comparable to pre-rendered films ...

lol They will never get to the level of pre-rendered films. A pre-rendered film contains 1% as many locations, and characters as a game. You couldn't pay the artists to do that much high quality work. You might as well demand a photo-realistic film be made exclusively by painters. You might as well ask a movie studio special effects team to recreate 100 square miles of the amazon, with fake trees and animals, for a 100 square mile theme park. Graphics of that level just aren't financially feasable unless we are talking about a very, very limited game. 

If UE4 is trash, then what is the best game engine? 



KBG29 said:
Cerebralbore101 said:

I honestly feel like there's not much progress to be made anymore. Realistic dynamic lighting can now be done. Textures can be taken from real life photographs of inanimate objects, and then processed in photoshop. Polycounts are high enough to afford to model small details like the buttons on your shirt, without losing performance.

What the heck else can be done?

I understand the "leap" from PS360 to current gen. Having just megabytes of ram to work with was a joke.

But now? Again, what is there to improve on?


I agree with you on traditional displays. Gran Turismo Sport, Horizon, and Uncharted already look amazing. In all reality there is nothing new happening this gen on traditional displays, that could not have been on PS3/360 at lower resolutions and lower detail.

Where that extra power is massively important is in new experiences in the motion, VR, and AI spaces.

VR and Motion need much more CPU and GPU power for many reasons. Motion needs better tracking equipment and more CPU power to be able to accurately and reliably track both peripherals and full body tracking. This will add an incredible amount of freedom for new styles of gameplay, and allow current genres to better represented. On the VR side, we need both assive amounts of CPU power to increase the frame rates to 120fps, while having the GPU power to draw two screens at once at resolutions much higher than even 4K. 2K/1080P is fine on a 50" or below display, when you blow the image up to 300+ inches, and fill over 100° of your vision 2K looks like an old 480P flat screen.

AI has the ability to make huge advancement in the way stories are told, and relationships are built with digital characters. For this to happen, we need even more CPU power. This is power on top of what is needed to render real time effects on two screens at higher than 4K resolution, and a rock solid 120fps.

Therea many, many advancements that still need much, much more power to achieve. I believe we have hit somewhat of a sweet spot for traditional displays, and I think we are at the point where we can have expereinces on the level of GTA5, Horizan, Uncharted 4, ect. on mobile devices. VR is where the future of all things digital is heading, and that space is vibrant for more power.

I couldn't agree more. Any space combat game with VR is a huge gamechanger. Being able to fly and instantly look around the cockpit is fantastic. Having stereoscopic 3D adds more to graphics realism than any amount of lighting, texturing, or FPS. Honestly I hope the PS5 comes with VR built in. Full blown VR is next gen. Slightly better graphics is not. 



Pemalite said:

I disagree.
Battlefield 1, despite how pretty she looks in places... Can still have landscape textures that look like a dogs breakfast.

That is both on the Xbox One. (Mostly 720P/Medium settings.) and my PC (1440P, Ultra.)
At 1440P and 4k due to the increased clarity, you do tend to notice lower-quality assets easier.

It's true however ... 

You wouldn't need anymore texels than pixels in an ideal world but because of the distortions in screen space with respect to how the texels are mapped to the geometry and how there's very few ways we can prefilter in real-time our life just isn't as simple ... (There's really no use for higher texture details once we take into account signal processing theory since the extra information would just be 'filtered' from the output.) 

For 4K you wouldn't need textures any higher than 4096*4096 unless you put your camera to only a quarter of the surface of the assets ... 

Cerebralbore101 said:

lol They will never get to the level of pre-rendered films. A pre-rendered film contains 1% as many locations, and characters as a game. You couldn't pay the artists to do that much high quality work. You might as well demand a photo-realistic film be made exclusively by painters. You might as well ask a movie studio special effects team to recreate 100 square miles of the amazon, with fake trees and animals, for a 100 square mile theme park. Graphics of that level just aren't financially feasable unless we are talking about a very, very limited game. 

If UE4 is trash, then what is the best game engine? 

I think we'll get into the ballpark of pre-rendered films just like how PS4 was able to reach the quality of pre-rendered films from the pre/early new millenium ... 



Nope, you are wrong.....

but this is kinda the same thing that happens every gen; and it really just goes to show how limited most peoples imaginations are. There is always that notion that what they see is the best it can really ever be until they see something better then rinse and repeat.

There are so many things that stands to benefit from a generational leap in hardware that are yet to be implemented in consoles. Right off the top of my head, true 4k capable processors, HBM memory, Nvme SSDs.... all those things individually can not only change how games are made but could all represent a generational leap in hardware potential.... all those things combined?

Well use your imagination.



Around the Network
Cerebralbore101 said:
fatslob-:O said:

UE4 is trash and there are definitely very real improvements to be made in graphics since AAA games still aren't comparable to pre-rendered films ...

lol They will never get to the level of pre-rendered films. A pre-rendered film contains 1% as many locations, and characters as a game. You couldn't pay the artists to do that much high quality work. You might as well demand a photo-realistic film be made exclusively by painters. You might as well ask a movie studio special effects team to recreate 100 square miles of the amazon, with fake trees and animals, for a 100 square mile theme park. Graphics of that level just aren't financially feasable unless we are talking about a very, very limited game. 

If UE4 is trash, then what is the best game engine? 

no opinion on UE4...

but best engine has to be what Horizon Zero Dawn was made on, its simply nuts what it can do on a base PS4 slim.



JRPGfan said:

no opinion on UE4...

but best engine has to be what Horizon Zero Dawn was made on, its simply nuts what it can do on a base PS4 slim.

I haven't actually spent much time looking at what Horizon: Zero Dawn has done on a technical level. But I would assume they achieved what they did with an abundant use of baked details.

fatslob-:O said:

For 4K you wouldn't need textures any higher than 4096*4096 unless you put your camera to only a quarter of the surface of the assets ... 

Definitely not true.

You can notice a big difference between 2k and 4k textures even at 1080P.




www.youtube.com/@Pemalite

Pemalite said:
JRPGfan said:

no opinion on UE4...

but best engine has to be what Horizon Zero Dawn was made on, its simply nuts what it can do on a base PS4 slim.

I haven't actually spent much time looking at what Horizon: Zero Dawn has done on a technical level. But I would assume they achieved what they did with an abundant use of baked details.

The main tradeoffs they made were very static environments, (foliage and water don't react to the player) low levels of AF, and short shadow LODs.



Pemalite said:

Definitely not true.

You can notice a big difference between 2k and 4k textures even at 1080P.

Signal processing theory shows otherwise ... 

2048*2048 -> 1920*1080 (50.5% texels wasted) 4096*4096 -> 1920*1080 (87.6% texels wasted) 

Both of the above cases are a result of oversampling more than necessary and is thus wasteful in terms of memory and bandwidth consumption. When texture sampling, very rarely does the texel density matches the pixel density so we either get undersampling or oversampling ... 

It is only when you map from a continuous domain to a discrete domain to represent a continuous domain that the information lost is important, take for example rasterization. The way we represent triangles on digital displays is flawed since it produces aliasing due to our reconstruction methods. We use edge equations to represent vector graphics on digital displays by rasterizing (or scan converting/sampling) the primitives ... 

Undersampling can lead to information loss as shown here known as aliasing which produces undesirable stair stepping artifacts not unlike the real triangle ... 



curl-6 said:

The main tradeoffs they made were very static environments, (foliage and water don't react to the player) low levels of AF, and short shadow LODs.

I didn't know low levels of AF was an issue with H:ZD reading the DF article but texture sampling is improved on PS4 Pro ...