By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - When did you first notice that grafix had diminishing returns?

linkink said:
curl-6 said:

I never said a "generation behind". But UC3's improvement over this is vastly bigger than any equivalent comparison point on PS4.

PS4's CPU and GPU were both simple, mature parts with no exotic design, easy to max out from the start.

Most developers had no idea how to max out the CPU'S like do they now. I't's something we don't really know as we are both are not developers. 

UC3's improvement over this is vastly bigger than any equivalent comparison point on PS4 is extremely arguable. shadowfall to battlefront is a bigger leap when you consider it runs at 60fps.

You don't need to be a developer to know that PS3's CPU used a complex and unorthodox design that took devs years to master while PS4 basically used a well documented off-the-shelf part.

Shadowfall to Battlefront, while noticeable, is not nearly as stark as the above. Also remember that Killzone is 1080p while Battlefront drops that to 900p.



Around the Network
curl-6 said:
linkink said:

Most developers had no idea how to max out the CPU'S like do they now. I't's something we don't really know as we are both are not developers. 

UC3's improvement over this is vastly bigger than any equivalent comparison point on PS4 is extremely arguable. shadowfall to battlefront is a bigger leap when you consider it runs at 60fps.

You don't need to be a developer to know that PS3's CPU used a complex and unorthodox design that took devs years to master while PS4 basically used a well documented off-the-shelf part.

Shadowfall to Battlefront, while noticeable, is not nearly as stark as the above. Also remember that Killzone is 1080p while Battlefront drops that to 900p.

I know ps3 was more difficult work, and took years to master, but PS4 games kept improving and looking better in big ways, developers really learned to take advantage of the weak CPU cores, something that has never been really used in gaming Pc's

Like I said that's very arguable I think battlefront looks vastly better, sometimes comes close to real life, and is at 60fps.  Yea it's 900p but has amazing AA solution, and looks cleaner then shadowfall.



linkink said:
curl-6 said:

You don't need to be a developer to know that PS3's CPU used a complex and unorthodox design that took devs years to master while PS4 basically used a well documented off-the-shelf part.

Shadowfall to Battlefront, while noticeable, is not nearly as stark as the above. Also remember that Killzone is 1080p while Battlefront drops that to 900p.

I know ps3 was more difficult work, and took years to master, but PS4 games kept improving and looking better in big ways, developers really learned to take advantage of the weak CPU cores, something that has never been really used in gaming Pc's

Like I said that's very arguable I think battlefront looks vastly better, sometimes comes close to real life, and is at 60fps.  Yea it's 900p but has amazing AA solution, and looks cleaner then shadowfall.

When it comes to the PS4 graphical quality is dependent almost entirely on the GPU, which again was mature well documented part that was easy to max out from day 1.

PS3 was a different story as its design meant that getting optimum results required using the CPU's satellite processors to offload GPU tasks, which was a difficult feat for developers to master.

It is only logical that the system that was very difficult for developers to fully exploit will show far more graphical progress over its lifespan than a system using simple, straightforward hardware, and the games reflect this.



curl-6 said:
linkink said:

I know ps3 was more difficult work, and took years to master, but PS4 games kept improving and looking better in big ways, developers really learned to take advantage of the weak CPU cores, something that has never been really used in gaming Pc's

Like I said that's very arguable I think battlefront looks vastly better, sometimes comes close to real life, and is at 60fps.  Yea it's 900p but has amazing AA solution, and looks cleaner then shadowfall.

When it comes to the PS4 graphical quality is dependent almost entirely on the GPU, which again was mature well documented part that was easy to max out from day 1.

PS3 was a different story as its design meant that getting optimum results required using the CPU's satellite processors to offload GPU tasks, which was a difficult feat for developers to master.

It is only logical that the system that was very difficult for developers to fully exploit will show far more graphical progress over its lifespan than a system using simple, straightforward hardware, and the games reflect this.

CPU plays a major role in maintaining  frame rate which is a huge part in making the graphics look better at 60fps or at 30fps, or game struggling to maintain 30fps. We don't know what tricks they are doing/learned over the years to offload cpu work to the GPU. better CPU= less work for the GPU in most cases. The stuff sony exclusives have pulled with ps4 hardware is just as amazing as they have with ps3, and it's a very popular opinion online. I'm always seeing how are they are getting ps4 to do this posts on message boards, and if last us 2 matches the E3 trailer well that's another mind blowing feat.

The games reflecting this is your opinion, and a very arguable one.

Last edited by linkink - on 07 May 2019

linkink said:

better CPU= less work for the GPU in most cases.

That is absolutely not how it works at all.



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
linkink said:

better CPU= less work for the GPU in most cases.

That is absolutely not how it works at all.

Well i really don't know how it works, but from i see on PC. better CPU = higher frame rate, I would think it would make it easier for developers to get much better results for graphics. if a game can run at 60fps with a much better CPU doesn't that leave more room for graphics to improve at 30fps?

Since you seem like a tech expert. What's your opinion on ps3 vs ps4 graphical progress?

Last edited by linkink - on 07 May 2019

linkink said:
curl-6 said:

When it comes to the PS4 graphical quality is dependent almost entirely on the GPU, which again was mature well documented part that was easy to max out from day 1.

PS3 was a different story as its design meant that getting optimum results required using the CPU's satellite processors to offload GPU tasks, which was a difficult feat for developers to master.

It is only logical that the system that was very difficult for developers to fully exploit will show far more graphical progress over its lifespan than a system using simple, straightforward hardware, and the games reflect this.

CPU plays a major role in maintaining  frame rate which is a huge part in making the graphics look better at 60fps or at 30fps, or game struggling to maintain 30fps. We don't know what tricks they are doing/learned over the years to offload cpu work to the GPU. better CPU= less work for the GPU in most cases. The stuff sony exclusives have pulled with ps4 hardware is just as amazing as they have with ps3, and it's a very popular opinion online. I'm always seeing how are they are getting ps4 to do this posts on message boards, and if last us 2 matches the E3 trailer well that's another mind blowing feat.

The games reflecting this is your opinion, and a very arguable one.

As Pemalite says, that it not how it works, at all.

The fact is, PS4 is a less complex system than the PS3 was. This is a good thing as it has made it easier for developers to create games for it. But it also means that the kind of progression seen from launch to end of life with the PS3 just cannot happen on PS4 because the system's power was fully accessible from day 1.

That's not to say there has been no progress; rendering techniques and development tools have improved over the years and this is reflected in the games look better over time. But because there are no large gains to be made from coming to grips with exotic hardware, it's just not possible for PS3's graphical progression to ever be matched on PS4.



linkink said:
Pemalite said:

That is absolutely not how it works at all.

Well i really don't know how it works, but from i see on PC. better CPU = higher frame rate, I would think it would make it easier for developers to get much better results for graphics. if a game can run at 60fps with a much better CPU doesn't that leave more room for graphics to improve at 30fps?

Except...
Better GPU = Higher Frame rate.
Faster Ram = Higher Frame rate.
Tighter Ram timings = Higher Frame Rate.
More Memory Channels = Higher Frame Rate.
Higher Quality motherboard = Higher Frame rate. (Thanks to better routing resulting in smaller trace lengths, chipset quality etc'.)
More Ram = Sometimes Higher Frame Rate.
Faster storage = Sometimes higher frame rate. (I.E. Better streaming of assets.)

It's a little disingenuous to state that "better CPU = higher framerate" and come to the conclusion you did.

The framerate a game operates at is influenced by each and every single component in a console or PC and the load placed upon said components.

However... The CPU does allot in assisting the rendering of a game, no doubt about it... Such as draw calls.
But whether you are GPU or CPU limited entirely depends on the game, game engine, how many characters/objects are on screen and so many other factors... Even how many audio cues are occurring... And that bottleneck might and can shift instantly as well depending on the games scene.

It's a very complex topic either way... And not one easily explained comprehensively in a singular post on a forum either.

But if a game is running at 60fps, then without question, halving the framerate (And thus doubling the available render time) will open up the capability of improving visual fidelity, but that can be said regardless of the CPU's influence.

And considering how many games employ a dynamic resolution these days and often sit below 4k anyway... We are generally GPU limited first and foremost to a point.



--::{PC Gaming Master Race}::--

curl-6 said:
linkink said:

CPU plays a major role in maintaining  frame rate which is a huge part in making the graphics look better at 60fps or at 30fps, or game struggling to maintain 30fps. We don't know what tricks they are doing/learned over the years to offload cpu work to the GPU. better CPU= less work for the GPU in most cases. The stuff sony exclusives have pulled with ps4 hardware is just as amazing as they have with ps3, and it's a very popular opinion online. I'm always seeing how are they are getting ps4 to do this posts on message boards, and if last us 2 matches the E3 trailer well that's another mind blowing feat.

The games reflecting this is your opinion, and a very arguable one.

As Pemalite says, that it not how it works, at all.

The fact is, PS4 is a less complex system than the PS3 was. This is a good thing as it has made it easier for developers to create games for it. But it also means that the kind of progression seen from launch to end of life with the PS3 just cannot happen on PS4 because the system's power was fully accessible from day 1.

That's not to say there has been no progress; rendering techniques and development tools have improved over the years and this is reflected in the games look better over time. But because there are no large gains to be made from coming to grips with exotic hardware, it's just not possible for PS3's graphical progression to ever be matched on PS4.

PS3 of course is more complex system. That doesn't mean ps4 can't see similar gains, which it has, and  it's not some crazy over the top opinion. After all 360 did see similar gains if not more, it wasn't nearly as complex as ps3. Bigger gains will come from rendering techniques and development tools, i think 360,XB1 and ps4 have proven that. So to say ps4 graphical progression can't match ps3 because ps3 was more complex is 100% false.

Pemalite said:
linkink said:

Well i really don't know how it works, but from i see on PC. better CPU = higher frame rate, I would think it would make it easier for developers to get much better results for graphics. if a game can run at 60fps with a much better CPU doesn't that leave more room for graphics to improve at 30fps?

Except...
Better GPU = Higher Frame rate.
Faster Ram = Higher Frame rate.
Tighter Ram timings = Higher Frame Rate.
More Memory Channels = Higher Frame Rate.
Higher Quality motherboard = Higher Frame rate. (Thanks to better routing resulting in smaller trace lengths, chipset quality etc'.)
More Ram = Sometimes Higher Frame Rate.
Faster storage = Sometimes higher frame rate. (I.E. Better streaming of assets.)

It's a little disingenuous to state that "better CPU = higher framerate" and come to the conclusion you did.

The framerate a game operates at is influenced by each and every single component in a console or PC and the load placed upon said components.

However... The CPU does allot in assisting the rendering of a game, no doubt about it... Such as draw calls.
But whether you are GPU or CPU limited entirely depends on the game, game engine, how many characters/objects are on screen and so many other factors... Even how many audio cues are occurring... And that bottleneck might and can shift instantly as well depending on the games scene.

It's a very complex topic either way... And not one easily explained comprehensively in a singular post on a forum either.

But if a game is running at 60fps, then without question, halving the framerate (And thus doubling the available render time) will open up the capability of improving visual fidelity, but that can be said regardless of the CPU's influence.

And considering how many games employ a dynamic resolution these days and often sit below 4k anyway... We are generally GPU limited first and foremost to a point.

Thanks for the explanation. 



linkink said:
curl-6 said:

As Pemalite says, that it not how it works, at all.

The fact is, PS4 is a less complex system than the PS3 was. This is a good thing as it has made it easier for developers to create games for it. But it also means that the kind of progression seen from launch to end of life with the PS3 just cannot happen on PS4 because the system's power was fully accessible from day 1.

That's not to say there has been no progress; rendering techniques and development tools have improved over the years and this is reflected in the games look better over time. But because there are no large gains to be made from coming to grips with exotic hardware, it's just not possible for PS3's graphical progression to ever be matched on PS4.

PS3 of course is more complex system. That doesn't mean ps4 can't see similar gains, which it has, and  it's not some crazy over the top opinion. After all 360 did see similar gains if not more, it wasn't nearly as complex as ps3. Bigger gains will come from rendering techniques and development tools, i think 360,XB1 and ps4 have proven that. So to say ps4 graphical progression can't match ps3 because ps3 was more complex is 100% false.

The 360's hardware was not mature technology when it came out; stuff like unified lighting and shading and a multi-core, multi-threaded CPU were new to the console space. Not so with PS4 and Xbox One; they were based on off-the-shelf parts using nothing cutting edge or exotic, as a result they simply have less room for graphics to grow over time. Last gen you got big jumps on both the software and hardware utilization side. This gen you only get the software advances since the hardware is easy to fully harness from the start.