By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Why Sony should stop pushing developers for 1080p games.

 

What do you think about this thread...

I agree with most (or all) of it. 49 25.00%
 
BOO!!! XBOT FANBOY ALERT!!! 110 56.12%
 
See results. 37 18.88%
 
Total:196
fatslob-:O said:
SubiyaCryolite said:
As someone who owned a 7870 I really don't understand why some titles perform the way they do on PS4. Its no beast but I remember it performing really well in (50fps > on high or ultra) in most games. I don't expect all games to run at 1080p60fps but I'm still surprised that so few do so. Right now Battlefield 4 and Hard line come to mind, looking at GPU benchmarks I don't understand why Dice settled on 900p especially with all that low level access. Maybe the CPUs really are to blame.

They could be using a high level API on the PS4 since that's a decision ... 

That could be true. The same thing happens with Watchdogs. Both the 7870 and 7850 can run it at 1080p on ultra (45/35) and yet the PS4 renders it at 900p30. At high the 7870 hits 54fps. Something seems off.



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

Around the Network
SubiyaCryolite said:
fatslob-:O said:

They could be using a high level API on the PS4 since that's a decision ... 

That could be true. The same thing happens with Watchdogs. Both the 7870 and 7850 can run it at 1080p on ultra (45/35) and yet the PS4 renders it at 900p30. At high the 7870 hits 54fps. Something seems off.

It is also because you are not taking into consideration the CPU.

I highly doubt the CPU you were using with that graphics card was a 1.6ghz amd.



Tachikoma said:
walsufnir said:


I don't know if what they are expecting for DX12, seems too much. But it would be interesting why the gap between "now" and console launch is there - sdk improvements or work on engine to match to either console.

For both, SDK updates, for the XBO, additional performance gained from removing kinect reserve, P2RLA is expected to give a similar boost to performance as the kinect removal but ith more of a focus on bandwidth allocation rather than sub-ycle management.

It's a graph made from performance logger results from the base engine at the company I now work, running across multiple platforms - the only difference between the results is the console updates and SDK advancements, the testing engine remains the same across all.

 

Well, for the sake of a valid comparison, it should

But thanks for the inside look. Of course I already googled P2RLA but didn't find anything. But I guess we will found out soon enough :)

If you are allowed to answer, is the engine using deferred rendering or some forward(+) engine? Or neither and mixing both? I am only technically interested (And are you sure you are allowed to publish the picture? ;))



SubiyaCryolite said:

That could be true. The same thing happens with Watchdogs. Both the 7870 and 7850 can run it at 1080p on ultra (45/35) and yet the PS4 renders it at 900p30. At high the 7870 hits 54fps. Something seems off.

I can't seem to think of anything else other than a high level API or a buggy SDKs ...

It could be way too early for some developers to quickly adapt their engines to specific platforms at this point in time ... 



Resolution is way more important than textures and particles imo.

But i'd rather have 60 fps though.



Around the Network
Mikmster said:
No. They should push them to make the best they can, at 1080p. PS4 is far superior machine which is why they should push them to utilize it and not gymp PS4 games do to the other guy being weaker.

I agree. Devs have only to put more effort in coding and stop with the 'Parity'. They should utilize all resources available for all consoles. Too bad most of devs do not want to spend more money and push more any single piece of hardware.

XBone, on the other hand, has a very weak GPU, and pushing for 1080p would be a bad move.   It's simply a less capable machine.



Tachikoma said:
SubiyaCryolite said:

That could be true. The same thing happens with Watchdogs. Both the 7870 and 7850 can run it at 1080p on ultra (45/35) and yet the PS4 renders it at 900p30. At high the 7870 hits 54fps. Something seems off.

It is also because you are not taking into consideration the CPU.

I highly doubt the CPU you were using with that graphics card was a 1.6ghz amd.

That's probably the case. My "4GHz 8 core" AMD CPU can't get Ground Zeroes to run past 60fps on Ultra. If that CPU can hold back my GTX 970 then a low powered 1.6GHz CPU must be really shitty.



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

walsufnir said:

Well, for the sake of a valid comparison, it should

But thanks for the inside look. Of course I already googled P2RLA but didn't find anything. But I guess we will found out soon enough :)

If you are allowed to answer, is the engine using deferred rendering or some forward(+) engine? Or neither and mixing both? I am only technically interested (And are you sure you are allowed to publish the picture? ;))

im allowed to publish it because i made it, and the log data is from a testing engine i wrote myself for profiling performance changes per update in a controlled environment, as it makes tracing problems introduced through new updates back to specific functions and features much easier, if i run the benchmark on a new update and a particular part of the test shows a significant difference, then its much easier to fine tune the game code to iron out such situations effectively, without wasting time trying to figure out why after an sdk update suddenly a balanaced process has decided its no longer happen with the cpu time or bandwith it was running just fine with on previous builds

the engine uses multiple render modes and solutions to extensively test various functions of each system, it uses forward+ in its deffered lighting simulations. while retaining AA and filtered transparency benefits from forward, then lots of generic gamey stuff like physically based shading, brdf and gbuffer filtered specular aliasing.

Just a good all round benchmarking tool that has performed well enough for the company to benefit from greatly.



fatslob-:O said:
Player2 said:

y > x.

It's like saying that a jump of 1.44 miles is shorter than a jump of 1.56 kilometres.

I was speaking in terms of percentages, not pixels ...

Percentages of what.

EDIT - What I get from your post is that an increase in 516k pixels is bigger than an increase of 636k pixels because god passed by and gave you a bigger ruler.



Tachikoma said:
walsufnir said:

Well, for the sake of a valid comparison, it should

But thanks for the inside look. Of course I already googled P2RLA but didn't find anything. But I guess we will found out soon enough :)

If you are allowed to answer, is the engine using deferred rendering or some forward(+) engine? Or neither and mixing both? I am only technically interested (And are you sure you are allowed to publish the picture? ;))

im allowed to publish it because i made it, and the log data is from a testing engine i wrote myself for profiling performance changes per update  in a controlled environment.

he engine uses multiple render modes and solutions to extensively test various functions of each system, it uses forward+ in its deffered lighting simulations. while retaining AA and filtered transparency benefits from forward, then lots of generic gamey stuff like physically based shading, brdf and gbuffer filtered specular aliasing.

Just a good all round benchmarking tool that has performed well enough for the company to benefit from greatly.


Such inside is seldom enough to hear so many thanks for that! Sounds like a very solid state of the art engine.

So we can expect games at 1080p30 (locked) and 900(and a little above ;))p30 locked from this engine which is what everyone should expect given the console's power differences. There might be exceptions, of course, which depend on how you will use the engine. But somehow still doubt the dx12 increase ;) (And I still hope that there will be more leaks of the XDK, the chm was very interesting to read).