KBG29 said:
Tachikoma said:
this is flat out wrong. The PS4 can handle VIDEO OUTPUT at 4k 30fps, not "Run games at".
|
PS4 can do both, and the only reason it does niether is because Sony has not allowed it to.
http://www.eurogamer.net/articles/digitalfoundry-vs-trine-2-on-ps4
This article goes in depth on the limitations Sony has enforced via firmware. The System itself can handle 4K/30fps in many situations. Obviously games like God of War or Uncharted will never render at that level, but there is no reason something like Flower, Pixel Junk Games, Arcade Shooters, and many other less demanding titles could not take advantage of the available power.
Honestly, I think the only reason Sony is F******* us over on 4K is because they want to push Morpheus next year, and Bravia is not a money maker.
|
There is a staggering difference between games like Trine and 95% of games in general, that being actual heavy resource usage, trine uses a miniscule portion of resources, as a result using it as proof that the console can run games at 4K is smoke and mirrors, more accurately, it would be "can run a small handful of software at 4k", the moment you do any sort of heavy load on the cpu or gpu that possibility flies out the window, thus using that as reasoning as to why the PS5 will arget 4K for games is flawed.
As i mentioned in an earlier post, the console will be CAPABLE of 4K but only select titles and indies will actually utilize it, developers are still going to target 1080p because it means they can do more for less, or for dodgy development it means they can afford to optimize less and still acieve passable results.
Take the shift from PS3/360 to PS4/XBO, the new consoles are notably more powerful than their predecessors, yet the jump in resolution and framerate is a sporadic one, not all games are 1080p on PS4 and even less 1080p on Xbox One, 1080p60fps is even less common, the likelihood of skipping consistent 1080p60fps for a higher resolution is virtually zero, at best what youll see is developers targetting 1080p then occasionally supersampling from a slightly higher res to aid with AA.
Devs *CAN* choose to output higher than 1080P on PS4 because game level access takes control of the graphics pipeline allowing the engine to set the resolution itself, through this they can output higher resolutions if they desire, the only difference is that the firmware has no 4K option in the console setup, YET, so forcing >1080p through graphics api could work but when 95%+ of the players will end up with a black screen or signal out of range errors what would be the point?