Of course you can do things at 1080p on XB1...
But only with a double buffer (screen tearing).
Of course you can do things at 1080p on XB1...
But only with a double buffer (screen tearing).
In Sebbi's response he seems to have forgotten about the 16 shadow casting lights that generate 64MB of data time 2 (128MB) ... Since the EDRAM is already full with the frame buffer data sets that's becoming quite limiting.
Let's say for a minute that this would equalize the bandwidth, then you hit the real wall, the GPU, it has only so many compute/shader units, limited ROPS and ACEs, all of which are significantly lacking on the XB1...
TL:DR; MS was right when they said their architecture was balanced, the GPU they picked does not need more memory bandwidth, it would probably go unused most of the time.
Justagamer said:
|
Yes ryse with optimizations and newer techniques could run 1080p 60fps if it released in a year or two with the same assets. I just used Ryse as an example since it looks great, has a sub 1080p resolution, and has an unsteady framerate. Nothing against the game, but it's just a good example that ESRAM isn't really the problem.
Also PS4 is different than XBONE in that it has a single pool of ram. This thread is about how it's hard to fit the buffer in the fast RAM. XBONE has 32 MB of fast ram, while PS4 has 8 GB of fast ram. Devs could theoretically make PS4 games with a 3 GB buffer if they wanted. It really just shows that RAM isn't the problem though. Thief has bad framerate because of the GPU, just like XBONE games have bad resolution and framerate because of the GPU. Making gpu code better is what'll really enhance graphics on both systems.
| Justagamer said:
|
Has there been any analysis made on Thief, on both platforms? Honestly, other than that one site nobody seem to have noticed it ran better or worse on either console, just that one guy who honestly sounded more like a delusional fan boy than anything (I rarely use that word, but this case is pretty low). I mean, even sites that normally lean heavily on the MS side like Polygon did not bring up any difference, while the same site tried to say near 60fps on Tomb Raider, that can take a dip from time to time, was worse than the never fluid 30fps version.
I'm not saying there are no dips, I just doubt that they are only on the PS4... and it would be that one stray example, going against every other multi platform titles, assuming that the ps4 was somehow the only one with the drops I strongly doubt it would be a trend for future titles.
LemonSlice said:
|
That was pretty funny xD.
| JoeTheBro said: Yes ryse with optimizations and newer techniques could run 1080p 60fps in a year or two. I just used Ryse as an example since it looks great, has a sub 1080p resolution, and has an unsteady framerate. Nothing against the game, but it's just a good example that ESRAM isn't really the problem.
|
Man you are getting way ahead of yourself here, Ryse would never run at 1080p/60fps with a simple software patch, if that was that simple believe me they would have at least released a 1080p patch by now!
going from 900p to 1080p takes 1.44 times more ressrources (about the theorical difference between the XB1 and PS4)
Going from around 24fps (the game is advertised as 30fps, but it dips waayyyy below that quite eavily a lot of the time) to 60 means 2.5 times more ressources... the two together means it would take a machine almost 4 times more powerful to do the exact same thing with both the higher resolution and frame rate...
Now I understant that optimisation can do wonders, but I doubt the Crytec guys were lazy, they cut down on the leinght of the game a lot, made the combat and AI extremely simple... everything was simplified so they could put all their efforts in the visual aspect of the game, you should not spread false hope like that to people, there is a hard limit on what the PS4 and xbox 1 can do, the ceiling for the xb1 is much lower and it will not be blown... the example given by the programmer should really be used as a warning more than hope...
The frame buffer in killzone:SF was reported to use 144MB at times, unless you leave some significant parts of it in the DDR3 you will have to make some visual sacrifices, like Forza 5 did or cut the frame rate by a significant ammount, like Tomb Raider did (along with texture quality and depth of field effects).
alabtrosMyster said:
Man you are getting way ahead of yourself here, Ryse would never run at 1080p/60fps with a simple software patch, if that was that simple believe me they would have at least released a 1080p patch by now! going from 900p to 1080p takes 1.44 times more ressrources (about the theorical difference between the XB1 and PS4) Going from around 24fps (the game is advertised as 30fps, but it dips waayyyy below that quite eavily a lot of the time) to 60 means 2.5 times more ressources... the two together means it would take a machine almost 4 times more powerful to do the exact same thing with both the higher resolution and frame rate... Now I understant that optimisation can do wonders, but I doubt the Crytec guys were lazy, they cut down on the leinght of the game a lot, made the combat and AI extremely simple... everything was simplified so they could put all their efforts in the visual aspect of the game, you should not spread false hope like that to people, there is a hard limit on what the PS4 and xbox 1 can do, the ceiling for the xb1 is much lower and it will not be blown... the example given by the programmer should really be used as a warning more than hope... The frame buffer in killzone:SF was reported to use 144MB at times, unless you leave some significant parts of it in the DDR3 you will have to make some visual sacrifices, like Forza 5 did or cut the frame rate by a significant ammount, like Tomb Raider did (along with texture quality and depth of field effects). |
I reworded my post. I wasn't talking about what will happen to the game, but what could have happened in an alternate universe.
Also as far as I know I've only heard of Killzone using a 32MB buffer. This was a long time ago so things could have changed. Do you have a link for 144MB?
alabtrosMyster said:
I'm not saying there are no dips, I just doubt that they are only on the PS4... and it would be that one stray example, going against every other multi platform titles, assuming that the ps4 was somehow the only one with the drops I strongly doubt it would be a trend for future titles. |
Well, I was just saying that sloppy code makes a sloppy game. That's all, I'm sure the x1 version is the same or even worse.... Too bad too, I really wanted thief to be good.... I was ready to jump on it.
1080p native/more stable framerate/general IQ is going to be a case of time and developer experience, just look at how games how improved from initial launch games on PS3/X360 such as COD2, Forza 2, Resistance: Fall of Man to games later on such as Gears of war 2, God of War 3, Dirt 2, to ultimate games such as The Last of Us, Halo 4 and Assassin's Creed 4: Black Flag.
It's also going to be a case of developer size and technical resources, behemoth teams such as Ubisoft/Epic Games/343 Industries/Turn10/Sony Santa Monica/Naughty Dog have much more resources and technical expertise/personal to spare per project than smaller developers.
Let's remember, the gen has just started for PS4 & Xbox One.

| VitroBahllee said: This must be what that 1080p SDK helps with. I will be interested in seeing if that isn't just talk. |
Its interesting that this is now the 3rd develloper since the new SDK rollout at the start of the month to say this. In this case a 3rd party correcting a Sony first party. Which is Rare.