selnor1983 said:
Infinity Ward -
Witcher 3 developer -
"On the PS4 it's very good to have the fast memory. Everyone is really happy about that," Torok said. "We are not at the stage right now to go in and optimise on each platform specifically. We want to make the game and the whole engine run on everything, with all the features and bells and whistles, and then just optimise, optimise, optimise," Torok continued. "I don't see a major power difference. The memory is very different but I already said that before. Pure computation power, if you just measure that, there's no major difference."
Xbox One – ESRAM & 720P – Why It’s Causing A Resolution Bottleneck – Analysis "32MB of ESRAM was unfortunately the most they could squeeze on to the APU’s die, without starting to compromise the amount of GCN cores or CPU powerformance on the Xbox One to the point were having the extra memory would have been a waste of time. Microsoft’s previous console, the Xbox 360 used GDDR3 memory, this ran at 1400MHZ on a 128bit bus. In addition to this 512MB of GDDR3, the Xbox 360 also had 10MB of eDRAM, providing a total of 32GB/s between the GPU and the eDRAM. This memory gave the Xbox 360 basically “free” Anti-Aliasing along with other graphical effects. With the Xbox One, this isn’t the case – and with many gamers expecting and demanding 1080P for their next generation titles, it appears that the 32MB of ESRAM is simply insufficient to meet the needs." Is there anything games developers can do to help get around these issues, well unfortunately it’s going to require the developers to get used to working around the limitations. One problem developers have spoken about is having to manually flush the ESRAM. Microsoft supply the Xbox One with its own memory compressed render targets. These are very similar to what has already been working for the Xbox 360. They argue that by using these render targets, in conjunction with ESRAM and DDR3 developers can work around the ESRAM. The problem is that with today’s multi platform environments, especially with the rather commanding lead that PC’s and the Playstation 4 already have in terms of ease of development, how easy will this be to implement. It’s far easier to just say “well, let’s aim for 720P for the X1″.
As it turns out MS have updated the SDK and taken care of it for the developers. ESRAM was always the issue. Not the GPU as to why we have not seen same Res as PS4 so far. Like I said PS4 will likely be a little better. But even before MS changed the SDK to be better developers like The Witcher 3 guys ( awesome graphics blokes ) even say theres not much of a power difference.
But Im sure you'll twist this into something. http://www.redgamingtech.com/xbox-one-esram-720p-why-its-causing-a-resolution-bottleneck-analysis/ |
Not a single bit of that says anything about the GPU gap. Devs are not going to come out and say the PS4 is a lot more powerfull they are truying to sell their game on both systems.
Again, I am still waiting, where does any of thsat debunk the 40% GPU power advantage? It's PR. Hideo Kojima also said there was little difference....Now his gaem is 720p on x1, 1080p on ps4. You need to learn PR.
The irony is, the witcher will be running at a higer res then x1 as well. By then the new SDK will be in full force, it cannot make up for a weaker gpu. Same with watchdogs.







