walsufnir said:
|
I don't think you read my statement properly...
walsufnir said:
|
I don't think you read my statement properly...
ethomaz said:
I thought it won because they tested GPGPU (GPU compute) and PS4 have 8 ACEs, 64 queues for compute while Xbone only 2 ACEs, 8 queues... and of course the obvious difference in RAW power helps too (18 vs 12 CUs). |
Damn I meant to say cpu lol. My bad.
ethomaz said:
I thought it won because they tested GPGPU (GPU compute) and PS4 have 8 ACEs, 64 queues for compute while Xbone only 2 ACEs, 8 queues... and of course the obvious difference in RAW power helps too (18 vs 12 CUs). |
I don't think the 64 queues and ACEs make the big difference here. Those queues will make a difference when there are a lot of different GPU compute tasks contending for the GPU. That's unlikely the case in a benchmark like this, where they have said they have combined their work into larger tasks.
Bandwidth could be the biggest bottleneck. The presentation talks about mapping data between CPU & GPU, so the RAM speed and the coherency features likely make a big difference.
"didn't use the X1's ESRAM" - ESRAM is a limited resource so devs have to choose if they use it for backbuffer or textures or GPU compute data. Can't have it all, since the XB1 has 8GB of main RAM and only 32MB of ESRAM, majority of a game's data is resident in RAM and needs to use RAM bandwidth to copy to/from ESRAM.
Captain_Tom said:
SoM for example was running at 1080p with high settings while the X1 was running 900p at medium settings. That is pretty close to the difference (Minus the GPGPU advantage knowone has completily tackeled yet). |
Yeah i mostly agree with u but they also still using midle ware like direct x and other stuff to. That said when developer like naughty Dogs able to code close to metal, i hope every deveolper who trully want to improvise learn from Naughty Dog in the future.
HollyGamer said:
Yeah i mostly agree with u but they also still using midle ware like direct x and other stuff to. That said when developer like naughty Dogs able to code close to metal, i hope every deveolper who trully want to improvise learn from Naughty Dog in the future. |
Hey, don't get me wrong - I completely agree that the grand majority of devs have been lazy with the extra performance. Heck I will use ACU as an EXCELLENT example:
-Ubisoft says that the CPU's are not fast enough to handle 60 FPS with so many people on screen. This is true if you only use the CPU for both of those things. However large numbers of similar calculations are great for GPGPU. Hmmmmm, oh yeah Ubisofts hordes of stupid civilians would be perfect for GPGPU. But they know that the X1's paltry GPGPU capabilities wouldn't make much of a difference, and so they were to lazy to take full advantage of the PS4 hardware.
-Yes Naughty Dog and Guerilla Games will come out and make ACU look practically last gen in comparison. Don't kid yourself, third parties never want their games to look bad on even one platform and they will change there ways to catch up. Just look at how ND and GG forced DICE to make BF3 take full advantage of the PS3. It had higher res tectures, better lighting, more particle affects, and better AA than the 360 once they used the CELL's SPs effectivelly...And the PS3 was only 50% stronger than the 360 at best, so imagine what will happen now...
Captain_Tom said:
|
Why wouldn't they? And where do they say they didn't? Any detail on the test for Xbone is missing. They are going from DX11 to PS4 optimization and PS4 details talk.
ICStats said:
This benchmark gives the performance for "5 milliseconds", in other words it is peak CPU performance without system overhead, and it is in line with the XB1's faster CPU clock speed. The reason XB1 came short overall is because of system overhead (for Kinect voice recognition, multiple OS, snap, etc.). |
What? So you say the system overhead fluctuates, let's say, every ten ms? That's most likely not true and I would be glad to be told otherwise with data.
walsufnir said:
|
I wouldn't because everyone knows that the ESRAM is ALWAYS going to be used by the GPU and not the CPU.
walsufnir said: Why wouldn't they? And where do they say they didn't? Any detail on the test for Xbone is missing. They are going from DX11 to PS4 optimization and PS4 details talk. |
I guess it is why you need the eSRAM for framebuffer... most devs will use eSRAM for that and it is already not enough for big 1080 framebuffer with post-processing stufs.
They don't have space on eSRAM to put GPU compute.
No dev will sacrifice framebuffer to use the eSRAM for other thing.
Gotta love the cell.
Not surprising really. As usual exclusives will be the ones to show the gap.
e=mc^2
Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)