By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - PS4 GPU is HD7870 confirmed

gmcmen said:
wow this means ps4 even more powerful then previously thought.

To be serious it is... after release the sites discovered:

  • TRUE AUDIO Support
  • 256MB DDR3 for the ARM CPU
  • 32MB Flash for unknown reasons
  • CPU clock over 1.6GHz (possible 1.8Ghz)


Around the Network
freedquaker said:

Another good point is that HD7870 is the ABSOLUTE MINIMUM performance yardstick you can expect from a game. In other words, with no optimization whatsoever, developers can expect at least 20-30% higher performance than on PC. This also means that 90% XB1 exclusives would run equally or better on PS4 without additional optimization whatsoever.

I am loving this generation as it is much more transparent to us, damn it!


Pretty much. 

Of course, MS can pay for exclusives and if their teams are good enough, they can write brilliant code, better than crappy code on PS4.

Not that MS has any sort of monopoly on brilliant code.

But there's not really anything unique about Xbone that is enabling any better result from the same code that PS4 couldn't execute better.

XBone's ESRAM architecture itself is imposing a limitation that is reducing software design choices for developers to fit into that 32MB window.

And unlike the PS3's SPUs, the ACE/GPGPU approach is entirely translateable from current GPU programming approaches (even if it is ahead of the pack),

in fact XBone's own vastly fewer ACEs will otherwise be utilized similarly to PS4's, although the number of them certainly matters here.

(so it's forseeable that we may see crossplatform games where PS4 does 2-10x the amount of the same GPGPU task as XBone, i.e. # of physics objects)



Why isnt anyone able to tell the cpu/gpu clock yet?



RIP ps3, xbox360.

welcome home ps4 and X1

ethomaz said:
gmcmen said:
wow this means ps4 even more powerful then previously thought.

To be serious it is... after release the sites discovered:

  • TRUE AUDIO Support
  • 256MB DDR3 for the ARM CPU
  • 32MB Flash for unknown reasons
  • CPU clock over 1.6GHz (possible 1.8Ghz)

what about the news here about the gpu being hd7870



I do think that the GPGPU angle is also coming off bad on XBone when you do consider the number of CUs.
If any GPGPU task would interfere with dedicating those fully to conventional graphics, that is reducing the graphic experience.
PS4 just has that 50% headroom that they can fully dedicate to other tasks before they reach the same level as XBone's CUs.



Around the Network
mutantsushi said:
I do think that the GPGPU angle is also coming off bad on XBone when you do consider the number of CUs.
If any GPGPU task would interfere with dedicating those fully to conventional graphics, that is reducing the graphic experience.
PS4 just has that 50% headroom that they can fully dedicate to other tasks before they reach the same level as XBone's CUs.


MS disables 2 CUs in favor of an increased clock speed. Something tells me that the GPU on the PS4 is overpowered compared to its CPU. Not a bad thing but it makes it so that the X1 GPU isn't necessarily "50%" less capable.

gmcmen said:

what about the news here about the gpu being hd7870

Spec wise = HD 7870
GCN wise = GCN 1.1 (the same of R9 290X, R7 260)



mutantsushi said:
Trunkin said:
Really? But its performance in multiplats(namely BF4) is perilously close to an HD7870 rig, though. With optimizations I would've thought it'd easily beat out a Windows HD7870 PC, even at launch, but that doesn't seem to be the case. Why is that?

Yes, without optimizations it is on par with HD7870.  Especially in a cross-platform like BF4 they just aren't making those optimizations at this point.

To make those optimizations, they would need roughly 10x the number of GPGPU command usage as the Xbone, and clearly they are just

not going to make that amount of platform specific optimization with unique codepath (that even most PCs can't handle) at this stage in the game.

With Sony's approach to this (ACE), they have chosen an approach that IS in line with general state of the art graphics programming though,

even if the # of ACEs is beyond anything seen in anything but top of the line PC GPUs, the same sorts of programs can and are written for ACEs,

so there is a larger number of programmers out there ready and able to make use of that, and what they learn for PS4 will be applicable to future PC gaming.

PS4's unified memory should further enable synergy between ACE/GPGPU and threads runnin on the CPU, along with the general benefits of fixed hardware,

which XBone also shares albeit with reduced CUs, etc., and informed opinion seems to be that Sony's API is more powerful than MS'.

So you're saying that, thanks to this GPGPU, both the CPU and GPU will be able to execute tasks considerably more efficiently and effectively than either would be able to separately in a similarly speced PC, and that, with more optimizations, that gap will only increase with time? Also, that the PS4 is superior to the XBO in every concievable way -- even moreso than we originally thought, but the gap won't be as noticeable for multiplats because devs won't be willing to spend the extra time and to reach the PS4's full potential?

(I'm a bit of an ignoramous on this subject...)



ethomaz said:

gmcmen said:

what about the news here about the gpu being hd7870

Spec wise = HD 7870
GCN wise = GCN 1.1 (the same of R9 290X, R7 260)


sony deserves to sell 150 miliion for giving us such amazing hardware compared to the competition.



ethomaz said:

gmcmen said:

what about the news here about the gpu being hd7870

Spec wise = HD 7870
GCN wise = GCN 1.1 (the same of R9 290X, R7 260)


And Radeon 7790, just more CU's and ACE units amongst other things.
It still won't compete with a Radeon 290 or 290X or even the 280 or 280X when Mantle is released, but it's certainly better than the Render-Output-Pipeline starved Xbox One.

This is just confirmation of what we knew earlier basically, but due to the clocks it's actuall performance is around the level of a Radeon 7850. - I would personally peg it between the 7850 and 7870, which is on the high-end spectrum of the mid-range PC discreet GPU market.




www.youtube.com/@Pemalite