By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - HD console graphics performance comparison charts

JazzB1987 said:

memory bandwidth not including edram makes exactly 0 sense.

Its like saying 360 vs X1 CPU performance not including more than 3 cores.

The eDRAM is not included because you can't just add the bandwidth to that off the main memory. Sure developers can use it to partly make up for slower main memory, but their is no way to add that to the graphs. Even with eDRAM faster memory still benefits graphics performance. In comparable PC graphics cards for example the bandwidth difference between PS4/XOne would give the PS4 20-30% higher framerates. eDRAM will make up for that but not all of it.

And that's why it's really important to note that although Wii-U has 2gb it runs at really low speeds. 32mb of e RAM cannot make up for that(and the e RAM of the XOne is also 3 times faster than that used in the Wii-U).



Around the Network

(Delete, misread)



Porcupeth said:
Squeezol said:

And still no 1080p. Maybe it was just unreasonable for me to think that 1080p 30FPS would be a standard this gen, (except for Wii U of course) but whatever. I'd really rather have 1080p than extra effects and whatnot.


All Sony exclusives are 1080p. Don't blame PS4 for third party's incompetence :)

also not sure why you mention Wii U...several of its games are 720p/30fps, including exclusives.

Lol no, Wii U exclusives (Nintendo-ones) don't run 30 FPS

New Super Mario U - 720p@60fps
Nintendo Land - 720p@60fps
Pikmin 3 - 720p@60fps
The Legend of Zelda Wind Waker HD - 1080p@60fps
Mario Kart 8 - 1080p@60fps

Anyway going forward I expect PS4X1 games third-party wise to adopt 900p for PS4 and something and 792p for X1 to take more advantage of nextgen stuff and for visual parity effects/animation-wise



ryuendo89 said:
Porcupeth said:
Squeezol said:

And still no 1080p. Maybe it was just unreasonable for me to think that 1080p 30FPS would be a standard this gen, (except for Wii U of course) but whatever. I'd really rather have 1080p than extra effects and whatnot.


All Sony exclusives are 1080p. Don't blame PS4 for third party's incompetence :)

also not sure why you mention Wii U...several of its games are 720p/30fps, including exclusives.

Lol no, Wii U exclusives (Nintendo-ones) don't run 30 FPS

New Super Mario U - 720p@60fps
Nintendo Land - 720p@60fps
Pikmin 3 - 720p@60fps
The Legend of Zelda Wind Waker HD - 1080p@60fps
Mario Kart 8 - 1080p@60fps

Anyway going forward I expect PS4X1 games third-party wise to adopt 900p for PS4 and something and 792p for X1 to take more advantage of nextgen stuff and for visual parity effects/animation-wise


Pikmin 3 and ZombiU are 720p/30fps

 

oh and Wind Waker HD is 1080p/30fps, not 60fps.



curl-6 said:
AnthonyW86 said:
JazzB1987 said:

memory bandwidth not including edram makes exactly 0 sense.

Its like saying 360 vs X1 CPU performance not including more than 3 cores.

The eDRAM is not included because you can't just add the bandwidth to that off the main memory. Sure developers can use it to partly make up for slower main memory, but their is no way to add that to the graphs. Even with eDRAM faster memory still benefits graphics performance. In comparable PC graphics cards for example the bandwidth difference between PS4/XOne would give the PS4 20-30% higher framerates. eDRAM will make up for that but not all of it.

And that's why it's really important to note that although Wii-U has 2gb it runs at really low speeds. 32mb of e RAM cannot make up for that(and the e RAM of the XOne is also 3 times faster than that used in the Wii-U).

I doubt you have a legitimate source for that.

It's More or less true CPU is responsible not the memory speed, even like that X1 CPU comunication to RAM is 30 GB/sec vs PS4 20 GB/sec, I call it bad optimisation since PS4 is way easier to develop for, Look at Wii U thirdpartys Almost the double GFlops of PS3 GPU-wise and still struggles in framerates because of it's CPU. X1 CPU is well know for being an 8-core @ 1.75 Ghz and PS4 is a 8-core 1.6 Ghz confirmed by both Sony and M$ time will not give graphics but will give framerates to X1, i suspect.

In fact 32 MB of eDRAM can make for it since like in X1 on Wii U eDRAM is built in the same chip, one think is edrams and esrams being separate chips, then yes since they would need to connect t each other from chip A to B. Since wii u edram is built inside the same chip of the GPU it's being used without delay everytime it's needed.



Around the Network

That info is wrong PS3 GPU have 176 GFlops not 192



ryuendo89 said:
curl-6 said:

I doubt you have a legitimate source for that.

It's More or less true CPU is responsible not the memory speed, even like that X1 CPU comunication to RAM is 30 GB/sec vs PS4 20 GB/sec, I call it bad optimisation since PS4 is way easier to develop for, Look at Wii U thirdpartys Almost the double GFlops of PS3 GPU-wise and still struggles in framerates because of it's CPU. X1 CPU is well know for being an 8-core @ 1.75 Ghz and PS4 is a 8-core 1.6 Ghz confirmed by both Sony and M$ time will not give graphics but will give framerates to X1, i suspect.

In fact 32 MB of eDRAM can make for it since like in X1 on Wii U eDRAM is built in the same chip, one think is edrams and esrams being separate chips, then yes since they would need to connect t each other from chip A to B. Since wii u edram is built inside the same chip of the GPU it's being used without delay everytime it's needed.


We have no confirmation of CPU clock speed of PS4, much less from Sony... it is rumoured to be between 1.6 and 2 GHz. And possibily equal to X1.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

curl-6 said:
AnthonyW86 said:
JazzB1987 said:

memory bandwidth not including edram makes exactly 0 sense.

Its like saying 360 vs X1 CPU performance not including more than 3 cores.

The eDRAM is not included because you can't just add the bandwidth to that off the main memory. Sure developers can use it to partly make up for slower main memory, but their is no way to add that to the graphs. Even with eDRAM faster memory still benefits graphics performance. In comparable PC graphics cards for example the bandwidth difference between PS4/XOne would give the PS4 20-30% higher framerates. eDRAM will make up for that but not all of it.

And that's why it's really important to note that although Wii-U has 2gb it runs at really low speeds. 32mb of e RAM cannot make up for that(and the e RAM of the XOne is also 3 times faster than that used in the Wii-U).

I doubt you have a legitimate source for that.

Actually wait, I misread that as Xbox 360, my bad.



The ps4's CPU is 1.6 Ghz at its BASE clock speed. Similarly to most modern PC CPUs (and unlike the XB1 CPU), it increases clock speed automatically when required (gaming). Several companies give it different names, like Turbo mode, etc. My Intel CPU is 3.4 Ghz but goes to 3.8 Ghz when gaming. The thing is that we don't know what the PS4's CPU jumps to, when it does that. Could be 1.8 or 2.0.



Squeezol said:

And still no 1080p. Maybe it was just unreasonable for me to think that 1080p 30FPS would be a standard this gen, (except for Wii U of course) but whatever. I'd really rather have 1080p than extra effects and whatnot.

720p wasn't even the standard in the beginning of the 7th gen, but it got there. I think 1080p 30fps will be the standard eventually.



I am the Playstation Avenger.