By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - HD console graphics performance comparison charts

Porcupeth said:
Squeezol said:
Porcupeth said:
Squeezol said:

And still no 1080p. Maybe it was just unreasonable for me to think that 1080p 30FPS would be a standard this gen, (except for Wii U of course) but whatever. I'd really rather have 1080p than extra effects and whatnot.


All Sony exclusives are 1080p. Don't blame PS4 for third party's incompetence :)

also not sure why you mention Wii U...several of its games are 720p/30fps, including exclusives.

Yeah, and that's better than 1080p 30FPS in my opinion, but the point is that it isn't 1080p.


I'm confused. You said you'd rather have 1080p but that 720p/30fps is better than 1080p/30fps? Are you sure there isn't a typo on your posts?

Wait, that indeed doesn't seem right at all. -.- I'm confused as well now, oops. Sorry. I expected 1080p 30FPS on PS4/XboxOne but I didn't expect it on the Wii U. That's what I meant. I'm not dissapointed by 720P 60FPS or even 720P 30FPS on the Wii U because of how far behind the Wii was. I am dissapointed about the resolution and framerate of a lot of games on PS4/XboxOne though.



Around the Network

Yes... don't use unknown numbers... ignore that WiiU have games with graphics close to PS360 and just use the info that MK8 is 1080p60fps and then must be better than X1/PS4.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

curl-6 said:
AnthonyW86 said:
curl-6 said:

You don't even know what the Wii U GPU is, nobody does except devs/designers under NDA. Any public specs on it are speculative and therefore illegitimate.

We now more than enough about Wii-U's Latte chip, it's pretty much an AMD HD5550 with modifications to make the chip smaller and the eDRAM. The chip has been taken appart on numerous occations, and the simple fact is that the power draw does not allow for anything significantly more powerfull. The memory of the PS4 alone uses more power than the entire Wii-U gpu.

You don't have confirmed numbers or specs, so the graph is meaningless.

Word of advice; don't use wikipedia as a source; the only reference for its Wii U stats is a Eurogamer article which itself was mere speculation.

 

You sound like a college professor. Why do simple graphs based on numbers seem to bother you so much? Just curious.



Thanks jlmurph!

curl-6 said:
AnthonyW86 said:
curl-6 said:

You don't even know what the Wii U GPU is, nobody does except devs/designers under NDA. Any public specs on it are speculative and therefore illegitimate.

We now more than enough about Wii-U's Latte chip, it's pretty much an AMD HD5550 with modifications to make the chip smaller and the eDRAM. The chip has been taken appart on numerous occations, and the simple fact is that the power draw does not allow for anything significantly more powerfull. The memory of the PS4 alone uses more power than the entire Wii-U gpu.

You don't have confirmed numbers or specs, so the graph is meaningless.

Word of advice; don't use wikipedia as a source; the only reference for its Wii U stats is a Eurogamer article which itself was mere speculation.

Die foto's have been looked at countless times, and the specs given are actually the maximum possible and exceed what is possible with the given power draw. Even AMD newest R5 line has lower performance per watt figures, and those chips are considerably smaller(not even including the power draw of the eDRAM). Specifications like clock speed are confirmed.



game_on said:
AnthonyW86 said:
game_on said:
AnthonyW86 said:
curl-6 said:
Legitimate source or no credibility.

http://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

http://en.wikipedia.org/wiki/Nvidia_gpu

http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27

All specifications of any gpu are listed there(PS4 and Xbox One have been updated not to long ago). You can check any tech site, they al list the same numbers.

Ok, you made the graphs yourself I asume.

Absolutely, we are at VGChartz remember.

But the PS3 has a CISC processor and the Xbox 360 a RISC. So comparing those two on operations per second and GFLOP seems a bit strange to me. Besides, it was always common knowledge that the PS3 is more powerful than the Xbox 360, so showing quite the opposite 8 years later?

 I'm not an technical expert (far from that), but I think that if it was this easy to compare consoles on a scale like this, graphs like this would show up more often.


PS3 in total package is stronger than X360, but taking just the graphics processor X360 is stronger...



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
BreedinBull said:

curl-6 said:

 You don't have confirmed numbers or specs, so the graph is meaningless.

Word of advice; don't use wikipedia as a source; the only reference for its Wii U stats is a Eurogamer article which itself was mere speculation.

 

You sound like a college professor. Why do simple graphs based on numbers seem to bother you so much? Just curious

Treating speculation as fact perpetuates misinformation and ignorance.



AnthonyW86 said:

Die foto's have been looked at countless times, and the specs given are actually the maximum possible and exceed what is possible with the given power draw. Even AMD newest R5 line has lower performance per watt figures, and those chips are considerably smaller(not even including the power draw of the eDRAM). Specifications like clock speed are confirmed.

The people discussing the die photos can't even agree on the specs. It's not an off the shelf part, and only Nintendo and AMD, and developers under NDA know what customizations may have been made. Given that the entire system is designed with a fetish for minimizing power draw, its very possible the GPU was redesigned to be as power-efficient as possible.



game_on said:
AnthonyW86 said:
game_on said:
AnthonyW86 said:
curl-6 said:
Legitimate source or no credibility.

http://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units

http://en.wikipedia.org/wiki/Nvidia_gpu

http://en.wikipedia.org/wiki/RSX_%27Reality_Synthesizer%27

All specifications of any gpu are listed there(PS4 and Xbox One have been updated not to long ago). You can check any tech site, they al list the same numbers.

Ok, you made the graphs yourself I asume.

Absolutely, we are at VGChartz remember.

But the PS3 has a CISC processor and the Xbox 360 a RISC. So comparing those two on operations per second and GFLOP seems a bit strange to me. Besides, it was always common knowledge that the PS3 is more powerful than the Xbox 360, so showing quite the opposite 8 years later?

 I'm not an technical expert (far from that), but I think that if it was this easy to compare consoles on a scale like this, graphs like this would show up more often.

It's just for general comparison, and i added the ps3 and x360 to show what Wii-U is more closer to. PS3 does indeed have a lower GFLOPs number than X360, but so do all Nvidia gpu's compared to AMD's to this day and indeed that does not mean it's weaker at all fronts. The txtels number confirms this since it's higher on PS3(that's why i added those). Besides PS3 the rest is all based on AMD hardware though.



curl-6 said:
AnthonyW86 said:

Die foto's have been looked at countless times, and the specs given are actually the maximum possible and exceed what is possible with the given power draw. Even AMD newest R5 line has lower performance per watt figures, and those chips are considerably smaller(not even including the power draw of the eDRAM). Specifications like clock speed are confirmed.

The people discussing the die photos can't even agree on the specs. It's not an off the shelf part, and only Nintendo and AMD, and developers under NDA know what customizations may have been made. Given that the entire system is designed with a fetish for minimizing power draw, its very possible the GPU was redesigned to be as power-efficient as possible.

I agree and that's what i said in the earlier post, the modifications made were mostly to make it more energy efficient. They are not going to boost performance.



AnthonyW86 said:
curl-6 said:
AnthonyW86 said:

Die foto's have been looked at countless times, and the specs given are actually the maximum possible and exceed what is possible with the given power draw. Even AMD newest R5 line has lower performance per watt figures, and those chips are considerably smaller(not even including the power draw of the eDRAM). Specifications like clock speed are confirmed.

The people discussing the die photos can't even agree on the specs. It's not an off the shelf part, and only Nintendo and AMD, and developers under NDA know what customizations may have been made. Given that the entire system is designed with a fetish for minimizing power draw, its very possible the GPU was redesigned to be as power-efficient as possible.

I agree and that's what i said in the earlier post, the modifications made were mostly to make it more energy efficient. They are not going to boost performance.

By being more power efficient, performance would no longer correlate with power draw in the same way as with a standard part.