By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - HD console graphics performance comparison charts

Ucell said:
DonFerrari said:
Ucell said:
walsufnir said:
Ucell said:
walsufnir said:
I still believe WiiU is way more capable than PS360, yes. But, besides from rumors about specs, just look at its power consumption. Assuming that WiiU is a powerful machine by todays standards is laughable, even Xbone and (lesser so) PS4 are. Yes, WiiU is also designed with power saving parts but this won't do wonders.
But why even bother? The WiiU has way, way more power than Wii so it's clearly an upgrade for Nintendo gamers. Just accept that bigN live in their own generation and everything is fine :)

Did you just imply PS4 is less powerful than Xbone? Well then your'e wrong my friend, the opposite has been proven countless times from both the specs and the games.

Oh, the defense force is on the way... Lurking the internet for people who could perhaps say Xbone is more powerful than the PS4, god forbid! I can't believe I am arguing this, really. EVERYBODY knows the specs, we read them a THOUSAND times at least.

Read what I wrote: by todays standards [...] laughable, [...] (lesser so) PS4... Do you need even more explanation? Is your heart beating at a normal rate now? *sigh*

Maybe you you could have just stated what you meant more clearly instead of meaningless attacks.

I could be a lot more rude, but then I'd be banned.

Man I'm a big sony fanboy and know puggsly is a defensor of Ms, but you are wrong here. You were to eager to attack him that you didn't noticed that he said that compared to high end pc all consoles are laughable, WiiU the more laughable and Ps4 lesser laughable than X1. You are doing meaningless attacks not him, he stated facts.



I wasn't attacking him but was rather just asking him hat he meant in a kind friendly manner.

Btw English is not my native language and not even the major language used in my country. So I do occasionally have a problem in understanding it.

My bad in missinterpreting you then, sorry. But your wording looked like it. No problem. But sure some people here could write in an easier to understand way.





duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
the-pi-guy said:
amak11 said:
Zekkyou said:

I wasn't planning on replying to anything else in this thread but that's just too hilarious.

Anyway, the NDA simply means no official source can comment on the WiiU's specifications. It doesn't stop outside sources buying a wattage meter and measure the energy flow of the WiiU. It's not exactly rocket science.

As i said to bigtakilla, anyone who believes AMD have created the most energy efficient GPU on the planet, then not monetized, utilized or progressed upon it in the slightest are being ridiculous. AMD is a company. They exist to make money, and that technology would be worth an awful lot of it. The fact they haven't attempted to use the concepts behind it in ANYTHING for almost 2 years now is proof enough it doesn't exist. Frankly, it's pretty embarrassing how desperate some people are to believe the WiiU's GPU is some kind of revolutionary new piece of technology. It's a $250 - $300 console for goodness sake

Anyway, i really am leaving this thread now. That last comment was just too hilarious to pass up on x'D


NDA means Non-disclosure agreement, no one can be this stupid. Meaning no official specifications have been released because they are not allowed to release said specs. There is assumed specsifications and there is right ones. Every spec chart is almost inaccurate. The only thing we do not know is the Wii U GPU which people speculated it's based on the Radeon 5000/6000 family or the Radeon 4000 family. This is why it's completely logical to assume 4-500 gflops for the console, which is slightly higher than the assumed max gflops for the console. 

What?  That's what he said.  

Also, you're completely missing what they've been saying.  

I misread, honest. But he's saying, take an off the shelf wattage metre and measure the maximum output of a console by measure how much electricity it uses? That's like me saying, I'm going to take this radar gun and measure your gas usage as you drive by me. He's basing technical ability 100% on wattage. Not on the custom hardware, not on the architecture, not on how the bus is laid out, purely on what is coming in from your wall. 

I can build a 200 dollar PC, and not have it near the Wii Us technical ability. AMD could have designed a extremely efficient and decent GPU to compliment the RISC-based Expresso CPU. AMD is already chimming in on combining RISC and CISC instruction sets in newer K12 CPUS they recently announced. 

http://techreport.com/review/26418/amd-reveals-k12-new-arm-and-x86-cores-are-coming



HoloDust said:

I know, what would we do without ignorance around here, right? ;)

As for PS4 and XOne specs - at least some of the top tier devs were expecting 2.5TFLOPS consoles, so you can see how they might be bit dissapointed with what they've got instead.


You're a little late to the ball Cinderella, he's gone.



Probably the interesting thing now is that apparently at $300 a Wii-U is not profitable and a PS4 at $400 is. I mean sure the tablet controller will raise the cost but looking at the specs of the Wii-U, a PS4 costing less than $100 more to make is just ridicilous.



amak11 said:
the-pi-guy said:
amak11 said:
Zekkyou said:

I wasn't planning on replying to anything else in this thread but that's just too hilarious.

Anyway, the NDA simply means no official source can comment on the WiiU's specifications. It doesn't stop outside sources buying a wattage meter and measure the energy flow of the WiiU. It's not exactly rocket science.

As i said to bigtakilla, anyone who believes AMD have created the most energy efficient GPU on the planet, then not monetized, utilized or progressed upon it in the slightest are being ridiculous. AMD is a company. They exist to make money, and that technology would be worth an awful lot of it. The fact they haven't attempted to use the concepts behind it in ANYTHING for almost 2 years now is proof enough it doesn't exist. Frankly, it's pretty embarrassing how desperate some people are to believe the WiiU's GPU is some kind of revolutionary new piece of technology. It's a $250 - $300 console for goodness sake

Anyway, i really am leaving this thread now. That last comment was just too hilarious to pass up on x'D


NDA means Non-disclosure agreement, no one can be this stupid. Meaning no official specifications have been released because they are not allowed to release said specs. There is assumed specsifications and there is right ones. Every spec chart is almost inaccurate. The only thing we do not know is the Wii U GPU which people speculated it's based on the Radeon 5000/6000 family or the Radeon 4000 family. This is why it's completely logical to assume 4-500 gflops for the console, which is slightly higher than the assumed max gflops for the console. 

What?  That's what he said.  

Also, you're completely missing what they've been saying.  

I misread, honest. But he's saying, take an off the shelf wattage metre and measure the maximum output of a console by measure how much electricity it uses? That's like me saying, I'm going to take this radar gun and measure your gas usage as you drive by me. He's basing technical ability 100% on wattage. Not on the custom hardware, not on the architecture, not on how the bus is laid out, purely on what is coming in from your wall. 

I can build a 200 dollar PC, and not have it near the Wii Us technical ability. AMD could have designed a extremely efficient and decent GPU to compliment the RISC-based Expresso CPU. AMD is already chimming in on combining RISC and CISC instruction sets in newer K12 CPUS they recently announced. 

http://techreport.com/review/26418/amd-reveals-k12-new-arm-and-x86-cores-are-coming


I bet if someone using a speeding radar see someone going 100mph no one would disagree that he isn't doing 50MPG at that tiime... so what are you discussing?



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
AnthonyW86 said:

Probably the interesting thing now is that apparently at $300 a Wii-U is not profitable and a PS4 at $400 is. I mean sure the tablet controller will raise the cost but looking at the specs of the Wii-U, a PS4 costing less than $100 more to make is just ridicilous.

That's possibly in part to do with the sales of each console. As Wii U hasn't been selling well, Nintendo can't afford to bulk buy large stocks of components as they'll just end up as wasted inventory. This in turn means they can't negotiate great deals based as they're not ordering in bulk.

PS4 on the other hand has been selling quickly so Sony knows it needs a large amount of components and can probably negotiate better deals on the hardware simply by virtue of large/bulk orders.



bigtakilla said:
curl-6 said:
petalpusher said:

Super mario 3d has some edge detect, everybody did see it on pre release screen, like i can see Mario Kart has none... Edge detect is the cheapest way to add AA and almost free, even compared to FXAA, there 's really nothing to brag about having edge detect AA. It's kind of a prehistoric solution.

If they could add MSAA/SMAA or even MLAA they would put some in their game for sure ("being conservative" not putting MSAA, is that a joke ?)

FXAA has been wide spread on last gen console cause it's damn cheap. On PC you usually have MSAA hopefully and some form of morphological AA (combined it can be great)

Actually, quite a lot of people ran around saying 3D World had no AA based on pre-release screens, and they turned out to be wrong.

What are you suggesting, that the Wii U can't do MSAA? That's already been proven false, as Black Ops II and Ghosts on Wii U both use it.

Yeah, people going off of screens is rediculously funny. I laugh literally every time a screen is posted as proof of anything. It is a cute effort though.

As i expected, MK8 is 720p no AA 

http://images.eurogamer.net/2013/articles//a/1/6/7/9/4/0/9/3.bmp.jpg/EG11/resize/1920x-1

http://images.eurogamer.net/2013/articles//a/1/6/7/9/4/0/9/8.bmp.jpg/EG11/resize/1920x-1

"Starting from the top, then, there has been a surprising amount of confusion surrounding the resolution of the game with some sources even suggesting a native 1080p presentation. We can finally put that rumour to rest right here and confirm that Mario Kart 8 instead operates at what is effectively the console's standard 1280x720. Of course, considering the quality of the visuals, this can hardly be considered a disappointment especially when other developers are struggling to hit 1080p consistently on more powerful hardware. What is surprising, however, is the complete omission of anti-aliasing in any form. At the very least, Nintendo has previously utilised a basic edge-smoothing algorithm across its Wii U titles and such a feature could have demonstrably improved image quality without a serious performance hit. As it stands, however, we're left with a heavily aliased presentation filled with obvious stair-stepping and pixel-crawling artefacts throughout most scenes. Busier areas can even result in a loss of detail to the point of reducing visibility."

http://www.eurogamer.net/articles/2014-05-22-mario-kart-8-tech-gallery

Framerate is not perfect 60 and hdoes have some stuttering but it's not too bad.



petalpusher said:
bigtakilla said:
curl-6 said:
petalpusher said:

Super mario 3d has some edge detect, everybody did see it on pre release screen, like i can see Mario Kart has none... Edge detect is the cheapest way to add AA and almost free, even compared to FXAA, there 's really nothing to brag about having edge detect AA. It's kind of a prehistoric solution.

If they could add MSAA/SMAA or even MLAA they would put some in their game for sure ("being conservative" not putting MSAA, is that a joke ?)

FXAA has been wide spread on last gen console cause it's damn cheap. On PC you usually have MSAA hopefully and some form of morphological AA (combined it can be great)

Actually, quite a lot of people ran around saying 3D World had no AA based on pre-release screens, and they turned out to be wrong.

What are you suggesting, that the Wii U can't do MSAA? That's already been proven false, as Black Ops II and Ghosts on Wii U both use it.

Yeah, people going off of screens is rediculously funny. I laugh literally every time a screen is posted as proof of anything. It is a cute effort though.

As i expected, MK8 is 720p no AA 

http://images.eurogamer.net/2013/articles//a/1/6/7/9/4/0/9/3.bmp.jpg/EG11/resize/1920x-1

http://images.eurogamer.net/2013/articles//a/1/6/7/9/4/0/9/8.bmp.jpg/EG11/resize/1920x-1

"Starting from the top, then, there has been a surprising amount of confusion surrounding the resolution of the game with some sources even suggesting a native 1080p presentation. We can finally put that rumour to rest right here and confirm that Mario Kart 8 instead operates at what is effectively the console's standard 1280x720. Of course, considering the quality of the visuals, this can hardly be considered a disappointment especially when other developers are struggling to hit 1080p consistently on more powerful hardware. What is surprising, however, is the complete omission of anti-aliasing in any form. At the very least, Nintendo has previously utilised a basic edge-smoothing algorithm across its Wii U titles and such a feature could have demonstrably improved image quality without a serious performance hit. As it stands, however, we're left with a heavily aliased presentation filled with obvious stair-stepping and pixel-crawling artefacts throughout most scenes. Busier areas can even result in a loss of detail to the point of reducing visibility."

http://www.eurogamer.net/articles/2014-05-22-mario-kart-8-tech-gallery

Framerate is not perfect 60 and hdoes have some stuttering but it's not too bad.


Obvious, but good enough, pleasant design.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

CrazyGPU said:

Performance graphs are great, and it shows that theoretically a PS4 GPU is in line with Radeon HD 7850 or near HD 7870. Its weird that even with that amount of power, its not able to run games like Battlefield 4 at 1080p 30 fps. Same with Watch Dogs. Those games run at 900p and scale. Scalliing makes textures more blurry. 

Now, Could it be that the AMD Jaguar multicore CPU is holding back the performance of the GPU?. 

"Sucker Punch said in a note in their own GDC 2014 Post-Mortem (regarding the CPU) “While the CPU has ended up working pretty well, it’s still one of our main bottlenecks."

http://www.redgamingtech.com/ infamous-second-son-post-mortem-part-2-ps4-performance-compute-particle-system/
Battlefield 4 uses 95% of CPU power of PS4
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/

I was thinking that with optimization, the "PS4 HD 7850 GPU equivalent" would get better graphics than the PC counterpart, but if the CPU is holding back performace, then it might be the case why the PS4 cant achieve 1080p in most games like the PC card does, and the XBONE is almost 720p because of its weaker GPU.

Cerny says that the GPGPU on the PS4 can make up for CPU weakness, but if the graphic card is doing compute, would it be able to cope with the same graphic quality? Actually Nvidia takes a hit when a graphic card use PhysX.

Also this link from this forum is saying that memory can become a bottleneck too. 

http://gamingbolt.com/crytek-8gb-ram-can-be-easily-filled-up-will-surely-be-limiting-factor-on-ps4xbox-one

Any PS4 dev here or someone with deep knowledge to comment on this?


I have no PS4 dev skill, but what I can say for sure is that any talk about how much percent of the CPU is used is a complete and utter bullsh*t.
Any dev, any team, and any game at any time can easily use 100% of the cpu. The work and talent is about using the cpu efficiently to achieve good results, which can't be expressed easily in percent (percent of what ?).

Also the discussion about raw power of the gpu doesn't really work. If we have similar hardware, for example from the same generation, that's fine, but if not it just doesn't work, because power is a lot about what can do in hardware or not. Like having shader or not, what generation, how many pipelines. Even Sony talk BS about it, like PS4 being 54 more powerfull than PS2, you can't count count like that.



im still surpsised how 1080p and EVEN 30fps is not a standard.....wii u gets tons of games that are 1080p and 60fps and some 720p and 60fps....

its all how the hardware is used.