By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - HD console graphics performance comparison charts

curl-6 said:

I honestly don't know if there's enough techheads on VGChartz to have a full-on discussion about the probabilities of Latte's performance, haha. 

And we both know what would happen, extremists from both sides would turn it into a circus. XD

But ofcourse ... This is totally the wrong place to have tech discussions when you have guys like megafeenix or ninjablade going around spreading crap. LOL

What you should do is go ask the guys at anandtech where the community is full of electrical/computer engineers and computer scientists which I aspire to be one of those but more so towards electrical/computer engineers. It's probably one of the best places to see a technical argument ... 



Around the Network

I'm just going to say two different things:

1. Empirical evidence > theoretical assumption. Anybody that has actually played Wii U exclusives would realize that the graphics fidelity far exceed what the PS3 or Xbox 360 have managed, with only the very last games with over seven years of optimizations being comparable.

2. You are attempting to compare things that aren't directly comparable, or at least not without a mathematical formula that would make a high school student cry. If you look back at quotes from chip designers that work with Nintendo, they all say that Nintendo insists on the chips being designed for doing all sorts of weird/unconventional stuff that was simply not the intended use of the components. Plus poorly funded port teams can get Xbox 360 games running on the Wii U with minimal difficulty, meaning that a poorly optimized Wii U game runs as well has a highly optimized Xbox 360 game (hmm...)



grahamf said:

2. You are attempting to compare things that aren't directly comparable, or at least not without a mathematical formula that would make a high school student cry. If you look back at quotes from chip designers that work with Nintendo, they all say that Nintendo insists on the chips being designed for doing all sorts of weird/unconventional stuff that was simply not the intended use of the components. Plus poorly funded port teams can get Xbox 360 games running on the Wii U with minimal difficulty, meaning that a poorly optimized Wii U game runs as well has a highly optimized Xbox 360 game (hmm...)

1. There are no chip designers at Nintendo. 

2. Where's your source that Nintendo requires it's chips having esoteric functions ? 

3. It's easier to program on the WII U than it is for the xbox 360 so don't go around assuming that more optimizations will make a big difference with the WII U. 



fatslob-:O said:

3. It's easier to program on the WII U than it is for the xbox 360 so don't go around assuming that more optimizations will make a big difference with the WII U. 

No, it isn't. Programming for Xbox 360 has been ridiculously easy for years because pretty much every major engine out there is intimately tailored to suit the hardware using nearly a decade of experience and optimization. Devs know the 360 inside out by now.

By comparison, little has been done to adapt these engines to Wii U, and devs are far, far less familiar with the hardware.

It may be easier to develop on Wii U now than it was to develop on 360 in 2005, since the Wii U's been out, there's never been a point where it was as easy to dev for as 360, and there likely never will be.



The cpu is only limiting your framerate, not your resolution.

What limits the resolution is your Gflops capabilities first, cause it's directly related to the amount of shader operations you can do, more pixels = more shader ops (at the same graphic level), then comes rasterisation, wich is basically transforming every part of your rendering in 2D datas of depth, colors.. embedded in a pixel, but a bit more complicated to be ROP bound theses days. Then you would have bandwidth as a limiting factor, almost equally, in certain case it kicks in before ROPs, especially if you have a big amount of texturing layers to process.

The cpu factor would only come last (and way behind bandwidth for instance), but would be huge in case you target 60fps, especially if you have a dense world with a lot of routines, physic, terrain navigation, and so on.



Around the Network
curl-6 said:

No, it isn't. Programming for Xbox 360 has been ridiculously easy for years because pretty much every major engine out there is intimately tailored to suit the hardware using nearly a decade of experience and optimization. Devs know the 360 inside out by now.

By comparison, little has been done to adapt these engines to Wii U, and devs are far, far less familiar with the hardware.

It may be easier to develop on Wii U now than it was to develop on 360 in 2005, since the Wii U's been out, there's never been a point where it was as easy to dev for as 360, and there likely never will be.

Having experience sure helps but developers are supposed to be adaptable when it comes to different platforms. Most developers do not concentrate on just one platform. 

How exactly could devs be far less familiar with a past 10 year old CPU microarchitecture and a modern GPU made by AMD ? 



fatslob-:O said:
curl-6 said:

No, it isn't. Programming for Xbox 360 has been ridiculously easy for years because pretty much every major engine out there is intimately tailored to suit the hardware using nearly a decade of experience and optimization. Devs know the 360 inside out by now.

By comparison, little has been done to adapt these engines to Wii U, and devs are far, far less familiar with the hardware.

It may be easier to develop on Wii U now than it was to develop on 360 in 2005, since the Wii U's been out, there's never been a point where it was as easy to dev for as 360, and there likely never will be.

Having experience sure helps but developers are supposed to be adaptable when it comes to different platforms. Most developers do not concentrate on just one platform. 

How exactly could devs be far less familiar with a past 10 year old CPU microarchitecture and a modern GPU made by AMD ? 

360 was the lead platform for multiplats for years, and even in cases where it wasn't devs had to port to it effectively. Virtually every third party dev working today is an expert in the 360 hardware.

And that's easy; less experience making games with them, engines not tailored around them.



grahamf said:
I'm just going to say two different things:

1. Empirical evidence > theoretical assumption. Anybody that has actually played Wii U exclusives would realize that the graphics fidelity far exceed what the PS3 or Xbox 360 have managed, with only the very last games with over seven years of optimizations being comparable.


Not really.

Not at all actually, not even close



We know this. We know already.  PS4 is the most powerful console by a significant margin.

1080p/30fps will be the standard of the best looking PS4 games this gen. Uncharted4 will be the 1st example of insanely marvellous graphics at 1080p/30fps.



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

LOL As Mario Kart creeps closer each day the Nintendo haters come out and play. Why does my Deluxe Set Box say 32 gig if it is so weak, maybe the 8 gig was but look up the 32 gig. and Wikipedia is not good source dude. I would go to each site and get there specs