By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - will wii can have call ofduty4 graphics in the end of its lifetime or more?

HappySqurriel, I had a great laugh reading your post. I really did...

It's a shame that it contains a lot of errors and general misconception. On second thought, no it isn't because that's what made me laugh.

Look, I don't want to come off sounding like a jerk, but not a lot of what you said in your post makes any sense.

The difference between the Xbox and the GC was very small, which was apparent by comparing system specifications or simply comparing both system's best looking games.

If you honestly believe that either Microsoft or Sony was increasing specifications of their systems to the point when it was time they had to be released, you seem to have a very limited amount of knowledge as to how corporate decisions are made, especially when it comes to hardware designs. Seeing as there are too many hardware components in a console it would be a too lengthy of a post to try to explain the whole process. Let's just say that the only thing that could have been changed up to the upmost latest moment, which is well before any launch, would be clockspeed.

Now, it's also obvious that you know little to nothing about the process of developing a game or programming in general. The reason as to why you're seeing framerate problems is not because said console it being pushed at it's limits. If I wanted I could make a game with visuals comparable to that of a basic Ps2 game and make it stutter on the Ps3.
The reason as to why you're not seeing framerate problems on the Wii is because developers are familiar with the system's hardware. The fact that the majority of games released on the system boast visuals that wouldn't even make the Ps2 brake a sweat is certainly another reason.

Also, I don't get what you mean with:

they are pushing the limits of what is (currently) possible on the other platforms


They're closed systems. What's possible in 10 years is possible now, seeing as the hardware doesn't change at all. The only limit developers face is that of their own capabilities.

Visuals will be better in the course of time simply because of the reason that developers get more familiar with a certain piece of hardware. That familiarity will result in them knowing the possibilities and boundaries better as well as making them able to work around certain bottlenecks and exploit a system's strength.



Around the Network
Teragen said:
HappySqurriel, I had a great laugh reading your post. I really did...

It's a shame that it contains a lot of errors and general misconception. On second thought, no it isn't because that's what made me laugh.

Look, I don't want to come off sounding like a jerk, but not a lot of what you said in your post makes any sense.

The difference between the Xbox and the GC was very small, which was apparent by comparing system specifications or simply comparing both system's best looking games.

If you honestly believe that either Microsoft or Sony was increasing specifications of their systems to the point when it was time they had to be released, you seem to have a very limited amount of knowledge as to how corporate decisions are made, especially when it comes to hardware designs. Seeing as there are too many hardware components in a console it would be a too lengthy of a post to try to explain the whole process. Let's just say that the only thing that could have been changed up to the upmost latest moment, which is well before any launch, would be clockspeed.

Now, it's also obvious that you know little to nothing about the process of developing a game or programming in general. The reason as to why you're seeing framerate problems is not because said console it being pushed at it's limits. If I wanted I could make a game with visuals comparable to that of a basic Ps2 game and make it stutter on the Ps3.
The reason as to why you're not seeing framerate problems on the Wii is because developers are familiar with the system's hardware. The fact that the majority of games released on the system boast visuals that wouldn't even make the Ps2 brake a sweat is certainly another reason.

Also, I don't get what you mean with:

they are pushing the limits of what is (currently) possible on the other platforms


They're closed systems. What's possible in 10 years is possible now, seeing as the hardware doesn't change at all. The only limit developers face is that of their own incapability.

Visuals will be better in the course of time simply because of the reason that developers get more familiar with a certain piece of hardware. That familiarity will result in them knowing the possibilities and boundaries better as well as making them able to work around certain bottlenecks and exploit a system's strength.

Being that I'm a fairly good professional software developer with a degree in Pure Mathematics (with a focus on projective geometry and linear algebra) and another degree in Computer Science (with a focus on Computer Graphics) I think I have a VERY GOOD IDEA of what I am talking about ...

The fact of the matter is that most people have unrealistic expectation of improvements that are possible due to how far games came on the Playstation and PS2. Back in the early days of 3D graphics, most developers had a very poor understanding of the data structures and algorithms of creating a 3D game engine; developers like John Carmack introduced BSP trees, KD-Trees, Octrees, portals and countless more efficient datastructures and algorithms at events like the GDC which drastically improved games. With the PS2 developers finally started to take advantage of licenced game engines and middleware which (essentially) meant that they began to leverage the skills of far better developers to take advantage of the processing power of the console.

Most developers today are already leveraging licenced middleware which is being produced by a collection of the most educated and experienced 3D programmers in the world which means they're already getting really good performance out of these systems. As time goes on and their code becomes more optimized you will see improvements, but it is more likely that the performance increases will be used to produce graphics on the level of Lair at a decent framerate  ...

 

The comment on "Running out of time" was simple on purpose ... If you offered Sony/Microsoft the ability to double the number of cores in their processors, take advantage of a smaller process and increase their clock speed, or add a physics co-proccessor (all with no added cost to the system) they would have taken advantage of this technology; this is what I mean by they didn't choose the processing power of their system.



People are forgetting about the large increase in memory over the GC. That and we still don't know what additions if any were made to the architecture. The GC was sadly pretty much ignored power wise by most developers late in its life.

Whether we will see COD4 I don’t know, but there will be large improvements. I’m pretty sure many games were not use the 64MB of DDR3 memory very effectively as many looked worse than GC quality which is pathetic. I don’t need great graphics, but I would like 480P widescreen support and better than GC which I think is a fair thing to ask from devs that and 60FPS or 30FPS rock steady at a minimum.

I also display the Wii on a 720P DLP projector displaying a 10’ wide image and the quality Wii games look fine on it.



I almost forgot about middleware it wasn't really optimized for the GC so who knows what will happen in the future. Factor 5 has been quoted as being disgusted with the quality of 3rd party games on the Wii.

The ATi GPU isn't really an ATi GPU, but ArtX so its a different beast alltogether.

http://www.news.com/2100-1040-237000.html

Jimbo signing off end of line...........................................



HappySqurriel said:
 

Being that I'm a fairly good professional software developer with a degree in Pure Mathematics (with a focus on projective geometry and linear algebra) and another degree in Computer Science (with a focus on Computer Graphics) I think I have a VERY GOOD IDEA of what I am talking about ...

The fact of the matter is that most people have unrealistic expectation of improvements that are possible due to how far games came on the Playstation and PS2. Back in the early days of 3D graphics, most developers had a very poor understanding of the data structures and algorithms of creating a 3D game engine; developers like John Carmack introduced BSP trees, KD-Trees, Octrees, portals and countless more efficient datastructures and algorithms at events like the GDC which drastically improved games. With the PS2 developers finally started to take advantage of licenced game engines and middleware which (essentially) meant that they began to leverage the skills of far better developers to take advantage of the processing power of the console.

Most developers today are already leveraging licenced middleware which is being produced by a collection of the most educated and experienced 3D programmers in the world which means they're already getting really good performance out of these systems. As time goes on and their code becomes more optimized you will see improvements, but it is more likely that the performance increases will be used to produce graphics on the level of Lair at a decent framerate ...

 

The comment on "Running out of time" was simple on purpose ... If you offered Sony/Microsoft the ability to double the number of cores in their processors, take advantage of a smaller process and increase their clock speed, or add a physics co-proccessor (all with no added cost to the system) they would have taken advantage of this technology; this is what I mean by they didn't choose the processing power of their system.


Congratulations, this post is a lot more intelligent than your previous one. You almost had me fooled there... Moving on, I'm afraid I'm missing the point of the majority of your post. I'm assuming you're saying that the brightest minds are developing middleware, so developers can spend the majority of resources on the actual development of the game - resulting in only minor performance improvements.

 

Middleware helps no one if it itself isn't optimized for a certain system. The recent issues developers had with UE3 for the Playstation 3 is an example of this. Not to mention that even with the same engine a game could look either like a last-gen game or an impressive on for todays standards. Again, I'm not sure what you were getting at. This could probably be due to my limited comprehension skills though...

 

There's nothing indicating that the future visual improvements of both the Xbox 360 and Ps3 will be minor. Especially in the Playstation 3's case.

 

Your last paragraph doesn't make much sense though...

If you offered Sony/Microsoft the ability to double the number of cores in their processors, take advantage of a smaller process and increase their clock speed, or add a physics co-proccessor (all with no added cost to the system) they would have taken advantage of this technology

Wouldn't any company, including Nintendo? The thing is, both companies were developing the components for their respective systems well in advance, whether is was the Cell processor, FlexIO bus, Xenos, Xenon, RSX or anything else. Such undertakings are very difficult as everything has to work together in a closed architecture and have to meet many criteria which directly relate to price and heating. I'm sorry but no matter how you spin this, the statement you made in your original post just doesn't make any sense - at the very least not to me.



Around the Network
Teragen said:

If you offered Sony/Microsoft the ability to double the number of cores in their processors, take advantage of a smaller process and increase their clock speed, or add a physics co-proccessor (all with no added cost to the system) they would have taken advantage of this technology

Wouldn't any company, including Nintendo? The thing is, both companies were developing the components for their respective systems well in advance, whether is was the Cell processor, FlexIO bus, Xenos, Xenon, RSX or anything else. Such undertakings are very difficult as everything has to work together in a closed architecture and have to meet many criteria which directly relate to price and heating. I'm sorry but no matter how you spin this, the statement you made in your original post just doesn't make any sense - at the very least not to me.


The obvious fact is that Nintendo had the choice to produce a more powerful console without dramatically changing the cost of the hardware and instead choose to release the Wii as it is; the Wii could (easily) be overclocked to double (or possibly tripple) the current clockspeed and all that would be required is a somewhat larger case and an improved cooling system and yet they choose this speed.

The likely reason Nintendo choose the performance of the Wii was that in peak performance the Gamecube was approaching what could (realistically) achieved on standard definition; with the Wii there should be (almost) no polygonalization artifacts, there should (almost) be no noticeable texture artifacts, and there should be (noticeably) improved effects which should be very close to the limit of what is noticeable on standard definition displays. Certainly, there could be some improvement to the effects which are displayed but those effects are far more noticeable at higher resolutions; basically, most material effects relate to how a material's lighting change over a small area and a higher resolution allows you to see smaller areas of a surface.

My reasoning why you could see a far greater improvement in what we see from the Wii than what we will see from the XBox 360 or PS3 is how focused developers are currently on getting good performance out of each of the systems; basically, every developer is working hard to get amazing graphics out of the PS3 and XBox 360 while few developers have even tried on the Wii. All systems will see some improvements from developers understanding the hardware better and producing more optimized code, they will all see improvements from developers being more efficient when they produce graphical assets specifically for the hardware, but the Wii will also benefit from developers actually trying to see what they can produce on the Wii.



HappySqurriel said:
Teragen said:
HappySqurriel, I had a great laugh reading your post. I really did...

It's a shame that it contains a lot of errors and general misconception. On second thought, no it isn't because that's what made me laugh.

Look, I don't want to come off sounding like a jerk, but not a lot of what you said in your post makes any sense.

The difference between the Xbox and the GC was very small, which was apparent by comparing system specifications or simply comparing both system's best looking games.

If you honestly believe that either Microsoft or Sony was increasing specifications of their systems to the point when it was time they had to be released, you seem to have a very limited amount of knowledge as to how corporate decisions are made, especially when it comes to hardware designs. Seeing as there are too many hardware components in a console it would be a too lengthy of a post to try to explain the whole process. Let's just say that the only thing that could have been changed up to the upmost latest moment, which is well before any launch, would be clockspeed.

Now, it's also obvious that you know little to nothing about the process of developing a game or programming in general. The reason as to why you're seeing framerate problems is not because said console it being pushed at it's limits. If I wanted I could make a game with visuals comparable to that of a basic Ps2 game and make it stutter on the Ps3.
The reason as to why you're not seeing framerate problems on the Wii is because developers are familiar with the system's hardware. The fact that the majority of games released on the system boast visuals that wouldn't even make the Ps2 brake a sweat is certainly another reason.

Also, I don't get what you mean with:

they are pushing the limits of what is (currently) possible on the other platforms


They're closed systems. What's possible in 10 years is possible now, seeing as the hardware doesn't change at all. The only limit developers face is that of their own incapability.

Visuals will be better in the course of time simply because of the reason that developers get more familiar with a certain piece of hardware. That familiarity will result in them knowing the possibilities and boundaries better as well as making them able to work around certain bottlenecks and exploit a system's strength.

Being that I'm a fairly good professional software developer with a degree in Pure Mathematics (with a focus on projective geometry and linear algebra) and another degree in Computer Science (with a focus on Computer Graphics) I think I have a VERY GOOD IDEA of what I am talking about ...



Sorry, after you wrote this I kind of lost interest in what you had to say. I'm an Electrical Engineering grad so I guess I have a much better understanding of hardware than you do (not that that has anything to do with what you know about this topic). Oh, and I software develop as well. Big deal.

So far the games from Nintendo and every third party look like last-gen games, so until something comes that looks better than several last gen games I'm going to assume the many developers are right that the Wii isn't much more powerful than the Xbox. You can see the specs on wikipedia if you'd like. Every fact is cited there.

Developers need to be honing the controls and making the wiimote work better, not trying to even attempt CoD4 graphics. Make better games and leave the graphics to the systems capable of expanding on them.



windbane said:

Sorry, after you wrote this I kind of lost interest in what you had to say. I'm an Electrical Engineering grad so I guess I have a much better understanding of hardware than you do (not that that has anything to do with what you know about this topic). Oh, and I software develop as well. Big deal.

So far the games from Nintendo and every third party look like last-gen games, so until something comes that looks better than several last gen games I'm going to assume the many developers are right that the Wii isn't much more powerful than the Xbox. You can see the specs on wikipedia if you'd like. Every fact is cited there.

Developers need to be honing the controls and making the wiimote work better, not trying to even attempt CoD4 graphics. Make better games and leave the graphics to the systems capable of expanding on them.


windbane has a point here though he doesn't know why or even what his point is.

Wii developers should be focusing on honing the controls and making better games.  The reason is very simple.  A quirky control scheme is one of the biggest selling points of the Wii as a console as its one of the Wii's best features.  Graphics, not so much.  That doesn't mean the Wii can't produce great looking games (it most certainly can) however as nice as the graphics will be... those won't be what sell the games in most cases.  People aren't buying Wii games for great graphics rather they're buying because Wii games are fun to play.

Developers should focus on the Wii's control scheme and tapping it to create games made of fun, awesome, and win... not so much great HD graphics.



This must be a joke topic. Surely?



Words Of Wisdom said:
windbane said:

Sorry, after you wrote this I kind of lost interest in what you had to say. I'm an Electrical Engineering grad so I guess I have a much better understanding of hardware than you do (not that that has anything to do with what you know about this topic). Oh, and I software develop as well. Big deal.

So far the games from Nintendo and every third party look like last-gen games, so until something comes that looks better than several last gen games I'm going to assume the many developers are right that the Wii isn't much more powerful than the Xbox. You can see the specs on wikipedia if you'd like. Every fact is cited there.

Developers need to be honing the controls and making the wiimote work better, not trying to even attempt CoD4 graphics. Make better games and leave the graphics to the systems capable of expanding on them.


windbane has a point here though he doesn't know why or even what his point is.

Wii developers should be focusing on honing the controls and making better games. The reason is very simple. A quirky control scheme is one of the biggest selling points of the Wii as a console as its one of the Wii's best features. Graphics, not so much. That doesn't mean the Wii can't produce great looking games (it most certainly can) however as nice as the graphics will be... those won't be what sell the games in most cases. People aren't buying Wii games for great graphics rather they're buying because Wii games are fun to play.

Developers should focus on the Wii's control scheme and tapping it to create games made of fun, awesome, and win... not so much great HD graphics.


Thanks for agreeing with me, but you pretty much said the same thing I did.  I'm sorry it's not as clear to everyone else that the Wii isn't about graphics and people are buying it for the control scheme.  I'm pretty sure I know why I had a good point, thanks.