By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - WiiU - Lots of new Info - Same poor result. Why Nintendo Direct just confirms my fears.

Hynad said:
Tarumon said:
Hynad said:
Tarumon said:
Hynad said:
Raze said:
True, but it'd be bad business to neglect an extra few million dollars in revenue simply because a company has a grudge against another. CEOs shouldn't dabble in highschool drama.

The graphical improvements are going to be very minor from Wii U to PS4/720, its all still 1080p. At best, there will be more ram and processor speed. I've made a point elsewhere, that I wouldn't hold my breath to see a leap in graphic quality from PS3 to PS4 or 360 to 720. The bottleneck is now at the end user - we're still in a world where the highest resolution output in homes is 1080p. We won't see 2k or 4k technology affordable and commonplace for at least another generation AFTER the PS4/720/WiiU.  We're fairly maxed out on graphical improvements for another decade.

As for the gamepad, this is a good point, but not every game has to use it. Wii controls work in Wii U, from what I understand of it.  So it can be bypassed.

Hum, what about no? Resolution isn't all. Draw distance, polygons on screen, lighting effects (and the number of active light sources), among plenty of other things can still be pushed way beyond what we currently see on the current gen consoles and the Wii U. We see movies like Avatar on our 1080p screens and the graphics of games are nowhere near that level. So yes, things can still get much better than they are now, even if the resolution stays at 1080p for the next decade.



How about any Discovery channel HD programming on 720p compared with the highest resolution game available on the planet? Avatar is rendered via processes and actual artists over hundredred million in costs, the cheapest dvd players and BR players can display them on your TV, the processing took place "outside the box".  Don't act like PS4 and 720 can render godlike graphics, they are CHEAP equipment even compared with moderately priced PCs.  There is nothing under the hood that makes the graphics that awesome because next gen consoles are last gen Computers.

^^^ How to miss the point with a mouth

How about, no Avatar is a teribl example?


Oh, one of those who can't be wrong. Noted

1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  



Around the Network

Sorry I had to abort quoting because my ipad is not handling it well. I think the best example I can give is over the last ten years, the two best of breed examples of video gaming in my mind would be Blizzard and Nintendo. Both of these firms sold more of their games than their closest rivals, made much more money, and neither of them pushed the edge of graphics...available to them at the time.

The simple reason is they want more of their customers to have access to their games and they just didnt care to put a huge dent in their fans' pockets by demanding super high end equipment. I agree with Raze on the display end that, for a good number of years 1080p is about as good as it gets. Any better than that, you will have ugly regular HD television. Just watch a regular channel on a HD TV these days, they are butt ugly...because there are more pixels than there is data to fill them, so you either have to upscale (ugly), or reduce screen size (ugly in a different way).

But I do agree with you, Hynad, that the 1080p canvass is much more than where current gen or Wii U has been able to exploit. My not so gentle (rude) interruption using Avatar as "my" example is that the costs and scale of production needed just to render such breath taking shots from one angle for split seconds at a time is NOT where even Nextgen consoles can even come close. With their added horsepower, the granularity of things will improve, frame rate will too. But that ain't no Avatar, where if need by, human artists can spend weeks to perfect a shot. On a console, the cutscenes could be literaly the same Avatar scenes...even today (we do eatch bluray movies on PS3 already right?), but the programming and hardware real time rendering would take is beyond the scope and budget of the home console market. For maybe the next 20 years.



that malstrom is a hoot! But iMO he did make some good points, the VC could be handled much better, the miiverse sounds cumbersome and a bit useless. Im with him with the Wonderful 101. Im getting NMH and Madworld vibes with that one. People seem to only care that it is exclusive not whether the game is actually good or not and I can see it not doing great. Other than that he sounds really cantankerous.

I personally liked ND, it made me more interested in Wii U than I was before. Not sure what the OP issue is? Does he only care about Ninty's pockets and "winning" or does he actually like playing those Wiiwhatever minigame compilations? And if so how much of those can one play?



Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.  



Is it opposite day? Most people thought 2010 was one of Ninty's best E3s ever, and most people were pleased about the latest Nintendo Direct.

Also, I don't see how Nintendo has abandoned the blue ocean strategy. They released a system very similar to the Wii, and it's offering the same old casual crap, just with more Nintendo games with actual effort put into them. I see no reason for complaint.



Around the Network

Malstrom has become a very bitter middle-aged man. I've listened to a lot of his stuff and agreed 95% of the time in the past, but more and more I'm seeing him going back on things he previously said, and just ranting like an angry chimpanzee.

 

The 3DS has done far better than he thought it would. NSMB2 has done far better than he thought it would. Even the 360 and PS3 have done far better than he thought they would. He hates the Wii U right now, he hates Zelda Wii U (baselessly, inspite of all of its news being good things - and I often agree with him about Zelda!) right now. I wouldn't be surprised if he ends up wrong about them as well.

He has turned his back on his previous love affairs with Diablo 3 and WoW, when before he went out of his way to attack anyone who dared to badmouth them. Now he is one of them. He isn't as accurate as he used to be. He's still worth reading because he does still have some good points on certain subjects, but I wonder for how much longer.

 

 

 

He's declining as much as Nintendo has been.



Hynad said:

Hum, what about no? Resolution isn't all. Draw distance, polygons on screen, lighting effects (and the number of active light sources), among plenty of other things can still be pushed way beyond what we currently see on the current gen consoles and the Wii U. We see movies like Avatar on our 1080p screens and the graphics of games are nowhere near that level. So yes, things can still get much better than they are now, even if the resolution stays at 1080p for the next decade.



So, maybe instead of a snarky 20-something response, you could try conversing like a human?

The 720/PS4 have been in development for more than 2 years now, which will still be weaker than my Win 7 PC. The graphics are a bit better than HD console equivalents, but it's no WOW factor like PS2 to PS3, etc. It's such a minor improvement that many people might not be convinced to make the jump, just based on graphics alone.  As someone else pointed out, Avatar and all HD movies are all pre-rendered screens, into AVI or quicktime format (I do editing for independent films, so I know this process well). Their extremely expensive computers process this compiling over hours, and in some cases, days. Your 1080p real-time rendering will be a little faste,r but nothing to really see a drastic shift.

Sorry to break it to you, but the future of console gaming is MEH at best, until 2k/4k tvs are the norm. More importantly, the difference between Wii U and its Sony and MS competitors will be quite minor, and it's going to come down to separation by fanboy sects.



The Carnival of Shadows - Folk Punk from Asbury Park, New Jersey

http://www.thecarnivalofshadows.com 


Hynad said:
Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.  


I think you have a fundamental lack of understanding of computer graphics.  What is taxing to the system is not the display resolution, 1920x1080 is all it takes to fill the pixels.  It's the physics required to render believable images at that resolution that is taxing.  It is much easier for a computer to render 100 images on the same sized screen using smaller pixels on a higher resolution screen than forcing a programmer to render the same 100 images on the same sized screen using bigger pixels.  TV resolution refers to the number of physical pixels.  Your computer has a hard time rendering at 1080p because your gpu and cpu cannot support the number of calculations needed to render the image on a timely basis.

Your verdict is based on a system that rendered pictures, not as intended.  Systems that can easily produce 1080p with all the physics produce BETTER quality pictures with cleaner edges.  There are effects you simply dont see, such as shimmering.

Video Game = Interactive Computer Generated Images.  What is taxing is how to emulate reality with computer generated images.  A monitor with higher resolution allows finer dots to draw with. 1080p is much fewer pixels than most decent sized computer monitors. Grapics have improved as the GPUs and CPUs have improved.  Current gen consoles have problems populating 720p screens at much beyond 30 frames per second.  The limiting facor was what was under the hood.

We now have the 1080p.  Wii U is 1080p capable.  That doesn't mean if everyone rendered at 1080p, the graphics will be nearly the same quality.  The physics could be infinitely taxing, layers and layers of effects can still be piled on.  When I said Avatar was a bad example, I didn't argue with much else of what you said, bur Avatar is CGI (with human art vs computer generated in spots).  Just because the new Super Bug can do 0-62 in 1.8 seconds doesn't mean you can use that as an example to prove all cars with four wheels with a Scion budget has much room for improvement.  No!  Consoles are budget computers just like the one you got.  There is no way in hell these consoles can go anywhere near Avatar, which is just a series of images displayed one after another with ZERO interaction that causes any recalculations. 

Again, I am ok if you insult me with "as if I don't understand what I'm saying".  But I do hope you are able to separate TV resolution from computer generated images quality.  Understand that it wasn't the resolution that caused your comouter to huff and puff but the PHYSICs.  Things wouldn't look as jaggedy if they were painted with a finer brush.  The granularity is EASY.  But making all those grains dance, shimmer, reflect light is super hard and super taxing.  If you lowered the resolution, your pc finally catches up, but the Physics wont work if by default your display resolution is less granular than what the damn engine is trying to refine.  

Thats why I really agree that 1080p display resolution is gonna be it for a while, at the expense of 4k TVs. So I agree with you that resolution is not the only way to improve graphics (even though it's the easiest way), but respectfully disagree with how much room there can be with console budgets.



Raze said:
Hynad said:
 

Hum, what about no? Resolution isn't all. Draw distance, polygons on screen, lighting effects (and the number of active light sources), among plenty of other things can still be pushed way beyond what we currently see on the current gen consoles and the Wii U. We see movies like Avatar on our 1080p screens and the graphics of games are nowhere near that level. So yes, things can still get much better than they are now, even if the resolution stays at 1080p for the next decade.



So, maybe instead of a snarky 20-something response, you could try conversing like a human?

The 720/PS4 have been in development for more than 2 years now, which will still be weaker than my Win 7 PC. The graphics are a bit better than HD console equivalents, but it's no WOW factor like PS2 to PS3, etc. It's such a minor improvement that many people might not be convinced to make the jump, just based on graphics alone.  As someone else pointed out, Avatar and all HD movies are all pre-rendered screens, into AVI or quicktime format (I do editing for independent films, so I know this process well). Their extremely expensive computers process this compiling over hours, and in some cases, days. Your 1080p real-time rendering will be a little faste,r but nothing to really see a drastic shift.

Sorry to break it to you, but the future of console gaming is MEH at best, until 2k/4k tvs are the norm. More importantly, the difference between Wii U and its Sony and MS competitors will be quite minor, and it's going to come down to separation by fanboy sects.


Snarky 20-something? Oh, that's cute.

One thing is sure, you don't know the others next-gen console's specs. All we have right now are rumors.

That being said, I wasn't talking about next gen consoles per-se. Just the nonsense you mentioned about resolution. Which I already addressed in one of my earlier post. You mentioned 2k resolution into the mix. 2K is so close to 1080p (2048 × 1152 compared to 1920x1980) that mentioning it as a likely future new standard is silly at best. 4K is definitely going to be the next evolution. But even until it becomes the standard, 1080p still hasn't been maxed by console, and even PCs barely manage to take full advantage of it, let alone any higher resolution. GPUs are still going to evolve, get faster, capable of rendering more per frames, etc, even if the resolution remains at 1080p. 

Hey, go play Call of Duty 3 in 1080p. Then put Skyrim, Witcher 2 or Crysis 2 at that same resolution. If this doesn't give you the big picture, I don't know what will.



Tarumon said:
Hynad said:
Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.  


I think you have a fundamental lack of understanding of computer graphics.  What is taxing to the system is not the display resolution, 1920x1080 is all it takes to fill the pixels.  It's the physics required to render believable images at that resolution that is taxing.  It is much easier for a computer to render 100 images on the same sized screen using smaller pixels on a higher resolution screen than forcing a programmer to render the same 100 images on the same sized screen using bigger pixels.  TV resolution refers to the number of physical pixels.  Your computer has a hard time rendering at 1080p because your gpu and cpu cannot support the number of calculations needed to render the image on a timely basis.

Your verdict is based on a system that rendered pictures, not as intended.  Systems that can easily produce 1080p with all the physics produce BETTER quality pictures with cleaner edges.  There are effects you simply dont see, such as shimmering.

Video Game = Interactive Computer Generated Images.  What is taxing is how to emulate reality with computer generated images.  A monitor with higher resolution allows finer dots to draw with. 1080p is much fewer pixels than most decent sized computer monitors. Grapics have improved as the GPUs and CPUs have improved.  Current gen consoles have problems populating 720p screens at much beyond 30 frames per second.  The limiting facor was what was under the hood.

We now have the 1080p.  Wii U is 1080p capable.  That doesn't mean if everyone rendered at 1080p, the graphics will be nearly the same quality.  The physics could be infinitely taxing, layers and layers of effects can still be piled on.  When I said Avatar was a bad example, I didn't argue with much else of what you said, bur Avatar is CGI (with human art vs computer generated in spots).  Just because the new Super Bug can do 0-62 in 1.8 seconds doesn't mean you can use that as an example to prove all cars with four wheels with a Scion budget has much room for improvement.  No!  Consoles are budget computers just like the one you got.  There is no way in hell these consoles can go anywhere near Avatar, which is just a series of images displayed one after another with ZERO interaction that causes any recalculations. 

Again, I am ok if you insult me with "as if I don't understand what I'm saying".  But I do hope you are able to separate TV resolution from computer generated images quality.  Understand that it wasn't the resolution that caused your comouter to huff and puff but the PHYSICs.  Things wouldn't look as jaggedy if they were painted with a finer brush.  The granularity is EASY.  But making all those grains dance, shimmer, reflect light is super hard and super taxing.  If you lowered the resolution, your pc finally catches up, but the Physics wont work if by default your display resolution is less granular than what the damn engine is trying to refine.  

Thats why I really agree that 1080p display resolution is gonna be it for a while, at the expense of 4k TVs. So I agree with you that resolution is not the only way to improve graphics (even though it's the easiest way), but respectfully disagree with how much room there can be with console budgets.

You really are a mouthful. -_-

You basically say the same thing I do, but from a different point of view. 

You say that the physics are what's taxing. And then say it is easier for that physics to be rendered at a lower resolution. I say the resolution is taxing, because rendering that same physics at a higher resolution makes my PC struggle.

You play semantics. I understand you like to hear yourself talk and read your own words. But if you want to argue over something, at least do it when the situation remotely requires it.

The point I was making, you finally agreed to it. But the judge is still out as to how much the visuals (and all that it implies) can be improved if our displays remain at a 1080p standard.