By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - WiiU - Lots of new Info - Same poor result. Why Nintendo Direct just confirms my fears.

We are talking consoles here no? With PC gaming, the evolution is easy to see. Everytime a new GPU comes out, we see improved graphics performance. A high end GPU alone costs more than an entire console.

The nextgen systems will be bottle necked by the GPUs that go in. Those won't be the highest end. We will see an improvement in console graphics, but the new and improved console graphics we've seen on PC side...for a while now. To improve in graphics beyond what is reasonable, higher resolution is only way to go.

Frames per second maxes around 60fps? Some super humans claims to perceive more. But traditional films are 24 fps. Hobbit, a breakthrough @48 fps. Avatar sequel Cameron is eyeing 48 maybe even 60fps. There is only so much faster rendering can do.

The effects gained from PHYSICS are not independent of resolution and fps. The only room for improvement under the 1080p cap is physics. Maxing out on 1080p is a theoretical stage that makes no sense of getting there. Only reason it's being held back is the TV industry needs years, decade at a time to profitably upgrade from one resolution to the next.



Around the Network
Tarumon said:
We are talking consoles here no? With PC gaming, the evolution is easy to see. Everytime a new GPU comes out, we see improved graphics performance. A high end GPU alone costs more than an entire console.

The nextgen systems will be bottle necked by the GPUs that go in. Those won't be the highest end. We will see an improvement in console graphics, but the new and improved console graphics we've seen on PC side...for a while now. To improve in graphics beyond what is reasonable, higher resolution is only way to go.

Frames per second maxes around 60fps? Some super humans claims to perceive more. But traditional films are 24 fps. Hobbit, a breakthrough @48 fps. Avatar sequel Cameron is eyeing 48 maybe even 60fps. There is only so much faster rendering can do.

The effects gained from PHYSICS are not independent of resolution and fps. The only room for improvement under the 1080p cap is physics. Maxing out on 1080p is a theoretical stage that makes no sense of getting there. Only reason it's being held back is the TV industry needs years, decade at a time to profitably upgrade from one resolution to the next.

GPUs and CPUs get better and better. Things like grass, trees, geometry of figures and landscape, draw distance, among other things, have improved a lot over the years. Yet they're not nearly to the point they can be. Same goes with physics, indeed.

Take the tech demos that were released over the last 2 or so years. It gives a good example of the things that better hardware can render even if the resolution doesn't necessarily go higher.




Raze said:
Michael-5 said:

Maybe, but is that true for all cases, and will developers make Wii U editions which integrate WiiU Gamepad features?

If it were easy to just tack on Wii U Gamepad features, then why isn't the Wii U seeing versions of Mass Effect Trilogy, Grand Theft Auto 5, Dragon Age III, and Dark Souls 2?

If devs have a thing out against Nintendo, they won't port their games to Nintendo, it's as simple as that. They might even make engines which are outside the WiiU's capabilities. The graphical difference between WiiU and NextBox/PS4 is still greater then the PS3/360 to Wii U difference, and were still not seeing most PS360 games get ported.

True, but it'd be bad business to neglect an extra few million dollars in revenue simply because a company has a grudge against another. CEOs shouldn't dabble in highschool drama.

The graphical improvements are going to be very minor from Wii U to PS4/720, its all still 1080p. At best, there will be more ram and processor speed. I've made a point elsewhere, that I wouldn't hold my breath to see a leap in graphic quality from PS3 to PS4 or 360 to 720. The bottleneck is now at the end user - we're still in a world where the highest resolution output in homes is 1080p. We won't see 2k or 4k technology affordable and commonplace for at least another generation AFTER the PS4/720/WiiU.  We're fairly maxed out on graphical improvements for another decade.

As for the gamepad, this is a good point, but not every game has to use it. Wii controls work in Wii U, from what I understand of it.  So it can be bypassed.

But CEO's are like Children. Do you remember why Nintendo never localized Project Rainfall Games? Reggie didn't care for them? You know why he didn't localize Disaster: Day of Crisis? He didn't like the game. Same goes for other devs, except for Ubisoft, most devs hold restraint against Nintendo. I don't know if we'll see a mainstream Rockstar product on WiiU (Like GTA V).

As for resolution, I dunno if most WiiU games will be native 1080p. Aren't a lot of current gen gens sub 720p and then upscaled? Isn't Halo 4 the first HD Halo? I hear just to run current gen games in 1080p requires a good amount more CPU. Who cares if TV's aren't 2k or 4k, lets get 1080p down pact first.



What is with all the hate? Don't read GamrReview Articles. Contact me to ADD games to the Database
Vote for the March Most Wanted / February Results

Can someone tell me the difference between red ocean and blue ocean?



One more thing to complete my year = senran kagura localization =D

Hynad said:
Tarumon said:
Hynad said:
Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.  


I think you have a fundamental lack of understanding of computer graphics.  What is taxing to the system is not the display resolution, 1920x1080 is all it takes to fill the pixels.  It's the physics required to render believable images at that resolution that is taxing.  It is much easier for a computer to render 100 images on the same sized screen using smaller pixels on a higher resolution screen than forcing a programmer to render the same 100 images on the same sized screen using bigger pixels.  TV resolution refers to the number of physical pixels.  Your computer has a hard time rendering at 1080p because your gpu and cpu cannot support the number of calculations needed to render the image on a timely basis.

Your verdict is based on a system that rendered pictures, not as intended.  Systems that can easily produce 1080p with all the physics produce BETTER quality pictures with cleaner edges.  There are effects you simply dont see, such as shimmering.

Video Game = Interactive Computer Generated Images.  What is taxing is how to emulate reality with computer generated images.  A monitor with higher resolution allows finer dots to draw with. 1080p is much fewer pixels than most decent sized computer monitors. Grapics have improved as the GPUs and CPUs have improved.  Current gen consoles have problems populating 720p screens at much beyond 30 frames per second.  The limiting facor was what was under the hood.

We now have the 1080p.  Wii U is 1080p capable.  That doesn't mean if everyone rendered at 1080p, the graphics will be nearly the same quality.  The physics could be infinitely taxing, layers and layers of effects can still be piled on.  When I said Avatar was a bad example, I didn't argue with much else of what you said, bur Avatar is CGI (with human art vs computer generated in spots).  Just because the new Super Bug can do 0-62 in 1.8 seconds doesn't mean you can use that as an example to prove all cars with four wheels with a Scion budget has much room for improvement.  No!  Consoles are budget computers just like the one you got.  There is no way in hell these consoles can go anywhere near Avatar, which is just a series of images displayed one after another with ZERO interaction that causes any recalculations. 

Again, I am ok if you insult me with "as if I don't understand what I'm saying".  But I do hope you are able to separate TV resolution from computer generated images quality.  Understand that it wasn't the resolution that caused your comouter to huff and puff but the PHYSICs.  Things wouldn't look as jaggedy if they were painted with a finer brush.  The granularity is EASY.  But making all those grains dance, shimmer, reflect light is super hard and super taxing.  If you lowered the resolution, your pc finally catches up, but the Physics wont work if by default your display resolution is less granular than what the damn engine is trying to refine.  

Thats why I really agree that 1080p display resolution is gonna be it for a while, at the expense of 4k TVs. So I agree with you that resolution is not the only way to improve graphics (even though it's the easiest way), but respectfully disagree with how much room there can be with console budgets.

You really are a mouthful. -_-

You basically say the same thing I do, but from a different point of view. 

You say that the physics are what's taxing. And then say it is easier for that physics to be rendered at a lower resolution. I say the resolution is taxing, because rendering that same physics at a higher resolution makes my PC struggle.

You play semantics. I understand you like to hear yourself talk and read your own words. But if you want to argue over something, at least do it when the situation remotely requires it.

The point I was making, you finally agreed to it. But the judge is still out as to how much the visuals (and all that it implies) can be improved if our displays remain at a 1080p standard.


You have a hard time not rubbing in an insult here and there.  Your pc at a lower resolution REQUIRES anti-aliasing because the game feeds more data than can be displayed resulting in artifacts and distortions.  It is an extra step to do so you can view purposely blurried images that appears better to your eyes on that gimped gpu.  The image was downsized for you.

The same game was designed to deliver more pixels, and thus will look better and requires less work to display IF your gpu was designed to handle it.  Lower resolution is not necessarily easier.  A GPU that has the most advanced physics are going to be capable of much higher resolution than 1080p by default.  Higher resolution equals more details, crisper images = better graphics.  The gpu that go into consoles due to budget need a lot of engineering from the game developer side to optimize.  If they maximized the polygons, same pixels, but much more taxing to system resources, the console will grind to a halt.  The graphics bottleneck is much much harder to overcome than you think.

Just borrow a high end PC with high end GPU, connect the hdmi to your TV, play the game @ 1080p, put the same game on monitor. That is how much a 4,000 dollar system can do with 1080p.  Impressed? At least I'm not.   Wii U is a distant relative, ancestor in terms of technology to the beast that is doing th 1080p....how much can you expect?  

I won't reply to you anymore because insults are not why I came to these forums.  Have fun arguing.

 

 



Around the Network
Tarumon said:
Hynad said:
Tarumon said:
Hynad said:
Tarumon said:
1080p is enough, more than enough to display graphics the eyes can appreciate as high def, just like 720p is real life like in my exàmple of Discovery channel prgrams.  You start going into higher resolution not because the eyes need it, it's the programmers that need it to fool the eyes, with the polygons, shading, light sources, antialiasing, distortions, are extremely resource intensive computing processes.  Avatar is not a video game where you can navigate around without lag and actually perform actions that causes rendering and rerendering of all the environmental variables.  The box we buy that is the console just doesnt have what it takes to do that.  If my tone in my inital pass upsetted you, I apologize, I edited it but you replied before my edit came out.  


If you want to stick to Avatar, go back in time a little and look at the CGI movies of yesteryears and look at the game graphics. Compare them with what the game graphics are now compared to those old CGI. Avatar is quite the extreme example, but the meaning behind it is that Graphics manage to catch up to CGI quality over time. Even though the CGI keeps on progressing. The point still stands.

Now, when it comes to resolution, I'm not quite sure what you're trying to say. Or why you think my post deserves to be argued on. Resolution is more taxing to the hardware than most anything else.

Case in point, I can run Witcher 2 at its highest graphics settings, in 1360x768. My machine is roughly 3 years old and was built for gaming back then, for a moderate price. Now, if I run it with the same settings, but with the resolution at 1080p, I have to start lowering some of the graphics options because the frame rate just can't keep up. I prefer to stick to a lower resolution, because all the effects are consistent with the resolution I use. Where if I use 1080p, the assets and effects take a bigger hit because I have to scale back on the quality of those to make sure the frame rate is remotely smooth.  By doing so, I end up up with a less consistent look throughout. The verdict being that the game looks overall better at a lower resolution, even if some of the edges may not appear as clean.

The way you're talking seems to imply that if the resolution of the games don't go beyond 1080p, then graphics will stall and won't improve in quality even if the hardware gets better. I may be wrong, but that's what I got from the way you wrote this. Like you're talking about something you don't fully grasp. In any case, I'll just reiterate that a higher resolution isn't the only way to improve graphics, and is in fact more demanding to the hardware than most other graphical processes.

I already explained what could still be improved greatly even if the resolution remains the same. I don't see why you'd try to refute what I said, to be honest.

My case isn't about Next Gen and how powerful it's gonna be. My point revolves only around the 1080p resolution and what it's possible to push even if we stick with it for the forseeable future.  


I think you have a fundamental lack of understanding of computer graphics.  What is taxing to the system is not the display resolution, 1920x1080 is all it takes to fill the pixels.  It's the physics required to render believable images at that resolution that is taxing.  It is much easier for a computer to render 100 images on the same sized screen using smaller pixels on a higher resolution screen than forcing a programmer to render the same 100 images on the same sized screen using bigger pixels.  TV resolution refers to the number of physical pixels.  Your computer has a hard time rendering at 1080p because your gpu and cpu cannot support the number of calculations needed to render the image on a timely basis.

Your verdict is based on a system that rendered pictures, not as intended.  Systems that can easily produce 1080p with all the physics produce BETTER quality pictures with cleaner edges.  There are effects you simply dont see, such as shimmering.

Video Game = Interactive Computer Generated Images.  What is taxing is how to emulate reality with computer generated images.  A monitor with higher resolution allows finer dots to draw with. 1080p is much fewer pixels than most decent sized computer monitors. Grapics have improved as the GPUs and CPUs have improved.  Current gen consoles have problems populating 720p screens at much beyond 30 frames per second.  The limiting facor was what was under the hood.

We now have the 1080p.  Wii U is 1080p capable.  That doesn't mean if everyone rendered at 1080p, the graphics will be nearly the same quality.  The physics could be infinitely taxing, layers and layers of effects can still be piled on.  When I said Avatar was a bad example, I didn't argue with much else of what you said, bur Avatar is CGI (with human art vs computer generated in spots).  Just because the new Super Bug can do 0-62 in 1.8 seconds doesn't mean you can use that as an example to prove all cars with four wheels with a Scion budget has much room for improvement.  No!  Consoles are budget computers just like the one you got.  There is no way in hell these consoles can go anywhere near Avatar, which is just a series of images displayed one after another with ZERO interaction that causes any recalculations. 

Again, I am ok if you insult me with "as if I don't understand what I'm saying".  But I do hope you are able to separate TV resolution from computer generated images quality.  Understand that it wasn't the resolution that caused your comouter to huff and puff but the PHYSICs.  Things wouldn't look as jaggedy if they were painted with a finer brush.  The granularity is EASY.  But making all those grains dance, shimmer, reflect light is super hard and super taxing.  If you lowered the resolution, your pc finally catches up, but the Physics wont work if by default your display resolution is less granular than what the damn engine is trying to refine.  

Thats why I really agree that 1080p display resolution is gonna be it for a while, at the expense of 4k TVs. So I agree with you that resolution is not the only way to improve graphics (even though it's the easiest way), but respectfully disagree with how much room there can be with console budgets.

You really are a mouthful. -_-

You basically say the same thing I do, but from a different point of view. 

You say that the physics are what's taxing. And then say it is easier for that physics to be rendered at a lower resolution. I say the resolution is taxing, because rendering that same physics at a higher resolution makes my PC struggle.

You play semantics. I understand you like to hear yourself talk and read your own words. But if you want to argue over something, at least do it when the situation remotely requires it.

The point I was making, you finally agreed to it. But the judge is still out as to how much the visuals (and all that it implies) can be improved if our displays remain at a 1080p standard.


You have a hard time not rubbing in an insult here and there.  Your pc at a lower resolution REQUIRES anti-aliasing because the game feeds more data than can be displayed resulting in artifacts and distortions.  It is an extra step to do so you can view purposely blurried images that appears better to your eyes on that gimped gpu.  The image was downsized for you.

The same game was designed to deliver more pixels, and thus will look better and requires less work to display IF your gpu was designed to handle it.  Lower resolution is not necessarily easier.  A GPU that has the most advanced physics are going to be capable of much higher resolution than 1080p by default.  Higher resolution equals more details, crisper images = better graphics.  The gpu that go into consoles due to budget need a lot of engineering from the game developer side to optimize.  If they maximized the polygons, same pixels, but much more taxing to system resources, the console will grind to a halt.  The graphics bottleneck is much much harder to overcome than you think.

Just borrow a high end PC with high end GPU, connect the hdmi to your TV, play the game @ 1080p, put the same game on monitor. That is how much a 4,000 dollar system can do with 1080p.  Impressed? At least I'm not.   Wii U is a distant relative, ancestor in terms of technology to the beast that is doing th 1080p....how much can you expect?  

I won't reply to you anymore because insults are not why I came to these forums.  Have fun arguing.

 

 


Are you kidding me? Better hardware allows better graphics? Really? Who would have thought?

Thanks for the input, Jeff.

That's not what I was arguing, but ok. I guess it was nice witnessing this conversation you had with yourself. -_-



RolStoppable said:
CCFanboy said:
Can someone tell me the difference between red ocean and blue ocean?

Red ocean means to compete head to head in an established market. Companies cannibalize each other and profit margins get slimmer. Red ocean, because that's the color the sea turns into when the sharks go at each other. In business terms, companies begin to bleed money and only the strongest survive.

Blue ocean means to differentiate oneself and create new values. Sidestepping the competition in order to sell to consumers who are underserved by other companies. By being the only company to address a certain demand, the profit margins are high and the growth can be huge.

In a red ocean, the company that has the most resources usually comes out on top. That's why it would be wise for a company like Nintendo to not directly compete with Microsoft who have an enormous amount of money.


That's why I feel so bad for Sony.  Any market MSFT wades in turns into instant red sea.  They got jaws and they are after blood.  



RolStoppable said:Red ocean means to compete head to head in an established market. Companies cannibalize each other and profit margins get slimmer. Red ocean, because that's the color the sea turns into when the sharks go at each other. In business terms, companies begin to bleed money and only the strongest survive.

Blue ocean means to differentiate oneself and create new values. Sidestepping the competition in order to sell to consumers who are underserved by other companies. By being the only company to address a certain demand, the profit margins are high and the growth can be huge.

In a red ocean, the company that has the most resources usually comes out on top. That's why it would be wise for a company like Nintendo to not directly compete with Microsoft who have an enormous amount of money.


Thanks for clearing that up. Nintendo have really done their own thing since Iwata took over. So if microsoft decide to go the casual route then that is their choice.



One more thing to complete my year = senran kagura localization =D

After reading the whole debacle, I see this thread is more like a ideals clashing with Wii fans (the OP) and Nintendo fans (majority of us). And Nintendo trying to establish a middle ground, but we already heard Nintendo said they're going for core-gamers even before the WiiU was out, so I really don't see the point in this, only if it Nintendo off-shoot themselves too much on either side. But at this time it's too early to tell and just lead to speculation...



 And proud member of the Mega Mario Movement!
Hynad said:

That being said, I wasn't talking about next gen consoles per-se. Just the nonsense you mentioned about resolution. Which I already addressed in one of my earlier post. You mentioned 2k resolution into the mix. 2K is so close to 1080p (2048 × 1152 compared to 1920x1980) that mentioning it as a likely future new standard is silly at best. 4K is definitely going to be the next evolution. But even until it becomes the standard, 1080p still hasn't been maxed by console, and even PCs barely manage to take full advantage of it, let alone any higher resolution. GPUs are still going to evolve, get faster, capable of rendering more per frames, etc, even if the resolution remains at 1080p. 

Hey, go play Call of Duty 3 in 1080p. Then put Skyrim, Witcher 2 or Crysis 2 at that same resolution. If this doesn't give you the big picture, I don't know what will.

True, there will be some improvements, but I'm currently playing recently released games on my gaming PC rig, which was built in mid 2011, and while the graphics are good, with excellent hardware, the imrpovements aren't dramatic over the best-looking 360/PS3 game. Neither Sony nor MS will go towards the high end of GPUs, due to costs, so at best, they'll be on par with my PC's graphic card, and I'm rolling with 16 GB ram inside, and 6 cores at 3.2 ghz/core, and 1Gb ram on my vid card. So even if the specs are comparable to my current PC, there's not going to be a "Oh my God this looks sooo much better" response between current gen and Sony/MS next gen machines, nowhere close to the jump from SD to HD.

The point is, it's going to require both Sony and MS to move in a different direction to make the sale this time around.



The Carnival of Shadows - Folk Punk from Asbury Park, New Jersey

http://www.thecarnivalofshadows.com