By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ctalkeb said:
Sri Lumpa said:
Barozi said:
Viper1 said:
Barozi said:

proof ?

At the moment Nintendo only confirmed that games can be upscaled to 1080p. Which is the same as EVERY single Xbox 360 game and about half of all PS3 games. 1080p is supported, that's it. Haven't read anything that they will run natively on 1080p or that they will all run at 60 frames per second.
Which is by the way VERY unlikely. It's not like the technology will suddenly stop. That Unreal Engine 4 promo tech demo won't run at 60FPS and 1080p on modern PCs.

1080p and 30FPS will be the absolute maximum that can be achieved with modern graphics. Doesn't mean though that technical less complex games like NSMB WiiU couldn't run with 60FPS. But that's not much different to XBLA or PSN games that also support native 1080p.

Nintendo never once said anything about upscaling outside of stating Wii U would not upscale Wii titles.

And please don't comment about modern graphics if you're going to make silly statements like that.  Everybody that games on a PC just kinda snickered at you.

But when you just say that it can do 1080p then you don't need to say that it's only upscaled, because it would be true either way. Would be stupid for a company anyway to say that it's not true full HD. It's just remarkable that Nintendo talks "all the time" about 1080p with their Wii U, but fail to mention if the games actually run on that resolution or not. Very vague. I bet they're not sure themselves.


not a silly statement afterall huh ?
Obviously I was not talking about games like Duke Nukem Forever or World of Warcraft or anything like that.
Good luck finding a PC that can do BF3 at 60 frames per second with MAX settings and 1080p (or 1200p) when it comes out in a few months.

That statement is even more silly than you original one where you said that 1080p and 30 fps were the absolute maximum, given that you tried to prove it with a benchmark showing a modern graphic card doing almost 50% more than the 30fps on average you tout as the absolute maximum. Pretty dumb, right.

Another example taken from my earlier post: http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/14

All I see here is that PC gamers have problems with reading comprehension. If you look further down on the posted comparison you'll find how the HD 4870 does - which is obviously  what he was referring to. How a modern game performs on hardware similiar to what is suspected as being in the Wii U.

That argument doesn't work because there is a huge difference in terms of optimization on closed-systems like consoles compared to PC games which need to work on a wide range of hardware with an operating system that uses resources as well behind it. If there was a PC with comparable hardware to what we find on the PS3 and 360, disregarding the first obstacle of too little ram, we shouldn't expect it to run anything near advanced as what the platforms run now. 



Around the Network
ctalkeb said:

All I see here is that PC gamers have problems with reading comprehension. If you look further down on the posted comparison you'll find how the HD 4870 does - which is obviously  what he was referring to. How a modern game performs on hardware similiar to what is suspected as being in the Wii U.

It's not our reading comprehension that is the problem, it's your hardware comprehension that is the problem.  Just as Sc94597 stated, the difference between an open hardware platform and a closed hardware platform is quite huge.  Take the PS3's RSX and X360's Xenos for isntance.  If you put them into a PC in their original console incarnations, you'd never see any of the games being displayed as you do on the consoles themselves.  The RAM limitations alone would bottleneck the crap out of them.  But in a closed hardware system like a console with less system resources competing for RAM and CPU cycles, they can render much better graphics, frame rates and resolutions.

PC GPU's also include many features that are not needed on a console.  The Unified Video Decoder, an upscaler, video transcoding, the PCi-express bridge, sideport interconnect, etc...

An HD 4870 in a closed hardware system would sing like Sarah Brightman.



The rEVOLution is not being televised

sc94597 said:

That argument doesn't work because there is a huge difference in terms of optimization on closed-systems like consoles compared to PC games which need to work on a wide range of hardware with an operating system that uses resources as well behind it. If there was a PC with comparable hardware to what we find on the PS3 and 360, disregarding the first obstacle of too little ram, we shouldn't expect it to run anything near advanced as what the platforms run now. 


Well, I know that, of course - you can just look at how badly linux ran on PS3 for an example - and I don't necessarily agree completely with the thread opener, but snickering and refering to gtx 580-cards was uncalled for and just shows that the people replying have problems with basic comprehesion within a context.

When someone is making fun of someone because of their own lack of understanding, I think they deserve to be called out on it.



Viper1 said:

It's not our reading comprehension that is the problem [...]


I was specifically refering to Sri Lumpa, read his post again, and you can clearly see that he thinks the gtx 580 is the card being discussed.

I also assumed that your comment on his line "1080p and 30FPS will be the absolute maximum that can be achieved with modern graphics" indicated that you thought he was speaking about graphics in general, rather than "modern graphics on the Wii U". I apologize to you if I was somehow mistaken.

That the HD 4870 (which won't be the exact design of the card, of course) will do far better in a closed, custom system than it does in a PC is beyond questioning.

I have many doubts about the Wii U concept, but that it'll be a noticable improvement on PS3/360 graphically is not one of them.

Edit: Spelling.

2nd Edit: Clarification (it's been a long day)



Viper1 said:
ctalkeb said:

All I see here is that PC gamers have problems with reading comprehension. If you look further down on the posted comparison you'll find how the HD 4870 does - which is obviously  what he was referring to. How a modern game performs on hardware similiar to what is suspected as being in the Wii U.

It's not our reading comprehension that is the problem, it's your hardware comprehension that is the problem.  Just as Sc94597 stated, the difference between an open hardware platform and a closed hardware platform is quite huge.  Take the PS3's RSX and X360's Xenos for isntance.  If you put them into a PC in their original console incarnations, you'd never see any of the games being displayed as you do on the consoles themselves.  The RAM limitations alone would bottleneck the crap out of them.  But in a closed hardware system like a console with less system resources competing for RAM and CPU cycles, they can render much better graphics, frame rates and resolutions.

PC GPU's also include many features that are not needed on a console.  The Unified Video Decoder, an upscaler, video transcoding, the PCi-express bridge, sideport interconnect, etc...

An HD 4870 in a closed hardware system would sing like Sarah Brightman.


Don't forget that GPU's on many PC graphics cards tend to have features that (theoretically) could be used to improve performance and visual results but are not used in most games until they become part of a standard that is supported by all manufacturers and these cards become common. On top of this, many consoles' GPUs have had significant modifications designed to improve performance or incorporate features that their PC counterpart didn't have.
 



Around the Network
ctalkeb said:
Viper1 said:

It's not our reading comprehension that is the problem [...]


I was specifically refering to Sri Lumpa, read his post again, and you can clearly see that he thinks the gtx 580 is the card being discussed.

I also assumed that your comment on his line "1080p and 30FPS will be the absolute maximum that can be achieved with modern graphics" indicated that you thought he was speaking about graphics in general, rather than "modern graphics on the Wii U". I apologize to you if I was somehow mistaken.

That the HD 4870 (which won't be the exact design of the card, of course) will do far better in a closed, custom system than it does in a PC is beyond questioning.

I have many doubts about the Wii U concept, but that it'll be a noticable improvement on PS3/360 graphically is not one of them.

Edit: Spelling.

2nd Edit: Clarification (it's been a long day)

I believe we are on a much better common understanding now.



The rEVOLution is not being televised

ctalkeb said:
Sri Lumpa said:
Barozi said:
Viper1 said:
Barozi said:

proof ?

At the moment Nintendo only confirmed that games can be upscaled to 1080p. Which is the same as EVERY single Xbox 360 game and about half of all PS3 games. 1080p is supported, that's it. Haven't read anything that they will run natively on 1080p or that they will all run at 60 frames per second.
Which is by the way VERY unlikely. It's not like the technology will suddenly stop. That Unreal Engine 4 promo tech demo won't run at 60FPS and 1080p on modern PCs.

1080p and 30FPS will be the absolute maximum that can be achieved with modern graphics. Doesn't mean though that technical less complex games like NSMB WiiU couldn't run with 60FPS. But that's not much different to XBLA or PSN games that also support native 1080p.

Nintendo never once said anything about upscaling outside of stating Wii U would not upscale Wii titles.

And please don't comment about modern graphics if you're going to make silly statements like that.  Everybody that games on a PC just kinda snickered at you.

But when you just say that it can do 1080p then you don't need to say that it's only upscaled, because it would be true either way. Would be stupid for a company anyway to say that it's not true full HD. It's just remarkable that Nintendo talks "all the time" about 1080p with their Wii U, but fail to mention if the games actually run on that resolution or not. Very vague. I bet they're not sure themselves.


not a silly statement afterall huh ?
Obviously I was not talking about games like Duke Nukem Forever or World of Warcraft or anything like that.
Good luck finding a PC that can do BF3 at 60 frames per second with MAX settings and 1080p (or 1200p) when it comes out in a few months.

That statement is even more silly than you original one where you said that 1080p and 30 fps were the absolute maximum, given that you tried to prove it with a benchmark showing a modern graphic card doing almost 50% more than the 30fps on average you tout as the absolute maximum. Pretty dumb, right.

Another example taken from my earlier post: http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/14

All I see here is that PC gamers have problems with reading comprehension. If you look further down on the posted comparison you'll find how the HD 4870 does - which is obviously  what he was referring to. How a modern game performs on hardware similiar to what is suspected as being in the Wii U.

I am not a PC gamer actually, I am a gamer. My preference gravitates between Nintendo and PC games, true, but I also have a PS3 and 360 because what matters are the games and though I tend to prefer Nintendo style games there are also plenty of non-nintendo games I like that do not see a release on PC's, thus making those purchases worthwhile. Fyi, I also have a NES, SNES, N64, GC, PS1, PS2, Megadrive (a.k.a. genesis), Dreamcast and Xbox so don't go thinking I am a PC whore, but if we are talking about state of the art graphics in games then you can't avoid talking about PC's.

As for reading comprehension, I can only read what he wrote, not what he meant if what he meant differs from what he wrote, and he clearly wrote about the absolute maximum of modern graphics, not about the absolute maximum of likely Wii U hardware.

But maybe you are right that he meant that. So let's see if there is any game with modern graphic (full release, not smaller downloadable) that a Radeon HD 4870 (which is suspected to be in the Wii U hardware) can run at either 30fps but a higher resolution than 1080p or at 1080p but at a higher fps than 30fps. Remember that only one game is necessary to prove his contention false (well, your really) as such a game would be above said "absolute maximum".

The problem is that GPU reviews tend to use the current generation cards and the previous generation card but the HD 4870 being 2 generations behind is not generally reviewed along with newer cards so that it is more difficult to find reviews containing the HD 4870 and recent games (say, less than 2 years old). It took me a little while but did manage to find a review with the HD 4870 and Mass effect 2, an 18 month old game (almost, except on PS3 of course):

http://www.pcgameshardware.com/aid,703669/Mass-Effect-2-Galactic-battle-Geforce-versus-Radeon/Practice/

As you can see, at 1920x1200p (slighly higher than 1080p) and with 4xMSAA, which has to be forced in the driver as ME2 doesn't support AA (because the Unreal Engine 3 doesn't support it), the HD 4870 runs it at 30fps minimum, not average with dips under that, minimum; on average it runs it at 42fps, 40% faster than the purported absolute maximum speed and at a slightly (10%) higher resolution. If we run it as it runs on the consoles, without little or no AA that is, the HD 4870 can run it at 53fps minimum and 70fps average. That's a huge 76% and 130% faster than the absolute maximum of 30fps.

So is 1920x1080p at 30fps the absolute maximum for a HD4870 class GPU? Mass Effect 2 says it isn't.

And it gets better because these benchmarks were done on PC's and as John Carmack said:

"But it's frustrating in that a lot of the PC systems that are many times more powerful still have trouble holding the same 60 frames-per-second rate because of API overhead, API clocking issues, and things like that. We're working with Intel and Nvidia on all these issues, but it is kind of frustrating when I know that the hardware is vastly more powerful but because we don't have quite as tight control over it, a lot of power goes to waste."

If Nintendo does use a chip of a power equivalent to the HD 4870 you can expect even better results as it will not have the same overhead as a PC operating system does (which is one of the reasons why consoles do not get immediately obsolete even when PC GPUs with more powerful hardware comes out but it takes a few generations for the power gap to overcome the lesser efficiency*).

So is it possible for Nintendo to make a console that can run a reasonable amount of reasonably recent games at 1080p and more than 30fps by using a 2 generations old GPU (PC GPU gens of course, not console gens)? Yes, definitely! Woud it run ALL games that way? Unlikely as there always are games that push the hardware further, are less optimised... Will they do so? Time will tell, but I certainly hope so as it would not only allow them to run current games very well but would go a long way toward allowing them to run games designed for the PS4/Xbox Next but at reduced IQ and lower framerate/resolution.

Anyway, like I said earlier, I buy consoles (and PC GPUs) for the game they allow me to play, so my main reason to buy the Wii U will be to see how Nintendo uses it for Zelda, Pikmin and Mario and I hope 3rd party will support it better now that the "it's not HD" barrier will be gone (and that they will also make good use of the new controller of course).

* another reason is the fixed platform that allows optimising for it whereas you cannot optimise for the myriad of PC configurations



"I do not suffer from insanity, I enjoy every minute of it"

 

Sri Lumpa said:

As for reading comprehension, I can only read what he wrote [...].

But maybe you are right that he meant that. [...] Remember that only one game is necessary to prove his contention false (well, your really) as such a game would be above said "absolute maximum".

I think that in a post that specifically talks about the Wii U, its games and hardware capabilities, it's not necessary to constantly write the words "Wii U". I could be mistaken about his intents, of course, but that's how I read it as there was no other way to intrepret it and still make sense.

For your second point, you should probably read the rest of my posts - even though I guess I should have made myself clearer originally.

(Not quite sure if ME represents "modern graphics" very well - I've always thought it looks like console graphics slightly improved for the PC. Then again, I'm not fantastically impressed by The Witcher 2 either)



ctalkeb said:
Sri Lumpa said:

As for reading comprehension, I can only read what he wrote [...].

But maybe you are right that he meant that. [...] Remember that only one game is necessary to prove his contention false (well, your really) as such a game would be above said "absolute maximum".

I think that in a post that specifically talks about the Wii U, its games and hardware capabilities, it's not necessary to constantly write the words "Wii U". I could be mistaken about his intents, of course, but that's how I read it as there was no other way to intrepret it and still make sense.

For your second point, you should probably read the rest of my posts - even though I guess I should have made myself clearer originally.

(Not quite sure if ME represents "modern graphics" very well - I've always thought it looks like console graphics slightly improved for the PC. Then again, I'm not fantastically impressed by The Witcher 2 either)

While the discussion is about the Wii U his statement was general, without any qualifier and I took it as such. Given that he talked about modern PC's in the preceding sentence and then about modern graphics it is quite easy to equate the two.

But even if I erred in that I still feel I showed that he was wrong. 

I have now read the rest of your posts (the problem of writing long posts with research) and yes, we agree on that point. One nitpick though, Linux running badly on Ps3 is a poor example as Sony did not allow the RSX to be available on Linux (understandable as they do not want to risk people releasing full release games that way, without paying them licensing fees) and so is not relevant to the point as it is not so much a lack of optimisation as a lack of access to the hardware.

As for ME 2 not necessarily being representative of modern graphics, it was hard to find a game that was quite recent, had been tested on the 4870 and preferrably released on consoles as well (so that we can know at what resolutions they are able to run it, 720p in this case). I thought about Portal 2 but it seemed to push graphics less IMO (from a technical perspective, not from an art style point of view).

Also I could throw the reading comprehension ball back at you as, like you so perspicaciously pointed out, we are talking about consoles and their graphics and the Wii U will have to compete with the PS3 and 360 and their games, so using games that are available to them seemed important to me. If you want to use modern to mean PC games pushing PC GPUs (so the first Crysis and Metro 2033 would be good candidates) for the game part and HD 4870 for the GPU part then yes, I would agree that a HD 4870 cannot run modern PC games at more than 1080p at 30fps (well, not most of them anyway, if any) but this seems to unnecessarily stack things one way. Current PC games and current PC GPU or current console games and HD 4870 seems a more fair comparison to me.

Fo the record, I don't think that the Wii U GPU will have the raw performance of the 4870 because:

1. The Wii U, while bigger than the Wii is still quite small.

2. Nintendo tends to eschew newer processes, as examplified by the Wii U CPU being on a 45nm process, so I find it more likely taht its GPU will be at 40nm rather than 32nm (if manufactured by Global Foundry) or 28nm (if manufactured by TSMC). TSMC is very unlikely as the GPU must tape out months before the launch and TSMC's 28nm process is probably not gonna be considered mature at that time (and given the problems they had at 40nm it would be a gamble to bet their won't be problems). As for Glofo's 32nm, they would have to have planned it well in advance as you can't just switch manufacturing process on a whim and so far Radeons are designed on TSMC processes.

3. The HD 4870 consume quite a lot of power. Combined with the smallness of the Wii U and the unlikeliness of a smaller than 40nm process, even with taking out unneeded parts (for a console) I am not sure if it would fit into the GPU TDP of the Wii U.

However, as we discussed earlier (well, talked past each other ), the greater efficiency and optimisation possible with custom hardware might make a CPU that is effectively as powerful as the HD 4870 whilst having less raw power a possibility. I certainly would like that.

(damnit, another long post!)



"I do not suffer from insanity, I enjoy every minute of it"

 

Sri Lumpa said:

.

(damnit, another long post!)


I think we have reached perfect understanding and agreement, which is pretty amazing, given that this is the internet.