By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Wii U GPU new info and more speculation

Cheebee said:
betacon said:
Cheebee said:

What is it with all the WiiU specs threads and PS3 fans coming in and flooding the threads with posts downplaying WiiU. Can't they go play their PS3s or post in some God of War of Last of Us-thread or something..? :s
Everybody and their grandma knows by now that you guys don't like WiiU and will probably never get one. We don't care. Get over it. -Nobody- is saying WiiU is an insane powerhouse, or that PS4/X720 won't be more powerful. Obviously. Get over yourselves. PS4/X720 will be -severely- underpowered compared to PCs as well.

Gosh, I don't see any PC fans going around flooding PS360 threads with countless posts of how sad those consoles' specs are, or having sigs touting the inferiority of said consoles compared to their obviously superior PCs.

 Bit sensitive, this if a internet forum for gaming discussion I fail to see what’s wrong with speculating the spec of a system a company has refused to show.

 

Nothing wrong with that. But there's a difference between what you're suggesting and what's happening on here. It's overkill.

neogaf and beyond3d are thinking it's 160 SP now and DF will run an article about Wii U being 160 sp or not.



Around the Network

Logically I don't see how it could run Black Ops 2 not only on one screen but on the second screen at the same time if it was less powerful than the 360. It doesn't make any sense.



Soundwave said:
Logically I don't see how it could run Black Ops 2 not only on one screen but on the second screen at the same time if it was less powerful than the 360. It doesn't make any sense.

Some quotes about the gamepad effecting performance.

 

I said streaming an image to the gamepad is free, streaming a copy of what is on the tv is free (as confirmed by dev's)

Of course rendering a completely separate image is going to take additional resources, having a HUD in a game technically takes up additional resources but those things would have already been rendered in one form or another. For example, rendering a map on the gamepad should not take up more resources than rendering it on screen. However, a completely interactive 3d camera would obviously constitute a performance hit because it's rendering another 3d image, it should similar to the hit that 360 or ps3 games experience in split screen multiplayer.

I'm sorry if my previous comment was worded in a way that could be misconstrued but I was referring to the act of streaming an image being free (not rendering an interactive 3d camera or an animated overlay). there is apparently a dedicated memory pool on the chip for the gamepad's framebuffer so it is literally free.

 

The people on neogaf that did the GPU teardown have been postulating that there is hardware on the chip whose sole purpose is render and push out images to the gamepad. If this is true then the rendering of the gamepad image does not take up any more resources than it would take to render the same image on the TV screen. The entire gamepad framebuffer is free even though it's pushing out a 480p image. No resources would be freed up if you turn the gamepad off because the hardware that pushes out the image would just not be used. 


Essentially there is dedicated hardware that handles the framebuffer seperately meaning that putting an image on the gamepad does not take up additional resources, unless you are pushing something that would not normally be rendered to the screen (like a second camera).



ninjablade said:
Cheebee said:
betacon said:
Cheebee said:

What is it with all the WiiU specs threads and PS3 fans coming in and flooding the threads with posts downplaying WiiU. Can't they go play their PS3s or post in some God of War of Last of Us-thread or something..? :s
Everybody and their grandma knows by now that you guys don't like WiiU and will probably never get one. We don't care. Get over it. -Nobody- is saying WiiU is an insane powerhouse, or that PS4/X720 won't be more powerful. Obviously. Get over yourselves. PS4/X720 will be -severely- underpowered compared to PCs as well.

Gosh, I don't see any PC fans going around flooding PS360 threads with countless posts of how sad those consoles' specs are, or having sigs touting the inferiority of said consoles compared to their obviously superior PCs.

 Bit sensitive, this if a internet forum for gaming discussion I fail to see what’s wrong with speculating the spec of a system a company has refused to show.

 

Nothing wrong with that. But there's a difference between what you're suggesting and what's happening on here. It's overkill.

neogaf and beyond3d are thinking it's 160 SP now and DF will run an article about Wii U being 160 sp or not.


On the face of what we do know, that's impossible. Anyone speculating such is an idiot, a fanboy, or both. The Wii U's CPU is suppossedly 15 GLOPS which is basically 1/5 of the GLOPS of the 360 CPU. It would not possible for the ports to be as close as they are with that CPU mated to a 176GLOP GPU.



ninjablade said:
Cheebee said:
betacon said:
Cheebee said:

What is it with all the WiiU specs threads and PS3 fans coming in and flooding the threads with posts downplaying WiiU. Can't they go play their PS3s or post in some God of War of Last of Us-thread or something..? :s
Everybody and their grandma knows by now that you guys don't like WiiU and will probably never get one. We don't care. Get over it. -Nobody- is saying WiiU is an insane powerhouse, or that PS4/X720 won't be more powerful. Obviously. Get over yourselves. PS4/X720 will be -severely- underpowered compared to PCs as well.

Gosh, I don't see any PC fans going around flooding PS360 threads with countless posts of how sad those consoles' specs are, or having sigs touting the inferiority of said consoles compared to their obviously superior PCs.

 Bit sensitive, this if a internet forum for gaming discussion I fail to see what’s wrong with speculating the spec of a system a company has refused to show.

 

Nothing wrong with that. But there's a difference between what you're suggesting and what's happening on here. It's overkill.

neogaf and beyond3d are thinking it's 160 SP now and DF will run an article about Wii U being 160 sp or not.

Actually, during this time i did a little of research and found out a few things about the 160SPs theory.

 

First, let's say these 30% of unknown space in the GPU doesn't exist. Now, from my calculations, 30% of 156.21mm^2(Wii U total GPU die size) gives us 46.83mm^2, which is the size of the unknown space in Wii U's GPU based on the information we have. Combining this number with the size of the eDram module, which measures is 38.68mm^2, we get 70.81mm^2 free space for the actual GPU on the die.

 

Now, let's get a 160SPs Radeon GPU for die size comparisons. Radeon HD 6450 which has 160SPs,core clock of 750MHz and, ironically, is manufactured at TSMC's 40nm process, has a die size of 67mm^2, which fits inside the 70.81mm^2 on the Wii U. So, if we completely rule out the unknown space on the GPU, then, yes, we probably have a 160SPs Radeon HD 6450 downlocked on the Wii U. Actually, this GPU is quite a match with the Wii U, as it's TDP is only of 18W at 750MHz, which could be easily reduced by downclocking the GPU. And, based on the research i've done, only Radeon HD 69xx cards have the Double Precision extensions, so if Wii U's GPU is the HD 6450 we could rule out any possible way to increase SP density on the SIMD blocks from removing the DP extensions. However, there are still possibilities to increase SP density on SIMD blocks besides removing the DPs, so i can't confirm the GPU being a 160SP one nor a HD 6450 because there are still multiple factors that could change SP count similar die sizes.



Around the Network
RazorDragon said:
ninjablade said:
Cheebee said:
betacon said:
Cheebee said:

What is it with all the WiiU specs threads and PS3 fans coming in and flooding the threads with posts downplaying WiiU. Can't they go play their PS3s or post in some God of War of Last of Us-thread or something..? :s
Everybody and their grandma knows by now that you guys don't like WiiU and will probably never get one. We don't care. Get over it. -Nobody- is saying WiiU is an insane powerhouse, or that PS4/X720 won't be more powerful. Obviously. Get over yourselves. PS4/X720 will be -severely- underpowered compared to PCs as well.

Gosh, I don't see any PC fans going around flooding PS360 threads with countless posts of how sad those consoles' specs are, or having sigs touting the inferiority of said consoles compared to their obviously superior PCs.

 Bit sensitive, this if a internet forum for gaming discussion I fail to see what’s wrong with speculating the spec of a system a company has refused to show.

 

Nothing wrong with that. But there's a difference between what you're suggesting and what's happening on here. It's overkill.

neogaf and beyond3d are thinking it's 160 SP now and DF will run an article about Wii U being 160 sp or not.

Actually, during this time i did a little of research and found out a few things about the 160SPs theory.

 

First, let's say these 30% of unknown space in the GPU doesn't exist. Now, from my calculations, 30% of 156.21mm^2(Wii U total GPU die size) gives us 46.83mm^2, which is the size of the unknown space in Wii U's GPU based on the information we have. Combining this number with the size of the eDram module, which measures is 38.68mm^2, we get 70.81mm^2 free space for the actual GPU on the die.

 

Now, let's get a 160SPs Radeon GPU for die size comparisons. Radeon HD 6450 which has 160SPs,core clock of 750MHz and, ironically, is manufactured at TSMC's 40nm process, has a die size of 67mm^2, which fits inside the 70.81mm^2 on the Wii U. So, if we completely rule out the unknown space on the GPU, then, yes, we probably have a 160SPs Radeon HD 6450 downlocked on the Wii U. Actually, this GPU is quite a match with the Wii U, as it's TDP is only of 18W at 750MHz, which could be easily reduced by downclocking the GPU. And, based on the research i've done, only Radeon HD 69xx cards have the Double Precision extensions, so if Wii U's GPU is the HD 6450 we could rule out any possible way to increase SP density on the SIMD blocks from removing the DP extensions. However, there are still possibilities to increase SP density on SIMD blocks besides removing the DPs, so i can't confirm the GPU being a 160SP one nor a HD 6450 because there are still multiple factors that could change SP count similar die sizes.

intersting post, so your leaning towards it being 160 SP as well?, according to the post above yours, only a idiot and a fanboy  would even be even condering it. i think some should run tests on trine 2 with an HD 5550 with the same specifactions as wii u, if it runs much better we can rule out 320 SP  for sure.



ninjablade said:
Soundwave said:
Logically I don't see how it could run Black Ops 2 not only on one screen but on the second screen at the same time if it was less powerful than the 360. It doesn't make any sense.

Some quotes about the gamepad effecting performance.

 

I said streaming an image to the gamepad is free, streaming a copy of what is on the tv is free (as confirmed by dev's)

Of course rendering a completely separate image is going to take additional resources, having a HUD in a game technically takes up additional resources but those things would have already been rendered in one form or another. For example, rendering a map on the gamepad should not take up more resources than rendering it on screen. However, a completely interactive 3d camera would obviously constitute a performance hit because it's rendering another 3d image, it should similar to the hit that 360 or ps3 games experience in split screen multiplayer.

I'm sorry if my previous comment was worded in a way that could be misconstrued but I was referring to the act of streaming an image being free (not rendering an interactive 3d camera or an animated overlay). there is apparently a dedicated memory pool on the chip for the gamepad's framebuffer so it is literally free.

 

The people on neogaf that did the GPU teardown have been postulating that there is hardware on the chip whose sole purpose is render and push out images to the gamepad. If this is true then the rendering of the gamepad image does not take up any more resources than it would take to render the same image on the TV screen. The entire gamepad framebuffer is free even though it's pushing out a 480p image. No resources would be freed up if you turn the gamepad off because the hardware that pushes out the image would just not be used. 


Essentially there is dedicated hardware that handles the framebuffer seperately meaning that putting an image on the gamepad does not take up additional resources, unless you are pushing something that would not normally be rendered to the screen (like a second camera).


Except Black Ops 2 is not mirroring the TV image, it's rendering the game on two different displays so two people can play at once pretty seamlessly at that. I've play 2-player this way and the performance is fine.



Soundwave said:
ninjablade said:
Soundwave said:
Logically I don't see how it could run Black Ops 2 not only on one screen but on the second screen at the same time if it was less powerful than the 360. It doesn't make any sense.

Some quotes about the gamepad effecting performance.

 

I said streaming an image to the gamepad is free, streaming a copy of what is on the tv is free (as confirmed by dev's)

Of course rendering a completely separate image is going to take additional resources, having a HUD in a game technically takes up additional resources but those things would have already been rendered in one form or another. For example, rendering a map on the gamepad should not take up more resources than rendering it on screen. However, a completely interactive 3d camera would obviously constitute a performance hit because it's rendering another 3d image, it should similar to the hit that 360 or ps3 games experience in split screen multiplayer.

I'm sorry if my previous comment was worded in a way that could be misconstrued but I was referring to the act of streaming an image being free (not rendering an interactive 3d camera or an animated overlay). there is apparently a dedicated memory pool on the chip for the gamepad's framebuffer so it is literally free.

 

The people on neogaf that did the GPU teardown have been postulating that there is hardware on the chip whose sole purpose is render and push out images to the gamepad. If this is true then the rendering of the gamepad image does not take up any more resources than it would take to render the same image on the TV screen. The entire gamepad framebuffer is free even though it's pushing out a 480p image. No resources would be freed up if you turn the gamepad off because the hardware that pushes out the image would just not be used. 


Essentially there is dedicated hardware that handles the framebuffer seperately meaning that putting an image on the gamepad does not take up additional resources, unless you are pushing something that would not normally be rendered to the screen (like a second camera).


Except Black Ops 2 is not mirroring the TV image, it's rendering the game on two different displays so two people can play at once pretty seamlessly at that. I've play 2-player this way and the performance is fine.


2 people in the campaign or in multiplayer, in mutltplayer in should be possible with out taking a hit since the cpu is bascially freed up.



ninjablade said:

i think some should run tests on trine 2 with an HD 5550 with the same specifactions as wii u, if it runs much better we can rule out 320 SP  for sure.

Good luck finding "an H5550 with the same specifications as the Wii U." How do you plan on finding an off-the-shelf GPU with the same specifications as a highly customized and to date not fully understood GPU?



ninjablade said:
RazorDragon said:
ninjablade said:
Cheebee said:
betacon said:
Cheebee said:

What is it with all the WiiU specs threads and PS3 fans coming in and flooding the threads with posts downplaying WiiU. Can't they go play their PS3s or post in some God of War of Last of Us-thread or something..? :s
Everybody and their grandma knows by now that you guys don't like WiiU and will probably never get one. We don't care. Get over it. -Nobody- is saying WiiU is an insane powerhouse, or that PS4/X720 won't be more powerful. Obviously. Get over yourselves. PS4/X720 will be -severely- underpowered compared to PCs as well.

Gosh, I don't see any PC fans going around flooding PS360 threads with countless posts of how sad those consoles' specs are, or having sigs touting the inferiority of said consoles compared to their obviously superior PCs.

 Bit sensitive, this if a internet forum for gaming discussion I fail to see what’s wrong with speculating the spec of a system a company has refused to show.

 

Nothing wrong with that. But there's a difference between what you're suggesting and what's happening on here. It's overkill.

neogaf and beyond3d are thinking it's 160 SP now and DF will run an article about Wii U being 160 sp or not.

Actually, during this time i did a little of research and found out a few things about the 160SPs theory.

 

First, let's say these 30% of unknown space in the GPU doesn't exist. Now, from my calculations, 30% of 156.21mm^2(Wii U total GPU die size) gives us 46.83mm^2, which is the size of the unknown space in Wii U's GPU based on the information we have. Combining this number with the size of the eDram module, which measures is 38.68mm^2, we get 70.81mm^2 free space for the actual GPU on the die.

 

Now, let's get a 160SPs Radeon GPU for die size comparisons. Radeon HD 6450 which has 160SPs,core clock of 750MHz and, ironically, is manufactured at TSMC's 40nm process, has a die size of 67mm^2, which fits inside the 70.81mm^2 on the Wii U. So, if we completely rule out the unknown space on the GPU, then, yes, we probably have a 160SPs Radeon HD 6450 downlocked on the Wii U. Actually, this GPU is quite a match with the Wii U, as it's TDP is only of 18W at 750MHz, which could be easily reduced by downclocking the GPU. And, based on the research i've done, only Radeon HD 69xx cards have the Double Precision extensions, so if Wii U's GPU is the HD 6450 we could rule out any possible way to increase SP density on the SIMD blocks from removing the DP extensions. However, there are still possibilities to increase SP density on SIMD blocks besides removing the DPs, so i can't confirm the GPU being a 160SP one nor a HD 6450 because there are still multiple factors that could change SP count similar die sizes.

intersting post, so your leaning towards it being 160 SP as well?, according to the post above yours, only a idiot and a fanboy f would even be even condering it. i think some should run tests on trine 2 with an amd HD 5550 with the same specifactions as wii u, if it runs much better we can rule out 320 SP  for sure.

Maybe. I mean, like i said, ignoring the 30% that's still unknown inside the GPU, then, yes, it would probably be a 160SPs GPU and probably be based on a Caicos core, the same on the HD 6450, as it pretty much matches everything that has been said about the Wii U's GPU. However, these guys were wrong before believing the die could hold 480SPs, so while 160SPs may seem possible for now based on the informations available from them, it's not certain as not every part of the GPU was analyzed yet. If there were die shots of the HD 6450 to be compared, it would be easier to see if something in it matches or not with Latte, as this card, i believe, is the only available with 160SPs from AMD.