Quantcast
NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

They kinda did, they're managing multiple resources with the 3DS/WiiU right down to design.

That's why when you see people actually pushing the device to take advantage of EVERYTHING it does, like VD Dev is with the 3DS, it tends to look leagues better than everything else on the machine



Around the Network
Hynad said:
fatslob-:O said:
Hynad said:
fatslob-:O said:

It may be just your PC then. PS2 emulation has ALWAYS been more CPU limited and that's due to the fact that the PS2 was prettty reliant on it's emotion engine and vector units. Increases in CPU performance these past serveral years have been incremental at best whereas GPU's have being just exploding in performance gains. What makes easier emulation for WII is it's graphics chip is pretty similar to a PC compared the PS2s special inhouse graphics synthesizer chip. 

What's your GPU anyways ? 

Come on. I run PS2 games in 1080p or 720p using a Q9300. Games run smoothly, and my GPU is a 5770. 

How does your computer do against wii games ? 

Frame rate is bad in most games. Super Mario Galaxy runs very slowly, so does Last Story, and Xenoblade is a mixed bag.

https://www.youtube.com/watch?v=cmeWmtH96js

https://www.youtube.com/watch?v=zx4rHGENSSM

https://www.youtube.com/watch?v=DsU3nm7t2dk

These videos with similar specs to your PC says bery differently. Are you sure you aren't being dishonest about your results ? For shadow of the colossus you would need an i5-2500K overclocked to 4.3 ghz and a GTX 660 with SPEED HACKS to get it running at 1080p. 



globalisateur said:
Scoobes said:
Wyrdness said:
fatslob-:O said:

 

 

What customizations though? The APUs are based nearly entirely of parts normally found in PC architecture. Based on what's been revealed I only know of 2 customizations on the APUs themselves. On PS4, the increase in ACEs and the number of compute queues each ACE can handle, whilst the X1 has the eSRAM. On both you have a few minor additions (mainly for audio which would normally be handled on the motherboard anyway).

They're as PC-like as you can get.

I completely disagree.

How can you arrive at this conclusion? Why do you ignore the most important stuff? There are more than those 2 customizations. The PS4 APU (and the whole console) is highly customized by Sony:

- Volatile Bit cache for compute/graphics better simultaneous uses.

- Onion bus which can completely bypass GPU caches.

- Unified GDDR5 memory which is a first for a CPU (not obviously for a GPU).

- Low power mode which can just power the ram, nothing else, not even the CPU (nor the GPU obviously). (not sure here, can recent PCs do that?)

 

All those elements to my knowledge had never been done on a APU before. Even the PS4 motherboard glorious sleek design is very far away from a PC motherboard as seen on the PS4 tear down. Not even talking about the small size of the machine nor the ingenious cooling of the machine: obliqued shape for insuring air release, circular "slits" which insure air entrance even if console blocked by stuff, passive cooling of GDDR5 chips on the EM shield (the ones that are at the back of the motherboard!! so much usual on PC!!), advanced active cooling by state of the art Sony centrifugal fan. Finally the PS4 APU is the bigger ever designed.

It is as much customized you can get (nowadays) actually.

In my opinion.

All of those things are used for hUMA. Make no mistake that these things will appear in amd kaveri except for the GDDR5 ofcourse but we still have it on our dedicated graphics card. 



curl-6 said:
fatslob-:O said:

Well we all know that the WII U can probably support pixel shaders too so what other excuse is there left ? 

Wii U supports pixel shaders, that is a given, going by the effects we're seeing it pull off. But that doesn't mean it is an off-the-shelf GPU, as Nintendo tends not to use these. If it was an off-the-shelf part it would have been identified immediately.

It'd be foolish to think that nintendo would have customized the GPU by themselves when they have even less experience in designing hardware than sony or amd. That's why nintendo looks to ibm and amd to do the job for them. BTW the latte likely comes from an existing GPU architecture from AMD. Think outside of the box curl. (Nintendo couldn't have possibly had the experience to simply change the architecture by itself.) 



fatslob-:O said:
Hynad said:

Frame rate is bad in most games. Super Mario Galaxy runs very slowly, so does Last Story, and Xenoblade is a mixed bag.

https://www.youtube.com/watch?v=cmeWmtH96js

https://www.youtube.com/watch?v=zx4rHGENSSM

https://www.youtube.com/watch?v=DsU3nm7t2dk

These videos with similar specs to your PC says bery differently. Are you sure you aren't being dishonest about your results ? For shadow of the colossus you would need an i5-2500K overclocked to 4.3 ghz and a GTX 660 with SPEED HACKS to get it running at 1080p. 

I assure you I play Final Fantasy XII, both Kingdom Hearts games, and Dragon Quest VIII on my PC, and they run surprisingly well in 1080p except for  Dragon Quest VIII, the most demanding of them, that runs in 720p. I don't have SOTC, so I can't say how it runs on my PC.

I don,t know what else to say. xD Maybe I could try to make a video of them running. But I've never tried to do that, so I may need some time. =P

One thing you need to know about Dolphin, it runs better in DirectX 9 mode.



  • PSN: Hynad
  • NN: 3519-6016-4122
  • XBL: Hynad
  • Steam: Hynad81
Around the Network
Hynad said:
fatslob-:O said:
Hynad said:

Frame rate is bad in most games. Super Mario Galaxy runs very slowly, so does Last Story, and Xenoblade is a mixed bag.

https://www.youtube.com/watch?v=cmeWmtH96js

https://www.youtube.com/watch?v=zx4rHGENSSM

https://www.youtube.com/watch?v=DsU3nm7t2dk

These videos with similar specs to your PC says bery differently. Are you sure you aren't being dishonest about your results ? For shadow of the colossus you would need an i5-2500K overclocked to 4.3 ghz and a GTX 660 with SPEED HACKS to get it running at 1080p. 

I assure you I play Final Fantasy XII, both Kingdom Hearts games, and Dragon Quest VIII on my PC, and they run surprisingly well in 1080p except for  Dragon Quest VIII, the most demanding of them, that runs in 720p. I don't have SOTC, so I can't say how it runs on my PC.

I don,t know what else to say. xD Maybe I could try to make a video of them running. But I've never tried to do that, so I may need some time. =P

One thing you need to know about Dolphin, it runs better in DirectX 9 mode.

@Bold That part is true because a few games will miss some graphical effect when rednering in DX9 mode but that's done because of speed gains.

Were you running the WII games in DX11 mode ? BTW those games don't look very demanding to me and PCSX2 can run them pretty well but as for games like MGS3 and zone of the enders those will teach you a lesson to not have anything lower than an overclocked i5-2500K.  

When I mean that PS2 emulation is CPU limited I mean that it's really CPU LIMITED. The reason why the cpu power is very important for PS2 emulation is because the CPU essentially does most of the work for emulation. They have to emulate alot of compenents such as the EE and it's VUs which are very complex. Despite the fact that the PS2 has a multicore architecture the components needed very tight communication so a lower amount of threads were more preferable due to the fact that each component was required to pass information very quickly to another processing component and in turn caused PCSX2 to initially use 1 core but very quickly transisitioned to a usage of 2 cores because the developers sought out that the the GS could be isolated and in turn be made very threadable for more performance gains. The reason why PCSX2 can now use 3 cores is due to the fact that they have figured out a way to isolate VU1.

Hence why it is easier to emulate the WII rather than the PS2 because the gains in CPU performance hasn't skyrocketed like GPUs and the WII as well as GC is very dependent on the GPU to do most of it's graphics task. 



fatslob-:O said:
curl-6 said:
fatslob-:O said:

Well we all know that the WII U can probably support pixel shaders too so what other excuse is there left ? 

Wii U supports pixel shaders, that is a given, going by the effects we're seeing it pull off. But that doesn't mean it is an off-the-shelf GPU, as Nintendo tends not to use these. If it was an off-the-shelf part it would have been identified immediately.

It'd be foolish to think that nintendo would have customized the GPU by themselves when they have even less experience in designing hardware than sony or amd. That's why nintendo looks to ibm and amd to do the job for them. BTW the latte likely comes from an existing GPU architecture from AMD. Think outside of the box curl. (Nintendo couldn't have possibly had the experience to simply change the architecture by itself.) 

Of course Nintendo doesn't customize Latte themselves, they probably just tell AMD "we want x figures for power consumption, heat, processing power, etc, make us a chip that does this and we'll buy them from you".



curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Well we all know that the WII U can probably support pixel shaders too so what other excuse is there left ? 

Wii U supports pixel shaders, that is a given, going by the effects we're seeing it pull off. But that doesn't mean it is an off-the-shelf GPU, as Nintendo tends not to use these. If it was an off-the-shelf part it would have been identified immediately.

It'd be foolish to think that nintendo would have customized the GPU by themselves when they have even less experience in designing hardware than sony or amd. That's why nintendo looks to ibm and amd to do the job for them. BTW the latte likely comes from an existing GPU architecture from AMD. Think outside of the box curl. (Nintendo couldn't have possibly had the experience to simply change the architecture by itself.) 

Of course Ninendo doesn't customize Latte themselves, they simply tell AMD "we want x figures for power consumption, heat, processing power, etc, make us a chip that does this and we'll buy them from you."

I also don't think AMD would waste hundreds of millions of dollars to build a new GPU architecture specifically for the WII U because afterall AMD is in the red LMAO. The only addition AMD integrated to their VLIW 5 architecture was the eDRAM and it's not that complicated. Just like how intel was able to get 128 mb into their iris pro chips and we all know intel has a history of designing extremely craptastic graphics chips. 



Double post, please delete.



fatslob-:O said:
curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Well we all know that the WII U can probably support pixel shaders too so what other excuse is there left ? 

Wii U supports pixel shaders, that is a given, going by the effects we're seeing it pull off. But that doesn't mean it is an off-the-shelf GPU, as Nintendo tends not to use these. If it was an off-the-shelf part it would have been identified immediately.

It'd be foolish to think that nintendo would have customized the GPU by themselves when they have even less experience in designing hardware than sony or amd. That's why nintendo looks to ibm and amd to do the job for them. BTW the latte likely comes from an existing GPU architecture from AMD. Think outside of the box curl. (Nintendo couldn't have possibly had the experience to simply change the architecture by itself.) 

Of course Ninendo doesn't customize Latte themselves, they simply tell AMD "we want x figures for power consumption, heat, processing power, etc, make us a chip that does this and we'll buy them from you."

I also don't think AMD would waste hundreds of millions of dollars to build a new GPU architecture specifically for the WII U because afterall AMD is in the red LMAO. The only addition AMD integrated to their VLIW 5 architecture was the eDRAM and it's not that complicated. Just like how intel was able to get 128 mb into their iris pro chips and we all know intel has a history of designing extremely craptastic graphics chips. 

Making a modified chip would not cost hundreds of millions of dollars. And they were happy to produce unique chips for the Wii/Gamecube.