It even does not matter if the GPU are DirectX capable, why? here is your answer:
"So what exactly is wrong with D3D for GPU computing? It's the appalling quality of the shader compiler. It crashes, or it takes tens of minutes to return, or it generates bogus code.
Worse still, Microsoft has adopted a you can't handle the truth attitude towards developers concerning hardware features. Direct3D does not report warp size to the developer, and HLSL does not support volatile modifiers for groupshared memory. This is not an accidental omission; it is a design choice to insulate programmers from platform differences. While this may make sense for graphics (and I think that in 2011, that position needs revisiting), it is a pointless attitude for GPU computing. This is not an example of failing to coax a tool do something it wasn't designed to do: DirectCompute's entire purpose is to enable high-performance GPU computing inside Direct3D, and at that it is a failure. Microsoft has not bothered fixing many disastrous compiler bugs; the most recent DX SDK (June 2010) is more than a year old, and I recall the compiler not even being updated since the release of the preceding February".
It is a little old, but it tackle everything you need to know about GPGPU-based coding.
http://www.moderngpu.com/intro/intro.html
There are other important things like
"To reduce control hardware to a minimum while hitting near 100% execution efficiency, GPUs employ two design features: SIMD execution and latency hiding. These are a disruptive technology in the history of computing. Textbook algorithms that would compile on any platform from the past three decades will not run, or will run very inefficiently, on GPU hardware. Hardware vendors promote the performance of GPUs, which is in fact astonishing, but soft peddle the architectural differences, so as not to scare off IT managers. The truth is that GPUs are a different beast. You can toss out your algorithms books; they are of no use here. By keeping SIMD execution and latency hiding in mind, however, you will discover GPGPU idioms that allow the development of traditional CS functions (such as sorts, covered here in depth), using very different algorithms and workflow."
You read that, right? "You can toss out your algorithms books, they are of not use" implying last gen PC's and consoles architecture are not useful for the Wii U nor the next XBOX and PS. My point is that there in no need to say the Wii U is not in the same gen as the next XBOX or PS. That is my whole point, don't need to hate any system, just enjoy the system of your preference.







