oni-link said:
|
I'm not sure of the technicalities, but I do think you're right. However, I also think making sure they have support on all major engines is money well spent.
oni-link said:
|
I'm not sure of the technicalities, but I do think you're right. However, I also think making sure they have support on all major engines is money well spent.
''In other news developers will be limited by the Wii-U's very slow main memory and and underpowered cpu''
You see anyone can make a statement by only talking about specific details.
CGI-Quality said: So, I wonder, what PC GPU can it stand toe-to-toe with (for some perspective)? |
I read that it is a heavily modified version of the Evergreen series. Seeing how the specs are so close (from the die shots) to that line of GPU's it's the closest one (without the special sauce) to compare the GPU to. The Wii U is even made in the same plant as the Evergreen!!!
for reference:
oni-link said:
for reference: |
Note that Wii-U's gpu only has half the bandwidth though.
AnthonyW86 said: ''In other news developers will be limited by the Wii-U's very slow main memory and and underpowered cpu'' You see anyone can make a statement by only talking about specific details. |
Please read the OP first before responding with useless banter. Shin'en like other develpers have shown the CPU despite it's disadvantages has it's own distinct advantages...ie large cache etc. Even if it's 3 Wii CPU's boosted 50% to working together that is quite a leap compared to it's predecessor. Not to mention it has 3.5x more embedded RAM than the X360 (35MB for Wii U vs 10 for X360) for it's GPGPU purposes. We don't know enough of the GDD3 to assume how slow the memory is...do we? It could be dual-channel etc for all we know.
superchunk said:
I'm not sure of the technicalities, but I do think you're right. However, I also think making sure they have support on all major engines is money well spent. |
Thanks to the Wii U's GPU, Unity has the system running DX:10.1 equivalent support while the X360 is stuck at DX8. So even with limited
support from UE4 (we do have full support for UE3) Unity engine and CryEngine3 should be quite enough for 3rd parties don't you think?
AnthonyW86 said:
Note that Wii-U's gpu only has half the bandwidth though. |
I didn't know Nintendo officially released the GPU's bandwith?
Soleron said: If we're defining generation by time, it's current gen |
you do realize he is talking about the GPU and those are a yearly thing unlike console gens
AnthonyW86 said:
Note that Wii-U's gpu only has half the bandwidth though. |
I didn't know Nintendo released the GPU's bandwith?
AnthonyW86 said:
Note that Wii-U's gpu only has half the bandwidth though. |
5550 actually comes in 3 flavors when it comes to memory - GDDR5 (57.6GB/s), DDR3 (28.8GB/s) and DDR2 (12.8GB/s). Aggregate VP ratings for these cards are 27, 21.8 and 16.5, so they are quite sensitive to memory bandwidth.
WiiU, although it has DDR3, has bandwidth of the DDR2 version, but I'm thinking that EDRAM might put it somewhere between DDR2 and DDR3 version (theoretically I guess maybe even on par with DDR3 version).
For comparison, XOne's GPU equivalent (7770) has VP rating of 96, and PS4's (7850) stands at 141.
This is all without customizations each platform holder incorporated in their solution, so differences are most likely even bigger.
For reference, 360 is rated at around 12.