snowdog said: Games such as Trine 2, Most Wanted, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU say you're wrong. There have been plenty of games at 720p native, 60fps with v-synch enabled. The latter of which is the most important. It lies somewhere between the PS3 and Xbox One in terms of power, and if Nintendo have evolved the TEV Unit the way I think they have it explains several things: 1) Latte's ALUs being twice the size they should be 2) The low power draw 3) The performance of Latte with games in 720p native, 60fps with v-synch enabled 4) Negative comments from devs who haven't actually used the thing and have just judged it by looking at the spec sheet. |
It would be a big stretch to assume that the WiiU uses TEV.
And is only speculation at this point.
Conina said:
Trine 2 on Wii U is 30 fps, not 60fps: http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off
The PS4 version runs in 1080p 60 fps (2D) or 720p 60 fps (stereoscopic 3D): http://www.eurogamer.net/articles/digitalfoundry-vs-trine-2-on-ps4
That's already quite a steep performance gap:
- Wii U: 1280 x 720 x 30 = 27,6 MPixel per second
- PS4 2D: 1920 x 1080 x 60 = 124 MPixel per second (4.5x of the Wii U)
- PS4 3D: 1280 x 720 x 60 x 2 = 110 MPixel per second (4x of the Wii U)
But it gets even better:
"We have to go by the 3D limitations at the moment - so the regular resolution is 1080p and 60fps, but in 3D the resolution gets dropped to 720p60. The game actually still runs at 1080p internally. So in the future the game may even automatically be able to run and output at 1080p60."
- PS4 3D (native rendering): 1920 x 1080 x 60 x 2 = 249 MPixel per second (9x of the Wii U)
"One benefit of the current 720p situation is that you get improved anti-aliasing in the form of downsampling the 1080p image to 720p which, combined with FXAA, produces a very clean image that is upscaled very well by our display."
If Sony should support 4K gaming with a firmware update, Trine 2 could be played in 4K 30 fps... the resources are the same as for the current 3D-rendering:
- PS4 2D 4K 30 fps: 3840 x 2160 x 30 = 249 MPixel per second (9x of the Wii U)
"I can't think why we technically couldn't support 3840x2160 mode at 30fps (with the stereo rendering quality). Increasing the resolution while rendering less often would end up to the same amount of pixels being rendered"
|
There is more to a game than just the resolution and framerate that can lower performance and increase graphics.
However, outside of super high-end PC gaming, 4k isn't that important or even common place.
With that in mind, if the Xbox 360 and Playstation 3 also had the correct interconect that could handle 4k output, then they too could have *potentially* had games running at 4k if they were simple enough.
Running at 4k resolutions is stupidly hardware intensive, the sad fact is, you won't be getting a game like Crysis running at 4k on any of the consoles, they don't have the Ram, Bandwidth or the compute resources to achieve it, only the PC does.
hated_individual said: Naughty Dog discussed PlayStation 4s architecture and they talked about latency... PlayStation 4 has GDDR5 which is main and only RAM, they said that it would take 200 cycles from L2 cache of CPU to RAM and vice versa also same for GPU to RAM thus it would take 800 cycles to make a full lap from CPU to GPU and back with HSA/HUMA while Xbox One has no HSA/HUMA thus would have 1200 cycles if it was using same type of RAM. WII U on the other hand would do it in 600 cycles or less if it used same main RAM as PS4 since CPU and GPU have direct access to each other thanks to eDRAM in Wii U GPU thus it can directly send processed data to CPU if needed. That is one of significant edges that WII Us archictecture has over competition and that edge that it has can be considered superior form of HSA and something that AMD might use in future if viable. |
Latency and GPU's isn't a problem, they're not impacted by latency all that much, it's bandwidth that they care about, CPU's however are the reverse, depending on the architecture of course.
Intel and AMD spend 10's of millions of transisters (Maybe even 100's of millions.) on their CPU's to hide latency, hence why it's not such a big deal for the Playstation 4 and Xbox One.
The WiiU doesn't have that luxury as it uses PowerPC and I'm actually not entirely sure on what IBM has done to hide latencies.
As for hUMA, the Xbox One doesn't actually support it like the Playstation 4 does to my knowledge, however it does have a comparible implementation.
Keep in mind that hUMA essentially allows for the CPU and GPU to view and alter and share the same data sets.
hUMA/HSA is essentially a step to something greater, which is going to require that technology at it's foundation to even work.
The WiiU also has a comparible implementation to hUMA, via it's eDRAM where both the GPU and CPU have coherancy, I'm not entirely sure if they share the same privelage in system memory however, so I won't bother on speculating about that.
In the end latency and bandwidth is only one problem amongst many when it comes to processor and graphics performance, they shouldn't be construed as the single most important factors which seems to be stupidly common place.
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxAzyYOAQ9.99
CPU and GPU will always see a consistent view of data in memory. If one processor makes a change then the other processor will see that changed data, even if the old value was being cached” – the Xbox One will feature something similar to hUMA.
Read more at http://gamingbolt.com/xbox-one-to-have-a-similar-solution-to-huma-xbox-one-dev#yR3o4WwxA