By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Intrinsic said:

bonzobanana said:

 Regarding the locked cpu speed. It's pretty clear they will provide some headroom in cpu requirements for portable mode to allow for an increase in cpu load when docked. Whether that means portable is 720p 30fps and docked is 1080p 30fps, 720p 60fps or even 1080p 60fps. I'm sure you know that a game engine producing 60fps over 30fps does not need 2x the power. The game engine is generating the data constantly and how the frame rate is delivered based on that data is not a huge difference. The ps2 as weak as it was generating 60 frames per second for most games but only output half the interlace lines of one frame and then the alternate scanlines for the next internally generated frame. 720p - 1080p needs a big increase in gpu performance but 30 to 60 fps does not in the same way, yes it still needs a greater performance level but not to the same magnitude. The gpu may even have functionality to create interpolated frames.

 

No, this is wrong. I will try and explain this in the most generalized basic way possible.

A game engine is split into CPU and GPU tasks. Lets call the X and Y respectively. Now lets take Zelda. To run zelda at 720p/30fps in tablet mode, X and Y must complete all relevant tasks in 33ms per frame. 

You need to increase the power of the CPU by at least 30-50% if you plan on doing the core CPU tasks that are tied to the framerate of a game in half the time. You need to literally double the GPU power to do the same tasks in half the time at the same resolution. 

So basically, if the were to try taking zelda from a 720p/30fps game to a 720p/60fps game they will need at least 40% more CPU and two times the GPU.

Going from 720p/30fps to 1080p/30fps does not require anything on the CPU side. Its purely a GPU bump and as such is signicatly less taxing than upping the framerate. 

Look at it this way, if you take the switch spec undocked, then take it docked with the clock bump to the GPU.... thats exactly what you need to do to render almost twice as many more  pixels than 720p to get it up to 1080p.

You have massively over-stated the required level of performance boost and you are totally at odds with the reality of what is achieved by games in general. Including a huge range of 60fps games by both Nintendo and other publishers. With exceptionally low cpu resources the wii u has provided many rock solid 60fps games but has struggled to achieve 1080p resolutions. Many games including fps games like black ops 2 have maintained 60 fps games. I think part of the issue may be you are taking a PC perspective where games simply push more frames based on the hardware performance level rather than a game developed from the ground up to achieve a certain frame rate with given hardware where the engine will be fully optimised to work at a certain frame rate. 

However even if your figures are right and it is as high as 40% which I don't believe for a minute it is for a game designed from the ground up for 60fps. The wii u already runs games like mario kart at 60fps with a rubbish 32bit 9000 mips cpu and the Switch has a far superor 64bit quad processor of 19000 mips approx which probably can achieve 2.5x approx performance. Huge amount of headroom there. For the ps3 and 360 the 360 was also about 19000 mips in total but the older architecture gpu needed much more assistance from the cpu plus again older 32bit architecture. Again there is headroom there to achieve higher frame rates. 

I honestly don't get what your problem is the Switch has extra cpu performance, huge amount of extra memory, a much later gpu architecture with a huge amount of features that will assist both cpu and gpu performance. So your saying when DICE did Black Ops II on 360 they lost getting on half the cpu performance by going to 60fps. How can you even give a figure of 40% when the cpu performance of different systems varies enormously and is not a constant with some systems having a higher or lower ratio of cpu power compared to their gpu. The game engine itself would dictate how much extra processing is required per frame and this would never be a constant compared to the all the other logic processes going on depending on game.

Not forgetting the Maxwell architecture is based around optimised VR performance and reducing the load on the system.