By using this site, you agree to our Privacy Policy and our Terms of Use. Close

you fell to realize that ps4 has problems with assesins creed 4 to do 1080p60fps and even posted from digital foundary and a video here

you fell to realize that a weaker system only can perform as good as the other powerful system only when the game is a ground up game not a port

you faell to realize that modern system doesnt do anything if the code of the ports tells the system to do stuff the way the other harwdare does things

 

seriosuly, its impossible, 176gflops cant do magic for ports, ps4 games prove it, ps4 is 7.5x more powerful than 360, more odern, has tremendous amount of fddr5 memory and still cant do 1080p60fps with games like assesins creed when you only need aout half that power memory and all to do it?

 

the answer is simpe,, these are quick ports, and if they affect ps4 so they affect wii u, wii u is not a magic box, thats why wii u agmes like assesins creed 3 can run on the system despite being ports, yet 2x more power and having a cpu weaker but more modern still hit te perfromance and thats why framerate is worse on wii u in most of the ports except for ones that have been optimized a little like need for speed

 

read dude, read, i am not telling it is digital foundary and others

here, enjoy

http://www.eurogamer.net/articles/digitalfoundry-ps4-ac4-patch-analysed-in-depth

 

"

Ubisoft recently revealed that the PS4 version of Assassin's Creed 4 would gain an update shortly after the game's release, bumping up the rendering resolution from 900p to 1080p, along with some improved visual effects and a new anti-aliasing solution, with all of this possible via further optimisation work carried out after core development was completed. With the patch now available in the US, are we looking at a noticeable upgrade in visual quality, or a more modest refinement to the original, unpatched presentation?


We kick off with a 1080p head-to-head video showcasing a number of clips from the first hour of the game. On first impressions, the differences between running the game upscaled from 900p and natively in 1080p are actually quite subtle: there's a slight but noticeable boost in sharpness, and images appear a little more crisp, but nothing that immediately grabs you as amounting to anything approaching a sensational upgrade. The original 900p framebuffer is actually upscaled well to 1080p without introducing much in the way of unwanted artefacts, and the anti-aliasing solution in both versions helps to give the game a smooth appearance.

The fact that we aren't seeing any hiccups in smoothness is also rather interesting: it makes you wonder, if Ubisoft are hitting 30fps on such a consistent basis, just how fast would the game actually run on the PS4 hardware if it were not for the frame-rate cap? Sadly, this is something we're unlikely ever to find out, with the upcoming PC version looking like the only one capable of delivering a 60fps update while running at high definition resolutions.

"

want more?

here

https://www.youtube.com/watch?v=l2Uza8TUbQU

 

so, is sony telling lies about the 1.8 teraflops?

or is the port we have to blame?

 

if thats the case, what has wii u to do magic with just 176gigaflops when 360 and ps3 are more powerful?

 

if thats the case, why i can fit enough shaders on wii u gpus die size(already accounted edram and other embedded stuff) enough for the 500 to 600 gigaflops?

wii u gpu is about 96mm2 and redwood xt is about 104mm2, yet the wii u gpu was measured by an exact photo of chipworks and the redwod wasnt, which points out redwood could be like 95mm2

 

if a redwood (hd5000)can hold 400 strea cores, 20 tmus and 8 render output units. giving us like 620gigaflops at 800mhz, why the wii u which could have 400 stream cores, 16tmus and 8 rops would not have like 400 to 500 gigaflops?

not to mention that since has less tmus we can use that space for more stream cores

 

seriosuly but both the ports and the die size of the gpu and other things tell the contrary