Kami said:
Pulling out of my ass? Ok lol I'll play a long. Bayonetta 2 does not run at 60 fps by the way just becasue you hit 60 fps every now and then when things are not happening does not make it 60 fps. If that were the case tomb raider uns at 60 fps on the PS4 which it does not.
Lets compare a Wii U to my phone (G3)
The Wii U GPU runs at 550mhz, my phone runs at 578mhz. (G3 wins)
The Wii U has 320 ALUs, the G3 has 128. (Wii U wins)
The Wii U GPU can handle 4.4 GPix/s, the G3 has 4.64 GPix/s. (G3 wins)
The Wii U most likely handles 350 gigaflops, the G3 GPU handles 166.5 (Wii U wins)
Before I even continue do you know how GPU and CPU specs actually add up or do you just know the internet defintion of all of these? If you were to open up the Wii U would you know what your looking at and how to analyze it? I'm asking because if you do then we can actually get somewhere with all of this spec talk. If not, well I'm more than happy to explain :)
|
I was playing by your rules. If LoL plays at 60 fps 1080p on an HD 4670 (even though it drops to as low as 45 fps according to benchmarks) and averages around 52 fps, then Bayonetta 2 also, "runs at 60 fps." I did mention there were drops though.
Again you're using non-official Wii U specs. Where did the 4.4 GPix/s come from? The same people who gave the 180 - 350 gflop estimate. But that is besides the point. Your phone is designed to run at resolutions of 1440 x 2560 pixels, obviously it will benefit from a higher pixel fillrate.
I know perfectly fine what a GPU and CPU is. What I'm bothered by is that you are making conclusions without having any concrete information about anything. Then you make claims that nobody can make unless they know very specific details about the hardware at hand such as, "550 Gflops is enough to run Wii U games at 1080p 60fps." It certainly isn't enough if the fillrate is only 4.4 Gpix/s and the Wii U is limited in ram and memory bandwidth. And it brings us to the question that if what you say is true why don't cards with that theoretical performance, or even more run comparable PC games at 1080p 60fps? You even made claims that are amatuerish like, " It is a 4xxx card, certainly it will be similar to 350 Gflops" as if there are no 4xxx cards with floating point performance similar to the PS4's (see: HD 4890 @ 1600 gflops.) So to answer the question, " Before I even continue do you know how GPU and CPU specs actually add up " I know at the very least one can't speak of specific performance by looking at a floating point estimate. "If you were to open up the Wii U would you know what your looking at and how to analyze it?" For the first part, yes I can identify which processor is which, and that's about it. I am not a computer scientist nor am I an electrical engineer who designs processors, so I wouldn't be able to gather facts by looking at a picture of a processor, like a qualified individual can, however I do understand the arguments they make to an extent. Can you do these things you require of me?