torok said:
AI doesn't have any correlation with resolution. You just run an algorithm that will have info about where the main character is, what he is doing and where the CPU guy is and what he is doing. Then you decide what he will do next. Just that. There isn't any graphics info here, just coordinates and other info. It doesn't have anything with resolution. Even when you update the characters in the beginning of the rendering pipeline, you are just changing coordinates for 3D models, there are no pixels. You apply lightining to vertices and stil there are no pixels. Just when everything is done you can rasterize it, do some fancy pixels shaders, AA and output it. But that's all done in the GPU, there is nothing with CPU here. Stop posting about rendering when you clearly don't know how it works.
And we don't even know for sure which console has a better CPU. We got a benchmark saying it's PS4.
|
This is precisely true. If the game is having trouble hitting 30fps due to CPU bottlenecking with AIs, it wouldn't matter if you lowered the resolution to 200p, or boosted it to 4K (assuming unlimited GPU power), the framerate would be identical.
CPU has no bearing on screen resolution. None. Nada. Zilch. Zero. People need to stop looking ignorant by claiming such. They have no connection.
The CPU will run the game as fast as it can according to the programming, and then the world that is generated has to be rendered graphically by the GPU. If you hit your performance target on the CPU side, then you have to measure up how much detail you can draw that world in, and with how many bells and whistles you can enable settings wise.
You can try for resolution first, then play with texture details, lighting, shadows, AA, AF, AO, LOD, DOF, etc. With modern displays being digital, it's always preferable to go for a 1:1 pixel matching, as running non-native resolution is ugly. With upscaling, this takes the element of native pixel matching out of the equation (as the image stream is 1080p from the console even with 792p, 900p, etc), but the problem with upscaling is what happens before it gets there. The upscaling effect produces blur and artifacts. These are NOT hard to see with a good modern display, and rendered WatchDogs and BF4 terribly on the PS4. Looks okay on a small screen, badly calibrated screen, or okay screen from a great distance, but otherwise you'd have to be blind to not see how ugly upscaling is.
Back to ACU, it's this simple :
(A) IF the PS4 and XB1 can both do the game @ ~30fps (or acceptable framerate) with the desired content/AI/design
(B) AND the XB1 can run that game @ 900P
(C) Then that means that the PS4 can run that @ 1080P with perhaps a shade of excess GPU power to spare
That's all there is to it. With zero optimization work, extra recoding, nothing. Same settings, same resources, just at 1080p vs. 900p. We're not talking porting to a different architecture, or dealing with a different type of GPU. It's just giving the same thing significantly more GPU power to render the same CPU-limited game @ 30fps (or variable 30 depending on how much of a wall they ran into CPU wise).
Repeat after me :
CPU bottleneck does not equal GPU bottleneck
It's honestly one of the dumbest PR statements of all time.