By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Don't want parity in AC Unity? Join #PS4noParity!

cpg716 said:
Wow.. How about #NeedBetterCPUforBothConsoles.. since.. the reason for the "parity" was the CPU issues with BOTH consoles.. nothing to do with GPU.. Read the article.

This is a mobile CPU in these consoles.. AI is the reason here.. not Pixels or GPUs..

The amount of pixels on screen is purely a GPU work. It's not like improving a CPU will give you more pixels in this case. It can improve framerate because the time you need to put a frame out doesn't depend only in rendering work (GPU) but on AI and physics routines (the first is on CPU the second can be on CPU or on both). A bad CPU can impact the framerate because the AI routines would take more time to run, but that doesn't impact the pixel count, because they are independent operations that aren't CPU bound. Before you jump to your conclusions, be sure that you know a little bit about 3D real time rendering before saying things that are completely incorrect.

And even about the CPUs, they are probably enough. We already know that Mantle and DirectX 12 will allow PCs to improve framerates simply by removing unnecessary high-level access that demand too much CPU time. Basically, you will be able to keep the fps even for weaker CPUs. PCs have strong CPUs for gaming simply because the excessive abstraction layers make it necessary even for games that aren't doing simulations or other CPU-heavy operations.

Their reasoning about CPUs was about the AI and that's is a CPU work. The article clearly says that. Even then, it looks more like a lazy work from them looking about how their previous AIs were pretty dumb.



Around the Network
cpg716 said:
 

There is no "parity clause"..  You aren't even making sense..    And again.. the developer spoke of CPU issues(with both consoles as their CPU is identical, with the X1 having a little more speed)..  this doesn't just relate to FPS.. it also relates to resolution when it comes to AI on the screen..

 

So.. yeah.. anyway.. 

 

 

AI doesn't have any correlation with resolution. You just run an algorithm that will have info about where the main character is, what he is doing and where the CPU guy is and what he is doing. Then you decide what he will do next. Just that. There isn't any graphics info here, just coordinates and other info. It doesn't have anything with resolution. Even when you update the characters in the beginning of the rendering pipeline, you are just changing coordinates for 3D models, there are no pixels. You apply lightining to vertices and stil there are no pixels. Just when everything is done you can rasterize it, do some fancy pixels shaders, AA and output it. But that's all done in the GPU, there is nothing with CPU here. Stop posting about rendering when you clearly don't know how it works.

And we don't even know for sure which console has a better CPU. We got a benchmark saying it's PS4.



kowenicki said:
Panama said:
Imagine if Platinum made Bayonetta run at 30fps on the 360 due to the piss poor PS3 port. This shit makes no sense other than a certain company's deep pockets influencing the decision.


Are you suggesting MS are paying for parity.  Cash is exchanging hands?

If so, I suggest you report it to the authorities. That is both anti competitive and illegal.


why else would a company purposely gimp their game? Maybe MS is not paying them but gave them a vague "We would look favorably on it if you main parity in your multi-plat releases" type statement.



End of 2009 Predictions (Set, January 1st 2009)

Wii- 72 million   3rd Year Peak, better slate of releases

360- 37 million   Should trend down slightly after 3rd year peak

PS3- 29 million  Sales should pick up next year, 3rd year peak and price cut

torok said:

 

 

 

AI doesn't have any correlation with resolution. You just run an algorithm that will have info about where the main character is, what he is doing and where the CPU guy is and what he is doing. Then you decide what he will do next. Just that. There isn't any graphics info here, just coordinates and other info. It doesn't have anything with resolution. Even when you update the characters in the beginning of the rendering pipeline, you are just changing coordinates for 3D models, there are no pixels. You apply lightining to vertices and stil there are no pixels. Just when everything is done you can rasterize it, do some fancy pixels shaders, AA and output it. But that's all done in the GPU, there is nothing with CPU here. Stop posting about rendering when you clearly don't know how it works.

And we don't even know for sure which console has a better CPU. We got a benchmark saying it's PS4.

This is precisely true. If the game is having trouble hitting 30fps due to CPU bottlenecking with AIs, it wouldn't matter if you lowered the resolution to 200p, or boosted it to 4K (assuming unlimited GPU power), the framerate would be identical.

CPU has no bearing on screen resolution. None. Nada. Zilch. Zero. People need to stop looking ignorant by claiming such. They have no connection.

The CPU will run the game as fast as it can according to the programming, and then the world that is generated has to be rendered graphically by the GPU. If you hit your performance target on the CPU side, then you have to measure up how much detail you can draw that world in, and with how many bells and whistles you can enable settings wise.

You can try for resolution first, then play with texture details, lighting, shadows, AA, AF, AO, LOD, DOF, etc. With modern displays being digital, it's always preferable to go for a 1:1 pixel matching, as running non-native resolution is ugly. With upscaling, this takes the element of native pixel matching out of the equation (as the image stream is 1080p from the console even with 792p, 900p, etc), but the problem with upscaling is what happens before it gets there. The upscaling effect produces blur and artifacts. These are NOT hard to see with a good modern display, and rendered WatchDogs and BF4 terribly on the PS4. Looks okay on a small screen, badly calibrated screen, or okay screen from a great distance, but otherwise you'd have to be blind to not see how ugly upscaling is.

Back to ACU, it's this simple :

(A) IF the PS4 and XB1 can both do the game @ ~30fps (or acceptable framerate) with the desired content/AI/design 

(B) AND the XB1 can run that game @ 900P

(C) Then that means that the PS4 can run that @ 1080P with perhaps a shade of excess GPU power to spare

That's all there is to it. With zero optimization work, extra recoding, nothing. Same settings, same resources, just at 1080p vs. 900p. We're not talking porting to a different architecture, or dealing with a different type of GPU. It's just giving the same thing significantly more GPU power to render the same CPU-limited game @ 30fps (or variable 30 depending on how much of a wall they ran into CPU wise).

Repeat after me :

CPU bottleneck does not equal GPU bottleneck

It's honestly one of the dumbest PR statements of all time.



LOL am i the only one who thinks this is HILARIOUS, i mean they did this "parity" decision with the intention to stop "all the dabates and stuff" and they ended up generating the biggest shitstorm Ubisoft has ever seen, i just went to twitter and not only do you see people saying they already canceled their preorders but there is also all this images: