By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ethomaz said:
Kasz216 said:

Well first off... using the GPU for AI and Physics is going to greatly lower your processing budget for... well graphics effects, which is going to greatly cut into the PS4's graphics capabilties.  Also the CPU will just be sorta wasted at this point.  Doing so more or less wastes the Bandwith advantage in the first place, making using GDDR5 in the first place... useless.

Aside from which according to your link it doesn't.

"First, not all the algorithms fit for the GPU’s programming model, because GPUs are designed to compute high-intensive parallel algorithms"

As Civ 5's Ai wouldn't work well for parelel processing because well, it's AI can't be processed effectivly paralel as later problems rely on earlier ones... in otherwords, there just aren't that many independent variables to process seperatly as most things are dependent.

For example.  Mongolian Horse Archer attacks Greek Phanalax Roll 1D4 for damage, if Hp end up 10, pull back calvary men, pull foward spearman.  If lower then 10, attack with calvary man 1D4 = 2 attack with Spearman, 2 move spearman over valuable resoruce.


Now imagine that times about 20 more units needing to be positioned.  You can only paralel process so much since the MAJORITY of moves requires imput from other moves, meaning that each move has to be calculated seperatly.  You can get part of each problem done, but they have to be finished in order anyawy, making it fairly moot.

I mean, instead of computer... think of it as either having a really smart guy quick at solving problems. (DDR3) or a group of 8 or so guys. (GDDR5) (numbers pulled out of my ass,  but largely irrelevent.)

 

Sure the 8 guys can probably solve 30 problems faster then the 1 guy.

However, give them 8 parts of the same algebra problem and it doesn't really help and they'll fall behind.   As 2 needs to find out what X is from 1,  3 needs to find Y from 2,   Etc.

Sure, they can shave off a little time by simplyfing down to needing the variables... but they still need the variables and then just sit there... waiting for the guys at the front of the line to hand them the new info, and since these guys are split up, they have to walk towards each other to tell each other what each number is, and they are each individually slower then the first guy at figuring out stuff in the first place. 

Once you build up enough dependent variables for a complicated AI in a strategy game... the smart guy has a clear advantage.

 

GPU's are workers... not thinkers.  Now if your offlaoding ALL The work on your workers... and your thinkers aren't doing anything... even the thinking.

Well, that's bad programming right there.  Your workers will get overworked, or you will have to cut back on their normal duties.

AI is a parallel processing... you have all the variable to woks in parallel to make the most actions possible and choose one... GPU do that... if any variable change a new AI processing is made.

You are just not understanding what the AI parallel processing means... it is already made in parallel in CPUs because it need to be processed that mode to be fast and responsible.

The AI process in parallel one decision with the variable they have at that moment... if a new variable (a later problem) happen it made a new AI parallel process to make another decision... that is made for all object in the screen.... so in parallel.

There is no linear or wait decision in IA processing. Everything is made in milliseconds (os less time)

The more parallel processing power you have better the AI because you made more options for the final decision making the AI even more unpredictable.

Just google Intelligence Artificial Parallel Processing... the base of IA is to process everything in parallel (GPU task).

Actually i'm thinking you just don't understand...

Or just have never played a game like Civilization.

As that's not how the AI works.  Again, Civilization's AI works mostly on fast sequential calculations... as do most games like it. (well... and most games)  You are not talking about making the best or quickest ai, your talking about making the most random ai.   I think your confusing Artifical Intellegence (Making a computer seem human) with Artificial Intellegence (Choices made by a game to be a challenge)

Parallel Threading is good if you want something random, and it's good if your trying to search through a giant database, trying to solve a bunch of independent problems or  trying to make a great Jeopardy computer....

Despite that and fact that everyone has multiple core CPUs, videogame Ai's are still generally made with Lua or Python, despite their being parallel processing alternatives. (Like Parallel Python.)

Which really, come to think of it... I'm not even sure why you brought the arguement down this avenue...

even if you use multiple cores... when it comes to complex dependent variable AIS..(or really any meant for a game) DDR3 is just going to work better in the first place... for AI Proccesses be it

GPU or CPU... just because latency is in NS doesn't mean it doesn't make a difference... again play a CPU intensive game like Civ 4 and you'll see.

There is a reason why even parallel processing Watson used 16TB of  DDR3. (And CPUs... For that matter.)  When they were just trying to build the smartest PC they could.

Either way GPU's  just aren't as good at complicated algebra choices.

Which is why the only super computer that uses GPUs also happen to use CPUs.  To make all the choices. (Titan)

The GPUs do the basic math, while the CPU's swoop in with the dependent variable stuff and direct trafic.

In the previous example working together, the CPU solvse the tough algebra while the GPU's handle the order of operations 2+2 parts.

 

Outside which  You keep changing the arguement as you keep losing them.  It's a shame because there are intresting conversations to be had on game design and what kind of games would excel on the system... but instead it's just... pointless, except for lurkers who get to learn a thing or two.

As such... I think i'm done.