Kwaad said:
Quite a few of the newer AI's actually DO obey line of sight. They also know there are objects they can hide behind. I just wanna say, I had my brother write me a smiple 'script'.
You run the script 30 times... it takes about 0 seconds. Do it about 40 times... it takes about 3 seconds. Do it about 50 times, it takes about 4 minutes. I admit, Scripts are simple. Very very simple. My brother wrote that script in about 10 minutes for me. My computer just cant seem to run that script 100 times in a day. (24 hours) EDIT: I bumped this thread becuase I had my brother make this program earlier today, I just got around to posting it, and it is in relation to this thread. |
arghh... 1. you don't mean running the code "30", or "40" or "50" times, you mean using "30" or "40" or "50" as an input.
of course your computer can't run it "100" times a day--it's using "100" as an input, and basically it's gonna do 2^100 calculations (complexity of your algorithm), on top of stack calls. assuming that's your limit and the PS3 is 1000 times faster than your computer with no memory limits it can still only run it "110" times a day (110 as input). Exponential growth happens in a lot of algorithms, including AI ones, and that's the gist of the argument why more powerful processing doesn't mean much, and almost certainly imperceptible.
2. it's using a recursive alrogithm--and that's why you have the exponential time scaling. recursive algorithms are easy to code but notoriously inefficient, and indeed, for this particularly application using a different algorithm (simply store the data on the fly) and the problem essentially scales linearly. meaning--it certainly takes less than a second to run it by using "100" as an input.
the Wii is an epidemic.