shavenferret said: Has anybody worked with AI in some capacity? |
Not in the last decade. Before that I worked on GPS navigation. Which had what was then called AI for route finding, dead reckoning, voice recognition, address matching, basically a lot of fuzzy matching. As well as tracking cell phone tower data to predict / determine the locations of traffic jams and feed back the actual traffic speed on major roads.
Of course of all those the rule based 'AI' was most reliable, the neural net parts (voice recognition) the least, or rather the hardest to fix/improve. With limited processing power it all had to be either rule based or simple neural nets. Server based is still not great though, Siri and my TV often still don't understand my Dutch accent. :/
Anyway none of those were in any way threatening, but we did had some moral concerns about tracking user data and the whole scale tracking of all cell phone location data.
Never used AI for code generation, but if AI could help find bugs, that could be useful. Not just make the program crash but actually find why it crashes under what circumstances. That's the hardest part of the job, those infernal 'can't replicate' easily crashes and unintended behavior. Which still happens when everything is rule based. Murphy's law in software "Everything that can go wrong, eventually will go wrong". We proved that rule all the time lol.
One of the worst was tracking down unexpected slow down and increase in memory use in the routing engine. Eventually it turned out to be caused by a unique situation in the data network. A logging area in Germany somehow had an exact geometric pattern of exactly equal length roads, a grid pattern. A condition in the code kept both options open when search paths reach the next crossing with the exact same value. Since it was a large grid pattern like a checkerboard, it basically replicated the checkerboard problem. Doubling the open options (paths to investigate) at every intersection. Yet only when this area fell within search range it would start exploring there and thus slowing down the useful search paths the longer the search had to go on from there. Either completing it slowly or eventually running out of memory.
Another one was just if not harder to find. A condition where left turn prohibitions from all sides on a crossing could create another looping condition. Both bugs and others were eventually found with visualization of the search tree. Then spotting by eye where suspicious behavior occurs, activity that goes on too long in an area, areas that aren't reached, unexpected jumps etc. AI could be useful to spot things like that.
Visualization of code running has been very useful for optimizing disk access as well. Finding patterns in data access to organize data more efficiently, reducing the number of reads, block sizes to use, optimizing what should stay in memory and for how long. Using the human mind for pattern recognition. AI used to improve compression and data organization would be useful. Of course at one point Huffman coding was considered AI.
Anyway we were all about reducing costs and optimization, this just doesn't compute to me:
https://www.eurogamer.net/google-reaches-grim-ai-milestone-replicates-playable-version-of-doom-entirely-by-algorithm
How to make things less efficient...