Machiavellian said:
Is that AI doesn't understand or the person imputing the data did not fully understand what they wanted. Garbage in you get garbage out. Not saying that the AI model used here would have done things better but saying you really do not know what was used for this video so you cannot assume the AI did not understand the assignment. This could have been the desired results. |
Looks like what was seemed typical of particularly earlier AI models, so I'm just going to assume it's AI doing its best at portraying diving, without actually understanding how a human works, especially when diving. My guess is that the training material doesn't include certain camera angles and people in certain angles, so the AI just doesn't know what exactly to do, and because it doesn't properly understand people or diving, it guesses incorrectly.
Admittedly I could be wrong, but this seems exactly like what I would expect AI to do when the training data isn't that comprehensive. I'm sure the training data is very impressive, but it's hard to cover absolutely everything.







