Yeah, most content should be fine unless you are specifically going for that audience. Still, this seems kind of ridiculous.
Problem is that youtube uses AI software to determine what videos are for children or not and allready a big amount of channels have lost their monitization because of this,too many creators that make vids for all ages to watch can be struck by this and even using some words like "cool" or "fun" can make a bot think it is for kids.
I feel really bad for the people that depend on youtube to be able to pay their bills and are suddenly struck by this.
While I also don't particularly trust Youtube's algorithms, it doesn't seem that Youtube is doing anything to factor this into their algorithms (or at least isn't using algorithmic means as their main way of designation). I may be wrong, but it seems like Youtube has provided the tools to let creators self identify whether their content is targeted towards children and the creator is directly liable for mischaracterization of their videos.
"As a creator, you know your videos and your audience best, and it is your legal responsibility to comply with COPPA and/or other applicable laws and designate your content accurately."