In a way, it sort of is smoother than in real life, because our brains can't cheat for us.
I think you've got that backwards. Any motion blur our eyes add they can add just as well to moving objects on a screen. It's how far objects move through your visual field while the light is being collected on you retina that cause blur. You don't have motion blur on non rotating objects that you follow with you eyes. Which is probably why 120hz/240hz smooth motion looks weird. The motion blur is in the original footage and gets smeared out over multiple frames. Now it runs at 120 or even 240 fps yet when you follow the ball with your eyes, it is still blurred as captured by the original 60 fps camera. Maybe that even interferes with your motion perception that reacts to clear lines and shapes for movement. Instead of your brain seeing a blurry object 'step' through your visual field and interfering motion from that, now it can't see the steps any more and it becomes a blur smoothly moving along.
I don't think there's a difference between waving your hand in front of you and watching a high fps recording of waving your hand in front of you. Depending on the light source of course, waving your hand in front of a CRT monitor was an easy test for flicker :)
It probably mostly has to do with adjusting to 120 fps recordings. From birth your brain has learned how to deal with 24, 30 and 60fps. 120fps is new and apparently still not high enough to remove the distinction between looking out a window and looking at a 120fps screen. Back before LCD I used to game at upto 100hz (CRT doesn't really become flicker free below 100) I didn't notice anything weird about it. That was before motion blur was added to games.