Once agian. It's not the DeInterlaceing chip. No chip does it right.
Then they should provide the option to deinterlace in the incorrect order. Video interlacing is just the process of sending the image one field at a time. Each field is every other line of the total image, and you need two fields to create one complete "frame". As long as the two fields are similar enough, that is, there wasn't much motion between the time where the two fields were created, things should look ok. If there is a lot of motion, you may need some form of motion compensation in the hardware to average the differences between the two fields to create a decent looking frame. FWIW this doesn't matter quite as much on a low resolution CRT unless they're out of phase, since the phosphors fade slowly and are close enough to blur the differences.
If the interlacing of the frames is backwards, that is, the first frame is going where the second frame goes, then that should be easy enough to fix if you know it's happening. Just provide an alternate field combining mode which places the fields in the correct place in the frame. If it's a motion compensation issue, well... They have to make a better chip. If field data is coming in for other frames while you're still decoding the current field, there isn't much you can do. I'd be surprised if the camera's owners didn't sue the manufacturer. Regardless, the technology isn't where it needs to be. I'll see where it is in 2009, when I replace my current 32" set.