By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JEMC said:

There are a lot problems with Apple scanning for child abuse pictures, but one of the first ones that come to mind is: what are they going to do if they find them? I doubt they'll warn the authorities because, given that those pics/videos were obtained through an illegal search of a private property, no judge will accept them as valid evidence in a court case. So, again, what they'll do with it? Will they ban those users from using their services and lose customers?

And let's not forget about the cases where the algorithm may take a picture/vid for something it isn't.

It's not just the violation of your privacy that is a problem, but also what will Apple do with this information.

I also would have concern that this capability could be use to scan for other things especially in a country like china where let say a Winnie the Pooh film could get you thrown in jail.  No system full proof and this just seem way to easy for some one to figure out how to exploit and use for less noble goals as stopping child predator's.

That being said I think you wrong that apple could/would not give these report to out side sources.  I mean the picture they release (see Captain_Yuri post) shows they send the report to NCMEC (National Center for Missing & Exploited Children) where I assume it would be investigated by that organization.  Apple will just add it into the IOS 15 user agreement and then it be perfectly legal for them to search phone and give it to authorities.

As for the algorithm they already said each picture would be manually reviewed by a human before a report filed to NCMEC.  Worse job ever but it does resolve any issue with the algorithm being wrong.