There’s a small crack in the iPhone foundation but it could get a lot worse

Apple's CSAM photo scanning intentions may be good, but that's how you pave the road to hell.

Apple's out-of-the-blue announcement last week that it was adding a bunch of features to iOS involving child sexual abuse materials (CSAM) generated an entirely predictable reaction. Or, more accurately, reactions. Those on the law-enforcement side of the spectrum praised Apple for its work, and those on the civil-liberties side accused Apple of turning iPhones into surveillance devices.