Apple's child abuse photo scanning: What it is and why people are worried

Apple has announced a trio of new technologies to compay child abuse images. Here's what you need to know.

Apple announced last week that it will begin scanning all photos uploaded to iCloud for potential child sexual abuse material (CSAM). It's come under a great deal of scrutiny and generated some outrage, so here's what you need to know about the new technology before it rolls out later this year.