Last Friday, Apple announced that it was implementing measures to combat the distribution of child sexual abuse media, or CSAM, on its services. Apple, the company that famously defied the FBI by refusing to provide technical assistance in hacking its own iPhones after a terrorist attack in San Bernardino, California, surprised commentators in both the tech and human rights communities with this announcement, and there was a predictable torrent of criticism from both ends of the policy spectrum.

The electronic distribution of child abuse images has been a perennial and unsolved issue for more than 20 years. The growing popularity of end-to-end encrypted apps such as Apple’s iMessage and Facebook’s WhatsApp has made it more difficult for both law enforcement and the platform providers themselves to access evidence of criminal activity or detect abuse. 

In the fractious and divisive policy debates over what to do…

Read more…

Share.

Comments are closed.