Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups.
That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.
In a statement on Friday morning, Apple told TechCrunch:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to
→ Continue reading at TechCrunch