Apple’s CSAM detection tech is under fire — again

Apple has encountered monumental backlash to a new child sexual abuse imagery (CSAM) detection technology it announced earlier this month. The system, which Apple calls NeuralHash, has yet to be activated for its billion-plus users, but the technology is already facing heat from security researchers who say the algorithm is producing flawed results.

NeuralHash is designed to identify known CSAM on a user’s device without having to possess the image or knowing the contents of the image. Because a user’s photos stored in iCloud are end-to-end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device, which Apple claims is

→ Continue reading at TechCrunch

More from author

Related posts

Advertisment

Latest posts

The Tantalizing Mystery of the Solar System’s Hidden Oceans

The original version of this story appeared in Quanta Magazine.For most of humankind’s existence, Earth was the only known ocean-draped world, seemingly unlike any...

The Role of Scholarships and Grants in Financing Education

Applying for scholarships and grants helps students pay for college. In some situations, these awards allow people to obtain a degree they wouldn't be...

EV, hybrid and gas-powered: Some interesting cars coming in 2024 | CNN Business

CNN  —  Next year will see the introduction of some new, genuinely affordable electric vehicles as well as...