Apple’s child abuse detection system is flawed — Report

A method promulgated by Apple Inc. and the European Union to scan people’s digital storage repositories for evidence of child pornography and other illegal content is a “dangerous technology” that can’t be implemented in a way that accomplishes the dual feat of preserving users’ privacy while helping government agencies conduct investigations, a group of prominent cryptographers and other security experts wrote in a report published Friday.

The 46-page report counts among its 14 authors pioneers in encryption software.

It outlines, in detail, what the authors deem the numerous risks of a technique called “client-side scanning,” which was at the heart of a controversy that erupted when Cupertino, California-based Apple announced a plan in August to scan users’ iCloud Photos accounts for sexually explicit images of children and then report instances to relevant authorities.

Apple later postponed those plans amid the backlash. The New York Times previously reported on the experts’ concerns.

The authors of the new report wrote that the method “by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic,” citing the “multiple ways in which client-side scanning can fail, can be evaded and can be abused.”

“Plainly put, it is a dangerous technology,” the report stated.

“Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope.”

“We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.”

Now read: Why Facebook’s leaked blacklist only had four South African organisations

Latest news

Partner Content

Show comments

Recommended

Share this article
Apple’s child abuse detection system is flawed — Report