Apple, EU’s CSAM Plans Are ‘Dangerous Technology’

Image after article, titled Experts Say Apple's Plans to Scan EU Photos Are

photo: Victoria Song / Gizmodo

Over a dozen cybersecurity experts are slamming Apple and the European Union’s plans to scan photos on people’s phones for known child sexual abuse (CSAM) material, according to the New York Times Reports. In a 46-page study, the experts say that photo-scanning technology is not only ineffective, but also “dangerous technology”.

The experts told the NYT that they began their study before Apple announced its CSAM plans in August. That’s because the EU published documents Last year, this indicated that the government wanted to implement a similar program that would scan not only for CSAM but also for organized crime and terrorism on encrypted devices. The researchers also said they believe a proposal to approve this technology in the EU could come later this year.

The technology works by scanning photos on your phone before sending them to the cloud and encrypting them. These photos are then compared with a database of known CSAM images. While Apple tried several times Clarify how the function worked and released extensive FAQs, Security and privacy experts firmly believed that Apple built a “back door” that could be used by governments and law enforcement agencies to monitor law-abiding citizens. Apple tried to allay those fears by making a promise wouldn’t let governments use their tools like that. These promises has not appeased experts at the time, and some researchers claimed they could Reverse engineering the algorithm and trick it into registering false positives.

Amid the backlash, Apple press pause in his program Beginning of September. However, pressing a pause is not the same as pulling the plug. Instead, Apple said it would take a little longer to refine the feature, but did not provide details on what this revision process would look like or what the new release deadline would be.

The worrying thing here is that even if Apple eventually thwarts its CSAM plans, the EU is already building a case for its own version – and one with a larger scope. The experts told the NYT that they are now releasing their findings to warn the EU of the dangers of opening this special Pandora’s box.

“It allows a personal private device to be scanned with no likely cause for wrongdoing,” Susan Landau, professor of cybersecurity and policy at Tufts University, told the New York Times. “It’s extremely dangerous. It’s dangerous to business, national security, public safety and privacy. “

When it comes to the CSAM debate, it’s easy to get lost in the weeds. Apple, for example, launched an unusually sloppy PR campaign to explain all the screws and bolts of its data protection fail safes. (Spoiler: Everyone was still massively confused.) The question, however, is not whether you can make such a tool secure and private, but whether it should exist in this function at all. And if you ask the security guys, it appears the resounding answers is “no”.