Connect with us

Tech

Apple’s iPhone Security Suddenly Under Attack—All Users Now At Risk

Published

on

Apple’s iPhone Security Suddenly Under Attack—All Users Now At Risk

Apple’s bad week has suddenly gotten worse. Just a few days after the FBI warned iPhone users to stop texting Android users, given the lack of encryption in RCS, the Bureau has now confirmed that U.S. law enforcement want access to encrypted iPhone content. And now, with perfect timing, Apple is being sued for not scanning encrypted user content for dangerous material, playing right into the FBI’s hands.

The net result is that the security all iPhone, iPad and Mac users rely on to keep their content safe and secure is under attack. The risk is the forced addition of backdoors into encrypted content. And once that line is crossed, there’s no going back.

ForbesApple’s Surprising iPhone Update—Green Bubbles End Next Week

This new lawsuit comes at the worst possible time. According to the filing lawyers, the class action is “on behalf of thousands of survivors of child sexual abuse for [Apple] knowingly allowing the storage of images and videos documenting their abuse on iCloud and the company’s defectively designed products. The lawsuit alleges that Apple has known about this content for years, but has refused to act to detect or remove it, despite developing advanced technology to do so.”

The claims relate to Apple’s proposal to scan on-device imagery for known child sexual abuse material (CSAM) before its upload to iCloud, using hashes of known images to flag matches on phones for manual review. An unsurprising backlash followed, and Apple withdrew its proposal before it was ever released.

Just a few hours before details of the lawsuit were first published in the New York Times, the FBI told me that “law enforcement supports strong, responsibly managed encryption. This encryption should be designed to protect people’s privacy and also managed so U.S. tech companies can provide readable content in response to a lawful court order.” The stories are different but the point is the same. U.S. law enforcement wants to force U.S. big tech to police the content on its platforms.

The lawsuit claims that “the images and videos of the plaintiffs’ childhood sexual abuse, which have been stored thousands of times, would have been identified and removed had Apple implemented its 2021 “CSAM Detection” technology.”

As I commented at that time, the issue is not scanning for CSAM, the issue is introducing screening of any content on one side of Apple’s end-to-end encryption. Right now, Apple can tell China, Russia and others that it does not have the technology to monitor for political dissent or religious or sexual behaviors, but bring in a backdoor for CSAM and there’s no impediment to its expansion. Apple and others defend decisions such as the removal of certain apps as compliance with local laws. You can see the risks as to where this might go if Pandora’s box is opened.

Realistically, the new lawsuit is just a sideshow to the real debate that will take place under the new Trump administration. During the last Trump presidency, Deputy U.S Attorney General Rod Rosenstein introduced the concept of “responsible encryption,” which aims to tackle ‘warrant-proof’ encryption, where tech platforms don’t hold any decryption keys, which law enforcement describes as “going dark.”

As The New York Times explains, “the lawsuit is the second of its kind against Apple, but its scope and potential financial impact could force the company into a yearslong litigation process over an issue it has sought to put behind it. And it points to increasing concern that the privacy of Apple’s iCloud allows illegal material to be circulated without being as easily spotted as it would be on social media services like Facebook. For years, Apple has reported less abusive material than its peers, capturing and reporting a small fraction of what is caught by Google and Facebook. It has defended its practice by saying it is protecting user privacy, but child safety groups have criticized it for not doing more to stop the spread of that material.”

In response to the lawsuit and its coverage, an Apple spokesperson told me that “child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.”

Now the FBI has reopened the debate into “responsibly managed encryption,” under the guise of Salt Typhoon hacking U.S. telco networks and the consequent warnings for American citizens to use encrypted messaging and calls where they can. The lawsuit makes the same point in a different way, but at the same time.

And there’s a third leg to this stool—Europe. EU regulators and lawmakers are still fighting amongst themselves over the proposal to resolve this problem differently. Again, taking CSAM as its starting point, the EU proposal is to introduce “chat control,” essentially making tech platforms responsible for the illegality of the content they transmit, forcing them to monitor content without actually participating in the monitoring itself. Users would need to agree to such content screening to install and use end-to-end encrypted platforms. This does not yet have the votes and sponsorship it needs amongst EU member states to proceed, but that could change.

Apple points to its advances in communication safety technologies as a safeguard for minors using its platforms, but that won’t satisfy law enforcement. A perfect storm could now be brewing for Apple and the 2 billion users that rely on its market-leading end-to-end encryption across much of its ecosystem to secure their data—even Apple, Apple says, cannot access their data under any circumstances.

ForbesGoogle’s RCS Nightmare—Why You Need A New App

But if the new Trump administration wants to push the FBI point, that “U.S. tech companies can provide readable content in response to a lawful court order,” and if Europe does the same, and if there’s a sensitive lawsuit exposing the risks in such encryption running in the background, then 2025 could prove difficult.

For all those Apple’s users this is a huge risk. Any breaks in the end-to-end encrypted enclave change it completely. If you’re an Apple user, you need to take this seriously.

Continue Reading
Click to scroll the page