TechScape: Is Apple Taking a Dangerous Step Into the Unknown? | Technology

Apple made waves on Friday, announcing that the company would begin scanning photo libraries stored on iPhones in the United States to find and report known cases of child pornography.

From our history:

Apple’s tool, called neuralMatch, will scan the images before they’re uploaded to the company’s iCloud online storage Photos, comparing them to a database of known child abuse images. . If a strong enough match is reported, Apple staff will be able to manually review the reported images and, if child abuse is confirmed, the user’s account will be deactivated and the National Center for Missing and Exploited Children (NCMEC) notified. .

It’s a huge deal.

But it’s also worth spending some time talking about what’s not new here, as context is key to understanding where Apple is innovating – and where it’s actually catching up.

Sign up for Alex Hern’s weekly tech newsletter, TechScape.

The first thing to note is that the basic idea of ​​digitization is not new at all. Facebook, Google, and Microsoft, to name just three, do almost exactly that on any image uploaded to their servers. The technology is slightly different (a Microsoft tool called PhotoDNA is used), but the idea is the same: compare the uploaded images with a large database of previously seen child abuse images, and if there has a correspondence, block the download, report the account, and call the police.

The scale is astronomical and deeply depressing. In 2018, Facebook alone detected around 17 million downloads each month from a database of approximately 700,000 images.

These analysis tools are by no means “smart”. They are designed to recognize only images that have already been found and cataloged, with a little leeway to match simple transformations like cropping, color changes, and more. They won’t take pictures of your kids in the bath, nor will using the word “brucewayne” give you access to someone’s files with the password “batman”.

However, Apple is taking a big step into the unknown. This is because his version of this approach will digitize, for the first time from any major platform, photos on users’ hardware, rather than waiting for them to be uploaded to the servers of the company.

This is what sparked outrage, for a number of reasons. Almost all of them focus on getting the program going through a rubicon, rather than opposing the specifics of the problem itself.

By standardizing on-device scanning for CSAM, critics worry, Apple has taken a dangerous step. From there, they argue, it’s just a matter of degree for our digital lives to be monitored, online and offline. It is a small step in one direction to extend digitization beyond CSAM; it is a small step in another to extend it beyond simple photo libraries; it is a small step in another to expand beyond the perfect matches of known images.

Apple is adamant that it will not take these steps. “Apple will refuse such requests” to extend the service beyond CSAM, the company said. “We have already faced requests to create and deploy government-mandated changes that degrade user privacy, and we have firmly refused those requests. “

He better get used to fighting, as these requests are very likely to come. In the UK, for example, a website blacklist maintained by the Internet Watch Foundation, the UK sister of the US NCMEC, is blocking access to known CSAMs. But in 2014, a High Court injunction forced internet service providers to add a new set of URLs to the list – sites that infringe the copyrights of luxury watch maker Cartier.

Elsewhere, there are safety concerns about the practice. Any system that involves taking actions that the owner of a device has not consented to could, critics fear, ultimately be used to harm them. Whether it’s a conventional security vulnerability, a potential use of the system to hack phones, or a subtle way of misusing the scanning device to directly cause damage, they fear that the system will open up a new “attack surface”, to little advantage over even scanning on Apple’s own servers.

This is the strangest thing about the news as it is: Apple will only scan material that is about to be uploaded to its iCloud Photo Library service. If the company simply waited until the files were already uploaded, they would be able to scan them without crossing dangerous lines. Instead, he took this unprecedented step instead.

The reason, says Apple, is privacy. The company, it seems, simply values ​​the rhetorical victory: the ability to say “we never scan the files you’ve uploaded,” unlike, say, Google, which relentlessly exploits user data for any possible benefit.

Some wonder if this is not a prelude to a more aggressive decision Apple might take: to encrypt iCloud libraries so that it can not scan them. The company reportedly abandoned its plan to do so in 2018, after the FBI intervened.

Parental control

The decision to scan photo libraries for CSAM was just one of two changes Apple announced on Friday. The other is, in some respects, more worrying, although its initial effects will be limited.

This fall, the company will begin scanning texts sent using the Messages app to and from users under the age of 17. Unlike CSAM analysis, it won’t look for any matches: instead, it will apply machine learning to try to spot explicit images. If one is sent or received, the user will receive a notification.

For teens, the warning will be a simple “are you sure?” »Banner, with the option to click and ignore; but for kids under 13 it will be a bit louder, warning them that if they see the message their parents will be notified and a copy of the image will be saved to their phone so their parents can check.

Both features will be enabled by parents and disabled by default. Nothing sent through the feature is reaching Apple.

But, again, some are worried. Standardizing this type of surveillance, they fear, effectively negates the protections end-to-end encryption offers users: if your phone is spying on your messages, then the encryption is irrelevant.

Do better

It is not just activists who are making these points. Will Cathcart, the head of WhatsApp, opposed the measures, writing “I think this is the wrong approach and a setback for the privacy of people all over the world. People asked if we would adopt this system for WhatsApp. The answer is no.”

But at the same time, there’s growing support for Apple – and not just from child welfare groups who have been clamoring for features like this for years. Even the folks on the technical side of the discussion accept that there are real tradeoffs here, and no straightforward answers. “I find myself constantly torn between wanting everyone to have access to crypto privacy and the reality of the scale and depth of damage that has been made possible by modern communications technologies.” wrote Alex Stamos, formerly Facebook’s chief security officer.

Regardless of the correct answer, however, one thing seems clear: Apple could have gotten into this debate more carefully. The company’s plans were negligently disclosed Thursday morning, followed by a Spartan announcement on Friday and a five-page FAQ on Monday. In the meantime, everyone involved in the debate had already hardened themselves to the most extreme versions of their positions, with the Electronic Frontier Foundation calling it an attack on end-to-end encryption and NCMEC rejecting the “voices. screaming from the minority “who opposed the call.

“One of the fundamental problems with Apple’s approach is that they seem desperate to avoid creating a real trust and security function for their communications products,” Stamos added. “There is no mechanism to report spam, death threats, hate speech […] or any other type of abuse on iMessage.

“Either way, going out the door with non-consensual analysis of local photos and creating a client-side ML that won’t provide much real damage prevention, means Apple could have poisoned the well against any use of classifiers on the side.” client to protect users.

If you would like to read the full version of this newsletter, please sign up to receive TechScape in your inbox every Wednesday.


Source link

About Bernard Kraft

Check Also

British Columbia vaccination card required for certain activities at Strathcona Gardens – Campbell River Mirror

As per the order of the Provincial Health Officer, Strathcona Gardens Recreation Center will require …