Edit Policy: Why Activism Against Apple Should Be Nonprofit

Published by: MRT

Published on:

Edit Policy: Why Activism Against Apple Should Be Nonprofit

Apple has never rowed back so quickly: On August 5, the company announced that it would provide 15 iPhones in the USA with surveillance technology from iOS version onwards that would spy on their users: a strong piece for a company that has posters such as “What happens on your iPhone stays on your iPhone“advertises. A month later, a few days before the launch of the new iPhone model, it is now said that the plans will be postponed indefinitely in order to collect more feedback.

In the Edit Policy column, former MEP Julia Reda comments on developments in European and global digital policy. In doing so, she wants to show that European and global network policy developments can be changed and encourage political engagement.

The events at which Apple presents new products have always been meticulously choreographed. The company apparently does not want to drown out this important advertising date by the outcry with which civil society has acknowledged the surveillance plans. The case shows like no other that the critical examination of the behavior of companies is not only effective, but absolutely essential for modern democracies. This knowledge would also benefit German authorities, which are specifically trying to make the democratic debate between civil society and companies more difficult.

Apple’s plans have been postponed, but not yet finally off the table: All photos uploaded to iCloud should therefore be scanned against a database of known images of child abuse. In addition, the end-to-end encryption for iMessages on accounts of children is to be lifted in order to use machine learning to identify possible nude images in these messages and to automatically inform the alleged legal guardian.

If you have problems playing the video, please activate JavaScript

Immediately after the announcement, over 90 organizations wrote in an open letter to the dangers of these plans for the safety of the children concerned and the privacy of all iPhone users. Practical examples followed from sciencehow one can manipulate harmless-looking pictures in such a way that they generate the same hash value as known abuse photos in order to then send them to unsuspecting targets and thus bring them into the sights of the law enforcement authorities. Whistleblower Edward Snowden warned against itthat with the update, users are relinquishing any sovereignty over their devices, which would no longer work for them, but against them. This protest from academia and civil society has started to take effect. After Apple gave in, civil society is now calling for the surveillance plans to be stopped completely.

It is not just because of the widespread use of iPhones that the public has reacted so indignantly to Apple’s surveillance plans. Corporate decisions like these have political effects far beyond their own customers. Experience from security policy teaches us that law enforcement authorities never voluntarily forego investigative methods that companies once made technically possible. You can see that, for example, how harshly Facebook is criticized by the British government for itthat it wants to expand end-to-end encryption for its messenger services. If Facebook’s Messenger had been encrypted from the start, the company would almost certainly have to endure less criticism from politicians.

Company decisions influence what is considered politically feasible and are the first step in changing the law. After a political debate broke out in Brussels about the admissibility of Facebook’s voluntary measures to combat child abuse and grooming under data protection law, the EU Commission not only launched a legislative proposal to legalize these practices, it is also planning another draft law , which could make such monitoring measures mandatory for numerous platforms.

Even the EU copyright reform would never have relied on upload filters if YouTube had not previously developed such a technology voluntarily and thus aroused desires in the entertainment industry. Likewise, Apple has to put up with the accusation that if it voluntarily scans the end devices of its customers for photos of abuse, the company could be forced at any time by changing the law to extend this monitoring to any other content. The digital civil society knows: if it wants to effectively combat restrictions on fundamental rights, it is not enough to deal with the state. The behavior of large companies must also be observed and, if necessary, criticized loudly.

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.