CSAM scanning: British Home Secretary calls on Apple to implement it

Published by: MRT

Published on:

CSAM scanning: British Home Secretary calls on Apple to implement it

Apple’s plans to introduce a widely criticized photo verification feature on iPhones are backed by the UK Home Office. IT companies and operators of social networks must not allow “harmful content” to be spread on their platforms, said British Home Secretary Priti Patel.

End-to-end encryption in particular is “a major challenge for public safety” and leads to “blindness” among service providers as well as law enforcement officers – and thus also enables crimes such as the sexual abuse of children.

For this “technical problem” one is looking for a “technical solution”, such as Apple recently submitted with the child protection functions originally announced for iOS 15, writes Patel in an opinion piece for the newspaper The Telegraph. Apple’s filter technology for abuse photos has a very low false-positive rate and thus protects “the privacy of law-abiding users”, but at the same time catches offenders with large collections of abuse material (Child Sexual Abuse Material – CSAM). Apple must “pull through” the project.

End-to-end encryption should not open the door to child abuse any further, the Interior Minister argues – also with reference to Facebook’s plans to better secure communication via Facebook Messenger. “Exaggerated allegations from certain corners” that governments are only concerned with spying on innocent citizens are “simply not true”.

A “Safety Tech Challenge Fund” worth the equivalent of almost 100,000 euros is intended to promote the development of further technology that protects children on platforms with end-to-end encryption, said the Minister of the Interior. Calls for a back door for crypto messengers have long been heard from the British Home Office, and Patel’s predecessor vehemently demanded access to encrypted messages for law enforcement agencies.

More from Mac & i

More from Mac & i

More from Mac & i

Apple originally announced that it would integrate a nude picture filter in iMessage (with parent notification function) and check iCloud photos for known abuse material – the detection should take place locally on the device. After considerable criticism from various sides, Apple postponed the implementation of the project, first of all they want to collect input and improve the system. Civil rights activists such as the EFF continue to demand that the group completely crush the planned functions.


(lbe)

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.