CSAM detection on the iPhone: Apple considers local child pornography scanners to be safe

Published by: MRT

Published on:

CSAM detection on the iPhone: Apple considers local child pornography scanners to be safe

Apple considers its controversial local child porn scanner, which will be part of the iPhone and iPad starting with iOS 15 and iPadOS 15, to be fundamentally secure. To this end, the company published a so-called Security Threat Model Review over the weekend, in which possible attack scenarios on the system are discussed.

By doing 14-page document the group assures, among other things, that the system, which is massively controversial among security experts, is designed in such a way that a user does not have to trust “Apple or any other individual entity” with regard to its functionality. Even “any number of possibly conspiring entities” from the same sovereign legal area (“under the control of the same government”) cannot do this, the group claims.

More from Mac & i

More from Mac & i

More from Mac & i

This is ensured by “various interlocking mechanisms”. This includes Apple’s uniform software update, which is distributed worldwide for execution on the devices, which is “intrinsically verifiable”. iOS and iPadOS are largely proprietary, however, and the group is currently not planning a code review of the CSAM scanner (“Child Sexual Abuse Material”). He could mean, however, the verification of specially equipped rooted iPhones, which the group issues to selected security researchers as part of the “Apple Security Research Device Program”.

Furthermore, Apple emphasizes that one never uses lists with hash values ​​of abuse material from only one organization, but “overlapping hashes from two or more child protection organizations”. This is also intended to prevent governments from adding content to the system that is not CSAM. Furthermore, there should be an auditing by external auditors. Finally, Apple plans to publish a knowledge base article on its website containing the root hash value of the current encrypted CSAM hash database, which will be part of every operating system update in the future. This would allow a user to check the root hash – to ensure that the database really comes from Apple.

The problem with all of these security measures, however, remains that Apple is dependent on market approvals by individual governments. That also emphasizes one Analysis of the network citizen rights organization EFF. The company has opened a door for surveillance and censorship: once this exists, it will also be used (“If You Build It, They Will Come”). Well-known security researcher Katie Moussouris meanwhile argued on TwitterApple cannot simply refuse to comply with legal regulations in one country because it would then block access to the entire market. “They will not operate illegally in a country and behave as a corporation like a sovereign nation.” So far, Apple has been stubbornly claiming that government orders to use the local CSAM scanner for censorship and surveillance will be opposed.


(bsc)

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.