Device-side scanning of iCloud photos raises privacy concerns

SappChat (https://sappchat.com/)
2 min readOct 20, 2021

Apple’s new Child Sexual Abuse Material (CSAM) system will be included in an iOS 15 update. Named “NeuralHash,” the system was revealed in a series of tweets by Matthew Green, a cryptography professor at Johns Hopkins University. It will roll out in the United States initially, but there is not yet a target date for other countries. Automated child abuse detection systems are banned in some parts of the world, such as the European Union.
It appears that the system will scan images on the device side if they are shared with iCloud.
Though Apple claims that its new iCloud photos system is more private and secure for the end user than any similar alternative, the announcement nonetheless set off alarms about mass surveillance. As a result of the scanning taking place on the device side, rumors quickly spread, implying that Apple intended to passively scan all user files on the device at all times.
Another issue is that the CSAM system allows Apple to remotely disable decryption on a flagged iCloud account. However, the company says that users will not be flagged unless they meet a certain (undisclosed) count of file matches among their iCloud photos. When this happens, the user’s iCloud account will be decrypted.
However, SappChat can promise that your data will stay encrypted and no third party will be involved!
Choose privacy and choose SappChat!

--

--

SappChat (https://sappchat.com/)

A decentralized Messaging App meets Decentralized Banking powered by Blockchain and AI. Its connecting everyone personally, financially, privately and securely.