Apple plans to scan iCloud Photos for child abuse material as part of privacy protections


Apple is reportedly abandoning a controversial plan to scan users’ iCloud photos for child sexual abuse material, or CSAM, amid an ongoing privacy inquiry.

Announced in August 2021, these security tools were designed to flag illegal content while maintaining privacy. But the plans have drawn widespread criticism from digital rights groups, who say the surveillance capabilities are ripe for potential abuse.

Apple canceled the plans after a month. Now, more than a year after the announcement, the company has no plans to move forward with the CSAM detection tool.

The company said it is developing new features that better balance user privacy and protect children. These settings allow parents to limit their children’s contacts, limit content and screen time, and provide a carefully curated app store for kids.

Apple says the best way to prevent child exploitation online is to stop it before it happens. The company highlighted the new features introduced in December 2021, which enabled this process.

The company said it will develop new features to balance user privacy and protect children.
SOPA Images/LightRocket via Gett

Communication security in Messages includes, for example, warnings when sending suspicious photos and enhanced prompts in Siri, Spotlight, and Safari Search.

The company is working on updating communication security in messages to protect nudity in videos and the safety of children. Apple is also working with child safety experts to make reporting incidents to law enforcement more seamless.

The company announced Wednesday that it will offer end-to-end encryption for nearly all data its users store on its global cloud-based storage system, making it harder for hackers, spies and law enforcement agencies to access sensitive user data.

.

Related Articles

Latest Posts