Child safety advocates disrupt Apple developers conference

Child safety advocates disrupt Apple developers conference

CUPERTINO — Around 35 protesters gathered at Apple headquarters Monday morning during the company’s annual global developers conference demanding the tech giant add a system to remove child sexual abuse content on iCloud — a venture Apple previously abandoned due to concerns over user privacy.

iCloud is a storage service that allows users to store and sync data across their devices, keeping information including photos, files, backups, passwords secure. The protesters — comprised mostly of child safety experts, advocates and  survivors of childhood sexual abuse — say the service allows perpetrators and abusers to confidently store and share child exploitation materials without getting caught by authorities.

Apple had spent years attempting to design a system that would identify and remove such content on the iCloud. The company ultimately scrapped the idea in late 2023, in response to concerns from digital rights groups that a scanning system would compromise the privacy and security of all iCloud users.

Shortly after, the child advocacy group Heat Initiative began organizing a campaign to demand that the company continue to move forward with detecting and reporting such materials. According to the Intercept, Heat is backed by “dark-money donors.”

Sarah Gardner, CEO of Heat, said in a statement after the protest that the the initiative is “transparent about its funders, which are made up of some of the most reputable safeguarding foundations in the world, including Oak Foundation and Children’s Investment Foundation Fund.  The group is also fiscally sponsored by Hopewell Fund.

The initiative, along child safety groups Wired Human and the Brave Movement, organized Monday’s protest.

Monday’s protest coincided with the first day of Apple’s annual Worldwide Developer’s Conference, an event when the company announces new tech features for its software programs. Gardner said Apple is leaving children safety behind in their conversations for new technologies, and needs to focus on protecting them.

“We don’t want to be here, but we feel like we have to,” Gardner said. “This is what it’s going to take to get people’s attention and get Apple to focus more on protecting children on their platform.”

As company officials and stakeholders passed through the Apple Park Visitor Center, child safety experts and advocates called out: “Build a future where children are protected.” Some spoke about their personal experiences with sexual abuse and voiced concerns about having more child safety measures in place.

“We’re not asking for much,” activist Sochil Martin said as the protester’s chants echoed in the background. “Apple has everything in their hands to do it.”

Their concerns also come as national leaders urge the passage of child safety bills including the Kids Online Safety Act, which would establishes guidelines to protect minors on social media platforms, including TikTok and Facebook.

Apple declined to comment on the protest, and instead provided this news organization with a 2023 letter exchange between Gardner and Erik Neuenschwander, Apple’s director of user privacy and child safety, that addressed the company’s reasoning for scrapping the system.

Neuenschwander said implementing a system would compromised the security and privacy of users, and would open the door, “for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories.”

Related Articles

Technology |


Restored artwork from Peanuts cartoon is unveiled at ‘Snoopy Bridge’ in Tarzana

Technology |


Is exploding cost of raising kids going to shrink Southern California?

Technology |


California legislators advance bills aimed at toxic chemicals, pesticides, lead

Technology |


Mental health begins in infancy, child development experts tell parents

Technology |


‘Snoopy Bridge’ in Tarzana undergoes renovations after being marred by graffiti

Apple already introduced new features in December 2021 that are designed to help keep children safe, including a setting where children can be warned when they receive or attempt to send content containing nudity in Messages, AirDrop, FaceTime video messages and other apps.

But protester Christine Almadjian said those features are not enough to protect children or hold predators accountable for possessing sexual abuse material. Almadjian, who is part of the national End Online Sexual Exploitation and Abuse of Children Coalition, said Apple needs to continue finding ways to identify and flag down such content.

“We’re trying to engage in a dialog with Apple to implement these changes,” she said Monday. “They don’t feel like these are necessary actions.”