Back Policy/Litigation
Back Resolutions
Back Current Initiatives
Back Donate

 

Resolution Details

Company:

Apple Computer, Inc.

Year:

2024

Issue Area:

Human Rights & Worker Rights

Focus Area:

Children, Human Rights Due Diligence, Human Trafficking/Exploitation, Risk Management, Sex Trafficking, Transparency

Status:

Withdrawn for Agreement

Resolution Text

Whereas: Online sexual exploitation of children poses material business risks to Information, Communication and Technology (ICT) companies and investors. In addition to reputational and legal risks, emerging legislation, including the United States’ STOP CSAM Act and Kids On-line Safety Act, the European Union’s Digital Services Act, the United Kingdom’s Online Safety Bill, and Australia’s ‘Online Safety Act’ aims to hold tech companies responsible for keeping children safe online, and imposes penalties that present financial risks for failing to adequately address the problem.

Each year, millions of images and videos of child sexual abuse material (CSAM) circulate online with reports having increased 15,000 percent over the last 15 years.[i] In 2022, the National Center for Missing and Exploited Children (NCMEC) received 31 million reports of alleged child sex abuse material.[ii] NCMEC noted that prepubescent children are at the greatest risk of being depicted in CSAM.[iii] Artificial intelligence is now being used to produce CSAM, magnify existing sextortion schemes, and target potential victims at previously unseen rates.[iv]

Apple is the world’s most valuable company and a major influencer in the ICT space with over 1.65 billion devices in active use. Its consumer electronics, software, operating systems and platforms for music, film, and internet portals are accessed by hundreds of millions of young people every day.

Apple does not proactively attempt to detect CSAM stored in its iCloud services despite widely available PhotoDNA detection technology used by other major tech firms, including Facebook,[v] Google,[vi] Adobe,[vii] Reddit,[viii] Discord,[ix] and Verizon.[x] Nor does Apple attempt to detect when its products and services are used to live-stream child sexual abuse.[xi] Former Apple Executive Eric Friedman stated that due to the company’s privacy protections, Apple is the “greatest platform for distributing child porn.”[xii]

Apple has developed “communication safety” tools to warn users about the dangers of sexual exploitation. Apple does not disclose data regarding the effectiveness of the tools in preventing the exploitation of children, claiming that doing so could raise privacy concerns. However, this information is financially material and will shed light on risks to investors.

The Tech Coalition, where Apple sits on the Board, emphasizes the importance of transparency in addressing CSAM. ICT peers, including Meta,[xiii] Amazon/Twitch,[xiv] AT&T[xv] and Verizon,[xvi] have reported results from human rights and child rights impact assessments to understand and address risks to children across their business units. However, Apple discloses little information on how it assesses the risk of its products facilitating child sexual exploitation, leaving investors in the dark.

RESOLVED: Shareholders request that Apple publish a report by March 2025, assessing risks of its products and services being used to facilitate online sexual exploitation of children, including metrics on the effectiveness of Apple’s efforts such as the amount of CSAM transmission prevented annually, prepared at reasonable expense, excluding proprietary information.

[i] https://www.thorn.org/

[ii] https://www.missingkids.org/ourwork/impact

[iii] https://www.missingkids.org/theissues/csam

[iv] https://cyber.fsi.stanford.edu/io/news/ml-csam-report#:~:text=June%2024%2C%202023-,New%20report%20finds%20generative%20machine%20learning%20exacerbates%20online%20sexual%20exploitation,is%20facilitating%20child%20sexual%20exploitation.

[v] https://about.fb.com/news/2020/06/fighting-child-exploitation-online/#:~:text=We%20have%20also%20taken%20steps,housed%20elsewhere%20on%20the%20internet.

[vi] https://blog.google/technology/safety-security/how-we-detect-remove-and-report-child-sexual-abuse-material/

[vii]https://www.adobe.com/legal/lawenforcementrequests/childsafety.html#:~:text=We%20utilize%20scanning%20technologies%20such,databases%20of%20known%20CSAM%20hashes.

[viii]https://www.reddit.com/r/RedditEng/comments/13bvo5b/reddits_p0_media_safety_detection/?rdt=56222#:~:text=Since%202016%2C%20Reddit%20has%20used,image%20uploaded%20to%20our%20platform.

[ix] https://discord.com/safety/360043700632-discords-commitment-to-a-safe-and-trusted-experience#:~:text=For%20example%2C%20we%20use%20PhotoDNA,our%20policies%20in%20their%20communities.

[x] https://www.verizon.com/about/our-company/company-policies/verizons-efforts-combat-online-child-exploitation-faqs

[xi] https://www.esafety.gov.au/newsroom/media-releases/world-first-report-shows-leading-tech-companies-are-not-doing-enough-tackle-online-child-abuse

[xii] https://www.forbes.com/sites/johnkoetsier/2021/08/19/apple-exec-we-are-the-greatest-platform-for-distributing-child-porn/?sh=32a7217b3c20;  https://s3.documentcloud.org/documents/21044004/2020-february-fear-friedman-admits-in-feb-2020-that-app-store-greatest-platform-for-child-porn-predator-grooming.pdf   

[xiii] https://www.bsr.org/reports/bsr-meta-human-rights-impact-assessment-e2ee-report.pdf

[xiv] https://www.bsr.org/reports/BSR-Twitch-Human-Rights-Impact-Assessment-Report_2.pdf

[xv] https://sustainability.att.com/priority-topics/human-rights

[xvi]https://www.verizon.com/about/sites/default/files/CRIA-Executive-Summary-June-2022.pdf