Thursday, February 19, 2026

West Virginia Sues Apple Over iCloud CSAM Allegations

Please
Share Article

Attorney General Claims Apple’s iCloud Design & Encryption Choices Enabled Widespread Storage & Distribution Of Child Sexual Abuse Material

Thursday, February 19, 2026, 4:35 P.M. ET. 5 Minute Read, By Jennifer Hodges, Political Editor & Art Fletcher, Executive Editor: Englebrook Independent News,

CHARLESTON, W.VA.- West Virginia Attorney General John B. McCuskey on Thursday filed a sweeping consumer-protection lawsuit against Apple Inc., alleging that the company’s iCloud platform has become one of the most significant facilitators of the possession, storage, and distribution of child sexual abuse material (CSAM) in the United States.

     The lawsuit, filed in Mason County Circuit Court, asserts that Apple knowingly designed and maintained an ecosystem that makes it easier for offenders to collect, store, and share illegal material while declining to implement widely available detection and reporting safeguards used by other major technology companies.

     The civil action, State of West Virginia ex rel. McCuskey v. Apple, Inc., Case No. CC-26-2026-C-16, was electronically filed February 19, 2026, at 8:52 a.m., and seeks injunctive relief, civil penalties, restitution, and punitive damages under West Virginia law.

Allegations Center On iCloud’s Design And Functionality;

     According to the complaint, Apple’s iCloud service is not a passive data-storage system but an integrated digital ecosystem intentionally engineered to simplify the organization, synchronization, and sharing of images and videos across devices.

     West Virginia argues that these design choices reduce “friction” for individuals who collect CSAM, enabling long-term retention, ease of access, and redistribution, all while minimizing detection.

     The filing states that iCloud’s automatic backup features, device synchronization, and content-sharing tools allow illegal material to persist indefinitely across multiple Apple devices, even when users change phones or accounts.

Internal Knowledge And Reporting Disparities;

     The Attorney General’s office alleges Apple has long been aware of the risks posed by its platform. The complaint references internal Apple communications cited in prior public reporting, including a text message attributed to Apple’s anti-fraud chief describing Apple as the “greatest platform for distributing child porn.”

     West Virginia also points to stark disparities in CSAM reporting volumes to the National Center for Missing & Exploited Children (NCMEC).

     According to figures cited in the lawsuit for 2023:

  • Apple submitted 267 reports
  • Google submitted approximately 1.47 million reports
  • Meta submitted approximately 30.6 million reports

The state contends the discrepancy reflects Apple’s refusal to deploy detection tools, not an absence of CSAM on its platform. 

Encryption And Advanced Data Protection;

     A central component of the lawsuit focuses on Apple’s use of encryption, particularly its Advanced Data Protection feature, which extends end-to-end encryption to iCloud backups and other cloud data.

     Apple began offering Advanced Data Protection broadly in December 2022 with iOS 16.2, iPadOS 16.2, and macOS 13.1.

     West Virginia argues that while encryption protects user privacy, Apple’s implementation places even known illegal material beyond the reach of law enforcement, including when valid warrants are issued.

     The complaint cites public statements attributed to federal law-enforcement officials, including the FBI, warning that expanded end-to-end encryption significantly hampers investigations into child exploitation and other serious crimes.

Abandoned CSAM Detection Technology;

     The filing devotes extensive attention to Apple’s previously announced CSAM detection initiative, unveiled in August 2021.

     That system relied on NeuralHash, an on-device technology that compares images uploaded to iCloud against known CSAM hashes, triggering reporting to NCMEC once a predefined threshold is met.

     According to the lawsuit, Apple delayed, then ultimately abandoned the program amid public backlash and internal concerns. The state alleges Apple quietly removed references to NeuralHash from its website by December 16, 2022, effectively abandoning a detection system that could have dramatically increased reporting.

     West Virginia characterizes that decision as driven by business and brand considerations rather than child safety.

     The complaint asserts four causes of action:

     Strict liability, design defect

     Negligence

     Public nuisance

     Violations of the West Virginia Consumer Credit and Protection Act (WVCCPA)

     Under the WVCCPA claim, the state alleges Apple engaged in unfair or deceptive practices by omitting material facts about how its products facilitate CSAM possession and by marketing privacy protections without adequately disclosing child-safety risks.

     The state seeks court-ordered changes to Apple’s platform, civil penalties, restitution, disgorgement of profits, and other equitable relief.

Statements From The Attorney General;

     In announcing the lawsuit, McCuskey described the action as unprecedented and necessary.

     “Preserving the privacy of child predators is absolutely inexcusable,” McCuskey said in a statement. “Apple’s refusal to detect and report these images is not a passive oversight; it is a choice.”

     The Attorney General’s office emphasized that the lawsuit is intended to force Apple to adopt meaningful safeguards to prevent repeated victimization by the ongoing storage and distribution of CSAM.

Apple Responds;

     Apple rejected the allegations, stating that it has implemented numerous child-safety features and continues to invest in protections across its platforms.

     In a statement reported Thursday, Apple said it is “innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”

     Apple specifically pointed to Communication Safety, a feature that intervenes on children’s devices when nudity is detected in Messages and other Apple services.

What Comes Next;

     The case will now proceed in West Virginia state court, where Apple is expected to file a response or preliminary motions challenging the claims.

    Legal observers anticipate early litigation over whether federal protections, including Section 230 of the Communications Decency Act, limit the state’s ability to pursue liability based on user-generated content.

     The case also places Apple at the center of a broader national debate over privacy, encryption, and corporate responsibility in combating child exploitation online.

Editor’s Note:

This article was written by Jennifer Hodges, Political Editor, and edited by Art Fletcher, Executive Editor, and is based on the civil complaint filed by the West Virginia Attorney General’s Office in Mason County Circuit Court, official press statements, and contemporaneous reporting that included Apple’s response. All allegations described herein are claims made in a civil lawsuit and have not been adjudicated or proven in court. Englebrook Independent News will continue to follow this case and provide updates as legal proceedings advance.

Jennifer Hodges
Jennifer Hodges
Jennifer Hodges is a Chief Investigative Reporter & Editor for Englebrook Media Group

Subscribe

Get the stories that matter—delivered straight to your inbox, no noise, no spam, just real local reporting.

* indicates required

Intuit Mailchimp

Read more

Local News