*
West Virginia says Apple ( AAPL ) prioritizing privacy over child
safety
*
Apple ( AAPL ) shelved scanning plan on privacy concern, Reuters
reported
*
Apple ( AAPL ) filed far fewer abuse reports than Google, Meta in
2023
(Adds statement from Apple ( AAPL ) in paragraphs 3 and 4)
By Nate Raymond and Stephen Nellis
Feb 19 (Reuters) - West Virginia's attorney general
filed a lawsuit on Thursday accusing Apple ( AAPL ) of allowing
its iCloud service to become what the company's internal
communications described as the "greatest platform for
distributing child porn."
Attorney General JB McCuskey, a Republican, accused Apple ( AAPL ) of
prioritizing user privacy over child safety. His office called
the case the first of its kind by a government agency over the
distribution of child sexual abuse material on Apple's ( AAPL ) data
storage platform.
"These images are a permanent record of a child's
trauma, and that child is revictimized every time the material
is shared or viewed," McCuskey said in the statement. "This
conduct is despicable, and Apple's ( AAPL ) inaction is inexcusable."
Apple ( AAPL ) in a statement said it has implemented features
that prevent children from uploading or receiving nude images
and was "innovating every day to combat ever-evolving threats
and maintain the safest, most trusted platform for kids."
"All of our industry-leading parental controls and
features, like Communication Safety - which automatically
intervenes on kids' devices when nudity is detected in Messages,
shared Photos, AirDrop and even live FaceTime calls - are
designed with the safety, security, and privacy of our users at
their core," Apple ( AAPL ) said.
The company has considered scanning images but abandoned the
approach after concerns about user privacy and safety, including
worries that it could be exploited by governments looking for
other material for censorship or arrest, Reuters has reported.
McCuskey's office cited a text message Apple's ( AAPL ) then
anti-fraud chief sent in 2020 stating that because of Apple's ( AAPL )
priorities, it was "the greatest platform for distributing child
porn."
His office filed the lawsuit in Mason County Circuit Court.
The lawsuit seeks statutory and punitive damages and asks to
have a judge force Apple ( AAPL ) to implement more effective measures to
detect abusive material and implement safer product designs.
Alphabet's Google, Microsoft ( MSFT ) and other
platform providers check uploaded photos or emailed attachments
against a database of identifiers of known child sex abuse
material provided by the National Center for Missing and
Exploited Children and other clearing houses.
Until 2022, Apple ( AAPL ) took a different approach. It did not scan
all files uploaded to its iCloud storage offerings, and the data
was not end-to-end encrypted, meaning law enforcement officials
could access it with a warrant.
Reuters in 2020 reported that Apple ( AAPL ) planned end-to-end
encryption for iCloud, which would have put data into a form
unusable by law enforcement officials. It abandoned the plan
after the FBI complained it would harm investigations.
In August 2021, Apple ( AAPL ) announced NeuralHash, which it
designed to balance the detection of child abuse material with
privacy by scanning images on users' devices before upload.
The system was criticized by security researchers who
worried it could yield false reports of abuse material and it
sparked a backlash from privacy advocates who claimed it could
be expanded to permit government surveillance.
A month later Apple ( AAPL ) delayed the introduction of NeuralHash
before canceling it in December 2022, the state said in its
statement. That same month, Apple ( AAPL ) launched an option for
end-to-end encryption for iCloud data.
The state said NeuralHash was inferior to other tools and
could be easily evaded. It said Apple ( AAPL ) stores and synchronizes
data through iCloud without proactive abuse material detection,
allowing such images to circulate.
While Apple ( AAPL ) did not go through with the effort to scan
images being uploaded to iCloud, it did implement a feature
called Communication Safety that blurs nudity and other
sensitive content being sent to or from a child's device.
Federal law requires U.S.-based technology companies to
report abuse material to the National Center for Missing and
Exploited Children. Apple ( AAPL ) in 2023 made 267 reports, compared to
1.47 million by Google and 30.6 million by Meta Platforms ( META )
, the state said.
The state's claims mirror allegations in a proposed class
action lawsuit filed against Apple ( AAPL ) in late 2024 in federal court
in California by individuals depicted in such images.
Apple ( AAPL ) has moved to dismiss that lawsuit, saying the firm is
shielded from liability under Section 230 of the Communications
Decency Act, a law that provides broad protections to internet
companies from lawsuits over content generated by users.