The lawsuit accuses Apple of prioritizing privacy branding and its own business interests over child safety.
"We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids," an ...
Apple has been sued by the state of West Virginia over what it says is a failure to prevent child sexual abuse materials (CSAM).
West Virginia Attorney General JB McCuskey has filed a lawsuit against Apple ( AAPL) for allegedly not stopping the storing of child sex abuse material, or CSAM, on its iCloud platform.
The lawsuit claims Apple has allowed iCloud to become a ‘secure avenue’ for storing and distributing CSAM.
Apple Sued Over Allegations of CSAM on iCloud ...
The post West Virginia Files Lawsuit Against Apple iCloud For Failing to Stop CSAM Material appeared first on Android ...
It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an ...
Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy ...