Apple Inc. is facing a class action lawsuit accusing the company of failing to prevent the upload and distribution of child sexual abuse material (CSAM) on its iCloud service. The lawsuit, filed by a minor identified as Jane Doe, claims that Apple has engaged in “privacy washing”—promoting a public image of strong privacy protection while allegedly neglecting to implement effective safeguards against the transmission of harmful content.
According to the lawsuit, Apple is fully aware that iCloud has become a significant platform for the distribution of child pornography. The plaintiff alleges that despite this knowledge, Apple has chosen not to adopt industry-standard practices for detecting and removing CSAM from its services. Instead, the company is accused of shifting the responsibility and costs of ensuring a safe online environment onto children and their families.
The lawsuit criticizes Apple for failing to live up to its stated privacy policies. “Apple’s privacy policy claims that it collects users’ private information to prevent the spread of CSAM, but it fails to do so in practice—a failure that Apple was already aware of,” the lawsuit states. This alleged negligence has led to Apple being accused of misrepresentation, unjust enrichment, and violations of federal sex trafficking laws, California business and professional codes, and North Carolina consumer protection laws.
Jane Doe, through her guardian, seeks to represent a nationwide class of individuals who have been victims of child sexual abuse due to the transmission of CSAM on iCloud over the past three years. The lawsuit demands a jury trial and calls for declaratory and injunctive relief, as well as compensatory and punitive damages for all class members.
This legal action against Apple adds to a growing list of lawsuits the company is facing. Earlier this year, two antitrust class actions were filed against Apple, accusing it of monopolizing the smartphone market with its iPhone and engaging in anticompetitive practices.
As the case progresses, it raises significant questions about the responsibility of tech giants like Apple to protect their users, particularly vulnerable groups such as children, from harmful online content. The outcome of this lawsuit could have far-reaching implications for how companies handle privacy and security in the digital age.
The plaintiff in this case is represented by Juyoun Han and Eric Baum of Eisenberg & Baum LLP, and John K. Buche and Byron Ma of The Buche Law Firm PC. The case, Doe, et al. v. Apple Inc., is currently being heard in the U.S. District Court for the Northern District of California under Case No. 5:24-cv-05107.
If you have been a victim of child sexual abuse as a result of CSAM being uploaded to iCloud, you are encouraged to share your experience in the comments.
Italy (AGCOM): Mandatory age checks on adult sites start Nov 12
Italy’s communications regulator, AGCOM, will enforce mandatory age verification for pornography websites starting November 12, 2025. The system is designed to block access by minors and relies on certified third parties (such as banks or mobile operators) to confirm whether a visitor is 18+. After verification, the third party issues an access code that lets the user proceed to the site.
AGCOM describes a “double anonymity” model: adult sites receive only an “of-age” confirmation and never the user’s identity, while verifiers do not see which website the person is trying to access. According to the rules, the check is required on every visit, not just once.
An initial enforcement list covers around 50 services, including major platforms that host or distribute pornographic content in Italy. Sites found non-compliant can face penalties of up to €250,000.
What changes in practice
Start date: November 12, 2025.
Who verifies: Certified third parties that already hold user identity data.
What sites see: Only that a user is of age, not who they are.
Frequency: Verification is required each time a covered site is accessed.
Enforcement: Fines up to €250,000 for failures to comply.
Italy’s move aligns with broader European efforts to implement age-assurance on adult content. Platforms operating in the country are expected to finalize integrations with certified providers and update user flows to meet the deadline, while users should anticipate an extra verification step before entering affected sites.
Discord: ID photos of 70,000 users may have been exposed via third-party breach
Discord says official ID photos and other data tied to about 70,000 users may have been exposed after a cyber-attack on an external provider used for age verification and customer support. The company, which reports more than 200 million users globally, said on 9 October 2025 that its own platform was not breached and that access for the affected vendor has been revoked.
According to Discord, the leaked information could include personal details, ID images submitted for age checks, partial credit-card data, and messages exchanged with customer support agents. The company added that no full card numbers, account passwords, or messages beyond support conversations were involved. Impacted users have been notified, and the firm says it is cooperating with law-enforcement authorities.
Discord did not name the third-party provider. A representative from Zendesk, which provides customer-service software to Discord, told the BBC its systems were not compromised and that the incident was not caused by a Zendesk vulnerability. Discord also rejected online claims that the breach was larger than stated, calling them inaccurate and “part of an attempt to extort payment,” and clarified that the incident was not a ransomware attack: “We will not reward those responsible for their illegal actions,” a spokesperson said.
The incident underscores why attackers target high-value personal data—such as full names and government-issued identifiers—that tend to remain constant over time and are useful in scams. Discord has tightened age-verification practices in recent years amid concerns about the distribution of prohibited content on some servers and says it continues to invest in safety and verification controls.
Valve Deckard: What It Could Mean for VR Adult Content
The Deckard is an upcoming VR headset from Valve, expected to launch in the next few months. If current leaks hold, it could be a major upgrade for immersive adult viewing.
Launch timeline. Chinese analyst group XR Research Institute suggests Deckard is targeting the holiday season, with projected annual production of 400k–600k units, comparable to early Vision Pro volumes.
Pricing. Expectations point to a premium ($1,000+) price tier paired with high-end performance.
Why it matters for VR erotica (platform-agnostic):
Display tech. High-resolution OLED/LCD panels with strong contrast and color should elevate skin tones, low-light scenes, and fine detail.
Input & tracking. Newly referenced “Roy” touch-style controllers in SteamVR code hint at better ergonomics and precision—useful for interactive experiences.
Deckard features (per code dives/leaks):
Standalone + PCVR hybrid. Emphasis on wireless PC streaming for 6K/8K playback without tether drag, alongside native PCVR.
Comfort & design. Ergonomic improvements aim at longer, more comfortable sessions.
App compatibility. Popular VR video apps (e.g., PCVR players and standalone viewers) are expected to work seamlessly with Deckard, based on typical SteamVR support patterns and developer indications.
Bottom line: if Valve delivers on display quality, wireless PCVR, and ergonomics, Deckard could become a flagship device for high-bitrate adult VR—without locking users to any single platform.
You must be logged in to post a comment Login