Connect with us

Tech & IT

Meta: Investigating Network of Pedophiles on Instagram, Linked by Algorithms

Stanford University report uncovers networks of minors promoting self-generated child sexual abuse images. Meta has started a task force to investigate how its photo-sharing app Instagram facilitates the spread and sale of child sexual abuse material.

The new effort by the Facebook parent company follows a report from the Stanford Internet Observatory which found that large networks of accounts that appeared to be operated by minors openly advertising self-generated child sexual abuse material for sale.


Buyers and sellers of self-generated child sexual abuse material connected through Instagram’s direct messaging feature and Instagram’s recommendation algorithms made the advertisements of the illicit material more effective, the researchers found.

“Due to the widespread use of hashtags, relatively long life of seller accounts and, especially, the effective recommendation algorithm, Instagram serves as the key discovery mechanism for this specific community of buyers and sellers,” the researchers wrote.

The findings offer more insight into how internet companies have struggled for years to find and prevent sexually explicit images that violate its rules from spreading on its social network. Experts have highlighted how intimate image abuse or so-called revenge porn rose sharply during the pandemic, prompting tech companies, porn sites, and civil society to bolster their moderation tools.

The Stanford researchers said the overall size of the seller network ranges between 500 and 1000 accounts at a given time. They said they started their investigation following a tip from the Wall Street Journal, which first reported on the findings.

Meta said it has strict policies and technology to prevent predators from finding and interacting with teens. In addition to the task force, the company said it had dismantled 27 abusive networks between 2020 and 2022, and in January, disabled more than 490,000 accounts for violating its child safety policies.

“Child exploitation is a horrific crime,” Meta spokesman Andy Stone said in a statement. “We work aggressively to fight it on and off our platforms and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.”

While Instagram is a central player in facilitating the spread and sale of child-sexualized imagery, other tech platforms also played a role, the report found. For instance, the report found that accounts promoting self-generated child sexual abuse material were also heavily prevalent on Twitter, although the platform appears to be taking them down more aggressively.

Some of the Instagram accounts also advertised links to groups on Telegram and Discord, some of which appeared to be managed by individual sellers, the report found.

Tech & IT

Apple rolls out UK age verification with iOS 26.4 after Meta and Google child safety fines

Apple has introduced age verification for iPhone and iPad users in the UK with iOS 26.4 and iPadOS 26.4, adding a new layer of checks for accounts that require confirmation that the user is 18 or older.


According to the report, UK users may now be asked to verify their age by adding a credit card or scanning an ID, unless Apple has already confirmed that information. Apple says the process is required by law in some countries and regions for actions tied to an Apple Account, including downloading apps, changing certain settings, or accessing specific features. When verification is needed, a prompt appears in the Settings menu.

The rollout comes at a time when child safety rules are tightening across the UK. While current UK law does not specifically require device-level age verification, adult websites, including pornography platforms, are already expected to carry out age checks. That has led to wider discussion about whether verification should also happen at the device level, rather than only on individual sites.

The timing is especially notable because it follows a major child safety case involving Meta and Google. The companies were reportedly ordered to pay $6 million after a lawsuit in Los Angeles claimed that platforms including Facebook, WhatsApp, and YouTube had a serious impact on a young woman’s mental health.

Apple’s move may also reflect broader regulatory pressure. The UK government is reportedly considering stronger restrictions for under-16s on social media, similar to measures seen in Australia. Reports also indicate Apple has been working with Ofcom as these safety tools develop.

For users who cannot verify an adult identity, Apple suggests that some features may be limited or that the account may need to be placed under Family Sharing with a parent or guardian. The exact restrictions could vary depending on the situation.

Continue Reading

Tech & IT

Australia Age Checks Now Required for Porn Access

Australia has begun enforcing stricter age-verification rules for online adult content, requiring platforms to take meaningful steps to stop under-18s from accessing pornography and other age-restricted material. The Age-Restricted Material Codes for services including social media, relevant electronic services, equipment providers, and designated internet services came into effect on March 9, 2026.

Under the new framework, some services may now require proof of age before allowing access to legal adult content. Australia’s eSafety Commissioner says the accepted methods can vary by platform, but any age-assurance process must be accurate, reliable, and compliant with Australian privacy law. eSafety has said the changes are intended to reduce children’s exposure to pornography, high-impact violence, and other harmful age-inappropriate material online.

The rollout has already affected access to some major adult platforms in Australia, while debate continues over privacy risks and how effective the rules will be in practice. Recent reporting has also linked the changes to rising interest in VPN services as some users look for ways around the restrictions.

Continue Reading

Tech & IT

Apple: Age-Verification Tools Expand Worldwide With New 18+ Download Blocks

Apple is expanding its age-verification system in more countries to match stricter child-protection laws. The changes mainly affect how people download 18+ (adult-rated) apps and how developers confirm whether a user is a minor or an adult—without collecting sensitive personal details.


What’s changing for users

  • New 18+ download blocks: In Brazil, Australia, and Singapore, users must confirm they are 18 or older before downloading apps rated 18+.
  • Less access for minors to adult content: This is meant to stop children from downloading adult-only apps through the App Store.

What’s changing for developers

  • Declared Age Range API (updated): Apple is updating an API that lets apps know only an age category (example: minor vs adult), not the person’s exact age.
    • Developers do not receive private data, such as date of birth.
    • The app receives a simple “category signal” to follow local rules.
  • Parental control options: For child accounts, parents/guardians can choose whether to share age information and whether permission is required in certain situations.

Loot boxes and “gambling-like” features

Apple is also targeting apps with features regulators often consider risky for minors, such as loot boxes.

  • In Brazil, if an app includes loot boxes, Apple may automatically rate it 18+.
  • That means minors can’t download it, because the App Store will treat it as adult-only.

U.S. states: Utah and Louisiana

Apple is adding tools to help apps comply with state-level child safety laws:

  • In Utah and Louisiana, Apple can share a new user’s age category with developers.
  • The system can also flag when parental permission is required, including for major app updates.

Why Apple says it’s doing this

Apple’s message is: protect kids + respect privacy.

  • The App Store handles most of the verification.
  • Apps get only a yes/no type age signal (minor/adult), not personal identity details.
  • The goal is to comply with various laws without forcing developers to collect sensitive data.

If you want, paste the original version you want to keep, word-for-word, and I’ll rewrite it just as clear while keeping the same word count.

Continue Reading
Advertisement Xlovecam.com

Trending