An increasing number of reports have raised alarms about the safety of children on Roblox, a popular online gaming platform. Law enforcement in Scottsdale, Arizona, recently uncovered a disturbing case where an adult predator posed as a teenager to target boys aged 10 to 13. The predator, identified as Jacob Lozano, 23, from Florida, used online games to lure his victims into private messaging apps, where he coerced them into performing explicit acts on camera. These acts were recorded without the boys’ knowledge, with Lozano offering rewards such as PlayStation gift cards and even delivering pizza to their homes.
The situation came to light when the mother of an 11-year-old boy with special needs discovered sexually explicit messages on her son’s phone. She immediately contacted the police and the National Center for Missing and Exploited Children (NCMEC). The investigation soon revealed that Lozano had exploited numerous other boys in Arizona and possibly beyond, though charges outside Arizona have yet to be filed.
This case is far from isolated. A 2022 report from the U.S. Government Accountability Office (GAO) indicates a significant increase in the online sexual exploitation of children. The report highlights how greater internet access, technological advancements, and the rise of encryption have made it easier for predators to exploit children. Data from the NCMEC shows that reports of online sexual enticement, including financial sextortion of minors, surged by over 300% between 2021 and 2023.
Lozano had been linked to a previous NCMEC CyberTip for online crimes against children. Florida authorities had found images of a prepubescent boy on a social media account controlled by Lozano. However, attempts to contact him during that investigation were unsuccessful, and the case was marked inactive. It wasn’t until his recent arrest in Florida and extradition to Arizona that he faced serious charges, including 14 counts of sexual exploitation of a minor.
The rise in using gaming platforms to facilitate sexual exploitation is a growing concern, especially after the 2018 federal crackdown on websites like Backpage.com, which was notorious for advertising sex, including the exploitation of children. After Backpage was shut down, traffickers began exploiting new venues, including social media, dating apps, and online gaming platforms like Roblox.
Roblox, which allows users to interact in virtual worlds they can create, has become one of the leading gaming sites for children globally, with nearly 60% of its users in 2023 being 16 years old or younger. However, its popularity has also made it a target for predators. The National Center on Sexual Exploitation (NCOSE) placed Roblox on its “Dirty Dozen List,” citing it as a platform that facilitates and profits from sexual abuse and exploitation.
Despite Roblox’s claims of implementing safety measures such as content moderation, chat filters, and parental controls, incidents of child exploitation continue to surface. A June 2023 NCOSE report revealed that children are often lured into virtual “condo experiences” within Roblox, where they are encouraged to engage in virtual sex acts.
Legal cases involving Roblox have further exposed the dangers of the platform. In one case, a 14-year-old Ohio girl was sexually assaulted by an adult she met on Roblox, who posed as a fellow teen. Another case in Florida involved a girl under 12 who was kidnapped and assaulted by a man she met on the platform.
Roblox has faced multiple lawsuits over the safety of children on its platform. A class-action lawsuit filed in San Mateo County Superior Court in December accused the company of allowing rampant sexual content and the presence of child predators. Although the lawsuit was dismissed for technical reasons, it highlighted the ongoing concerns about the platform’s safety.
In response to these growing concerns, Roblox partnered with the U.S. Department of Homeland Security’s Know2Protect Initiative, a campaign aimed at combating online child exploitation. Roblox has pledged to enhance its safety measures, including displaying safety tips within its games and developing immersive experiences to educate users.
Despite these efforts, law enforcement officials like Sgt. Lorence Jove Jr. of the Tucson Police Department emphasize the immense challenge in keeping up with the rapidly evolving online threats. The battle against online child exploitation is ongoing, and platforms like Roblox remain at the forefront of this critical issue.
Ofcom Sets July 2025 Age Check Deadline for Adult Sites
As the UK’s powerful media and communications regulator, the Office of Communications—better known as Ofcom—has officially laid down a hard deadline for adult websites to introduce robust age verification systems for UK users.
Established by the Office of Communications Act 2002 and empowered through the Communications Act 2003, Ofcom serves as the government-approved authority overseeing the UK’s broadcasting, telecoms, internet, and postal sectors. With a statutory duty to protect consumers and uphold content standards, Ofcom regulates a wide range of services including TV, radio, broadband, video-sharing platforms, and wireless communications. One of its core responsibilities is ensuring that the public—particularly minors—is shielded from harmful or inappropriate material online.
In its latest move under the UK’s Online Safety Act, Ofcom announced that all pornography providers accessible from the UK must implement “highly effective” age verification processes by July 25, 2025. On April 24, the regulator issued letters to hundreds of adult sites warning them of non-compliance consequences and clarifying that the law applies even to platforms based outside the UK.
“If people are visiting your site from the UK, you’ll likely be in scope, wherever in the world you’re based,” the agency stated.
The action builds on earlier requirements directed at porn content producers who self-host, some of whom were already expected to comply earlier this year. The July deadline now puts the entire online adult sector under one enforcement umbrella.
In addition to enforcing universal age checks, Ofcom is requiring any platform that only verifies age for part of its content to complete a children’s risk assessment for remaining accessible sections. This assessment must be submitted by July 24, just one day before the compliance deadline.
Sites found to be in breach of the new requirements face significant penalties—fines of up to 10% of global annual revenue or £18 million, whichever is greater. Ofcom also signaled the possibility of escalating enforcement by seeking court orders to compel third parties like banks and internet service providers to block access to non-compliant platforms.
As part of its broader safety initiative, Ofcom is exploring the use of AI-driven facial age estimation tools to support verification processes, a move reflecting the increasing intersection between artificial intelligence and adult content regulation.
Earlier this year, the UK government also announced plans to make the country the first in the world to criminalize the creation, possession, or distribution of AI tools intended to generate child sexual abuse material (CSAM), signaling an even more aggressive stance toward digital harms involving minors.
Ofcom’s July deadline now stands as a critical compliance milestone for the global adult industry. For any site with UK traffic, there is no longer room for delay—age verification must be implemented, or the consequences will be severe.
Alibaba’s latest AI video generation model, Wan 2.1, was meant to be a breakthrough in open-source technology. However, within a day of its release, it was adopted by AI porn creators, sparking concerns over its potential for misuse. While open AI modelsdemocratize access to powerful tools, they also raise ethical issues, particularly in the creation of non-consensual content. The rapid adoption of Wan 2.1 highlights this ongoing challenge.
Alibaba, the Chinese tech giant, recently released its new AI video generation model, Wan 2.1, making it freely accessible to those with the necessary hardware and expertise. While this open-source approach empowers developers and researchers, it also comes with a dark side. Within just 24 hours, the AI porn community seized the opportunity to produce and share dozens of explicit videos using the new software.
Even more concerning is the reaction from a niche online community dedicated to creating nonconsensual AI-generated intimate media of real people. Users on Telegram and similar platforms quickly celebrated Wan 2.1’s capabilities, praising its ability to handle complex movements and enhance the quality of AI-generated adult content. One user, referring to Tencent’s Hunyuan AI model (another tool popular in these circles), noted, “Hunyuan was released just in December, and now we have an even better text-to-video model.”
This is the ongoing dilemma of open AI models. On one hand, they offer groundbreaking possibilities, allowing developers to experiment, innovate, and improve AI technology. On the other, they can be easily exploited to create unethical and harmful content, including deepfake pornography.
Rapid Adoption in AI Porn Communities
The speed at which Wan 2.1 was adapted for explicit content was staggering. The first modifications of the model appeared almost immediately on Civitai, a site known for hosting AI-generated models. By the time initial reports surfaced, multiple variations of Wan 2.1 had already been downloaded hundreds of times. Users on Civitai enthusiastically shared AI-generated pornographic videos, many of which were created using these modified models.
Civitai’s policies prohibit the sharing of nonconsensual AI-generated pornography, but loopholes remain. While the site does not host nonconsensual content directly, it allows users to download models that can be used elsewhere for illicit purposes. Previous investigations have shown that once these models are accessible, there is little stopping users from misusing them in private or unregulated online spaces.
The Bigger Issue: Ethics of Open AI Models
The release of open-source AI models like Wan 2.1 is a double-edged sword. Open models promote innovation, allowing developers to refine AI technology for legitimate purposes such as filmmaking, animation, and content creation. However, as seen with Wan 2.1, early adopters often push the boundaries of ethical use, leading to misuse in inappropriate or even illegal ways.
Despite mounting concerns, Alibaba has remained silent on the issue. The company has yet to respond to inquiries regarding the misuse of its AI model. This raises questions about the responsibilities of tech giants when it comes to the unintended consequences of their AI releases. Should companies impose stricter regulations on how their AI models are used? Or is it the responsibility of platforms and communities to enforce ethical guidelines?
What Comes Next?
As AI-generated content becomes increasingly sophisticated, the challenge of regulating its use grows more complex. Open-source AI models are powerful tools, but they must be released with safeguards in place to prevent misuse. Without proper oversight, the line between innovation and exploitation will continue to blur, leaving room for ethical dilemmas and legal concerns.
For now, Wan 2.1 stands as yet another example of how quickly AI technology can be both a breakthrough and a battleground. The question remains—how will companies like Alibaba address these issues moving forward?
SexLikeReal (SLR) has launched SLR For Women, its first dedicated VR porn vertical offering a female-first perspective. This initiative utilizes the platform’s chroma suit passthrough technology to create immersive experiences tailored for female viewers.
A New Approach to VR Adult Content
SLR For Women debuted with a VR porn scene featuring Danny Steele and Alicia Williams, filmed using chroma passthrough technology. The female performer wears a chroma suit, allowing only her genitals to remain visible, maintaining a first-person perspective experience.
While female-perspective VR porn exists across various platforms, SLR’s entry is notable due to its technological advancements and strong user engagement. The company is inviting female users to submit scripts, with the best ideas set to be produced as POV VR scenes by its top production team.
Future Expansion & User Involvement
Currently, the SLR For Women section features just one scene, posted over three weeks ago. Although no rush of female subscribers is expected yet, SLR has indicated plans for more female-focused content and encourages user feedback to shape its future releases.
SLR has previously introduced AI-powered passthrough technology, allowing non-chroma-shot videos to be converted into passthrough VR, as well as the world’s first AR cam rooms for live streaming. Whether this new venture will receive continued investment remains to be seen, but the launch signals an industry shift towards more inclusive VR experiences.
Warning: Undefined variable $user_ID in /home/bcamsmagazine/public_html/wp-content/themes/zox-news/comments.php on line 49
You must be logged in to post a comment Login