Tech & IT
62 Women Sue PornHub Involving GirlsDoPorn Videos
The lawsuit filed against the parent company of PornHub, which also owns other free adult video platforms, alleges a partnership with GirlsDoPorn and accuses them of turning a blind eye to evident signs of sex trafficking. Over 60 women claim they were coerced into appearing in explicit videos, and they argue that these videos were distributed and monetized without their consent. The women are demanding justice and accountability for the damages they have endured.

Over 60 women have filed a lawsuit against the owner of PornHub and other adult video sites, accusing them of the “illegal publication of sex trafficking videos.” The lawsuit was lodged in the U.S. District Court in San Diego against Aylo Media S.A.R.L., the parent company of PornHub, on charges including human trafficking and racketeering.
This is the second such lawsuit. In December 2020, around 60 women filed a similar case against MindGeek, the previous name of PornHub’s parent company. The lawsuit was settled a year later, with terms kept confidential. The recent lawsuit involves 62 different women, represented by the same lawyers.
Brian Holm, one of the attorneys, emphasized that the number of plaintiffs in the case is a mere fraction of the women exploited by PornHub over time. Holm remarked on the courage of the plaintiffs, given the daunting nature of the case, and their intent to expose PornHub’s alleged unethical practices.
Federal prosecutors recently unveiled another sex-trafficking indictment against a former employee of GirlsDoPorn, a company involved in the allegations. Michael James Pratt, the founder of GirlsDoPorn, is currently awaiting extradition from Spain after being arrested in Madrid. Pratt and his associates were previously sued by 22 women in 2019, leading to a judgment of $12.7 million against them.
These women claimed Pratt and his team lured them with modeling opportunities, but upon arrival in San Diego, they were coerced into filming explicit content. These films were supposedly intended for private overseas collections.
The current lawsuit asserts that Aylo Media, previously MindGeek, collaborated with GirlsDoPorn in 2011 to exploit these videos on its platforms. One such video allegedly became the second most viewed on PornHub in 2014, a site with a reported 42 billion views in 2019.
The complaint suggests that while GirlsDoPorn shielded the videos behind a paywall, Aylo distributed them on public platforms, earning millions from the views. Although Aylo supposedly received takedown requests from the women in the videos, they reportedly ignored these appeals. Only after FBI intervention in 2019 were the videos removed.
The lawsuit is seeking a minimum of $10 million in damages for each plaintiff and aims to prevent Aylo from ever displaying the victims’ videos again.
You may like
Pornhub Reveals Canada’s Quirky Thanksgiving Searches for 2023
Attorneys Send Letter to PornHub Urging Them to Close Loophole
Pornhub’s Cookie Change: Still Not GDPR-Compliant
Clash of Logos: Pornhub’s Legal Battle with a New York Eatery
Age Verification Strikes Again: Pornhub Access Cut in Virginia, USA
The Ethical Dilemma of Promoting “Ethical Porn”
‘Barely Legal’ Beer Withdrawn from Festival Over Porn Website Resemblance
PornHub Still Reigns in Popularity Despite Challenge from ChatGPT
Pornhub Traffic in Canada Rises During Coronation of King Charles III
Pornhub Blocks Access in Utah Ahead of Age Verification Law
Tech & IT
Metaverse Sex: Revolutionizing Intimacy in a Decade
VR to Transform Adult Entertainment with Metaverse Sex in 10 Years
Highlights:
- VR sex replaces porn apps in a decade
- Multi-sensory VR includes touch, smell
- Ethical concerns in virtual consent
- Metaverse allows global virtual encounters
- Shift could aid lonely individuals

According to Sam Hall, managing director of Mixed Reality Rooms, the world of adult entertainment is poised for a transformative shift in the next decade. Hall envisions a future where traditional porn apps and websites are replaced by immersive experiences in the metaverse, enabled by advancements in virtual reality (VR) technology.
This evolution, he predicts, will be driven by the increasing accessibility of VR headsets, paving the way for a new normal in adult entertainment. The introduction of multi-sensory VR, including technologies that simulate touch, smell, and taste, is expected to create experiences that closely mimic real-life interactions. Hall foresees a significant role for connected sex toys, utilizing haptic technology to enhance the virtual experience.
However, this technological leap is not without its challenges. Hall raises critical ethical concerns, particularly around consent in the virtual realm. The use of personal images and virtual avatars without consent poses a significant risk, echoing apprehensions noted in a 2017 report about VR porn’s impact on sexual expectations and potential for abuse.
Despite these concerns, the potential of the metaverse extends beyond just entertainment. Hall suggests that it could democratize sexual experiences, providing new opportunities for those who may find it challenging to find partners in the real world. This virtual world, limitless in its scope, allows individuals to express their desires in diverse settings, real or fictional, private or public.
The integration of VR technology with sexual wellness hardware is still in its nascent stages, but Hall notes that it’s only a matter of time before these technologies fully converge. He points out that people are already exploring romantic connections in virtual spaces, hinting at the future of relationships and intimacy.
As we stand on the brink of this new era, the questions of how quickly these changes will materialize and how they will reshape our understanding of human intimacy remain open. What is clear, however, is that the metaverse is set to redefine the landscape of adult entertainment, offering unprecedented experiences while challenging our conventional notions of consent and connection.
Latest News
Omegle Shuts Down After Child Safety and Legal Challenges
Omegle, a platform for video chatting with strangers, has shut down following numerous child abuse allegations and lawsuits. Over a decade, it linked children with predators, prompting legal scrutiny. Founder Leif K-Brooks, under pressure, cites the challenge of moderating content as a key reason for the shutdown.
Highlights:
- Omegle shuts down following child safety issues.
- Platform linked minors with predators for years.
- High CSAM reports exceed other social platforms.
- Founder Leif K-Brooks announces app closure.
- Legal challenges question Section 230’s scope.
- Calls for systemic online child protection.

Omegle, once a popular platform for connecting strangers through video chat, has officially shut down. Known for its tagline “Talk To Strangers,” Omegle became a concerning destination for minors, leading to its closure last Thursday. This decision comes after more than a decade of the platform inadvertently facilitating connections between children and predators, which led to multiple lawsuits and criminal investigations.
The platform has been embroiled in several child grooming cases. One notable incident involved a Norwegian teenager who met a predator on Omegle at the age of 14, leading to her abuse. In 2022, an FBI investigation uncovered a user sharing child sexual abuse material (CSAM) acquired through Omegle. The perpetrator was sentenced to 42 months in prison. That year, Omegle reported over half a million CSAM cases to the National Center for Missing and Exploited Children, a figure higher than those reported by other major platforms like TikTok, Snapchat, and Discord.
Founder Leif K-Brooks announced the shutdown, highlighting the intensive content moderation efforts Omegle had undertaken. Despite these efforts, Brooks admitted the platform was misused for heinous crimes. The stress and financial burden of managing the site’s content were significant factors in his decision.
A pivotal lawsuit contributing to Omegle’s downfall involved a 13-year-old identified as C.H. She alleged that at the age of 11, she was coerced into sexual acts by predators she met on Omegle. Her case, which bypassed the protections typically afforded to tech companies under Section 230, highlighted the platform’s inability to safeguard young users effectively.
Despite Omegle’s attempts to combat child exploitation, including the use of AI and human moderators, critics argued that these measures were insufficient. The Canadian Center for Child Protection pointed out the inadequacy of Omegle’s age verification process, which merely required users to confirm they were 18 years old. Disturbingly, conversations and videos discovered on dark web forums indicated that predators had used Omegle to target and exploit children.
The closure of Omegle reflects a growing awareness and intolerance of platforms that fail to protect children from online sexual exploitation. While Omegle’s shutdown is a significant step, it highlights the broader issue of child safety on the internet, underscoring the need for more stringent regulations and proactive measures across all online platforms.
Source: Forbes
Tech & IT
Meta Shifts Age Verification Duty to Google and Apple App Stores
In a recent statement, Meta proposed that Google and Apple’s App Stores should take on the responsibility of online age verification for their apps. This move comes amidst increasing pressure for Meta to implement stringent age controls and parental consent mechanisms to safeguard younger users. The company’s head of global safety argued that the varied verification methods across U.S. states make it impractical for social media apps to manage this process uniformly.
Highlights:
- Meta rejects app age verification role.
- Suggests App Stores handle age checks.
- Focus on parental consent, safety.
- U.S. state laws vary in verification.

Meta, a leading technology company, recently emphasized its stance on not participating in online age verification for its applications. Instead, it suggested that this responsibility should be managed by the App Stores of Google and Apple. This position was clarified in a post by the company’s head of global safety and security.
This suggestion arises amidst growing discussions about implementing effective age controls and requiring parental consent for young users on Meta platforms. The company’s safety head stated that Meta is not willing to take on this responsibility, citing the inconsistency of verification methods across different U.S. states. Such disparities make it challenging for social media applications to uniformly implement age verification.
Consequently, Meta proposes that parents should be the ones giving permission for their children to use apps through Google Play and the Apple App Store. In this system, when a teenager attempts to install an app, a notification would be sent to their parents for approval. This approach is similar to how parents are alerted about in-app purchases. If a parent approves, the child can install and use the app; otherwise, the installation is blocked.
Furthermore, Meta suggests that age verification could be conducted during the initial setup of a child’s device, allowing parents to set age restrictions once, rather than repeatedly responding to alerts or approval requests.
Meta strongly supports legislation that mandates parental approval for users under 16. The company commits to developing features and settings to facilitate parental assistance in app usage by children.
Currently, there are no widespread regulations specifically mandating online age verification in app stores. However, some states, such as Louisiana, are enacting their own laws. These laws require users to verify their age through government-issued IDs to access certain websites. Utah, for instance, recently passed legislation requiring parental approval for children signing up on various online platforms, including Facebook.
As more states introduce age verification laws, discussions about extending these methods to other online applications are intensifying. Lawmakers are exploring new legislation aimed at expanding internet access regulation and enhancing safety measures for broader audiences, especially younger users.
Warning: Undefined variable $user_ID in /home/bcamsmagazine/public_html/wp-content/themes/zox-news/comments.php on line 49
You must be logged in to post a comment Login
This site uses User Verification plugin to reduce spam. See how your comment data is processed.