“ Not only can their service help while you’re currently in the industry, but it’s great also for retired cam models as well as to prevent family members, coworkers, and friends from finding out what you used to do.”
Once you become an established cam model and have more people viewing your content, you’ll inevitably come across your content on sites that you do not want it distributed on. This could include tube sites, clip sites, forums etc. Contacting the webmaster yourself can sometimes work to get your content removed, but not always, and it can be time consuming. There are sites that can do all the work for you, in exchange for a fee level plan.
One of the best and most popular sites removing copyrighted content in the adult industry is Cam Model Protection. They specifically focus on cam models. They have different packages to choose from, based on the services you would like to get from them.
Not only can their service help while you’re currently in the industry, but it’s great also for retired cam models as well as to prevent family members, coworkers, and friends from finding out what you used to do. If you are unsure about Cam Model Protection, they offer a free trial membership. You will have full access to all of their features so you can test them out and see if it’s right for you. You can even have your fans donate towards your monthly fee. Read more articles here
Stripe Shuns Sex Work, Yet Gains from AI Non-Consensual Images
Stripe, a $50 billion payment processing giant known for avoiding the adult industry, is currently reaping profits from AI-produced non-consensual pornographic images.
Two platforms, CivitAI and Mage.Space, have been highlighted for exploiting text-to-image AI utilities to create explicit images, frequently of celebrities or other known figures. Mage.Space, for instance, requires subscription fees for this service.
Both CivitAI and Mage.Space process their transactions via Stripe, meaning with every transaction, Stripe receives a share. Notably, Stripe’s typical rate is 2.9% plus a flat $0.30 per transaction.
GoAskAlex, an adult performer, expressed concern about the irony in Stripe’s dealings. She emphasized the injustice of financial institutions capitalizing on non-consensual content, especially when they actively distance themselves from legitimate adult industry professionals. Hany Farid, a UC Berkeley professor, similarly questioned the ethics of online financial platforms supporting such services.
Mage’s co-founder, Gregory Hunkins, informed about their stand against non-consensual imagery. However, non-consensual images were still readily available on their platform. Multiple outreach attempts to Stripe for comments remained unanswered.
The inconsistency in Stripe’s policy becomes glaring when considering their clear stance against collaborating with “adult content and services.” They have explicitly listed the types of businesses they refuse, which includes explicit content and adult services.
Mike Stabile of the Free Speech Coalition expressed astonishment at Stripe’s seemingly contradictory actions, pointing out how several in the adult industry have been banned or denied services by Stripe. The current situation feels like an affront, especially when one considers that legitimate adult professionals are sidelined while AI platforms exploiting their likenesses profit.
X Corp. Faces 2,000 Arbitration Demands, Agrees to Negotiate
Following Elon Musk’s acquisition of Twitter in October 2022, X Corp. has entered discussions to address arbitration claims from nearly 2,000 laid-off employees.
Attorney Shannon Liss-Riordan, in a memo cited by Bloomberg, stated, “We have successfully brought Twitter to the negotiation table. Twitter seeks global mediation to settle all our filed claims.” Private mediation sessions are slated for December 1 and 2.
A source indicated to Bloomberg that X Corp. is acting in compliance with a court order to mediate. An August filing disclosed over 2,200 individual arbitration demands against Twitter, with potential filing fees nearing $3.5 million as reported by CNBC. X Corp. has pushed for an even distribution of the arbitration costs.
Previously, Twitter (now “X”) allegedly compelled ex-employees to opt for arbitration over lawsuits while declining arbitration costs. Liss-Riordan pursued multiple class actions, including one suggesting Twitter breached the federal and California Worker Adjustment and Retraining Notification (WARN) Acts, failing to provide 60 days’ notice before a mass layoff.
In January 2023, a federal judge mandated ex-employees into arbitration due to existing agreements. However, Twitter was accused in July of both insisting on arbitration and avoiding the associated costs.
The company also faces allegations of unpaid severance, discrimination, and WARN Act and FMLA violations. Liss-Riordan, representing the ex-employees, commented, “We are dedicated to ensuring they receive what’s due.”
Updates will follow upon receiving comments from Liss-Riordan and X Corp.
Texas Age Verification Law for Adult Websites Blocked by Federal Judge
A day before it was set to be enforced, a Texas law mandating age verification and health warnings for pornographic websites was halted by U.S. District Judge David Ezra. He ruled in favor of the Free Speech Coalition, an association representing the adult entertainment industry, arguing that the law infringed on First Amendment rights and lacked clarity.
Judge Ezra criticized House Bill 1181, which was endorsed by Gov. Greg Abbott, stating that the age verification component unnecessarily restricted adults from accessing lawful adult content under the guise of shielding minors.
Furthermore, the judge voiced concerns about privacy. The law’s provision allowing age verification via government-issued ID could let the state government monitor and log users’ access to such websites. This poses risks, especially for those accessing LGBTQ content, given Texas’ ongoing controversial laws related to homosexual activities.
The ruling also highlighted concerns about potential leaks or breaches exposing personal data.
Another aspect of the law mandates warnings on porn sites about supposed psychological risks of viewing adult content. However, Judge Ezra pointed out a lack of evidence supporting the effectiveness of such warnings in restricting minors’ access. He further noted that these warnings were labeled “Texas Health and Human Services” without clear backing from the named institution.
It’s worth noting that Texas followed the steps of Louisiana and other states in proposing such legislation. However, there were gaps in the Texas law, such as excluding social media sites due to them likely not meeting the one-third sexually explicit content criteria. This loophole means minors could still access adult content on platforms like Reddit, Tumblr, and Instagram.
Summing up, Judge Ezra emphasized that while the law’s intent was to protect minors, it wasn’t adequately designed for that purpose and instead contained broad exemptions.