Latest News
TikTok and Bumble Join Meta and StopNCII.org
The website enables people to generate unique digital fingerprints (hashes) of intimate images and videos to protect user privacy, as the actual files are not uploaded.
TikTok and Bumble are joining forces with StopNCII.org (Stop Non-Consensual Intimate Image Abuse) in an effort to combat the spread of revenge porn.
This tool is shared with the initiative’s partners, who will detect and block any images that match a corresponding hash. Over the past year, 12,000 people have used the tool to create more than 40,000 hashes, and these have been sent to the partner platforms, who will remove and block the image if it violates their policies.
This initiative builds upon a pilot program that Meta (formerly known as Facebook) launched in 2017 that asked users to upload revenge porn images to a Messenger chat with themselves. The UK has recently announced plans to impose stricter regulations on the removal of non-consensual intimate images, making this initiative all the more timely.
You may like
Latest News
Judge Blocks Utah Social Media Age Verification Law
A federal judge in Utah has temporarily blocked a state law protecting children’s privacy and limiting their social media use, declaring it unconstitutional.
U.S. District Court Judge Robert Shelby issued a preliminary injunction against the law, which would have required social media companies to verify users’ ages, enforce privacy settings, and limit certain features on minors’ accounts.
The law was scheduled to take effect on October 1. Still, its enforcement is now paused pending the outcome of a case filed by NetChoice, a nonprofit trade group representing companies like Google, Meta (Facebook and Instagram’s parent company), Snap, and X. The Utah legislature had passed the Utah Minor Protection in Social Media Act in 2024, after earlier legislation from 2023 faced legal challenges. State officials believed the new law would withstand legal scrutiny, but Judge Shelby disagreed.
“The court understands the State’s desire to protect young people from the unique risks of social media,” Shelby wrote. However, he added that the state failed to provide a compelling reason to violate the First Amendment rights of social media companies.
Republican Governor Spencer Cox expressed disappointment with the court’s ruling but emphasized that the fight was necessary due to the harm social media causes to children. “Let’s be clear: social media companies could, right now, voluntarily adopt all of the protections this law imposes to safeguard our children. But they refuse, choosing profits over our kids’ well-being. This has to stop, and Utah will continue to lead this battle.”
NetChoice contends that the law would force Utah residents to provide more personal information for age verification, increasing the risk of data breaches. In 2023, Utah became the first state to regulate children’s social media use. Utah sued TikTok and Meta, accusing them of using addictive features to lure children.
Under the 2024 law, minor accounts would have default settings limiting direct messages, sharing features, and disabling autoplay and push notifications, which lawmakers say contribute to excessive use. The law would also restrict how much information social media companies could collect from minors.
Additionally, another law taking effect on October 1 allows parents to sue social media companies if their child’s mental health worsens due to excessive use of algorithm-driven apps. Social media companies must comply with various requirements, including limiting use to three hours daily and imposing a nightly blackout from 10:30 p.m. to 6:30 a.m. Violations could result in damages starting at $10,000.
NetChoice has successfully obtained injunctions blocking similar laws in California, Arkansas, Ohio, Mississippi, and Texas. “With this being the sixth injunction against these overreaching laws, we hope policymakers will pursue meaningful and constitutional solutions for the digital age,” said Chris Marchese, NetChoice’s director of litigation.
Latest News
White House Announces AI Firms’ Pledge Against Image Abuse
The White House announced this week that several leading AI companies have voluntarily committed to tackling the rise of image-based sexual abuse, including the spread of non-consensual intimate images (NCII) and child sexual abuse material (CSAM). This move is a proactive effort to curb the growing misuse of AI technologies in creating harmful deepfake content.
Companies such as Adobe, Anthropic, Cohere, Microsoft, and OpenAI have agreed to implement specific measures to ensure their platforms are not used to generate NCII or CSAM. These commitments include responsibly sourcing and managing the datasets used to train AI models, safeguarding them from any content that could lead to image-based sexual abuse.
In addition to securing datasets, the companies have promised to build feedback loops and stress-testing strategies into their development processes. This will help prevent AI models from inadvertently creating or distributing abusive material. Another crucial step is removing nude images from AI training datasets when deemed appropriate, further limiting the potential for misuse.
These commitments, while voluntary, represent a significant step toward combating a growing issue. The announcement, however, lacks participation from major tech players such as Apple, Amazon, Google, and Meta, which were notably absent from today’s statement.
Despite these omissions, many AI and tech companies have already been working independently to prevent the spread of deepfake images and videos. StopNCII, an organization dedicated to stopping the non-consensual sharing of intimate images, has teamed up with several companies to create a comprehensive approach to scrubbing such content. Additionally, some businesses are introducing their own tools to allow victims to report AI-generated sexual abuse on their platforms.
While today’s announcement from the White House doesn’t establish new legal consequences for companies that fail to meet their commitments, it is still an encouraging step. By fostering a cooperative effort, these AI companies are taking a stand against the misuse of their technologies.
For individuals who have been victims of non-consensual image sharing, support is available. Victims can file a case with StopNCII, and for those under 18, the National Center for Missing & Exploited Children (NCMEC) offers reporting options.
In this new digital landscape, addressing the ethical concerns surrounding AI’s role in image-based sexual abuse is critical. Although the voluntary nature of these commitments means there is no immediate accountability, the proactive approach by these companies offers hope for stronger protections in the future.
Source: engadget.com
Latest News
Texas AG Defends Restrictions on Targeted Ads to Teens
Texas Attorney General Ken Paxton is urging a federal judge to uphold restrictions on social media platforms’ ability to collect minors’ data and serve them targeted ads. In papers filed with U.S. District Judge Robert Pitman, Paxton argues that the coalition challenging the law has no grounds to proceed, as the law only applies to social platforms, not users.
The Securing Children Online through Parental Empowerment Act (HB 18) requires social platforms to verify users’ ages and use filtering technology to block harmful content, including material that promotes eating disorders, self-harm, and sexual exploitation. The bill also limits data collection from minors and prohibits targeted ads without parental consent.
The law faces challenges from two lawsuits: one from tech industry groups and another from a coalition that includes advocacy group Students Engaged in Advancing Texas and the Ampersand Group, which handles ads for nonprofits and government agencies. The coalition claims the law will prevent them from delivering public service ads, such as fentanyl warnings or sex trafficking alerts, to teens.
In a previous ruling, Judge Pitman blocked parts of the law requiring content filtering but left in place restrictions on data collection and targeted advertising. The judge stated those provisions might be challenged later.
Paxton contends that the coalition’s arguments are too vague, questioning the specifics of their ad plans and whether the law targets only commercial advertising. Judge Pitman has not yet issued a final ruling on the coalition’s request.
Trending
- Selfcare & Sexual Wellness9 months ago
The Dark Side of the Playboy Mansion
- Selfcare & Sexual Wellness9 months ago
Adult Star’s Choice to Avoid Pregnancy at Work
- Cam Models3 years ago
EmilyJonesChat
- Finance & Business3 months ago
BCAMS Magazine, the 22th issue!
- Finance & Business7 months ago
BCAMS Magazine, the 21th issue!
- Cam Models2 years ago
Demora Avarice
Warning: Undefined variable $user_ID in /home/bcamsmagazine/public_html/wp-content/themes/zox-news/comments.php on line 49
You must be logged in to post a comment Login
This site uses User Verification plugin to reduce spam. See how your comment data is processed.