Tech & IT
Low Users, Sex Scandals Sink Korean Metaverses; 3AC Sues Terra
South Korea’s tech-savvy population, high-speed internet, and deep gaming culture once made it the perfect setting for metaverse platforms. Despite these strengths, several leading South Korean metaverses are now shutting down due to low user engagement and controversies.
The most recent closure is 2nd Block, a metaverse created by Dunamu, the operator of South Korea’s largest cryptocurrency exchange, Upbit. Dunamu announced it will close on September 9 due to declining user activity since the pandemic began to ease.
This follows the closure of Seoul’s $4.5 million metaverse project, which will shut down on October 16 after less than two years. The platform was designed to offer a digital space for residents and tourists, but it failed to maintain interest.
Even metaverses still in operation are struggling. Zepeto, developed by South Korean internet giant Naver, has shifted focus to the global market, boasting 20 million monthly overseas users, compared to just 1.2 million in South Korea. However, Zepeto has faced domestic criticism after reports emerged of children’s avatars being sexually harassed, with predators coercing minors into sharing explicit images in exchange for in-game items.
In response, South Korean lawmakers proposed stricter penalties for virtual sex offenders, but the bill failed to pass, leaving the issue unresolved. The challenge of tracking predators across international borders complicates enforcement.
In a related legal battle, liquidators of the defunct Singapore crypto hedge fund Three Arrows Capital (3AC) have filed a $1.3 billion lawsuit against Terraform Labs. The lawsuit alleges that Terraform misled 3AC about the stability of its stablecoin TerraUSD (UST) and its sister token LUNA. Terra’s collapse in May 2022 led to a $40 billion loss in the Terra ecosystem and forced 3AC into bankruptcy. Terraform Labs settled a $4.47 billion SEC lawsuit in June. Meanwhile, Terraform’s co-founder, Do Kwon, remains in Montenegro awaiting extradition.
Source: cointelegraph.com
You may like
Tech & IT
The Truth About Free Speech, Big Tech, and Protecting Our Children
In today’s world, there is a lot of confusion about what “free speech” truly means, especially regarding the influence of Big Tech. As Americans continue to idolize tech billionaires, it’s essential to understand the legal boundaries of free speech and how these platforms operate, especially when children’s safety is at stake.
What Is Free Speech?
Free speech, as protected under the First Amendment of the U.S. Constitution, is often misunderstood. The First Amendment restricts the government’s ability to limit speech, but it doesn’t grant individuals the right to say whatever they want on private platforms. Whether it’s a social media site, a restaurant, or a business, private companies have the right to moderate or restrict speech on their terms. The idea that users are entitled to free speech on platforms like Facebook, Twitter, or Telegram is a misconception. These platforms are private businesses, not public forums.
However, these tech companies promote themselves as champions of free speech while still exercising significant control over the content they allow. This creates an illusion of free speech where, in reality, users must follow the rules set by these billionaires.
The Cost of Unchecked Platforms on Children’s Safety
One of the most alarming issues today is the way tech platforms are being exploited for child abuse and sex trafficking. While some Big Tech companies claim they are creating safe spaces, many have been slow or reluctant to address the growing epidemic of child sexual exploitation on their platforms. Reports have shown that Facebook, Instagram, Twitter, and even Telegram have become hotbeds for child trafficking and the distribution of abusive material.
For example, Telegram’s founder, Pavel Durov, has been praised for allowing free speech on his platform. However, recent investigations reveal that Telegram has been slow to cooperate with law enforcement, particularly in cases involving child abuse. French authorities recently arrested Durov for allegedly failing to provide information in child exploitation cases. This arrest raises serious concerns about the safety of children online and how tech platforms, even those claiming to defend free speech, might be complicit in illegal activities.
These companies prioritize profit over safety. They know tightening security would cost them time and money, so they continue allowing unsafe environments to thrive. Children are the ones paying the price as these platforms enable predators to find and exploit them.
The Greed Behind Big Tech
At the heart of the problem is greed. Tech billionaires like Durov, Mark Zuckerberg (Facebook), and Elon Musk (Twitter) have made fortunes by creating platforms that allow anyone to voice their opinions. However, these platforms have also created opportunities for criminals, including child traffickers. Instead of focusing on safety, these companies prioritize user engagement, which increases ad revenue, data collection, and, ultimately, their bottom line.
Despite the ongoing abuse, companies like Twitter have cut teams responsible for monitoring child exploitation. Under Elon Musk’s leadership, Twitter reduced its child safety monitoring staff, even though Musk publicly stated that protecting children would be a top priority. The result? An increase in dangerous and illegal content that harms vulnerable young users.
Similarly, Facebook and Instagram have failed to take meaningful steps to combat child trafficking on their platforms. Lawsuits have even been filed against these tech giants, accusing them of promoting child trafficking. Instead of acting decisively to protect children, these billionaires protect their business models and profits.
Protecting Free Speech While Safeguarding Children
There is a clear need to balance free speech with the responsibility to protect children. While people have the right to express their opinions, this does not mean tech platforms should turn a blind eye to illegal and harmful activities on their sites. Big Tech’s refusal to adopt stronger protections is not about defending free speech—it’s about greed and profit.
It is crucial to demand more accountability from these platforms. The public must understand that free speech doesn’t give anyone the right to endanger others, particularly children. If platforms are not ensuring safety, they should be held accountable for their negligence.
The Solution
To protect free speech and ensure the safety of our children, tech companies need to take a stand against illegal activities. This means investing in moderation, cooperating with law enforcement, and putting ethics before profit. While Big Tech platforms offer valuable services, they cannot continue to put children at risk to grow their empires.
Parents, governments, and communities must stay vigilant and pressure these platforms to enforce stronger safety measures while protecting free speech. Free speech should never come at the cost of our children’s safety.
In conclusion, the battle for free speech must not ignore the importance of protecting society’s most vulnerable. As long as greed drives tech companies’ decision-making processes, our children will remain in danger. It’s time to demand better.
Source: healthimpactnews.com
Tech & IT
Apple Faces Class Action Over Child Abuse Content
Apple Inc. is facing a class action lawsuit accusing the company of failing to prevent the upload and distribution of child sexual abuse material (CSAM) on its iCloud service. The lawsuit, filed by a minor identified as Jane Doe, claims that Apple has engaged in “privacy washing”—promoting a public image of strong privacy protection while allegedly neglecting to implement effective safeguards against the transmission of harmful content.
According to the lawsuit, Apple is fully aware that iCloud has become a significant platform for the distribution of child pornography. The plaintiff alleges that despite this knowledge, Apple has chosen not to adopt industry-standard practices for detecting and removing CSAM from its services. Instead, the company is accused of shifting the responsibility and costs of ensuring a safe online environment onto children and their families.
The lawsuit criticizes Apple for failing to live up to its stated privacy policies. “Apple’s privacy policy claims that it collects users’ private information to prevent the spread of CSAM, but it fails to do so in practice—a failure that Apple was already aware of,” the lawsuit states. This alleged negligence has led to Apple being accused of misrepresentation, unjust enrichment, and violations of federal sex trafficking laws, California business and professional codes, and North Carolina consumer protection laws.
Jane Doe, through her guardian, seeks to represent a nationwide class of individuals who have been victims of child sexual abuse due to the transmission of CSAM on iCloud over the past three years. The lawsuit demands a jury trial and calls for declaratory and injunctive relief, as well as compensatory and punitive damages for all class members.
This legal action against Apple adds to a growing list of lawsuits the company is facing. Earlier this year, two antitrust class actions were filed against Apple, accusing it of monopolizing the smartphone market with its iPhone and engaging in anticompetitive practices.
As the case progresses, it raises significant questions about the responsibility of tech giants like Apple to protect their users, particularly vulnerable groups such as children, from harmful online content. The outcome of this lawsuit could have far-reaching implications for how companies handle privacy and security in the digital age.
The plaintiff in this case is represented by Juyoun Han and Eric Baum of Eisenberg & Baum LLP, and John K. Buche and Byron Ma of The Buche Law Firm PC. The case, Doe, et al. v. Apple Inc., is currently being heard in the U.S. District Court for the Northern District of California under Case No. 5:24-cv-05107.
If you have been a victim of child sexual abuse as a result of CSAM being uploaded to iCloud, you are encouraged to share your experience in the comments.
Source: topclassactions.com
Tech & IT
Roblox Under Fire: Growing Risks in Online Child Exploitation
An increasing number of reports have raised alarms about the safety of children on Roblox, a popular online gaming platform. Law enforcement in Scottsdale, Arizona, recently uncovered a disturbing case where an adult predator posed as a teenager to target boys aged 10 to 13. The predator, identified as Jacob Lozano, 23, from Florida, used online games to lure his victims into private messaging apps, where he coerced them into performing explicit acts on camera. These acts were recorded without the boys’ knowledge, with Lozano offering rewards such as PlayStation gift cards and even delivering pizza to their homes.
The situation came to light when the mother of an 11-year-old boy with special needs discovered sexually explicit messages on her son’s phone. She immediately contacted the police and the National Center for Missing and Exploited Children (NCMEC). The investigation soon revealed that Lozano had exploited numerous other boys in Arizona and possibly beyond, though charges outside Arizona have yet to be filed.
This case is far from isolated. A 2022 report from the U.S. Government Accountability Office (GAO) indicates a significant increase in the online sexual exploitation of children. The report highlights how greater internet access, technological advancements, and the rise of encryption have made it easier for predators to exploit children. Data from the NCMEC shows that reports of online sexual enticement, including financial sextortion of minors, surged by over 300% between 2021 and 2023.
Lozano had been linked to a previous NCMEC CyberTip for online crimes against children. Florida authorities had found images of a prepubescent boy on a social media account controlled by Lozano. However, attempts to contact him during that investigation were unsuccessful, and the case was marked inactive. It wasn’t until his recent arrest in Florida and extradition to Arizona that he faced serious charges, including 14 counts of sexual exploitation of a minor.
The rise in using gaming platforms to facilitate sexual exploitation is a growing concern, especially after the 2018 federal crackdown on websites like Backpage.com, which was notorious for advertising sex, including the exploitation of children. After Backpage was shut down, traffickers began exploiting new venues, including social media, dating apps, and online gaming platforms like Roblox.
Roblox, which allows users to interact in virtual worlds they can create, has become one of the leading gaming sites for children globally, with nearly 60% of its users in 2023 being 16 years old or younger. However, its popularity has also made it a target for predators. The National Center on Sexual Exploitation (NCOSE) placed Roblox on its “Dirty Dozen List,” citing it as a platform that facilitates and profits from sexual abuse and exploitation.
Despite Roblox’s claims of implementing safety measures such as content moderation, chat filters, and parental controls, incidents of child exploitation continue to surface. A June 2023 NCOSE report revealed that children are often lured into virtual “condo experiences” within Roblox, where they are encouraged to engage in virtual sex acts.
Legal cases involving Roblox have further exposed the dangers of the platform. In one case, a 14-year-old Ohio girl was sexually assaulted by an adult she met on Roblox, who posed as a fellow teen. Another case in Florida involved a girl under 12 who was kidnapped and assaulted by a man she met on the platform.
Roblox has faced multiple lawsuits over the safety of children on its platform. A class-action lawsuit filed in San Mateo County Superior Court in December accused the company of allowing rampant sexual content and the presence of child predators. Although the lawsuit was dismissed for technical reasons, it highlighted the ongoing concerns about the platform’s safety.
In response to these growing concerns, Roblox partnered with the U.S. Department of Homeland Security’s Know2Protect Initiative, a campaign aimed at combating online child exploitation. Roblox has pledged to enhance its safety measures, including displaying safety tips within its games and developing immersive experiences to educate users.
Despite these efforts, law enforcement officials like Sgt. Lorence Jove Jr. of the Tucson Police Department emphasize the immense challenge in keeping up with the rapidly evolving online threats. The battle against online child exploitation is ongoing, and platforms like Roblox remain at the forefront of this critical issue.
Source: arizonadailyindependent.com
Trending
- Selfcare & Sexual Wellness8 months ago
The Dark Side of the Playboy Mansion
- Selfcare & Sexual Wellness8 months ago
Adult Star’s Choice to Avoid Pregnancy at Work
- Cam Models3 years ago
EmilyJonesChat
- Finance & Business2 months ago
BCAMS Magazine, the 22th issue!
- Finance & Business6 months ago
BCAMS Magazine, the 21th issue!
- Cam Models2 years ago
Demora Avarice
Warning: Undefined variable $user_ID in /home/bcamsmagazine/public_html/wp-content/themes/zox-news/comments.php on line 49
You must be logged in to post a comment Login
This site uses User Verification plugin to reduce spam. See how your comment data is processed.