Connect with us

Tech & IT

From AI-generated soft porn to voice cloning, here’s how technology is being used as a tool for scams

In the last 10 years, technology has made such huge advances that it is hard to imagine life without it. By looking around, there are plenty of examples of how it has made life easier. Artificial Intelligence and Machine Learning are being used to make processes simpler and allow us to do more things at once. However, with the increased usage of technology, the amount of frauds has also increased.

Have you heard of or even experienced any type of scam? Common ones include password phishing emails and spoof websites, but with AI, the scams become more dangerous. Although these cases may sound like something out of a movie, they are real and people have suffered from them. Let’s take a look at some examples.

Voice Cloning

If you weren’t aware that it was a scammer posing as your boss, you might transfer the funds without questioning it. However, it is important to be aware of potential scams, as they may not be who they say they are.

In 2019, a British energy company fell victim to a scammer posing as a German executive, resulting in the transfer of around $240,000 to an unknown bank account.

It is believed that the scammers utilized AI software to replicate the voice of the German executive, including their unique prosody and accent, based on a few audio recordings. Although this is not a widely used scamming tactic, it can be expected that it will become more common as AI software becomes more accessible and does not take a lot of resources or time to utilize.

Deep Fakes

A video of Ukrainian President Volodymr Zelesky asking the army to lay down their weapons during the Russia and Ukraine war circulated on social media, but it was quickly debunked and removed from a Ukrainian news website by hackers when the entire country was under the attack of Russian troops.

Thanks to deep fakes, a combination of voice cloning and AI-generated videos, criminals can create compromising videos of victims to blackmail them, as well as videos asking families to transfer money. This technique is used widely by criminals who use readily-available selfies and personal images from the internet.

It is becoming increasingly difficult to detect the deep fakes created by criminals; they even possess the proficiency to deceive facial recognition software.

Recruitment fraud

As more of the world has gone digital, it is now commonplace to find job postings online. Unfortunately, this has also opened the door for scammers to take advantage of unsuspecting people and steal their money

People are tricked into giving away their data and personal information in exchange for money when scammers post job openings online. Those looking for employment are asked to transfer money in order to secure the job, with a promise that it will be returned when they start working. This not only applies to full-time jobs, but part-time positions as well.

Fake Images

AI has made fake images or morphed images worse than they had been in the past, similar to the deep fakes. Lensa AI, for example, can produce non-consensual soft porn images when provoked, and it can be trained to create photos like sketches, cartoons, animes, and watercolors. This means that if an image with a different body and face is fed into the application, it will automatically generate a new image.

It is always advisable to double check the source of any sites, emails or messages before taking action, despite the benefits of AI. This is just one example; there are many other applications utilized by scammers for producing images for the purpose of blackmailing their victims.

READ MORE

Click to comment

You must be logged in to post a comment Login

Latest News

Protecting EU Citizens from Nonconsensual Pornographic Deepfakes Law in 2023

The European Union’s current and proposed laws fail to adequately protect citizens from the harms of nonconsensual pornographic deepfakes—AI-generated images, audio, or videos that use an individual’s likeness to create pornographic material without their consent. To protect victims of this abuse, the EU must take steps to amend existing legislative proposals and encourage soft law approaches.


Although deepfakes have legitimate commercial uses, 96 percent of deepfake videos found online are nonconsensual pornography. Perpetrators can use them to harass, extort, offend, defame, or embarrass individuals by superimposing their likeness onto sexual material without permission. The ease of creating and distributing deepfakes due to the increasing availability of AI tools has made this form of abuse easier than ever.

The Digital Services Act (DSA) obliges platforms to demonstrate the procedures by which illegal content can be reported and taken down. However, this will have little impact on the spread of nonconsensual pornographic deepfakes since the bill does not classify them as illegal. The DSA also does not cover 94 percent of deepfake pornography, which is hosted on dedicated pornographic websites instead of mainstream platforms. Moreover, the EU dropped a proposal in the DSA that would have required porn sites hosting user-generated content to swiftly remove material flagged by victims as depicting them without permission.

The Artificial Intelligence (AI) Act, likely to pass into law in 2023, requires creators to disclose deepfake content. But this does little to protect victims, as the demand for deepfakes does not depend on their authenticity. The Directive on Gender-Based Violence proposed in 2022 criminalizes sharing intimate images without consent and could include deepfakes in its scope. However, the bill fails to cover nudity that is not explicitly sexual and sexual imagery that is not wholly nude. Moreover, it only applies to material made accessible to many end-users when even sharing deepfakes with a single person can cause great harm.

These legislative proposals must be amended to protect victims better and deter perpetrators. Additionally, the EU should encourage soft law approaches such as public awareness campaigns, self-regulatory codes of practice, and the development of deepfake detection tools by law enforcement. With a combination of hard and soft law approaches, the EU can protect its citizens from the harms of nonconsensual pornographic deepfakes.

Continue Reading

Selfcare & Sexual Wellness

Uncovering the Fetish Videos Lurking on TikTok


It’s no secret that TikTok is full of videos that are seemingly innocent but are actually disguised as fetish content. From bizarre life hacks to footage of someone being tied up, these videos reflect what some people find erotic. What’s more, these videos are being watched by children, which is the minimum age requirement for the app.

Foot fetishists, food fetishists, and those fascinated by messiness have all found a home on TikTok. While the platform does not allow videos that depict sexual fetishes, the definition of what constitutes a sexual fetish can be blurry.

Videos that appear to show a spray-on tattoo application, or a bride cutting a bridesmaid’s dress, can appear to be harmless but are actually porn for someone. To spot hidden fetish videos, we can think back to Supreme Court Justice Potter Stewart’s famous phrase: “I know it when I see it.”

Videos that feature cheesy soundtracks, leering camera angles, and an excessive buildup to an anticlimactic reveal are all signs of potential fetish content. Foot fetish videos are particularly popular, with videos showing people giving foot massages and stepping on gross items. Food fetishism is also prevalent, with videos featuring feeding fetishes and weird recipes.

Sploshing is another popular fetish involving someone being covered in a messy substance, such as food, mud, slime, or paint. Although it’s unclear what kind of effect this content may have on children, it’s worth noting that it’s out there and being watched. Letting people who will do anything for views potentially influence the psyches of our youth is a dangerous experiment that we’re conducting on humanity.

Continue Reading

Latest News

Gender Inclusion: A Step Forward in Meta’s Adult Content Nudity Policy

Oversight Board has advised Meta to update this aspect of its adult content nudity policy to make it more gender-inclusive. The Oversight Board, composed of lawyers, academics, and human rights experts, declared that Instagram’s policy regarding adult content nudity was discriminatory and unworkable due to its gender exclusion.

Specifically, the board recommended that the photo-sharing app allow non-binary people’s nipples to be shown on the platform, just like male nipples are already permitted.

Making the declaration on January 17, 2022, the board said that the nipple-related rules needed updating for the sake of clarity as well as inclusiveness.

“The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests to scenes of childbirth and medical and health contexts, including top surgery and breast cancer awareness,” the board said.

The board concluded that the current exceptions to the policy are too ill-defined, creating confusion for both users and reviewers. Such an approach, they claimed, was not practical when moderating content at scale. Furthermore, it was a violation of international human rights standards.

“Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale.”

Meta now has 60 days to respond to the board’s recommendations and change its adult content nudity policy for the sake of gender inclusivity. Whether or not Instagram will act on this advice remains to be seen, but it is likely that the platform will come under pressure to do so or risk undermining the point of the independent board.

Therefore, one can anticipate that the platform will soon feature a greater variety of nipples, including non-binary nipples, and that the current ban on female nipples will be relaxed. This does not mean that Instagram is set to become a hub of sexually explicit content, but rather that the rules regarding gender inclusivity will be updated and more clearly defined.

Continue Reading
Advertisement

Trending