Undress AI? Guidance for parents and carers

img JHpTsez0cJv1GaMdhfOiKfWD

Artificial intelligence (AI) continues to advance rapidly, bringing numerous benefits but also posing new risks. One such concerning development is the rise of “Undress AI” apps, which could expose young people to potential harm. It’s crucial for parents and carers to understand what these tools are and how to protect their children online.

Summary

Undress AI apps are a genre of applications that use AI technology to alter images or videos by virtually removing clothing. The risks associated with these tools include exposure to inappropriate content, bullying, abuse, and negative impacts on mental health. Research from Graphika has shown a 2000% increase in spam referral links to “deepnude” websites. It is now illegal to generate and distribute intimate deepfakes. Preventative conversations about Undress AI can help keep children from getting involved. This article provides guidance and resources to support children’s online safety.

What is Undress AI?

Undress AI refers to a type of tool that uses artificial intelligence to remove clothing from individuals in images. These tools vary in their operation but share a common purpose: creating manipulated images that imply nudity. Although these images do not depict the victim’s actual nude body, they can be used to cause harm. For more information on the implications and concerns related to such tools, visit this article on AI and privacy.

Perpetrators using Undress AI tools might keep these images for personal use or share them widely. They could use these images for sexual coercion (sextortion), bullying, abuse, or as a form of revenge porn. Children and young people are particularly vulnerable if their images are manipulated using this technology. A report from the Internet Watch Foundation (IWF) found over 11,000 potentially criminal AI-generated images of children on one dark web forum dedicated to child sexual abuse material (CSAM), with around 3,000 assessed as criminal.

The IWF also noted many examples of AI-generated images featuring known victims and famous children. Generative AI can only create convincing images if it learns from accurate source material. Therefore, AI tools that generate CSAM likely learn from real images featuring child abuse.

Risks to Look Out For

Undress AI tools use suggestive language to attract users, making children more likely to follow their curiosity. Children and young people might not fully understand the law and could struggle to distinguish harmful tools from those promoting harmless fun.

Inappropriate Content and Behavior: The curiosity and novelty of Undress AI tools could expose children to inappropriate content. They might believe it’s acceptable to use these tools because they do not show “real” nude images. If they share these manipulated images with friends “for a laugh,” they are breaking the law, often without realizing it. Without parental intervention, they might continue this behavior, even if it harms others.

Privacy and Security Risks: Many legitimate generative AI tools require payment or subscriptions to create images. Free deepnude websites might produce low-quality images or have lax security. If a child uploads a clothed image of themselves or a friend, the site or app could misuse it, including the “deepnude” it generates. Children using these tools are unlikely to read the Terms of Service or Privacy Policy, exposing them to risks they might not understand.

Creation of Child Sexual Abuse Material (CSAM): The IWF reported a 417% increase in cases of “self-generated” CSAM circulating online from 2019 to 2022. In most cases, abusers coerce children into creating these images. With Undress AI, children might unknowingly create AI-generated CSAM. If they upload a clothed picture of themselves or another child, someone could “nudify” that image and share it more widely.

Cyberbullying, Abuse, and Harassment: Like other deepfakes, people can use Undress AI tools or “deepnudes” to bully others. This could include claiming a peer sent a nude image of themselves when they didn’t or using AI to create a nude with features that bullies mock. Sharing nude images of peers is both illegal and abusive.

How Widespread is Deepnude Technology?

Research shows that the usage of these AI tools is increasing, especially to remove clothing from female victims. One Undress AI site states that their technology is “not intended for use with male subjects” because they trained the tool using female imagery. Most AI tools of this type follow a similar pattern. The IWF found that 99.6% of the AI-generated CSAM it investigated featured female children.

Graphika’s research highlighted a 2000% increase in referral link spam for Undress AI services in 2023. They found that 34 of these providers received over 24 million unique visitors to their websites in one month. The report predicts “further instances of online harm,” including sextortion and CSAM. Perpetrators will likely continue to target girls and women over boys and men, especially if these tools primarily learn from female images.

What Does UK Law Say?

Until recently, creating sexually explicit deepfake images was not illegal unless the images were of children. However, the Ministry of Justice announced a new law this week, making it illegal to create sexually explicit deepfake images of adults without their consent. Those convicted will face an “unlimited fine.”

This change contradicts a statement from early 2024, which claimed that creating a deepfake intimate image was “not sufficiently harmful or culpable that it should constitute a criminal offence.” The Online Safety Act, enacted in January 2024, made it illegal to share AI-generated intimate images without consent.

Generally, this law covers any image that is sexual in nature, including those featuring nude or partially nude subjects. However, the law relies on the intention to cause harm, making it difficult to prove intent and prosecute those creating sexually explicit deepfakes.

How to Keep Children Safe from Undress AI

Whether you’re concerned about your child using Undress AI tools or becoming a victim, here are some actions to take to protect them:

  1. Educate: Talk to your children about the risks of Undress AI and the importance of respecting others’ privacy.
  2. Monitor Online Activity: Keep an eye on the apps and websites your children use.
  3. Encourage Open Communication: Create a safe environment for your children to talk about their online experiences.
  4. Set Boundaries: Establish clear rules about what is and isn’t acceptable online behavior.
  5. Use Parental Controls: Utilize parental control tools to block access to harmful websites and apps.

By taking these steps, you can help protect your children from the dangers of Undress AI and promote a safer online environment.


Discover more from MSN Technology

Subscribe to get the latest posts sent to your email.