Close Menu
Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On
This Wireless Mic Kit Is  Off

This Wireless Mic Kit Is $70 Off

26 January 2026
MCP extension unites Claude with apps like Slack, Canva, and Figma

MCP extension unites Claude with apps like Slack, Canva, and Figma

26 January 2026
Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous

Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous

26 January 2026
Internal chats show how social media companies discussed teen engagement

Internal chats show how social media companies discussed teen engagement

26 January 2026
Microsoft’s latest AI chip goes head-to-head with Amazon and Google

Microsoft’s latest AI chip goes head-to-head with Amazon and Google

26 January 2026
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Monday, January 26
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous
News

Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous

By News Room26 January 20264 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
Deepfake ‘Nudify’ Technology Is Getting Darker—and More Dangerous
Share
Facebook Twitter LinkedIn Pinterest Email

Open the website of one explicit deepfake generator and you’ll be presented with a menu of horrors. With just a couple of clicks, it offers you the ability to convert a single photo into an eight-second explicit videoclip, inserting women into realistic-looking graphic sexual situations. “Transform any photo into a nude version with our advanced AI technology,” text on the website says.

The options for potential abuse are extensive. Among the 65 video “templates” on the website are a range of “undressing” videos where the women being depicted will remove clothing—but there are also explicit video scenes named “fuck machine deepthroat” and various “semen” videos. Each video costs a small fee to be generated; adding AI-generated audio costs more.

The website, which WIRED is not naming to limit further exposure, includes warnings saying people should only upload photos they have consent to transform with AI. It’s unclear if there are any checks to enforce this.

Grok, the chatbot created by Elon Musk’s companies, has been used to created thousands of nonconsensual “undressing” or “nudify” bikini images—further industrializing and normalizing the process of digital sexual harassment. But it’s only the most visible—and far from the most explicit. For years, a deepfake ecosystem, comprising dozens of websites, bots, and apps, has been growing, making it easier than ever before to automate image-based sexual abuse, including the creation of child sexual abuse material (CSAM). This “nudify” ecosystem, and the harm it causes to women and girls, is likely more sophisticated than many people understand.

“It’s no longer a very crude synthetic strip,” says Henry Ajder, a deepfake expert who has tracked the technology for more than half a decade. “We’re talking about a much higher degree of realism of what’s actually generated, but also a much broader range of functionality.” Combined, the services are likely making millions of dollars per year. “It’s a societal scourge, and it’s one of the worst, darkest parts of this AI revolution and synthetic media revolution that we’re seeing,” he says.

Over the past year, WIRED has tracked how multiple explicit deepfake services have introduced new functionality and rapidly expanded to offer harmful video creation. Image-to-video models typically now only need one photo to generate a short clip. A WIRED review of more than 50 “deepfake” websites, which likely receive millions of views per month, shows that nearly all of them now offer explicit, high-quality video generation and often list dozens of sexual scenarios women can be depicted into.

Meanwhile, on Telegram, dozens of sexual deepfake channels and bots have regularly released new features and software updates, such as different sexual poses and positions. For instance, in June last year, one deepfake service promoted a “sex-mode,” advertising it alongside the message: “Try different clothes, your favorite poses, age, and other settings.” Another posted that “more styles” of images and videos would be coming soon and users could “create exactly what you envision with your own descriptions” using custom prompts to AI systems.

“It’s not just, ‘You want to undress someone.’ It’s like, ‘Here are all these different fantasy versions of it.’ It’s the different poses. It’s the different sexual positions,” says independent analyst Santiago Lakatos, who along with media outlet Indicator has researched how “nudify” services often use big technology company infrastructure and likely made big money in the process. “There’s versions where you can make someone [appear] pregnant,” Lakatos says.

A WIRED review found more than 1.4 million accounts were signed up to 39 deepfake creation bots and channels on Telegram. After WIRED asked Telegram about the services, the company removed at least 32 of the deepfake tools. “Nonconsensual pornography—including deepfakes and the tools used to create them—is strictly prohibited under Telegram’s terms of service,” a Telegram spokesperson says, adding that it removes content when it is detected and has removed 44 million pieces of content that violated its policies last year.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

This Wireless Mic Kit Is  Off

This Wireless Mic Kit Is $70 Off

26 January 2026
MCP extension unites Claude with apps like Slack, Canva, and Figma

MCP extension unites Claude with apps like Slack, Canva, and Figma

26 January 2026
Internal chats show how social media companies discussed teen engagement

Internal chats show how social media companies discussed teen engagement

26 January 2026
Microsoft’s latest AI chip goes head-to-head with Amazon and Google

Microsoft’s latest AI chip goes head-to-head with Amazon and Google

26 January 2026
After 5 Years, Apple Finally Upgrades the AirTag

After 5 Years, Apple Finally Upgrades the AirTag

26 January 2026
Asus Zenbook Duo (2026) review: twice as nice — for a price

Asus Zenbook Duo (2026) review: twice as nice — for a price

26 January 2026
Top Articles
The CES 2026 stuff I might actually buy

The CES 2026 stuff I might actually buy

10 January 202660 Views
The Nex Playground and Pixel Buds 2A top our list of the best deals this week

The Nex Playground and Pixel Buds 2A top our list of the best deals this week

13 December 202548 Views
OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

11 December 202544 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss
After 5 Years, Apple Finally Upgrades the AirTag

After 5 Years, Apple Finally Upgrades the AirTag

26 January 2026

Apple’s AirTag has become the de facto “Bluetooth tracker” since its 2021 debut, upending established…

Asus Zenbook Duo (2026) review: twice as nice — for a price

Asus Zenbook Duo (2026) review: twice as nice — for a price

26 January 2026
Apple’s new AirTag has more range and a better speaker

Apple’s new AirTag has more range and a better speaker

26 January 2026
We Strapped on Exoskeletons and Raced. There’s One Clear Winner

We Strapped on Exoskeletons and Raced. There’s One Clear Winner

26 January 2026
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2026 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.