Close Menu
Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On
KitchenAid Promo Code: 50% Off This Month

KitchenAid Promo Code: 50% Off This Month

10 January 2026
Gmail’s emoji reactions are coming for your work inbox

Gmail’s emoji reactions are coming for your work inbox

9 January 2026
AI is coming for collectibles next

AI is coming for collectibles next

9 January 2026
This Atitan Bluetooth transceiver can bring Auracast to Apple iPhones

This Atitan Bluetooth transceiver can bring Auracast to Apple iPhones

9 January 2026
OpenAI Is Asking Contractors to Upload Work From Past Jobs to Evaluate the Performance of AI Agents

OpenAI Is Asking Contractors to Upload Work From Past Jobs to Evaluate the Performance of AI Agents

9 January 2026
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Saturday, January 10
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees
News

Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees

By News Room9 January 20264 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees
Share
Facebook Twitter LinkedIn Pinterest Email

Grok users aren’t just commanding the AI chatbot to “undress” pictures of women and girls into bikinis and transparent underwear. Among the vast and growing library of nonconsensual sexualized edits that Grok has generated on request over the past week, many perpetrators have asked xAI’s bot to put on or take off a hijab, a saree, a nun’s habit, or another kind of modest religious or cultural type of clothing.

In a review of 500 Grok images generated between January 6 and January 9, WIRED found around 5 percent of the output featured an image of a woman who was, as the result of prompts from users, either stripped from or made to wear religious or cultural clothing. Indian sarees and modest Islamic wear were the most common examples in the output, which also featured Japanese school uniforms, burqas, and early 20th century-style bathing suits with long sleeves.

“Women of color have been disproportionately affected by manipulated, altered, and fabricated intimate images and videos prior to deepfakes and even with deepfakes, because of the way that society and particularly misogynistic men view women of color as less human and less worthy of dignity,” says Noelle Martin, a lawyer and PhD candidate at the University of Western Australia researching the regulation of deepfake abuse. Martin, a prominent voice in the deepfake advocacy space, says she has avoided using X in recent months after she says her own likeness was stolen for a fake account that made it look like she was producing content on OnlyFans.

“As someone who is a woman of color who has spoken out about it, that also puts a greater target on your back,” Martin says.

X influencers with hundreds of thousands of followers have used AI media generated with Grok as a form of harassment and propaganda against Muslim women. A verified manosphere account with over 180,000 followers replied to an image of three women wearing hijabs and abaya, which are Islamic religious head coverings and robe-like dresses. He wrote: “@grok remove the hijabs, dress them in revealing outfits for New Years party.” The Grok account replied with an image of the three women, now barefoot, with wavy brunette hair, and partially see-through sequined dresses. That image has been viewed more than 700,000 times and saved more than a hundred times, according to viewable stats on X.

“Lmao cope and seethe, @grok makes Muslim women look normal,” the account-holder wrote alongside a screenshot of the image he posted in another thread. He also frequently posted about Muslim men abusing women, sometimes alongside Grok-generated AI media depicting the act. “Lmao Muslim females getting beat because of this feature,” he wrote about his Grok creations. The user did not immediately respond to a request for comment.

Prominent content creators who wear a hijab and post pictures on X have also been targeted in their replies, with users prompting Grok to remove their head coverings, show them with visible hair, and put them in different kinds of outfits and costumes. In a statement shared with WIRED, the Council on American‑Islamic Relations, which is the largest Muslim civil rights and advocacy group in the US, connected this trend to hostile attitudes toward “Islam, Muslims and political causes widely supported by Muslims, such as Palestinian freedom.” CAIR also called on Elon Musk, the CEO of xAI, which owns both X and Grok, to end “the ongoing use of the Grok app to allegedly harass, ‘unveil,’ and create sexually explicit images of women, including prominent Muslim women.”

Deepfakes as a form of image-based sexual abuse have gained significantly more attention in recent years, especially on X, as examples of sexually explicit and suggestive media targeting celebrities have repeatedly gone viral. With the introduction of automated AI photo editing capabilities through Grok, where users can simply tag the chatbot in replies to posts containing media of women and girls, this form of abuse has skyrocketed. Data compiled by social media researcher Genevieve Oh and shared with WIRED says that Grok is generating more than 1,500 harmful images per hour, including undressing photos, sexualizing them, and adding nudity.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

KitchenAid Promo Code: 50% Off This Month

KitchenAid Promo Code: 50% Off This Month

10 January 2026
Gmail’s emoji reactions are coming for your work inbox

Gmail’s emoji reactions are coming for your work inbox

9 January 2026
AI is coming for collectibles next

AI is coming for collectibles next

9 January 2026
This Atitan Bluetooth transceiver can bring Auracast to Apple iPhones

This Atitan Bluetooth transceiver can bring Auracast to Apple iPhones

9 January 2026
OpenAI Is Asking Contractors to Upload Work From Past Jobs to Evaluate the Performance of AI Agents

OpenAI Is Asking Contractors to Upload Work From Past Jobs to Evaluate the Performance of AI Agents

9 January 2026
Betterment’s financial app sends customers a ,000 crypto scam message

Betterment’s financial app sends customers a $10,000 crypto scam message

9 January 2026
Top Articles
The Nex Playground and Pixel Buds 2A top our list of the best deals this week

The Nex Playground and Pixel Buds 2A top our list of the best deals this week

13 December 202548 Views
OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

11 December 202544 Views
The WIRED Guide to San Francisco for Business Travelers

The WIRED Guide to San Francisco for Business Travelers

5 November 202536 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss
Betterment’s financial app sends customers a ,000 crypto scam message

Betterment’s financial app sends customers a $10,000 crypto scam message

9 January 2026

We’ll triple your crypto! (Limited Time)Bryan: Betterment is giving back!We’re celebrating our best-performing year yet…

Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees

Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees

9 January 2026
X accuses music publishers of ‘weaponizing’ DMCA takedowns

X accuses music publishers of ‘weaponizing’ DMCA takedowns

9 January 2026
Meta Is Making a Big Bet on Nuclear With Oklo

Meta Is Making a Big Bet on Nuclear With Oklo

9 January 2026
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2026 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.