Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On

The Sims 4: Sylvan Glade Guide

31 August 2025

The Most Powerful Necromancers In Anime, Ranked

31 August 2025

How To Clean Your TV Screen or Computer Monitor

31 August 2025

Dungeons and Dragons: Magic Classes Are Clearly Better Than Martial Classes, but You Should Play Them Anyway

31 August 2025

How to Set Up Your New Android Phone

31 August 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Sunday, August 31
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » Meta is struggling to rein in its AI chatbots
News

Meta is struggling to rein in its AI chatbots

By News Room31 August 20253 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Email

Meta is changing some of the rules governing its chatbots two weeks after a Reuters investigation revealed disturbing ways in which they could, potentially, interact with minors. Now the company has told TechCrunch that its chatbots are being trained not to engage in conversations with minors around self-harm, suicide, or disordered eating, and to avoid inappropriate romantic banter. These changes are interim measures, however, put in place while the company works on new permanent guidelines.

The updates follow some rather damning revelations about Meta’s AI policies and enforcement over the last several weeks, including that it would be permitted to “engage a child in conversations that are romantic or sensual,” that it would generate shirtless images of underage celebrities when asked, and Reuters even reported that a man died after pursuing one to an address it gave him in New York.

Meta spokesperson Stephanie Otway acknowledged to TechCrunch that the company had made a mistake in allowing chatbots to engage with minors this way. Otway went on to say that, in addition to “training our AIs not to engage with teens on these topics, but to guide them to expert resources” it would also limit access to certain AI characters, including heavily sexualized ones like “Russian Girl”.

Of course, the policies put in place are only as good as their enforcement, and revelations from Reuters that it has allowed chatbots that impersonate celebrities to run rampant on Facebook, Instagram, WhatsApp call into question just how effective the company can be. AI fakes of Taylor Swift, Scarlett Johansson, Anne Hathaway, Selena Gomez, and Walker Scobell were discovered on the platform. These bots not only used the likeness of the celebrities, but insisted they were the real person, generated risque images (including of the 16-year-old Scobell), and engaged in sexually suggestive dialog.

Many of the bots were removed after they were brought to the attention of Meta by Reuters, and some were generated by third-parties. But many remain, and some were created by Meta employees, including the Taylor Swift bot that invited a Reuters reporter to visit them on their tour bus for a romantic fling, which was made by a product lead in Meta’s generative AI division. This is despite the company acknowledging that it’s own policies prohibit the creation of “nude, intimate, or sexually suggestive imagery” as well as “direct impersonation.”

This isn’t some relatively harmless inconvenience that just targets celebrities, either. These bots often insist they’re real people and will even offer physical locations for a user to meet up with them. That’s how a 76-year-old New Jersey man ended up dead after he fell while rushing to meet up with “Big sis Billie,” a chatbot that insisted it “had feelings” for him and invited him to its non-existent apartment.

Meta is at least attempting to address the concerns around how its chatbots interact with minors, especially now that the Senate and 44 state attorneys general are raising starting to probe its practices. But the company has been silent on updating many of its other alarming policies Reuters discovered around acceptable AI behavior, such as suggesting that cancer can be treated with quartz crystals and writing racist missives. We’ve reached out to Meta for comment and will update if they respond.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

How To Clean Your TV Screen or Computer Monitor

31 August 2025

How to Set Up Your New Android Phone

31 August 2025

The 24 best gifts for book lovers

31 August 2025

The 23 Key Pixel Settings to Change on Your Google Phone

31 August 2025

The Best Handheld and Wearable Fans to Keep Sweat at Bay

31 August 2025

These Hi-Fi Speakers Are Made out of Rocket Fuel Tanks

31 August 2025
Top Articles

iPhone 17 Air Colour Options Hinted in New Leak; Could Launch in Four Shades

10 July 202570 Views

Huawei Pura 80 Series Launch Date Set for June 11; Key Camera Specifications Leaked

4 June 202560 Views

Vivo X Fold 5 Colour Options, Specifications Teased Ahead of India Launch

2 July 202553 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss

The 24 best gifts for book lovers

31 August 2025

What do you gift somebody who loves books? The obvious answer is “a book,” but…

Gears of War: Reloaded Players Aren’t Happy With One ‘Inconsistent’ Change to the Original Game

31 August 2025

The 23 Key Pixel Settings to Change on Your Google Phone

31 August 2025

Meta is struggling to rein in its AI chatbots

31 August 2025
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.