Close Menu
Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On
How to Meditate (Without an Om in Sight) (2026)

How to Meditate (Without an Om in Sight) (2026)

22 January 2026
Lenovo wants other companies to make accessories for its modular laptops.

Lenovo wants other companies to make accessories for its modular laptops.

22 January 2026
A Wikipedia Group Made a Guide to Detect AI Writing. Now a Plug-In Uses It to ‘Humanize’ Chatbots

A Wikipedia Group Made a Guide to Detect AI Writing. Now a Plug-In Uses It to ‘Humanize’ Chatbots

22 January 2026
Nintendo is following up Alarmo with a weird Talking Flower in March

Nintendo is following up Alarmo with a weird Talking Flower in March

22 January 2026
The Best Nintendo Switch 2 Accessories

The Best Nintendo Switch 2 Accessories

22 January 2026
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Thursday, January 22
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » A Wikipedia Group Made a Guide to Detect AI Writing. Now a Plug-In Uses It to ‘Humanize’ Chatbots
News

A Wikipedia Group Made a Guide to Detect AI Writing. Now a Plug-In Uses It to ‘Humanize’ Chatbots

By News Room22 January 20264 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
A Wikipedia Group Made a Guide to Detect AI Writing. Now a Plug-In Uses It to ‘Humanize’ Chatbots
Share
Facebook Twitter LinkedIn Pinterest Email

On Saturday, tech entrepreneur Siqi Chen released an open source plug-in for Anthropic’s Claude Code AI assistant that instructs the AI model to stop writing like an AI model.

Called Humanizer, the simple prompt plug-in feeds Claude a list of 24 language and formatting patterns that Wikipedia editors have listed as chatbot giveaways. Chen published the plug-in on GitHub, where it has picked up more than 1,600 stars as of Monday.

“It’s really handy that Wikipedia went and collated a detailed list of ‘signs of AI writing,’” Chen wrote on X. “So much so that you can just tell your LLM to … not do that.”

The source material is a guide from WikiProject AI Cleanup, a group of Wikipedia editors who have been hunting AI-generated articles since late 2023. French Wikipedia editor Ilyas Lebleu founded the project. The volunteers have tagged over 500 articles for review and, in August 2025, published a formal list of the patterns they kept seeing.

Chen’s tool is a “skill file” for Claude Code, Anthropic’s terminal-based coding assistant, which involves a Markdown-formatted file that adds a list of written instructions (you can see them here) appended to the prompt fed into the large language model that powers the assistant. Unlike a normal system prompt, for example, the skill information is formatted in a standardized way that Claude models are fine-tuned to interpret with more precision than a plain system prompt. (Custom skills require a paid Claude subscription with code execution turned on.)

But as with all AI prompts, language models don’t always perfectly follow skill files, so does the Humanizer actually work? In our limited testing, Chen’s skill file made the AI agent’s output sound less precise and more casual, but it could have some drawbacks: It won’t improve factuality and might harm coding ability.

In particular, some of Humanizer’s instructions might lead you astray, depending on the task. For example, the Humanizer skill includes this line: “Have opinions. Don’t just report facts—react to them. ‘I genuinely don’t know how to feel about this’ is more human than neutrally listing pros and cons.” While being imperfect seems human, this kind of advice would probably not do you any favors if you were using Claude to write technical documentation.

Even with its drawbacks, it’s ironic that one of the web’s most referenced rule sets for detecting AI-assisted writing may help some people subvert it.

Spotting the Patterns

So what does AI writing look like? The Wikipedia guide is specific with many examples, but we’ll give you just one here for brevity’s sake.

Some chatbots love to pump up their subjects with phrases like “marking a pivotal moment” or “stands as a testament to,” according to the guide. They write like tourism brochures, calling views “breathtaking” and describing towns as “nestled within” scenic regions. They tack “-ing” phrases onto the end of sentences to sound analytical: “symbolizing the region’s commitment to innovation.”

To work around those rules, the Humanizer skill tells Claude to replace inflated language with plain facts and offers this example transformation:

Before: “The Statistical Institute of Catalonia was officially established in 1989, marking a pivotal moment in the evolution of regional statistics in Spain.”

After: “The Statistical Institute of Catalonia was established in 1989 to collect and publish regional statistics.”

Claude will read that and do its best as a pattern-matching machine to create an output that matches the context of the conversation or task at hand.

Why AI Writing Detection Fails

Even with such a confident set of rules crafted by Wikipedia editors, we’ve previously written about why AI writing detectors don’t work reliably: There is nothing inherently unique about human writing that reliably differentiates it from LLM writing.

One reason is that even though most AI language models tend toward certain types of language, they can also be prompted to avoid them, as with the Humanizer skill. (Although sometimes it’s very difficult, as OpenAI found in its yearslong struggle against the em dash.)

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

How to Meditate (Without an Om in Sight) (2026)

How to Meditate (Without an Om in Sight) (2026)

22 January 2026
Lenovo wants other companies to make accessories for its modular laptops.

Lenovo wants other companies to make accessories for its modular laptops.

22 January 2026
Nintendo is following up Alarmo with a weird Talking Flower in March

Nintendo is following up Alarmo with a weird Talking Flower in March

22 January 2026
The Best Nintendo Switch 2 Accessories

The Best Nintendo Switch 2 Accessories

22 January 2026
1Password is introducing a new phishing prevention feature

1Password is introducing a new phishing prevention feature

22 January 2026
7 Best Smart Locks (2026) for Front Doors, Side Doors, and Even Garages

7 Best Smart Locks (2026) for Front Doors, Side Doors, and Even Garages

22 January 2026
Top Articles
The CES 2026 stuff I might actually buy

The CES 2026 stuff I might actually buy

10 January 202660 Views
The Nex Playground and Pixel Buds 2A top our list of the best deals this week

The Nex Playground and Pixel Buds 2A top our list of the best deals this week

13 December 202548 Views
OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

11 December 202544 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss
1Password is introducing a new phishing prevention feature

1Password is introducing a new phishing prevention feature

22 January 2026

When a 1Password user clicks a link and opens a website with a URL that…

7 Best Smart Locks (2026) for Front Doors, Side Doors, and Even Garages

7 Best Smart Locks (2026) for Front Doors, Side Doors, and Even Garages

22 January 2026
Google Acquires Top Talent From AI Voice Startup Hume AI in Licensing Deal

Google Acquires Top Talent From AI Voice Startup Hume AI in Licensing Deal

22 January 2026
We Are Witnessing the Self-Immolation of a Superpower

We Are Witnessing the Self-Immolation of a Superpower

22 January 2026
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2026 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.