Close Menu
Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On
The 40 Best Shows on Apple TV, WIRED’s Picks (December 2025)

The 40 Best Shows on Apple TV, WIRED’s Picks (December 2025)

16 December 2025
A vague study on Nazi bots created chaos in the Taylor Swift fan universe

A vague study on Nazi bots created chaos in the Taylor Swift fan universe

16 December 2025
The Best Streaming Bundles and Streaming Deals of December 2025

The Best Streaming Bundles and Streaming Deals of December 2025

16 December 2025
You can pair a tiny wireless mic to this 4K webcam

You can pair a tiny wireless mic to this 4K webcam

16 December 2025
Grindr Goes ‘AI-First’ as It Strives to Be an ‘Everything App for the Gay Guy’

Grindr Goes ‘AI-First’ as It Strives to Be an ‘Everything App for the Gay Guy’

16 December 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Tuesday, December 16
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » Distillation Can Make AI Models Smaller and Cheaper
News

Distillation Can Make AI Models Smaller and Cheaper

By News Room20 September 20255 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
Distillation Can Make AI Models Smaller and Cheaper
Share
Facebook Twitter LinkedIn Pinterest Email

The original version of this story appeared in Quanta Magazine.

The Chinese AI company DeepSeek released a chatbot earlier this year called R1, which drew a huge amount of attention. Most of it focused on the fact that a relatively small and unknown company said it had built a chatbot that rivaled the performance of those from the world’s most famous AI companies, but using a fraction of the computer power and cost. As a result, the stocks of many Western tech companies plummeted; Nvidia, which sells the chips that run leading AI models, lost more stock value in a single day than any company in history.

Some of that attention involved an element of accusation. Sources alleged that DeepSeek had obtained, without permission, knowledge from OpenAI’s proprietary o1 model by using a technique known as distillation. Much of the news coverage framed this possibility as a shock to the AI industry, implying that DeepSeek had discovered a new, more efficient way to build AI.

But distillation, also called knowledge distillation, is a widely used tool in AI, a subject of computer science research going back a decade and a tool that big tech companies use on their own models. “Distillation is one of the most important tools that companies have today to make models more efficient,” said Enric Boix-Adsera, a researcher who studies distillation at the University of Pennsylvania’s Wharton School.

Dark Knowledge

The idea for distillation began with a 2015 paper by three researchers at Google, including Geoffrey Hinton, the so-called godfather of AI and a 2024 Nobel laureate. At the time, researchers often ran ensembles of models—“many models glued together,” said Oriol Vinyals, a principal scientist at Google DeepMind and one of the paper’s authors—to improve their performance. “But it was incredibly cumbersome and expensive to run all the models in parallel,” Vinyals said. “We were intrigued with the idea of distilling that onto a single model.”

“Distillation is one of the most important tools that companies have today to make models more efficient.”

Enric Boix-Adsera

The researchers thought they might make progress by addressing a notable weak point in machine-learning algorithms: Wrong answers were all considered equally bad, regardless of how wrong they might be. In an image-classification model, for instance, “confusing a dog with a fox was penalized the same way as confusing a dog with a pizza,” Vinyals said. The researchers suspected that the ensemble models did contain information about which wrong answers were less bad than others. Perhaps a smaller “student” model could use the information from the large “teacher” model to more quickly grasp the categories it was supposed to sort pictures into. Hinton called this “dark knowledge,” invoking an analogy with cosmological dark matter.

After discussing this possibility with Hinton, Vinyals developed a way to get the large teacher model to pass more information about the image categories to a smaller student model. The key was homing in on “soft targets” in the teacher model—where it assigns probabilities to each possibility, rather than firm this-or-that answers. One model, for example, calculated that there was a 30 percent chance that an image showed a dog, 20 percent that it showed a cat, 5 percent that it showed a cow, and 0.5 percent that it showed a car. By using these probabilities, the teacher model effectively revealed to the student that dogs are quite similar to cats, not so different from cows, and quite distinct from cars. The researchers found that this information would help the student learn how to identify images of dogs, cats, cows, and cars more efficiently. A big, complicated model could be reduced to a leaner one with barely any loss of accuracy.

Explosive Growth

The idea was not an immediate hit. The paper was rejected from a conference, and Vinyals, discouraged, turned to other topics. But distillation arrived at an important moment. Around this time, engineers were discovering that the more training data they fed into neural networks, the more effective those networks became. The size of models soon exploded, as did their capabilities, but the costs of running them climbed in step with their size.

Many researchers turned to distillation as a way to make smaller models. In 2018, for instance, Google researchers unveiled a powerful language model called BERT, which the company soon began using to help parse billions of web searches. But BERT was big and costly to run, so the next year, other developers distilled a smaller version sensibly named DistilBERT, which became widely used in business and research. Distillation gradually became ubiquitous, and it’s now offered as a service by companies such as Google, OpenAI, and Amazon. The original distillation paper, still published only on the arxiv.org preprint server, has now been cited more than 25,000 times.

Considering that the distillation requires access to the innards of the teacher model, it’s not possible for a third party to sneakily distill data from a closed-source model like OpenAI’s o1, as DeepSeek was thought to have done. That said, a student model could still learn quite a bit from a teacher model just through prompting the teacher with certain questions and using the answers to train its own models—an almost Socratic approach to distillation.

Meanwhile, other researchers continue to find new applications. In January, the NovaSky lab at UC Berkeley showed that distillation works well for training chain-of-thought reasoning models, which use multistep “thinking” to better answer complicated questions. The lab says its fully open source Sky-T1 model cost less than $450 to train, and it achieved similar results to a much larger open source model. “We were genuinely surprised by how well distillation worked in this setting,” said Dacheng Li, a Berkeley doctoral student and co-student lead of the NovaSky team. “Distillation is a fundamental technique in AI.”


Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

The 40 Best Shows on Apple TV, WIRED’s Picks (December 2025)

The 40 Best Shows on Apple TV, WIRED’s Picks (December 2025)

16 December 2025
A vague study on Nazi bots created chaos in the Taylor Swift fan universe

A vague study on Nazi bots created chaos in the Taylor Swift fan universe

16 December 2025
The Best Streaming Bundles and Streaming Deals of December 2025

The Best Streaming Bundles and Streaming Deals of December 2025

16 December 2025
You can pair a tiny wireless mic to this 4K webcam

You can pair a tiny wireless mic to this 4K webcam

16 December 2025
Grindr Goes ‘AI-First’ as It Strives to Be an ‘Everything App for the Gay Guy’

Grindr Goes ‘AI-First’ as It Strives to Be an ‘Everything App for the Gay Guy’

16 December 2025
Google wants its AI assistant CC to replace your morning scroll

Google wants its AI assistant CC to replace your morning scroll

16 December 2025
Top Articles
The Nex Playground and Pixel Buds 2A top our list of the best deals this week

The Nex Playground and Pixel Buds 2A top our list of the best deals this week

13 December 202548 Views
OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

OpenAI Launches GPT-5.2 as It Navigates ‘Code Red’

11 December 202544 Views
The WIRED Guide to San Francisco for Business Travelers

The WIRED Guide to San Francisco for Business Travelers

5 November 202536 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss
Google wants its AI assistant CC to replace your morning scroll

Google wants its AI assistant CC to replace your morning scroll

16 December 2025

Google wants you to start your day with AI. Google’s AI, to be specific, which…

New Head of Trump’s Cancer Panel Speculated About Links Between Vaccines and Cancer

New Head of Trump’s Cancer Panel Speculated About Links Between Vaccines and Cancer

16 December 2025
Kindles make for even better gifts now most are on sale

Kindles make for even better gifts now most are on sale

16 December 2025
OpenAI Rolls Back ChatGPT’s Model Router System for Most Users

OpenAI Rolls Back ChatGPT’s Model Router System for Most Users

16 December 2025
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.