Technophile NewsTechnophile News
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
What's On

Sony’s New WH-1000XM6 Headphones Are Finally on Sale This Prime Day

9 July 2025

The 35 best Prime Day deals you can get for under $25

9 July 2025

Apple Pencil With ‘Trackball’ Tip, Ability to Draw on Any Surface Described in Patent Document

9 July 2025

The ‘Click-to-Cancel’ Rule Was Killed, but Consumer Advocates Could Revive It

9 July 2025

Apple’s second-generation Vision Pro might launch this year

9 July 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact Us
Wednesday, July 9
Facebook X (Twitter) Instagram YouTube
Technophile NewsTechnophile News
Demo
  • Home
  • News
  • PC
  • Phones
  • Android
  • Gadgets
  • Games
  • Guides
  • Accessories
  • Reviews
  • Spotlight
  • More
    • Artificial Intelligence
    • Web Stories
    • Press Release
Technophile NewsTechnophile News
Home » A New Kind of AI Model Lets Data Owners Take Control
News

A New Kind of AI Model Lets Data Owners Take Control

By News Room9 July 20253 Mins Read
Facebook Twitter Pinterest LinkedIn Telegram Tumblr Reddit WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Email

A new kind of large language model, developed by researchers at the Allen Institute for AI (Ai2), makes it possible to control how training data is used even after a model has been built.

The new model, called FlexOlmo, could challenge the current industry paradigm of big artificial intelligence companies slurping up data from the web, books, and other sources—often with little regard for ownership—and then owning the resulting models entirely. Once data is baked into an AI model today, extracting it from that model is a bit like trying to recover the eggs from a finished cake.

“Conventionally, your data is either in or out,” says Ali Farhadi, CEO of Ai2, based in Seattle, Washington. “Once I train on that data, you lose control. And you have no way out, unless you force me to go through another multi-million-dollar round of training.”

Ai2’s avant-garde approach divides up training so that data owners can exert control. Those who want to contribute data to a FlexOlmo model can do so by first copying a publicly shared model known as the “anchor.” They then train a second model using their own data, combine the result with the anchor model, and contribute the result back to whoever is building the third and final model.

Contributing in this way means that the data itself never has to be handed over. And because of how the data owner’s model is merged with the final one, it is possible to extract the data later on. A magazine publisher might, for instance, contribute text from its archive of articles to a model but later remove the sub-model trained on that data if there is a legal dispute or if the company objects to how a model is being used.

“The training is completely asynchronous,” says Sewon Min, a research scientist at Ai2 who led the technical work. “Data owners do not have to coordinate, and the training can be done completely independently.”

The FlexOlmo model architecture is what’s known as a “mixture of experts,” a popular design that is normally used to simultaneously combine several sub-models into a bigger, more capable one. A key innovation from Ai2 is a way of merging sub-models that were trained independently. This is achieved using a new scheme for representing the values in a model so that its abilities can be merged with others when the final combined model is run.

To test the approach, the FlexOlmo researchers created a dataset they call Flexmix from proprietary sources including books and websites. They used the FlexOlmo design to build a model with 37 billion parameters, about a tenth of the size of the largest open source model from Meta. They then compared their model to several others. They found that it outperformed any individual model on all tasks and also scored 10 percent better at common benchmarks than two other approaches for merging independently trained models.

The result is a way to have your cake—and get your eggs back, too. “You could just opt out of the system without any major damage and inference time,” Farhadi says. “It’s a whole new way of thinking about how to train these models.”

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related News

Sony’s New WH-1000XM6 Headphones Are Finally on Sale This Prime Day

9 July 2025

The 35 best Prime Day deals you can get for under $25

9 July 2025

The ‘Click-to-Cancel’ Rule Was Killed, but Consumer Advocates Could Revive It

9 July 2025

Apple’s second-generation Vision Pro might launch this year

9 July 2025

I Found the Absolute Best Prime Day Gaming Laptop Deal

9 July 2025

The Columbia hack is a much bigger deal than Mamdani’s college application

9 July 2025
Top Articles

iQOO Z10 Turbo Pro – Price in India, Specifications (1st May 2025)

30 April 2025125 Views

iQOO Neo 10 Pro+ Confirmed to Debut This Month, Pre-Reservations Begin

8 May 202589 Views

Redmi K80 Ultra Design, Colours, and Key Features Revealed; to Get MediaTek Dimensity 9400+ SoC

18 June 202580 Views
Stay In Touch
  • Facebook
  • YouTube
  • TikTok
  • WhatsApp
  • Twitter
  • Instagram
Don't Miss

Samsung Unpacked 2025: Galaxy Z Fold 7 With Snapdragon 8 Elite Chipset, 8-Inch Inner Display Launched in India

9 July 2025

Samsung Galaxy Z Fold 7 has been unveiled at the Galaxy Unpacked event. The new…

I Found the Absolute Best Prime Day Gaming Laptop Deal

9 July 2025

The Columbia hack is a much bigger deal than Mamdani’s college application

9 July 2025

Samsung Unpacked 2025: Galaxy Z Flip 7 Launched in India With 4.1-Inch Cover Screen, Exynos 2500 SoC

9 July 2025
Technophile News
Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us
© 2025 Technophile News. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.