Is this the future of AI? A new decentralized model is unveiled
If this works you could soon recruit people to help train your own AI model
Here at Tom’s Guide our expert editors are committed to bringing you the best news, reviews and guides to help you stay informed and ahead of the curve!
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Daily (Mon-Sun)
Tom's Guide Daily
Sign up to get the latest updates on all of your favorite content! From cutting-edge tech news and the hottest streaming buzz to unbeatable deals on the best products and in-depth reviews, we’ve got you covered.
Weekly on Thursday
Tom's AI Guide
Be AI savvy with your weekly newsletter summing up all the biggest AI news you need to know. Plus, analysis from our AI editor and tips on how to use the latest AI tools!
Weekly on Friday
Tom's iGuide
Unlock the vast world of Apple news straight to your inbox. With coverage on everything from exciting product launches to essential software updates, this is your go-to source for the latest updates on all the best Apple content.
Weekly on Monday
Tom's Streaming Guide
Our weekly newsletter is expertly crafted to immerse you in the world of streaming. Stay updated on the latest releases and our top recommendations across your favorite streaming platforms.
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
The AI company PrimeIntellect recently started training a new 10 billion parameter model, a task it hopes to complete using the help of users around the world.
On its blog, PrimeIntellect said its new model, INTELLECT-1, will be the product of the first decentralized training run of a model of this scale. However, this still falls significantly short of even four-year-old models like OpenAI’s GPT-3 which featured over 175 billion parameters.
This project started out with research on the open-source implementation and scaling of globally distributed AI model training. The method worked for a model 1 billion parameters large and now the next step is to scale it up by a factor of ten.
Size isn't everything though. Newer models like Microsoft's Phi and Meta's Llama are proving you can achieve GPT-3 and even GPT-4 level performance with a fraction of the parameters through efficiency improvements.
This brings us one step closer towards open source AGI
PrimeIntellect
The company’s goal is to find a way to make decentralized training a reality to ensure that the next generation of AI, artificial general intelligence (AGI), is open-source, transparent, and accessible. This reduces the risk of only a few large companies having access to this advanced technology.
For now, users can only contribute to the project through the company’s own platform. You can do this by renting GPUs that PrimeIntellect selected, specifically NVIDIA’s H100 Tensor Core GPU, which cost around $20 per hour to run. But in the future, you should be able to contribute to the model’s training with your own hardware.
The training is made possible through separate clusters of devices that process data to train the AI model. However, new features allow the different clusters to communicate less frequently with each other to synchronize their progress, thus freeing up bandwidth requirements. The training framework can also handle nodes joining or leaving without leading to system crashes.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Nodes that join training that has already started would need to be brought up to speed with the latest state of the model before being able to contribute. Delays with this catching-up process have been solved by having new nodes request checkpoints from their peers.
What happens next for INTELLECT-1
INTELLECT-1 is based on the Llama-3 architecture and is being trained on four different datasets. It’s mainly training on a Hugging Face dataset called FineWeb-Edu which contains content from educational web pages.
In the future, PrimeIntellect wants to train even larger models and create ways for anyone to create their own similar AI model training project to which other users can also contribute their processing power.
More from Tom's Guide
- Apple is bringing iPhone Mirroring to macOS Sequoia — here’s what we know
- iOS 18 supported devices: Here are all the compatible iPhones
- Apple Intelligence unveiled — all the new AI features coming to iOS 18, iPadOS 18 and macOS Sequoia

Christoph Schwaiger is a journalist, mainly covering technology, health, and current affairs. His stories have been published by Tom's Guide, Live Science, New Scientist, and the Global Investigative Journalism Network, among other outlets. Christoph has appeared on LBC and Times Radio. Additionally, he previously served as a National President for Junior Chamber International (JCI), a global leadership organization, and graduated cum laude from the University of Groningen in the Netherlands with an MA in journalism. You can follow him on X (Twitter) @cschwaigermt.










