Microsoft’s heavy investment into ChatGPT has been no secret. The mega-corporation has been invested in OpenAI for some time, infusing the AI company with billions of dollars. And it’s paid off already, the new Bing with ChatGPT has been a rousing success despite some stumbles.
Now, Microsoft is reportedly making another AI bet, and if it pays off it could give the company a massive advantage in the fast-changing AI landscape.
According to The Information, Microsoft has been secretly working on Project Athena — a first-party chip designed to power the supercomputers that train AI. The hope is the chip will be ready for mass production in 2024.
Currently, Microsoft is a bit behind on this front. While fellow mega-corps like Amazon, Alphabet (Google) and Meta (Facebook) already have their own first-party chips, Microsoft has been reliant on Nvidia for its AI chip supply.
* Would be Microsoft's first in-house AI chip
* Expected to be used in addition to Nvidia AI chips, not fully replace them
* Could reduce ChatGPT operating costs by 33%
* Expected to use 5-nm process TSMC semiconductors
* Could be available for mass production in 2024
This isn’t inherently a bad thing — Nvidia has been making AI chips for over 15 years and is considered the industry leader in AI chip manufacturing. Plus, manufacturing chips in-house has a high barrier to entry in terms of cost.
Project Athena is reported to cost around $100 million a year, which is something only a company the size of Microsoft and its largest competitors can afford.
But long-term, the savings could be huge. The research firm SemiAnalysis says “Athena, if competitive, could reduce the cost per chip by a third when compared with Nvidia’s offerings.” Given that the current estimated costs of operating ChatGPT is around $700,000 per day (SemiAnalysis says it costs around $0.36 per prompt) those savings could add up fast. Based on that math alone, Athena could easily save Microsoft over $84 million in operating costs alone, on top of any performance upgrades that Athena could provide.
It’s unclear when those performance upgrades will come though, as this seems at first to be an effort at cost-saving rather than trying to overtake Nvidia on the first try. In fact, according to The Information, Microsoft doesn’t even view Athena as a broad replacement for Nvidia’s AI chips, which makes sense given that the two companies recently announced a multiyear collaboration to build a next-generation supercomputer.
Athena is expected to be based on a 5-nanometer process, which is slightly outdated technology compared to the top-of-the-line Nvidia chips, so while it's not impossible that it could outperform an Nvidia AI chip, it feels unlikely.
Ultimately though, this could be a win for consumers. If Microsoft can successfully develop its own AI chip to augment the ones it gets from Nvidia, it could seriously upgrade its supercomputer capacity, which is currently rationed due to shortages of Nvidia’s AI chips. Combine increased capacity with decreased operating costs, and AI could become more accessible for all. And as someone who has frequently received an error message that ChatGPT is at capacity, I’d certainly have no complaints about increased capacity.
More from Tom's Guide
- Bing with ChatGPT vs Google Bard: Which AI chatbot wins?
- Meet Magi — Google's answer to Bing with ChatGPT
- How to get ChatGPT to answer any question — even banned ones