One of the biggest drawbacks to using third-party components, in any device, is that you depend on what other manufacturers do—or don’t do—other manufacturers. This poses a risk to innovation and, of course, the possibility that costs will change from one day to the next. Your room for maneuver, then, is minimal. In Microsoft are aware of this situation and have finally taken the step to develop the Maia 100 and Cobalt 100, their own processors for their data centers.
The Microsoft Maia 100 and Cobalt 100, although they have different purposes, share multiple features. The most important: both are committed to the ARM architecture. This technology, whose popularity skyrocketed with the rise of mobile devices, is gradually making its way into computers, and is now targeting data centers. Unlike the x86 architecture, ARM constantly improves both performance and power efficiency.
The main objective of the Microsoft Maia 100, which has 105 billion transistorsis to become the brain of your artificial intelligence solutions. Let us remember that those from Redmond, through Azure, offer services to perform tasks associated with AI. So implementing a home-designed chip not only allows them to innovate at their own pace, but also saves costs.
An important point is that, when designing Maia 100, Microsoft had in mind the scalability. An artificial intelligence infrastructure must be ready to grow when required, whether to serve a greater number of customers or because it is offering new services. In any case, the chip can cope with this growth without problems.
OpenAI will benefit from the Microsoft Maia 100
The first company that will benefit from Microsoft Maia 100 is OpenAI, responsible for ChatGPT. Their services are installed in Azure, so the new artificial intelligence processor fits them like a glove. In fact, OpenAI engineers were involved in the development of the Maia 100.
“We were excited when Microsoft first shared its designs for the Maia chip, and we worked together to refine it and test it with our models. Azure’s AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and make them cheaper for our customers,” said Sam Altman, CEO of OpenAI.
For its part, the Cobalt 100 is intended to process other Azure services. It is a real beast, since it integrates 128 cores. The latter is key, since it helps the chip to be more efficient in the energy field. According to Microsoft, The Cobalt 100 consumes 40% less energy than the chips they used previously. The difference is brutal.
“With these additions to the Azure infrastructure hardware portfolio, our platform allows us to deliver the best performance and efficiency across all workloads,” add Microsoft.
Be careful, the presentation of the Maia 100 and Cobalt 100 does not mean that Microsoft will say goodbye to its partnership with NVIDIA and AMD, its main suppliers of chips for data centers. They will continue to work together to use their processors for other tasks. Of course, the new chips will come into operation next year.