How Microsoft Can Win In Real Battle Of AI Race In The Hard And Soft Area
Starting from last Friday(17 Nov), we have seen the episodes played out from the OpenAI drama, where their CEO was ousted from the company he founded. But reason given until today I am still not getting it.
I would rather focus on what happen after that, Microsoft announced on 20 Nov (Monday) that it has hired Sam Altman, the co-founder of ChatGPT maker OpenAI.
So in this article, I would like to share why $Microsoft(MSFT)$ might have a better competitive advantage in the real battle of AI race.
As we know, the most common hardware now data center are running for AI or machine learning workload is from $NVIDIA Corp(NVDA)$ . But this is only on the hard area of things.
With the trend moving towards Artificial General Intelligence (AGI), I would think it is important for a company to have both hard and soft area advantage.
Hard Area
At the recent Microsoft Ignite, it unveiled two custom-designed chips.
The Microsoft Azure Maia AI Accelerator
The Microsoft Azure Cobalt CPU
This has also produced integrated systems which will be utilizing these 2 chips, Microsoft Azure Maia AI Accelerator is to support and optimized for artificial intelligence (AI) tasks and generative AI
Microsoft Azure Cobalt CPU, an Arm-based processor tailored to run general purpose compute workloads on the Microsoft Cloud.
Microsoft’s Azure Maia AI chip and Arm-powered Azure Cobalt CPU are arriving in 2024, on the back of a surge in demand this year for Nvidia’s H100 GPUs that are widely used to train and operate generative image tools and large language models.
This is not the first time Microsoft is on the hard area (e.g. silicon), more than 20 years ago, Microsoft has collaborated on silicon for the Xbox. They have also co-engineered chips for its Surface devices.
So I believe the new chips architecture design which both built in-house at Microsoft, was built on the Xbox’s experience. The journey started back in 2017 when they began architecting the cloud hardware stack and then put them on track to build our new custom chips.
Soft Area
With the latest sage for OpenAI coming to a close, we could see that $Microsoft(MSFT)$ has their existing Maia team, plus the incoming ex-OpenAI CEO led team, will they be able to move further on the soft area in the AI race.
The reason why we are now focused on ChatGPT, which is built on the GPT technology, this can be said to be the first step towards Artificial General Intelligence (AGI).
There are lots more to be done, while the version of GPT-4 currently available to the public is impressive, it is not the end of the road. There are groups working on additions to GPT-4 that are more goal-driven, meaning that you can give the system an instruction such as “Design and build a website on (topic).”
The system will then figure out exactly what subtasks need to be completed and in what order in order to achieve that goal.
Sad to say that, today, these systems are not particularly reliable, as they frequently fail to reach the stated goal. But they will certainly get better in the future.
Microsoft could close the gap as they would have the team from OpenAI (the same group of people who started and build GPT-4), this could make the future build much more in focus.
Companies And Their Chip Technologies
As we are in the topics of soft or hard area, there are already companies who are building chips and tools that could help them in the AI journey. (this list that I have put up might not be comprehensive)
We can see that GPU, FPGA, ASIC technology have been used in the chip intended for A.I. as well, I would think at the end of the day, it is how much does it cost to prepare the resources for computing?
Microsoft Azure Maia 100 is an ASIC built upon TSMC's 5nm node and uses an x86 host. The chip will be mounted in custom liquid-cooled racks offering up to 4 chips. The chip will support standard INT8 & INT4 data formats and utilize embedded ethernet interfaces.
ASIC Advantage In Performance, Energy Efficiency and User Cost
Below is a simple table to compare the different categories across GPU, FPGA and ASIC. I would think the performance and energy efficiency is something that is very important in training large language model where GPT is built on.
As we can see the different in LLMs is by different size of parameters, the more parameters, the models would be more accurate, but that require high computing power (which will draw energy like crazy).
This does not guarantee you performance if you are able to afford high compute. That is why I think it is a good idea and decision for Microsoft to go for ASIC chips.
If you look at the parameters size of GPT-4, it is 100T (trillion), with news of GPT-5 coming, we can logically put the size of parameters above 100T.
Summary
Based on the information I have gathered I think a company who can have advantage in both the hard and soft area of the AI race, would be in a better position.
I will be following up closely on what OpenAI ex-CEO would bring over to Microsoft, and whether OpenAI is still able to continue building GPT-5.
Appreciate if you could share your thoughts in the comment section whether you think Microsoft with both advantage in hard and soft area of AI race would lead the pack in 2024?
@TigerStars @Daily_Discussion @TigerWire appreciate if you could feature this article so that fellow tiger would benefit from my investing and trading thoughts.
Disclaimer: The analysis and result presented does not recommend or suggest any investing in the said stock. This is purely for Analysis.
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

doesn’t matter what I think it’s all about what NVDA has to say tonight after the close.So if u really know what u favorite stocks are going to do would buy them before NVDA reports or short them
Bloomberg interview of msft ceo. He said he heard no reason for Altman’s firing.
Why would Microsoft hire a guy who was recently fired without knowing the circumstances of his firing!?
Microsoft is embracing AI, which is the future of all things, computer and robotics
Looks optimistic but just the market is pulling up too fast
It looks like this will happen, congrats to MSFT