Google's Cloud TPU v4 Outperforms Nvidia A100 in AI Chip Market
Google recently published a paper detailing the performance of its Cloud TPU v4 supercomputing platform, claiming that the TPU v4 is 1.2-1.7 times faster than Nvidia A100 in similar systems, with 1.3-1.9 times lower power consumption. This raises concerns for Nvidia about the market share of AI chips being taken over by Google. So, what is Google's strategy in the AI chip market?
📌 What's the difference between TPU and GPU?
Initially, GPU was designed for 3D graphic design and use. But over time, their parallel computing capabilities made them a popular choice for AI. TPU, on the other hand, was invented by Google and is called a tensor processing unit. It uses ASIC chips (application-specific integrated circuits) specifically designed to handle the computational needs of machine learning and accelerate AI calculations and algorithms. It is tailored for Google's open-source machine learning framework TensorFlow. Google designed TPU as a mathematical matrix processor rather than a general-purpose processor (machine learning is basically complex matrix operations), solving the problem of frequent cache reads that would slow down GPU and CPU. But the biggest disadvantage of TPU is that they are more expensive than GPU and CPU, and their usage is not universal.
📌 Main Application Areas of TPU
TPU v4 has been running internally at Google since 2020, and the AI team actively uses it for cross-lingual models, recommendation systems, and other ML research and production that generates AI. As for recommendation systems, Google states that its TPU supercomputer is also the first hardware to support embedding, which is a key component of deep learning recommendation models (DLRM) for advertising, search rankings, YouTube, and Google Play. In addition, Google Cloud also provides TPU rental services.
📌 What Are Google's Ambitions?
Few companies outside of Google use this technology, primarily because TPU is not publicly sold and because Google Cloud has a relatively small share of the cloud market. According to data from the Synergy Research Group, Google holds 11% of the market share, lagging behind its large-scale competitors, Amazon's AWS and Microsoft Azure, which hold 34% and 21% of the share, respectively. Google also has an agreement with Nvidia to provide H100 to Google Cloud customers, reflecting Nvidia's position as a market leader that will remain stable for some time to come. However, Google will ensure that it does not fall behind in AI hardware and software technology. Currently, TPU servers are only launched in their own cloud services, and there is no plan for large-scale sales, and hardware prices have not been announced.
📌Nvidia's new generation H100:
Nvidia A100 was first released in 2020, and it is a chip worth about $10,000 that has become one of the most critical tools in the artificial intelligence industry. According to data from New Street Research, Nvidia occupies 95% of the graphics processing unit market available for machine learning. In 2022, Nvidia began mass-producing H100 and will begin shipping it in the form of individual and select board units from global manufacturers in the fall. In the quarter ending in January, Nvidia's H100 chip revenue exceeded that of A100, even though the unit price of H100 is higher. Compared to the previous generation A100, H100 provides almost 9 times faster AI training and nearly 30 times faster AI inference. Based on this data, it is speculated that H100 should perform better than TPU v4. Large customers for H100 include:
✦Meta: Grand Teton AI supercomputer
✦OpenAI: used on the Azure supercomputer
✦Amazon: AWS was an early customer of H100
✦Stability AI: plans to use H100 to accelerate its upcoming video, 3D, and multimodal models.
📌How does Nvidia respond to TPU?
Nvidia believes that the best answer may be a hybrid approach. It will use CPUs for operations where performance is not critical but programmability is essential. It will use GPUs where parallel operation is required but a certain degree of flexibility/programmability is desired. Then it will use ASICs where algorithms become stable, and the volume will be significant, such as in the case of deep learning inference processing. Nvidia has also launched the latest autonomous driving platform, DRIVE Thor, and has stated that it will achieve higher performance with lower power consumption while maintaining the customization flexibility required by original equipment manufacturers in the automotive industry. It also allows other companies and researchers to use this accelerator through open-source accelerators, expanding its company's influence. Nvidia effectively positions ASICs as a tool that can be used meaningfully while maintaining its dominance in GPUs and CUDA software.
$Alphabet(GOOG)$ $NVIDIA Corp(NVDA)$
@TigerStars @MaverickTiger @VideoLounge @CaptainTiger @MillionaireTiger @Daily_Discussion
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.
- CaesarHicks·2023-04-14NVDA is still the No.1 semi stock to buy.LikeReport
- PenelopeHood·2023-04-14Is a good time to buy NVDA recently?LikeReport
- appz·2023-04-14[smile] [smile] [smile]LikeReport
- Fayt·2023-04-14thanksLikeReport
- YiiYii·2023-04-14[OK]LikeReport
- JeremyEe·2023-04-14hagvsLikeReport
- Tiger Sniper·2023-04-14kLikeReport
- Ah Wang·2023-04-14kLikeReport
- 康鸿·2023-04-14okLikeReport
- chantow·2023-04-14thanksLikeReport
- CHEWYY·2023-04-14Thks1Report
- ywtan15·2023-04-14kLikeReport
- Royyyyyyy·2023-04-14kLikeReport
- kel0508·2023-04-14sureLikeReport
- Gnay·2023-04-14yaLikeReport
- NinjaPigz·2023-04-14thanksLikeReport
- Boo2020·2023-04-14kLikeReport
- PaulN·2023-04-14👍LikeReport
- laserbutt·2023-04-14KLikeReport
- Attap Kia·2023-04-14ok thanksLikeReport