ToughCoyote
2023-12-11

$Advanced Micro Devices(AMD)$ US has released the AMD Instinct MI300X accelerator, the ROCm 6 open source software suite with extensive optimizations and added support for new LLM features, and Ryzen 8040 series accelerators equipped with Ryzen AI capabilities. It is going to compete with the$NVIDIA Corp(NVDA)$ and$Semiconductor Bull 3X Shares(SOXL)$ 

AMD new products

AMD Instinct MI300X accelerator can be used to generate AI, with large language model (LLM) training and inference performance, in addition, AMD Instinct MI300A accelerated processing unit (APU) has also been released - combining the latest AMD CDNA 3 architecture and "Zen 4" CPU that delivers breakthrough performance for high-performance computing and artificial intelligence workloads.

According to him,$Microsoft(MSFT)$ Microsoft, Lawrence Livermore National Laboratory’s “El Capitan” supercomputer, and Oracle’s cloud infrastructure plan have all become its customers.

AMD Instinct MI300X accelerator uses the new AMD CDNA 3 architecture. Compared with the previous generation AMD Instinct MI250X accelerator, MI300X has nearly 40% more computing units, 1.5 times more memory capacity, 1.7 times more peak theoretical memory bandwidth, can support new mathematical formats such as FP8 and sparsity, and can Adapt to artificial intelligence and high-performance computing workloads.

With 192GB HBM3 (High Bandwidth Memory) memory capacity and 5.3 TB/s peak memory bandwidth, the AMD Instinct MI300X accelerator delivers the performance needed for surging AI workloads.

The AMD Instinct platform is a generative AI platform based on industry-standard OCP design and features 8 MI300X accelerators delivering industry-leading 1.5TB HBM3 (High Bandwidth Memory) memory capacity. The industry-standard design of the AMD Instinct platform allows OEM partners to design MI300X accelerators into existing AI products, simplifying deployment and accelerating adoption of AMD Instinct accelerator-based servers.

It is worth noting that AMD mentioned in the announcement that compared with NVIDIA's H100 HGX, the AMD Instinct platform can run inference on large language models such as BLOOM 176B4 1.6 times faster, and is the only 70B accelerator on the market with a single MI300X The only choice for running inference on parametric models (such as Llama2) and can simplify the deployment of large language models at the enterprise level.

AMD Instinct MI300A acceleration processor is the world's first data center acceleration processor designed for high-performance computing and AI. It combines high-performance AMD CDNA 3 GPU cores, the latest AMD "Zen 4" x86 CPU cores and 128GB The next-generation HBM3 (High Bandwidth Memory) memory delivers 1.9x performance per watt improvement over the previous generation AMD Instinct MI250X on FP32 high-performance computing and artificial intelligence workloads, and is in line with Nvidia’s Grace Hopper super core (H200 Compared with Grace CPU), the performance per watt may be 2 times higher.

But more importantly, AMD announced the launch of the latest AMD ROCm 6 open software platform and promised to open the most advanced software libraries to the open source community to advance its vision of open source AI software deployment. ROCm 6 software greatly improves the acceleration performance of AI and adds support for several new key features of generative AI, including FlashAttention, HIPGraph and vLLM.

As for the Ryzen 8040 series accelerators equipped with Ryzen AI functions, they are expected to be launched in the first quarter of 2024 for manufacturers such as Acer, Asus, Dell, HP, Lenovo and Razer.

In addition, AMD said it is investing in software performance through the acquisition of Nod.AI, Mipsology and more strategic ecological cooperation.

Can AMD replace Nvidia?

According to CNBC, Meta $Meta Platforms, Inc.(META)$ META.US), OpenAI and Microsoft.US have stated that they will use AMD’s latest AI chip Instinct MI300X, which may mean that these technology giants that are deploying AI tend to look for other products to replace Nvidia. AI chips are in short supply and expensive in the US.

Su Zifeng, CEO of AMD, predicts that the AI ​​chip market will be worth more than US$400 billion by 2027, and believes that AMD can occupy a large market share in it. AMD didn't disclose pricing for the MI300X, but Nvidia currently costs about $40,000 per chip, and Su revealed that AMD's chips are lower than their Nvidia counterparts.

More importantly, AMD said it has improved ROCm 6, a software suite used to optimize the AI ​​software stack, to compete with Nvidia's industry-standard CUDA software, which may be why AI developers currently prefer Nvidia.

Nvidia’s moat

Speaking of the competition between AMD and Nvidia's AI chips, it is necessary to first talk about why GPUs play a core role in the development of AI. This starts with parallel computing.

Parallel computing is an algorithm that can execute multiple instructions at a time - decomposing a computing task into numerous subtasks and executing them simultaneously on multiple processors to speed up the calculation. Its purpose is to increase the computing speed and solve large and complex computing problems by expanding the scale of problem solving.

In the 1980s and 1990s, the first generation of parallel computers appeared, such as supercomputers and multi-processor systems. These systems usually required multiple CPUs (central processing units) or a CPU and other specialized chips to achieve high-performance scientific computing. . However, these systems are expensive, inefficient to use, and complex to program.

With the development of graphics and the increasing demand for graphics processing power in games, GPU (graphics processing unit) appeared and developed. ATI developed the first graphics chip and graphics card in 1985.

At the beginning, the GPU played the role of a CPU co-processor. The CPU was responsible for logical tasks, and the GPU was responsible for graphics rendering tasks. The GPU (or graphics card) at that time only contained simple memory and a frame buffer, and could only perform graphics. Storage and transfer, all operations must be controlled by the CPU.

With the development of electronic technology, graphics cards are becoming more and more technologically advanced and powerful. Nvidia first proposed the concept of GPU when it released the GeForce 256 graphics processing chip in 1999. GPUs emerged at the historic moment, allowing graphics cards to reduce the burden on the CPU. Depends on it and performs some work that originally belonged to the CPU.

As GPU architecture improves and programming models innovate, GPUs begin to expand from graphics rendering to other areas such as data mining and AI that involve large amounts of data-parallel computing. Since GPU has powerful parallel computing capabilities, its role is no longer limited to graphics accelerators, but is used for general computing. Different from the advantage of CPU in serial processing (CPU is suitable for processing tasks that require strict correlation between previous and subsequent calculation steps), GPU can process hundreds of threads at the same time and complete a large number of computing tasks in a short time.

Artificial intelligence (AI), as the name suggests, is a technology that imitates human intelligence and thinking processes. It requires extracting new insights and deep learning from massive amounts of data to produce a new method that is similar to human intelligence. Intelligent machines that respond. Therefore, the development of AI involves a large amount of data processing and model training. In particular, deep learning requires matrix operations on a large amount of data. These data are similar operations that can be performed in parallel. GPU can just meet this demand. This is why GPU is known as For the reason of AI computing engine and core.

Nvidia and AMD, which acquired graphics card pioneer ATI for $5.4 billion in 2006, are currently the major GPU manufacturers. In 2006, NVIDIA launched the universal parallel computing architecture CUDA - simply put, it is an ecosystem that is strongly bound to NVIDIA's own GPU. This is also the reason why NVIDIA AI chips are so popular. Many early engineers have long used CUDA, and therefore, it is necessary to To break through the moat of NVIDIA AI chips, we must first break through the ecological barriers of CUDA. This is an obstacle that AMD admitted to facing during its launch event.

For this reason, AMD launched ROCm to use its own ecosystem to compete with NVIDIA's CUDA. It should be noted that the sales of GPUs that support CUDA have reached hundreds of millions, and thousands of developers have become accustomed to using NVIDIA's CUDA to solve various problems. The problem is, it may take some time for AMD to cultivate its own ecosystem.

Summarize

AMD has previously revealed that the mass production of Instinct MI300A and MI300X GPUs is progressing smoothly in the fourth quarter, and pointed out at the third fiscal quarter results conference that its AI progress is better than expected, and the data center GPU revenue in the fourth quarter is expected to be approximately 4 billion and will exceed US$2 billion by 2024. The MI300 is expected to be the first product to reach billion-dollar sales in the shortest period of time in AMD's history.

It can be seen that the current market has long expected AMD's AI chips, but judging from AMD's performance guidance, the strong development of AI may not yet be reflected in the fourth fiscal quarter, and will not be reflected in performance until fiscal year 2024.

Not so for Nvidia, whose revenue and non-accounting standard net profit have climbed at an unprecedented pace, and the strong revenue growth brought about by the shortage of its AI chips has been reflected in this year's results. In the third quarter of NVIDIA's fiscal year 2024, which ended at the end of October 2023, the company's revenue increased by 205.51% year-on-year and 34.15% quarter-on-quarter to US$18.12 billion; non-accounting standards net profit increased by 588.19% year-on-year to US$100.2 billion; and it expects revenue in the fourth fiscal quarter to reach $20 billion. The continued strong demand for computing power and network will drive strong growth in its data centers.

The current short supply and high price of NVIDIA's AI chips may drive users to AMD. However, in the short term, it will take some time for AMD to replace NVIDIA, mainly because NVIDIA has taken the lead in the field of AI chips, accumulated a lot of orders, and has The competitive advantages of the platform and ecosystem are protected. It is not impossible for AMD to break down these barriers and replace them, but it may be difficult to achieve in the short term.

@TigerStars @Daily_Discussion @MillionaireTiger @Tiger_chat 

💰 Stocks to watch today?(15 May)
1. What news/movements are worth noting in the market today? Any stocks to watch? 2. What trading opportunities are there? Do you have any plans? 🎁 Make a post here, everyone stands a chance to win Tiger coins!
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

  • OYoung
    2023-12-12
    OYoung

    The competition between AMD and NVDA is getting fiercer

  • ZoePaul
    2023-12-12
    ZoePaul

    The emergence of this new product gave me a big surprise

  • Thatway
    2023-12-12
    Thatway

    Will AMD stock price rise due to Instinct MI300X?

  • coolguy001
    2023-12-12
    coolguy001

    It seems that NVDA will be under considerable threat

  • ZonaMatthew
    2023-12-12
    ZonaMatthew

    The product power of Instinct MI300X is really outstanding

  • Brando741319
    2023-12-20
    Brando741319
    Good
Leave a comment
7
229