NVDA, AMD... Beware ! Chip Maker Shake-Up Coming?

New kids on the block

New Kid On The Block.

A seismic shift is occurring in the artificial intelligence (AI) hardware market driven by a new contender: Cerebras Systems.

Recently, the California-based startup announced the launch of Cerebras Inference — a groundbreaking solution claimed to be 20x faster than $NVIDIA Corp(NVDA)$ GPUs.

Cerebras has developed what it calls the “Wafer Scale Engine, 3rd generation” that powers the new Cerebras Inference.

This massive chip integrates 44GB of SRAM and removes the need for external memory (a significant bottleneck in traditional GPU setups).

By resolving the memory bandwidth issue, Cerebras Inference has set a new industry standards for speed, delivering:

  • A whopping 1,800 tokens per second for Llama3.1 8B.

  • And 450 tokens for Llama3.1 70B.

Comparison between Cerebras and current chip leaders like Nvidia, $Advanced Micro Devices(AMD)$ and $Intel(INTC)$ becomes increasingly relevant.

Although Nvidia has long dominated the AI and deep learning sectors with its robust GPU solutions, Cerebras' entry with a distinct & potentially superior technology could disrupt market dynamics.

AMD and Intel, significant players in the chip industry, may also feel the pressure as Cerebras chips begin to carve out a niche in high-performance AI tasks.

Nvidia & Cerebras - Comparison.

To compare Cerebras chips to Nvidia's GPUs will involve looking at several key dimensions of hardware performances: ie.

  • Architectural design.

  • Application suitability.

  • Market impact.

(1) Architectural Design

Cerebras:

  • Its breakthrough chip, the Wafer Scale Engine, is built on a single, massive wafer.

  • And the latest version has approximately 4 trillion transistors.

  • And at the same time, integrates 44GB of SRAM directly on-chip.

What is SRAM ? It is a type of memory chip which is (a) faster, (b) retains data bits in its memory as long as power is supplied and (c) requires less power than dynamic memory (DRAM).

  • This design removes the need for external memory, thus doing away with memory bandwidth bottleneck that hampers traditional chip architectures.

  • Focuses on creating the largest and most powerful chip that can store & process enormous AI models directly on the wafer, enabling faster AI processings.

Nvidia:

  • Its chip's architecture is based on a multi-die approach where several GPU dies are connected via high-speed interlinks (eg. NVLink).

  • This setup is found in Nvidia’s latest product - the DGX B200 server, allowing customers to buy an entry level server and gradually increase its processing capacity by adding & attaching more processors.

  • The flexible approach will involve complex orchestration between multiple chips and memory pools.

  • Nvidia's chips (eg. B200) , packed with billions of transistors and are optimized for (a) AI training and (b) inference taskings, leveraging on proprietary advanced GPU architecture that has been refined over the years.

(2) Performance

Cerebras:

  • Its chips performances is groundbreaking in specific scenarios, particularly “AI inference” processing.

  • The chip can process inputs at speeds reportedly 20x faster than Nvidia's solutions.

  • This is a result of the chip’s direct integration of memory and processing power, that allows for faster data retrieval and processing without the inter-chip data transfer delays.

Nvidia:

  • Nvidia’s chip trails Cerebras, when it comes to raw AI inference processing speed.

  • However, its GPUs are (a) extremely versatile and (b) considered industry-standard in various applications ranging from gaming to complex AI training tasks.

  • Nvidia's chips are good at many different AI tasks because they work well with other things.

(3) Application Suitability

Cerebras:

  • Cerebras chips are good for businesses that need to process large AI models very quickly.

  • They are perfect for organizations that want to do things fast and process a lot of data at once.

Nvidia:

Nvidia's GPUs are more versatile and can handle a range of tasks, from rendering graphics in video games to training complex AI models and running simulations. This flexibility makes Nvidia a go-to choice for many sectors, not just those focused on AI.

  • Nvidia's chips are more task-flexible and can deployed to process many things eg. graphics in viedo games, training AI models and run simulations.

  • As a result, it makes their chips popular in many industries, not just AI.

Conclusion

  • Cerebras’s chip offers superior performance in specific high-end AI tasks.

  • Nvidia’s chip provides versatility and a strong ecosystem.

  • The choice between Cerebras and Nvidia would depend on (1) specific use cases and (2) requirements.

  • Organizations dealing with extremely large AI models where “inference speed” is critical, Cerebras could be the better choice.

  • While Nvidia remains overall - a strong contender across a wide range of applications, providing flexibility and reliability with a comprehensive software support ecosystem.

My viewpoints : (mine only)

  • The thing about technology stock is that it can never rest on its laurels; especially at the pole position.

  • There will always be a new company (startup or veteran) fighting to ascend the coveted pole position.

  • The price of Cerebras chip was unmentioned. I think it is because it is currently unavailable.

  • It is only a question of when Cerebras chips will be available commercially.

  • When that day arrives, will all hell breaks loose for existing chip makers; especially if price of Cerebras chip is going to be “cheaper” than Nvidia, AMD and Intel?

  • More options and competition will result in less monopoly and should lead to cheaper AI chips.

  • The AI-race seems to be between Nvidia and new kid Cerebras; leaving both AMD & Intel trailing behind with little mention in the post.

  • Again, don’t get me wrong. Nvidia’s influence and standing as the lead AI chip maker will not evaporate overnight.

  • Rather if Cerebras proves to be a “worthy competitor” it will begin to siphon off sales-opportunities from Nvidia overtime.

  • Time, Price and Gung-ho customers to take a chance on Cerebras’s AI-chip will be key to its success, traction and adoption.

  • Also who is to say that existing AI-chip maker will not have a breakthrough product to rival and chip away at Nvidia’s sales, rounding back to my first point, mentioned above.

Fabless chip maker will need to worry when Cerebras decides to go public (IPO). For it signals a (likely) shakeup of the chip industry. Agree ?

Must ReadClick on below titles to access. Give a like & help to repost ok. Thanks.

  • Do you think new comer Cerebras will be able to survive though the early years ?

  • Do you think the concept of “out with the old, in with the new” applies to the semiconductor industry too?

If you find this post interesting, give it wings! ️ Repost and share the insights ?

Do consider “Follow me” and get firsthand read of my daily new post. Thank you.

@Daily_Discussion

@TigerPM

@TigerStars

@Tiger_SG

@TigerEvents

# 💰 Stocks to watch today?(19 Dec)

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Report

Comment4

  • Top
  • Latest
  • HLPA
    ·09-23
    TOP
    fully agree...cant wait for Cerebras ipo!
    Reply
    Report
    Fold Replies
    • JC888
      Hi, tks for reading my post and support. Keeping an open eye for it....
      09-29
      Reply
      Report
  • JC888
    ·09-22
    Hi, tks for reading my post. I make time to write & share.
    Pls "Re-post" so that more get to know. Tks! Rating is important (to me).
    Consider "Follow me" and get first hand read of my Daily new posts? Thanks!). Tks!!
    Reply
    Report
  • JC888
    ·09-22
    Hi, tks for reading my post. I make time to write & share.
    Pls "Re-post" so that more get to know. Tks! Rating is important (to me).
    Consider "Follow me" and get first hand read of my Daily new posts? Thanks!). Tks!!
    Reply
    Report