$美国超微公司(AMD)$  The market is processing $AMD earnings wrong:

1. The market is looking at the AI story as if it were only about selling GPUs. But $AMD's business segments are effectively distribution channels via which it can repackage and sell its core AI tech. $AMD is going to be adding AI functionality to all of its products.

Over the long run, this is a much better strategy than only going head to head with $NVDA selling GPUs (which it is doing too), simply because it makes distribution easier and cheaper.

The datacenter growth is being offset by cyclical market weakness in gaming and embedded markets, but these two segments will give $AMD an advantage over time.

"[...] we see clear opportunities to drive our next wave of growth as we deliver leadership AI solutions across our portfolio."

"I think the key is not just about the MI300 conversation. But it is really about sort of our long-term multi-generational roadmap."

- $AMD CEO Lisa Su, during the Q4 2023 ER cc.

2. The MI300 is gaining considerable traction, with $AMD increasing sales guidance of the compute engine for FY2024 from $2B to $3.5B. Firstly, Lisa always sandbags guidance and secondly, this is only the start of the hardware and software iteration curve which will see $AMD gradually take more GPU/AI share from $NVDA.

Customers need time to assess the product's performance and it takes time to increase supply capacity.

"In Cloud, we are working closely with $MSFT, $ORCL and other large cloud customers on Instinct GPU deployments, powering both their internal AI workloads and external offerings."

- $AMD CEO Lisa Su, during the Q4 2023 ER cc.

3. Adding to point #1, $AMD's diversified business does not only yield well moated distribution channels, but enables a highly differentiated product roadmap. Computing is not about selling CPUs and GPUs, but abut moving electrons around cost effectively.

It so happens that, going forward, the best way of doing that is by mixing and matching different compute engines. $AMD's chiplet expertise will enable it to connect GPUs, CPUs, DPUs and FPGAs to create highly differentiated products with marked performance per dollar capabilities.

"[...] even in the case of process parity [with $INTC], we feel very good about our architectural roadmap and all the other things that we add, as we look at our entire portfolio of CPUs, GPUs, DPUs, adaptive SoCs and kind of put them together to solve problems."

- $AMD CEO Lisa Su, during the Q4 2023 ER cc.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Report

Comment

  • Top
  • Latest
empty
No comments yet