Amazon, Cerebras Partnership Shows AI Shift Toward Inference Over Training -- Market Talk

Dow Jones03-13 23:38
 

11:38 ET -- Amazon's plan to deploy Cerebras chips shows the shift in AI computing toward inference functions, which allow AI models to respond to user queries, instead of model training. The GPU chips used widely for AI model training aren't as good for inference workloads that require more speed. Cerebras says its chips can process the phase of inference computing in which an AI model spits out a response to a user query up to 25 times faster than Nvidia's GPUs. Next week, Nvidia plans to unveil a new processing system tailored for inference using the technology of chip startup Groq, with which it signed a licensing deal in December. (nicholas.miller@wsj.com)

 

(END) Dow Jones Newswires

March 13, 2026 11:38 ET (15:38 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment