Amazon Web Services and Cerebras Systems said they are collaborating to offer a disaggregated AI inference solution through Amazon Bedrock. The companies said the system will combine AWS Trainium-powered servers with Cerebras CS-3 systems, connected using Elastic Fabric Adapter networking in AWS data centers. AWS and Cerebras said the approach separates inference into prefill on Trainium and decode on CS-3. AWS said it will also offer open-source LLMs and Amazon Nova using Cerebras hardware later this year.
Disclaimer: This news brief was created by Public Technologies (PUBT) using generative artificial intelligence. While PUBT strives to provide accurate and timely information, this AI-generated content is for informational purposes only and should not be interpreted as financial, investment, or legal advice. Amazon.com Inc. published the original content used to generate this news brief via Business Wire (Ref. ID: 202603131106BIZWIRE_USPR_____20260313_BW406341) on March 13, 2026, and is solely responsible for the information contained therein.
Comments