Robotaxi Gains Major Momentum! Mercedes-Benz Partners with NVIDIA and Uber to Deploy Global Robotaxi Fleet

Stock News01-30 10:30

On Thursday Eastern Time, the lineup of global Robotaxi (fully autonomous driverless taxi) participants expanded significantly following a major announcement of a tripartite collaboration project involving European luxury car leader Mercedes-Benz, NVIDIA (NVDA.US), and Uber Technologies (UBER.US). It is understood that the three parties will collaborate to build a global Robotaxi platform, which will utilize Mercedes-Benz's new S-Class vehicles, NVIDIA's AI computing-powered autonomous driving hardware and software stack, and Uber's large ride-hailing network to provide driverless mobility services in major global markets. This Robotaxi service will operate on the MB.OS operating system developed by Mercedes-Benz, allowing NVIDIA and other autonomous driving technology partners to layer high-level autonomous driving applications on top, while the German automaker will retain control over the core vehicle software and integration systems.

For its part, NVIDIA will provide the DRIVE Hyperion architecture for in-vehicle high-performance autonomous driving chips, the Alpamayo AI autonomous driving model, and exclusive simulation and safety tools to enable a Level 4-capable autonomous driving system; this builds upon previously related major announcements—its AV stack will hit the road with Mercedes and be incorporated into Uber's plans to deploy a large autonomous vehicle fleet in the latter part of this decade. The role of US ride-hailing giant Uber is to directly integrate the tripartite Robotaxi system into its existing ride-hailing platform, enabling human-driven vehicles and autonomous S-Class Robotaxis to operate in parallel. Although the companies did not provide a specific launch timeline, the collaboration is described as a significant step towards the large-scale commercial deployment of Robotaxis in multiple global cities.

"Mercedes-Benz, NVIDIA, and Uber will jointly create a global platform that makes autonomous driving accessible to all users," said NVIDIA CEO Jensen Huang in a video played at a Mercedes event. NVIDIA's "AI + Autonomous Driving" Ambition According to insights from NVIDIA CEO Jensen Huang, "Physical AI" emphasizes enabling robots/autonomous systems to perceive, reason, and execute a complete set of actions in the real world. In Huang's view, an era where "Physical AI" aids the evolution of human civilization is imminent. Applications of "Physical AI," including fully autonomous driving technology, focus on allowing robots/autonomous systems to perceive, reason, and act in the real world, and these three capabilities are the key toolchain for advancing models from "just conversing" to "being able to work in the physical world."

During CES 2026, NVIDIA significantly announced the open-source autonomous driving AI large model family, Alpamayo. Huang referred to Alpamayo as the "ChatGPT moment for Physical AI." The flagship autonomous driving large model product, Alpamayo 1, is a Vision-Language-Action (VLA) model with 10 billion parameters. Unlike traditional autonomous driving systems that merely detect objects and plan paths, Alpamayo adopts a breakthrough approach with a "chain-of-thought AI reasoning system"; it focuses on processing video input and generating driving trajectories, but more crucially, it also outputs the logic behind its decisions. Huang stated: "This is not just a driving model, but a model that can explain its own thought process."

At the exhibition, Huang played a demonstration video where Alpamayo could not only drive the car but also explain its decision-making logic in natural language, for example, "The brake lights of the vehicle ahead are on, it might be slowing down, so I should maintain distance." Beyond the Alpamayo family (open/source reasoning VLA models), NVIDIA also recently launched the AlpaSim simulation tool + Physical AI Open Datasets: the official narrative combines them with Alpamayo to form a complete development loop of "model + simulation + data" to enhance the safety and verifiability of fully autonomous driving technology. For NVIDIA, the open-source strategy focuses on ecosystem entry, while monetization is concentrated on computing power and enterprise-level delivery.

For instance, opening up models and toolchains will significantly lower the development barrier, encouraging more automakers/robotics development and manufacturing companies to align their R&D and mass-production roadmaps with NVIDIA's unique computing platform, thereby driving strong demand for data center AI computing power platforms (full rack), in-vehicle edge computing platforms (DRIVE), and robotics edge computing platforms (Jetson). At CES, NVIDIA also heavily promoted its "Rubin Six-in-One" rack-level AI computing infrastructure platform, emphasizing a substantial reduction in token costs, essentially using platform-based computing supply to handle the massive new AI training/inference workloads brought by the "Physical AI" wave.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment