At today's The Android Show event, Google revealed key details about Project Aura and the Android XR system, positioning it as the most comprehensive hardware prototype yet to achieve the ideal form of Android XR. As Google's officially endorsed system-level reference hardware, Project Aura represents the first time Gemini AI gains the ability to truly "see the world."
Google emphasized that the core objective of Android XR is to create an open, unified extended reality platform, enabling AI to move beyond flat screens and into real-world light, space, and interactions.
XREAL's expertise in optics, chips, and spatial algorithms has made Aura Google's most endorsed "AI + XR hardware implementation path," dubbed "Gemini AI's first native spatial eyes." Aura integrates XREAL's optics, chips, and spatial algorithms with Google's Android XR and Gemini AI, allowing AI to complete the full loop of "seeing → understanding → interacting."
Project Aura's three core capabilities form the infrastructure for this generation of AI devices: a 70-degree optical see-through field of view, the X1S spatial computing chip, and deep integration with Gemini AI. Additionally, Android XR builds on the mobile ecosystem to provide native support for spatial computing, addressing the long-standing fragmentation issue in the XR industry. This unification enables developers and content ecosystems to operate on a standardized platform while providing Gemini AI with a standardized spatial entry point.
According to official disclosures, Project Aura is set to launch commercially in 2026.
Comments