Two tech giants that entered the AI arms race as challengers are simultaneously encountering internal difficulties. On Thursday, March 13, Elon Musk posted on X acknowledging that "xAI was initially built incorrectly and is being completely rebuilt from the ground up." According to the Financial Times, the company is also undergoing a new round of large-scale personnel restructuring, with few members of the original founding team remaining.
Simultaneously, the New York Times reported that Meta has quietly postponed the launch of its next-generation core AI model until at least May. The delay is attributed to the model's underperformance on key benchmarks such as reasoning, programming, and writing compared to Google's Gemini 3.0. Both companies are lagging in the crucial commercial arena of AI programming tools, a sector widely viewed as the core revenue source for AI labs.
Meta's capital expenditure guidance for this year is a massive $115 billion to $135 billion, nearly double that of the previous year, raising questions about the viability of its core strategic bets. xAI, meanwhile, faces pressure following its merger with SpaceX and potential public listing prospects. A struggling AI division is not the narrative Musk wants to present to investors.
**Meta's Core Model Delayed, Internal Testing Falls Short of Gemini 3.0**
Meta's newly formed "TBD Lab" is responsible for developing a series of AI products codenamed after fruits: 'Avocado' is the core foundational model, 'Mango' focuses on image and video generation, and a larger-scale 'Watermelon' project is in the planning stages. According to the New York Times, the 'Avocado' AI model, originally scheduled for release this month, has been delayed until at least May due to unsatisfactory internal test results.
Testing revealed that while the model outperformed Meta's previous offerings and Google's Gemini 2.5 released in March, it fell short when compared to Google's Gemini 3.0 from last November. The report, citing informed sources, indicated that Meta is even internally discussing the temporary licensing of models from competitors like Google to power its AI products and maintain competitiveness, though no final decision has been made. If Meta opts to integrate Google's Gemini, it would create a contrast with CEO Mark Zuckerberg's previous public emphasis on self-reliance and competing for AI supremacy.
Meta's substantial capital expenditure guidance is predominantly allocated towards AI data centers, computing clusters, and infrastructure. The company has also reportedly made long-term investment commitments approaching $600 billion for the US market, invested $14.3 billion in Scale AI, and appointed its CEO, Alexandr Wang, as Meta's Chief AI Officer. Zuckerberg has publicly stated that these investments will propel Meta to "push the frontiers" and advance toward superintelligence.
**xAI Undergoes "Complete Rebuild" as Co-founders Depart**
Of xAI's original 11 co-founders, only Manuel Kroiss and Ross Nordeen remain. This week, co-founders Zihang Dai and Guodong Zhang departed successively. Both were Chinese nationals; the former had publicly acknowledged xAI's lag in programming capabilities, while the latter, responsible for pre-training the Grok model, was seen as primarily accountable for this shortcoming. Previous departures included co-founders like Greg Yang, Tony Wu, and Jimmy Ba, signaling a systemic overhaul behind the personnel turbulence.
Reports indicate that executives from SpaceX and Tesla have been deployed to xAI as "turnaround specialists," auditing employee work with a focus on data quality issues in model training and dismissing those deemed underperforming. Concurrently, Musk is proactively broadening recruitment channels, publicly apologizing on X to previously rejected candidates and stating intentions to re-evaluate past rejections and contact promising applicants. xAI currently employs over 5,000 staff, fewer than OpenAI's over 7,500 but slightly more than Anthropic's approximately 4,700.
**Programming Tools as Key Battleground, Commercial Pressures Drive xAI Restructure**
Within the AI industry, programming tools are widely considered the most certain path to commercial monetization currently. The programming functionality of xAI's Grok significantly trails behind Anthropic's Claude Code and OpenAI's Codex. Musk acknowledged this gap and set "catching up to competitors" as a mid-year goal during an all-hands meeting this week. To bolster its "Grok Code Fast" product capabilities, xAI recently hired Andrew Milich and Jason Ginsberg from the popular AI programming application Cursor.
Musk is also betting on a longer-term vision. xAI's "Macrohard" project aims to develop AI agents capable of fully replacing white-collar work. However, according to Business Insider, the project is currently paused, and its initial lead, Toby Pohlen, departed after just 16 days. Musk revealed this week that Macrohard will be jointly advanced with Tesla's "Digital Optimus" project, using xAI's language models to power Tesla's AI agents. He has assigned Tesla's Ashok Elluswamy to lead the project's reconstruction.
On the infrastructure front, xAI's supercomputing cluster in Memphis has deployed over 200,000 GPUs, with plans to expand to 1 million, leveraging data resources from the X platform to provide unique advantages in scale and real-time capability for model training. However, external pressures are significant. With xAI's $12.5 billion merger into SpaceX and a potential SpaceX IPO window possibly opening as early as June, the cash-burning AI division urgently needs to demonstrate Grok's actual user growth and commercial progress to external investors.
Comments