Apple Strikes Again

Travis Hoium
06-15

A lot of artificial intelligence startups died this week.

$Apple(AAPL)$ introduced Apple Intelligence, which includes image and text generation along with the ability to search and understand a lot of content housed on your personal device.

Image generation and text prompts aren’t new, but Apple is doing most of the AI computing on-device, negating the need for the cloud computing that powers most AI tools today.

When combined with the generalizability of AI that I talked about a few weeks ago, I have to wonder if Apple will suck the oxygen from the room of the AI ecosystem.

Artificial Intelligence & On-Device Compute

The common narrative in AI has been that $NVIDIA Corp(NVDA)$ is the clear winner and will grow as quickly as it can make chips. Cloud infrastructure from $Alphabet(GOOG)$ $Alphabet(GOOGL)$ $Meta Platforms, Inc.(META)$ $Amazon.com(AMZN)$ $Microsoft(MSFT)$ $Snowflake(SNOW)$, and others is the next “obvious” play in AI.

I think Apple proved that wrong this week by moving the most impressive new AI features on-device.

This is from an interview Ben Thompson did with Daniel Gross and Nat Friedman:

I don’t fully understand and I never fully have understood why local models can’t get really, really good, and I think that the reason often people don’t like hearing that is there’s not enough epistemic humility around how simple most of what we do is, from a caloric energy perspective, and why you couldn’t have a local model that does a lot of that.

A human, I think, at rest is consuming like 100 watts maybe and an iPhone is using, I don’t know, 10 watts, but your MacBook is probably using 80 watts. Anyway, it’s within achievable confines to create something that has whatever the human level ability is, it’s synthesizing information on a local model.

What I don’t really know how to think about is what that means for the broader AI market, because at least as of now we obviously don’t fully believe that. We’re building all of this complicated data center capacity and we’re doing a lot of things in the cloud which is in cognitive dissonance with this idea that local models can get really good.

The economy is built around the intelligence of the mean, not the median. Most of the labor is being done that is fairly simple tasks, and I’ve yet to see any kind of mathematical refutation that local models can’t get really good.

You still may want cloud models for a bunch of other reasons, and there’s still a lot of very high-end, high-complexity work that you’re going to want a cloud model for, chemistry, physics, biology, maybe even doing your tax return, but for basic stuff like knowing how to use your iPhone and summarizing web results, I basically don’t understand why local models can’t get really good.

If 90% of AI goes local, what are all the data center GPUs for?

I’m not suggesting data center GPUs are not necessary, but rather wondering if the growth rate of data center investment will slow as compute moves on-device.

Apple Generalizes…Everything?

When Apple introduces a new product or capability it doesn’t typically start with the most advanced features. It starts relatively basic with use cases people will use and builds up to eat more and more of the ecosystem.

The original iPhone didn’t have an app store.

Then there was an app for that.

Then there were Apple Apps for that.

I dove into why I think the generalizability of AI will lead to a lot of value destruction here. As far as Apple goes, I think we are seeing the early phases of Apple eating more of the AI ecosystem and Apple may have even more power in AI than apps because of how generalizable a model can be.

Who needs Grammarly when spelling and grammar checking is on-device for free?

The same can be said for $Duolingo, Inc.(DUOL)$ .

Even $Adobe(ADBE)$ Photoshop and Canva should be nervous.

If an on-device, generalizable model can do 90% of AI tasks, what are we building all of this cloud computing and specialized capability for?

More Questions Than Answers

AI is evolving quickly and we know very little about what the market will look like long-term.

What I think we know is:

  1. AI models themselves are becoming a commodity/open-source

  2. AI computing will move on-device

  3. Applications and models are becoming more generalizable

As an investor, I think that leads to more value destruction than value creation…at least over the next few years.

Apple has a major point of leverage as the device hundreds of millions of people interact with every day.

Google has points of leverage with multiple products that have billions of users and the other mobile ecosystem (Android).

Those two companies seem to be winners in AI, whether there’s incremental revenue from AI or not.

Everyone else may be on shaky ground as AI is commoditized, moves on-device, and becomes generalizable.

https://asymmetric-investing.beehiiv.com/p/apple-strikes-in-ai

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

  • Alex Tan
    06-16
    Alex Tan
    appl to the moon. appl have tremendous potential. there are stories of lower appl demands, so what? it's undeniable iphone is the most popular phone in the world. and I be honest, girls will think that you are some kind of freak if you don't use iphone
  • YueShan
    06-17
    YueShan
    Good⭐️⭐️⭐️
Leave a comment
2
34