Zuckerberg vs Altman: Not a Fair Fight

Business models matter.

In partnership with

What stocks am I buying this month? Become a premium subscriber to find out and you’ll see me put my money where my mouth is.

Artificial intelligence has become a battle of titans in tech. Satya Nadella, Sundar Pichai, and Jensen Huang have become three of the most important voices in the industry and can move hundreds of millions of dollars of capital with a single comment.

But it’s Mark Zuckerberg versus Sam Altman that I find the most interesting at the moment because one may be solidifying his position at the top of tech while the other struggles to keep his company relevant.

The AI Stack

Below is a very simple way to look at artificial intelligence companies today. There’s nuance in each, but broadly the market can be thought of as models, inferences, and apps.

  • Models: GPT from OpenAI made AI models a household concept, but Llama 3 (which I’ll get to in a moment) has changed the game for AI and may show there will be little economic profit to be made in the model-making business. I think we’re seeing this become a commodity business and open-source is winning.

    • I acknowledge that NVIDIA dominates the chip side of model creation and that shows no sign of stopping. The question is, have we moved beyond model creation as a place of investment?

  • Inference: Inference is what happens when you interact with an AI application, like doing a query or asking for an image. Today, inference is primarily done in the cloud, but every device maker is trying to move more inference on-device. Apple, Google, Samsung, Qualcomm, and many more are designing chips that can handle smaller models on-device while larger calls will be pushed to the cloud.

  • Apps: The application is what you interact with as a user. ChatGPT is an app while GPT-4 is the model. Meta AI is an app while Llama 3 is the model. And so on. It’s not clear what AI apps will win, but I think it’s clear there will be some existing companies helped by AI and some new entrants will emerge now that the model and inference infrastructure is maturing.

As an investing research service, I’m trying to build frameworks of how to think about AI so I can look for opportunities in the right places and avoid pitfalls others may have made. These frameworks will evolve, but I think this is a good outline of where we sit today.

Asymmetric Investing’s free content is ad-supported. If you want to avoid ads, sign up for premium here and you’ll get double the content including my stock buys and portfolio updates each month.

Make Your Money a Multitasker.

With Betterment's expert-built ETF portfolios, you’re automatically diversified across thousands of stocks and bonds at once. These expert-built portfolios are designed to help reduce risk, regardless of what’s happening in the market.

AI Models and the Scorched Earth Strategy

There are different kinds of value created in different business models.

In the case of artificial intelligence models, a company like OpenAI needs to generate economic value by selling access to its models (APIs and ChatGPT) to survive. It’s competing against Meta Platforms, which is creating strategic value for its core business by open-sourcing its models.

This is the core of the Zuck vs Altman battle and Altman may be in a losing battle from the start.

Sunny Madra from Groq Cloud said this about Meta open-sourcing the Llama 3 model on the BG2 Podcast:

Sunny Madra: Zuck came out and threw down for the entire world that are building models. You have a model that’s much smaller, so much easier to run on all different types of hardware, and much faster. Within the first 48 hours it became the most popular model that we run on Grok.

And people are doing a direct replacement with OpenAI across the board. And they don’t really see any performance impact or reasoning impact.

Bill Gurley: And why replace? What is being optimized in the switch?

Sunny Madra: Price performance. From a GPT 4, you’re more than 10x cheaper.

Their discussion outlines how Meta’s open-source Llama models are at least on par with OpenAI’s GPT-4 at a fraction of the cost for inference.

Mark Zuckerberg can afford to give away these models for two reasons:

  1. If the open-source community helps make these models even 10% better for Meta’s internal use cases it could save the company billions of dollars.

    1. Better yet, if it helps a competitor make a chip that’s cheaper total cost of ownership than NVIDIA’s it could save the company tens of billions.

  2. Zuckerberg wants to make sure the model itself isn’t the point of value, the application is. Meta has arguably the best AI applications with advertising on its family of apps and potential use cases for businesses and consumers with Meta AI chatbots.

Zuckerberg is scorching the earth and Altman is the one getting burned.

Inference Today & The Future of Inference

For now, AI models and AI inference are almost all performed in the cloud. This is what’s driven the growth of NVIDIA, Google Cloud, Microsoft Azure, and many others.

As I highlighted above, I don’t think model training is going to be a growth business long term because there’s no economic profit in building the model itself. So, inference may provide more ongoing value.

There are three places inference can be done when it comes to AI:

  • Cloud: Large cloud providers running GPUs (NVIDIA) in large clusters. This is where models are trained and where inferences are performed for nearly all AI applications today.

  • Edge: Some inferences may be moved to the edge in time, but this is not common today.

  • On-Device: Every hardware maker is working to incorporate at least some

Cloud providers sit in an interesting position because they can be the trainers of models and provide inference. Meta, Microsoft, and Alphabet are all upping their capex budgets to capitalize on this opportunity.

What I think is clear is that some inferences will move on-device. Don’t be surprised if Apple announces exactly that next week when it holds an event. Apple has been rumored to be building AI into devices as early as this year.

Of course, Google is likely to move inference on-device as well and has the chip expertise to pull it off on its own devices. Qualcomm Snapdragon is building on-device AI capability for its chips as well. If you’re making mobile chips, you’re going to making AI mobile chips soon.

Early reports are the smallest version of Llama 3 works on devices today without any specialized hardware. But the hardware (chips) will ultimately be optimized for AI and that’s when there will be a pull of AI on-device.

There’s likely to be a mix of on-device, edge, and cloud inference in the future. That doesn’t give anyone a real edge unless they’re running completely different applications than everyone else.

“Good enough” AI on-device may be table stakes for hardware companies long-term.

This is a bifurcated market at its finest, but the likelihood is again the strong getting stronger. Google Cloud, Microsoft Azure, Meta, and Amazon AWS will likely win again in the inference cloud. I don’t see how Apple and Android don’t dominate the on-device inference market.

This is why I own Alphabet, but why I’m skeptical that hot stocks like NVIDIA and AMD have durable advantages as more players enter the market and huge cloud customers look to lower their costs, pressuring chip suppliers.

AI Applications and AI Business Models

This is where AI will be fun.

If models are a commodity and inference is becoming bifurcated and commodity-like, where there’s value to be had in AI it will accrue to AI applications.

When Meta open-sourced Llama 3, it wasn’t out of the goodness of their heart. It’s because they’re trying to protect their application and business model from disruption.

Meta wants to commoditize its compliments. It just so happens that OpenAI is its biggest compliment.

But what applications will win? We can make some educated guesses:

  • Advertising: AI will make ads more effective through better targeting and custom AI content.

  • Information retrieval: Search, chatbots, etc.

  • Data processing: More efficiently retrieving and processing data for companies.

  • Personal assistant: An AI assistant who knows our personal preferences and can do menial tasks for us.

  • The unknown: More likely, the way we will use AI probably doesn’t exist today. Just like Google, Airbnb, and Facebook didn’t exist in the early days of the internet.

Searching for great use cases and business models is likely where I will spend a lot of time over the next few years.

Round 1 Goes to Zuck

To the extent that we know anything about AI today, we know that AI models are becoming commodities and the hottest company of 2023 — OpenAI — and its founder Sam Altman may meet the same fate as Netscape nearly three decades ago.

The next step will be figuring out who can do on-device inference and where there’s value in applications. If AI is where the internet was in the mid-1990s, it may be years before we have answers.

But I’ll be looking.

Disclaimer: Asymmetric Investing provides analysis and research but DOES NOT provide individual financial advice. Travis Hoium has a long position in all stocks mentioned. All content is for informational purposes only. Asymmetric Investing is not a registered investment, legal, or tax advisor or a broker/dealer. Trading any asset involves risk and could result in significant capital losses. Please, do your own research before acquiring stocks.

Reply

or to participate.