The Trillion Dollar Opportunity: Autonomous Driving Spotlight

Laying the groundwork for a decade of innovation and disruption.

Today’s deep dive is on autonomous driving as an industry. This won’t be a specific stock spotlight, but instead cover the autonomous driving landscape that could be worth trillions of dollars a decade or two from now.

You read that right. Autonomous driving could be a multi-trillion-dollar market. And it’s already a huge focus of Asymmetric Investing.

Why Autonomous Driving?

Why does autonomous driving have asymmetric upside? How could it be a $1 trillion+ market a decade from now?

In the U.S. alone, 3.19 trillion miles are driven by vehicles each year.

Estimates vary, but Uber charges about $2 per mile on average.

If autonomous vehicles can get just 1/3 of those miles at $1 per mile, the opportunity is over $1 trillion in revenue.

And the market for transportation as a service (TaaS) is already proven. Over the past 12 months, $150 billion of Uber rides were booked.

Assume autonomous vehicles have an operating margin of 20% on $1 trillion in revenue and there’s $200 billion+ in potential operating profit.

Slap a 20x price to operating profit multiple on the industry and you get $4 trillion in value.

Sure, these numbers are made up, but even if I’m off by 95%, there are stocks with significant upside.

And this isn’t a pipe dream. Autonomous driving is here. NOW!

Key Concepts

Before we get into technology and business models, I want to lay the groundwork for how to define autonomous driving. The most common conventions come from SAE International, which you can see below.

The key distinction is between Level 2 and Level 3. Both have similar technological features, but the safety responsibility falls on the DRIVER with Level 2 and the VEHICLE with Level 3.

An easy way to think about it is Level 2 autonomy being advanced cruise control and Level 3 behind a fully autonomous driver.

Level 4 is where you get to the point where there may not be a steering wheel in the car. And the big difference between Level 4 and Level 5 is the geofence (geographic limitations) around Level 4.

Technology

Note: I am not an autonomous driving technology expert but I’m going to try to lay out the different approaches companies take as best I can. Forgive me for any mistakes.

There are two main approaches to autonomous driving technology today.

  1. Multiple types of sensors and software with layers of software redundancy.

    1. There are wide variations and level of safety and capability in this broad category.

  2. Vision-only, AI-powered.

To put it simply, most companies are using a multi-sensor, multi-redundancy approach to autonomous driving. Tesla is the only vision-only company building a vision-only system with AI.

We know the multi-sensor approach can achieve Level 4 autonomy because millions of miles have been driven without a driver by Level 4 vehicles. As an industry participant said recently, we have the solution. Now it’s just a matter of scaling and making the economics work.

We don’t know if vision-only, AI-powered autonomous driving systems can get beyond Level 2. I put that bluntly because it’s true!

Now, what are the different sensors that are collecting data for the autonomous driving system?

LiDAR

LiDAR is a key sensor in most autonomous driving systems. IBM defines LiDAR like this:

LiDAR, an acronym for “light detection and ranging,” is a remote-sensing technology that uses laser beams to measure precise distances and movement in an environment, in real time.

LiDAR data can be used to generate everything from detailed topographic maps to the precise, dynamic 3D models that are required to safely guide an autonomous vehicle through a rapidly and constantly changing environment.

In a vehicle, Lidar sensors spin and sense where there are physical objects up to 200 meters away. The system then creates a 3D map of the environment like the one you see below. It’s a little like creating a video game world on the fly.

Waymo

The downside of LiDAR sensors has long been their cost. As recently as 2020, a LiDAR for autonomous driving could cost $3,500. This cost hurdle has been pointed to as a reason LiDAR isn’t feasible by Elon Musk. The cost for a LiDAR sensor is now expected to be closer to $500 with prices continuing to come down.

I think LiDAR is taking the path of many emerging technologies. It was very expensive to use LiDAR a decade ago, so some (Tesla) eschewed the technology. But that mental model needs to be updated because the costs are coming down so rapidly. Today, there’s likely LiDAR on your phone and an auto-grade lidar may be on the order of $100 a decade from now, a rounding error in the cost of vehicles.

Cameras

Every autonomous driving system uses cameras. Every system does something like what you see below, trying to identify objects in real-time to navigate the vehicle.

Waymo

While every system may do things a little differently with the camera, the process of sensing, identifying, and reacting is well established. Most objects are knownably using only camera sensors.

The challenge of relying on cameras alone is the extreme edge cases like an object the system has never seen before or an object that’s obstructed by something, like the sun. There’s an example of a Tesla driver using Autopilot and the system didn’t see a white semi-truck crossing the highway because the truck blended in with the sun behind it.

This is why I mentioned redundancy above. A camera may miss that truck. LiDAR wouldn’t.

In my opinion, when we’re talking about something as safety-critical as a vehicle, cameras aren’t enough.

Radar

Your vehicle today likely has radar and autonomous vehicles use a similar technology. Cadence explains radar like this.

The principle of operation for LiDAR and RADAR are the same, but instead of the light waves used in LIDAR, RADAR relies on radio waves. The time taken by the radio waves to return from the obstacles to the device is used for calculating the distance, angle, and velocity of the obstacle in the surroundings of the autonomous vehicle.

Compared to other sensor technologies in autonomous vehicles, RADAR works reliably under low visibility conditions such as cloudy weather, snow, rain, and fog.

If it sounds like radar and LiDAR are similar, it’s because they are. But radar is more of a secondary/backup system that provides velocity data and works more effectively in inclement weather than LiDAR and cameras. You can see below, the data isn’t as clean looking as it is from LiDAR.

Waymo

The combination of these three inputs is key to most autonomous driving platforms and lays the foundation for how we should think about operations and safety in the industry.

Artificial Intelligence

Subscribe to Asymmetric Investing Premium Membership to read the rest.

Become a paying subscriber of Asymmetric Investing Premium Membership to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In

A subscription gets you:
Exclusive access to all premium content, including 1-2 company deep dives each month
Timely updates on Asymmetric Universe stocks
Asymmetric Investing portfolio (including trades before they're made)
No advertising for premium members

Reply

or to participate.