The Two Types of Self-Driving Tech Every Investor Should Know About

    Date:

    Editor’s Note: Tesla’s “Robotaxi” event on October 10 will mark the beginning of a new era for autonomous transportation. It’s one that will have far-reaching implications across the economy.

    By making autonomous vehicles a reality, Tesla is opening the door to tectonic disruption – and ushering in vast new opportunities. My InvestorPlace colleague Luke Lango believes he’s spotted one of them, so I encourage you to click here to sign up for his 10 a.m. broadcast taking place tomorrow, October 7 – less than 24 hours from now.

    Tomorrow’s AV technologies are on the verge of smashing into today, and the shift will be like nothing we’ve seen before. So, I’ve invited Luke here today to tell you more about those technologies… and to preview his October 7 broadcast.

    Take it away, Luke…


    My friends with kids in college now tell me their children pretty much grew up thinking they’d never have to drive due to self-driving cars.

    My buddies with kids in high school now tell me the same thing.

    The folks I know with middle school students, however… those kids have always known they would have to sign up for driver’s ed.

    Silicon Valley can only tell us that self-driving cars are “five years away from being five years away” for so long before we doubt their story.

    Still, I’ve been bullish on autonomous vehicles for a while now. The industry’s developments, after all, have been promising.

    And here’s the thing… Maybe those kids in middle school now won’t have to learn to drive.

    Those self-driving tech developments are rapidly moving from promising to reality.

    You probably know about the rapid expansion of autonomous ride-hailing services in Phoenix and San Francisco. But did you know that Waymo started offering driverless rides to some folks in Austin today?

    You may have heard about the rollout of autonomous trucking in Texas and Arizona. But did you know that California Governor Gavin Newsom vetoed a bill that would have banned driverless trucks from operating on his state’s roads this past Monday?

    And with the upcoming launch of Elon Musk’s robotaxi in just a week – on Thursday, October 10 – I believe the stage is set for self-driving cars to begin transforming the $11 trillion transportation services industry. (Sign up for my October 7 morning broadcast about that event – and how to put your portfolio in a position to profits from it – by going here.)

    Now, that’s all great information to know. But it doesn’t mean much if we don’t grasp how these vehicles actually work.

    After all, understanding the tech behind a burgeoning megatrend is key… before we can start profiting from it.

    Therefore, to potentially turn the “Age of Avs” into a massive payday, we must first understand how self-driving cars work…

    A Sensor Trifecta

    At its core, a self-driving car is operated by a combination of sensors – the “hardware stack” – and AI-powered software –the “software stack.”

    In short, the car’s sensors gather information about its surroundings. Then, the AI software processes that data to determine whether the vehicle should accelerate, brake, change lanes, turn, etc.

    Athis all needs to happen virtually instantaneously.

    Usually, the hardware stack is composed of three sensors: cameras, radar, and lidar. A typical self-driving car uses all three sensors as each has strengths and weaknesses that complement the others nicely.

    Let’s go through them one by one…

    • Cameras collect visual data. They capture high-resolution images of the vehicle’s environment, similar to how our own eyes work. These cameras recognize various signs, lane markings, and traffic lights – and can distinguish between different objects, like pedestrians, cyclists, and vehicles. They are very good at providing detailed visual information, which helps the car understand the context of its surroundings. But they tend to perform poorly in bad visual environments, like when there’s low light or inclement weather.
    • An AV’s radar sensors emit radio waves that bounce off objects and return to the sensor, providing information about the distance, speed, and movement of obstacles in the car’s vicinity. These sensors work well in all weather conditions (complementing cameras nicely), but they provide limited resolution and detail (where cameras excel).
    • Lidar – which stands for light detection and ranging – is essentially radar powered by lasers. These sensors emit laser pulses that bounce off surrounding objects and return to the sensor. By measuring the time it takes for the light to return, lidar can create a high-resolution 3D map of the vehicle’s environment. This provides accurate depth perception, enabling the car to understand the exact shape, size, and distance of surrounding objects. However, lidar doesn’t capture color or texture information (like cameras do).

    In other words, AVs use cameras to see things. Radar senses how fast those things are moving. And lidar helps calculate the exact position of those things.

    In this sense, it is easy to see how these three sensors work together within a self-driving car.

    Driven by Next-Gen Software

    Self-driving cars use what is called “sensor fusion” to combine camera, radar, and lidar data, creating a complete, accurate, and reliable model of their environment.

    For example, if a person crosses the road in front of an AV:

    • The camera identifies it as a person.
    • The radar tracks the pedestrian’s speed to predict potential collisions.
    • The lidar measures the pedestrian’s exact distance, shape, and movement.

    Together, these sensors allow the car to make informed decisions, such as slowing down, stopping, or rerouting, ensuring safe and efficient navigation.

    But it can only make those decisions with the help of its software stack.

    An AV utilizes a variety of software and methods to provide real-time intelligence about its surroundings. There are essentially five components to this software stack: perception, localization, prediction, planning, and control.

    In short:

    • The perception software uses sensor fusion, object classification, and semantic segmentation to create a comprehensive picture of a car’s environment.
    • The localization software uses highly detailed maps and location data to place the car precisely in its environment.
    • The prediction software leverages machine learning models to predict how things in the environment may act in different scenarios.
    • The planning software takes the outcomes of the perception, localization, and prediction software to decide an optimal path for the car.
    • The control software executes the planned action, controlling the car’s steering, acceleration, braking, etc.

    Together, these hardware and software stacks create the technological background for self-driving cars.

    The Final Word

    Of course, every company attacks the self-driving problem differently. But this is the general framework most follow.

    As such, when looking to invest in the autonomous vehicle supply chain, it makes sense to look for stocks providing AVs’ critical components.

    Find the strongest camera, radar, and lidar providers. Focus on the most promising software plays.

    They’ll likely be the biggest winners in the Age of AVs.

    In fact, if you’re hoping to get positioned for an era of AV-powered market gains, click here to save your seat for my special video briefing tomorrow, October 7, at 10 a.m. Eastern. With the broadcast just hours away, I urge you to secure your spot now. There, we’ll talk all about the quickly unfolding Age of AVs.

    This upcoming broadcast is all about getting you prepared for Tesla’s Robotaxi launch later in week (which we expect will be huge). Though it’s about much more than that upcoming debut.

    Indeed, in this broadcast, I’ll detail all the recent groundbreaking developments in the autonomous vehicle industry, including how robotaxis are set to completely transform transportation, save millions of lives, and potentially put up to $30,000 a year in passive income in your pocket.

    Plus, I’ll show you how to get my playbook on the best AV stocks to buy right now.

    Click here to reserve your seat now.

    And I’ll see you there.

    Sincerely,

    Luke Lango

    Editor, Hypergrowth Investing

    Chart

    SignUp For Breaking Alerts

    New Graphic

    We respect your email privacy

    Share post:

    Popular

    More like this
    Related