Self-Driving Vehicle Technology: Progress and Promises

view from inside a self-driving vehicle, illustration - Credit: Petovarga

Self-Driving Vehicle Technology: Progress and Promises
Communications of the ACM, October 2020, Vol. 63 No. 10, Pages 20-22
Technology Strategy and Management
By Michael A. Cusumano

“If and when this technology will make its way into your average passenger vehicle is uncertain, but there is no doubt that companies have been moving closer toward their goal. ”

 

Automakers have already spent at least $16 billion developing self-driving technology, with the promise of someday creating fully autonomous vehicles. What has been the result? Although it seems that we have more promises than actual progress, some encouraging experiments are now under way, and there have been intermediate benefits in the form of driver-assist safety features.

 

Engineers started on this quest to automate driving several decades ago, when passenger vehicles first began deploying cameras, radar, and limited software controls. In the 1990s, automakers introduced radar-based adaptive cruise control and dynamic traction control for braking. In the 2000s, they introduced lane-departure warning and driver-assist parking technology. Since 2017, Waymo, Uber, Daimler, the U.S. Postal Service, and several other automakers all have launched experiments with robo-taxis or robo-trucks, targeting Level 4 Autonomy (see the sidebar on the last page of this column). If and when this technology will make its way into your average passenger vehicle is uncertain, but there is no doubt that companies have been moving closer toward their goal.

 

The basic technologies and engineering skills needed to make self-driving vehicles more widely available already exist. The most popular camera packages from Mobileye (purchased by Intel in 2017) and OmniVision are relatively inexpensive. However, some self-driving systems deploy much more expensive lasers (usually referred to as “lidar” for Light Detection and Ranging) as well as radar and ultrasound sensors, provided by firms such as Ibeo, Velodyne, and Autoliv. Major auto parts and technology suppliers, led by Bosch, Denso, Aptiv (formerly Delphi Automotive, which also purchased the AI and robotics software company NuTonomy in 2017), TRW, and Continental, assemble components into various driver-assist systems and use microprocessors from Intel-Mobileye, Nvidia, and ARM. Blackberry, formerly a pioneer in secure email and smartphones, has become a player in the automotive IoT software market with its QNX operating system, which runs on some 150 million vehicles. Green Hills Software competes in this business as well, along with Google. Automakers, auto parts vendors, and robotics and AI startups, all have been learning how to design the AI and machine-learning applications needed to process data and warn drivers or provide instructions to the vehicle subsystems.

 

Electric vehicles rely heavily on computers to control their functions, and this characteristic makes them especially suitable for self-driving technology. Not surprisingly, Tesla vehicles deploying driver-assist technology have logged nearly two billion miles and the company probably leads the industry in data collection. Tesla was able to commercialize its Autopilot system because it used cameras, radar, and ultrasound, rather than the more expensive lidar. However, Tesla vehicles are still somewhere between Levels 2 and 3—far from the goal of autonomous driving. They also have been involved in several high-profile accidents when drivers stopped paying attention, so the company now insists that drivers keep their hands on the steering wheel and eyes on the road.

Sidebar: Levels of Autonomy in Self-Driving Vehicles

Level 1—Driver Assistance: The system and the human driver share control; for example, radar-based adaptive cruise control operates the engine and braking, and assists in lane control while the driver steers.

 

Level 2—Partial Automation: The system controls vehicle operations such as acceleration, braking, and steering, but drivers must constantly monitor operations and usually need to hold the steering wheel for the autonomous system to operate. Assists steering, lane changing, traffic-jam driving (low-speed version of cruise control), and overtaking.

 

Level 3—Conditional Automation: The system controls most vehicle operations, but driver monitoring and intervention are still essential. Drivers may be “hands off” on highways for short periods. Assists lane changes, parking, and traffic-jam driving.

 

Level 4—High Automation: The system supports self-driving with no or minimal driver intervention, but primarily in mapped locations. Automated lane changing and other features available.

 

Level 5—Full Automation: The system requires no human intervention, as in a robo-taxi or robo-truck.

 

This table is based on various sources, including: Shuttleworth, J. SAE Standards news: J3016 automated-driving graphic update, SAE.org News, January 7, 2019; and Madhavan, R. How self-driving cars work: a simple overview. Emerj.com, June 3, 2019.

Read the Full Article »

About the Author:

Michael A. Cusumano is Deputy Dean and SMR Distinguished Professor at the MIT Sloan School of Management, and co-author of The Business of Platforms (2019).