Recent developments in autonomous vehicles (AVs) have been encouraging with numerous tests proving safety and reliability improvements. Tesla recently made bold claims – saying its cars would be capable of level 4/5 autonomy in 2020. How realistic is this and by when can we expect a significant rollout of autonomous cars?

Counterpoint Research predicts that, with the resolution of both the technological and regulatory issues, around 15% of new cars sold in 2030 will be fully autonomous (Level 4 to 5).

To make the vision of AVs a reality, OEMs, suppliers, and start-ups are applying cutting-edge technology to solve some of the biggest problems in computing, engineering, software development, and algorithm design today. However, it is regulatory and legal hurdles, rather than technological issues, that will prove to be the biggest barriers for self-driving technology.

Exhibit 1: ADAS Evolution in Automobile

Advanced Driving Assistance Systems (ADAS) are progressively demonstrating the reality of vehicles taking over control from drivers and playing the crucial role of preparing regulators, consumers, and corporations for the possibilities that lay ahead.

ADAS introduction has demonstrated that the challenges holding back adoption of AVs are consumer awareness, pricing, and most significantly, issues in privacy, safety, and security.

Safety remains the critical and overriding component of ADAS technology. ADAS uses both visual and aural warnings if they suspect an accident is imminent. Advanced versions can now actively steer the vehicle away or activate brakes if drivers ignore the warnings. Driver assistance technologies available today include adaptive cruise control, lane-departure warning systems, autonomous emergency braking systems, and parking-assistance systems. Some vehicles also offer the option of blind-spot monitoring and rear crossing traffic alert to supplement the other assistance systems.

ADAS enabled through radar scanners, sensors, and cameras, integrated into the vehicles central electronic control modules are fundamental blocks driving the progress towards fully autonomous vehicles. These ADAS systems enable the vehicle to perceive its surroundings, interpret obstacles and critical situations, and assist or alert the driver in performing maneuvers. The aim, ideally, is to completely prevent accidents or at least minimize the consequences for those involved.

As technology evolves and becomes more cost-effective, design engineers are constantly updating their autonomous systems to include new and different types of sensors. Further adding to the complexity is the role cloud computing will eventually play as 5G radio technology is rolled out and finally offers the bandwidth requirements needed for the massive data streams coming off sensor systems.

Exhibit 2: Increasing Focus on Autonomous Vehicles and Safety

Clearly, automotive companies can no longer be experts in just mechanical and electrical engineering. They now need to be experts in radar, artificial intelligence, software development, motor controls, battery chemistries, big data analysis, machine learning, and the list goes on. The automotive industry, therefore, needs to collaborate across automotive suppliers, OEMs, integrators, technology developers, governments, and test vendors alike, to truly understand and propose solutions.

Regulatory hurdles holding back autonomous cars from becoming an everyday reality

However, the transition from human-driven to autonomous cars will not be seamless as it remains unclear how AVs fit into existing legal and regulatory frameworks around the world. Debates on how exactly the laws should, and will, handle the introduction of autonomous vehicles have differing and often contradictory conclusions. With existing legal frameworks proving to be inadequate, regulatory changes are urgently necessary to address a variety of barriers preventing the successful introduction of autonomous vehicles. As the reality of commercially available self-driving cars becomes more imminent, concerns about how the law—specifically tort law—will treat liability for autonomous vehicles has risen considerably.

Autonomous vehicles give rise to new liability and ethical issues

Assigning negligence forms the legal basis for liability in road accidents. Car owners, or the driver, are in the first instance liable for losses arising from accidents caused by their vehicles. Consequently, car owners are required to have, at a minimum, third-party liability insurance. Where an accident is a result of a fault or defect in the car, car owners/drivers will then look to the vehicle manufacturer or any of the component/service providers for recovery of any losses. States impose strict liability on producers of defective products for harm caused by those products. Inevitably, the introduction of autonomous vehicles adds another layer of complexity to attributing liability for car accidents. For example, the fact that you are operating a self-driving car, and chose not to override it before an incident, does it amount to negligence on the drivers part, the AI/software developer, the manufacturer of the vehicle or the component supplier?

The problem with autonomy in cars is that drivers will tend to over-rely on them. Tesla’s Autopilot, for example, does not have Level 3 autonomy, but Level 2, at best. The self-drive capability difference between Level 3 – when the car can take full control under certain circumstances – and Level 2 is significant. Tesla’s do not have particularly sophisticated sensors, and fatal crashes have already demonstrated that fact. At Level 3 autonomy, the driver is required to remain ready to take over control at a moment’s notice. Back in 2012, Google had tested Level 3 autonomy but found drivers were too trusting, and so decided not to take Level 3 to market at all, preferring instead to leapfrog towards developing full autonomous Level 5 vehicles, where no steering wheel or any other input is required. There appears to be an emerging consensus that Level 3 autonomy is a bad idea altogether. This means OEMs need to advance from Level 2, something that most leading OEMs have achieved, to at least Level 4. The technological challenges involved in such a jump are akin to progressing directly from powered flight to landing a man on the Moon.

So what does ‘Auto’ stand for again?

Tesla is making bold claims about its autonomous vehicle plans. At an investor event last month, Elon Musk revealed technical details of a new chip and computer for full self-driving capabilities that are already being built into Tesla cars. This is a key part of Tesla’s strategy to make autonomous cars mainstream. The company claims that the new chip will clear the way (subject to receiving regulatory approvals) to improve its software and neural networks to effectively operate its cars as fully autonomous vehicles. In such vehicles, which would be out as early as 2020, drivers would not need to touch the wheel. While it has not always been clear what Musk means when he refers to full self-driving, it is apparent that Tesla does not apply the standard definition of Level 4 or Level 5 autonomy.

Adding further to the debate is the first ever incident of a self-driving Uber car killing a pedestrian in Tempe, Arizona, in March 2018. The simple mundane everyday situation of a crosswalk, turn or intersection, is now presenting to be a much harder and broader ethical predicament, on how a car should decide between the lives of its passengers, and the lives of pedestrians.

Uncertain timeline to having fully autonomous vehicles

AVs adoption rate largely depends on resolving regulatory issues, as well as changing consumer opinions and overcoming significant technological and economic barriers. It also depends on what we mean by ‘autonomous’ – are we referring to entirely autonomous systems or systems that demonstrate some level of autonomy along with some degree of human intervention?

The chief differentiator of driver assisted and fully autonomous systems is the ability to focus away from safety concerns and towards freeing up the driver’s attention and time.

Innovations in the autonomous systems’ capabilities will develop quickly. Autonomy, in limited, predictable environments, such as freeway driving, will likely be available widely within the next five years. TuSimple that provides trucks that self-drive long distance freeway routes are almost ready for commercial launch. A human is in the cab ready to take control though.

Dual control is emerging as the interim half-way mark between ADAS and full autonomy – cars that blend a degree of autonomous control with human driver control. It is this degree of autonomous control that will gradually shift. The notion of a fully autonomous car, where the driver is hands and attention free for the entire journey is, in our view, much further away though.

How far ahead are autonomous vehicles?

Human drivers demonstrate decision making that’s still a long way ahead of an AVs current ability. Human drivers process and react to varying information quickly, making rapid decisions based on experience, judgment, and ethics. Further, humans cautiously negotiate roads occupied by other similarly unpredictable human drivers and improvise when confronted with unique situations. Current AVs may perform some of these tasks possibly faster and more consistently, but none can yet compare with a human across all situations.

In early 2018, GM introduced the Cruise AV – an autonomous hatchback, based on the Chevrolet Bolt EV, drawing significant attention with the absence of a steering wheel and pedals. While GM has not revealed any plans for a production run, it has been petitioning the American government for permission to test the model on public roads in 2019. Validating the significant role such autonomous cars can play in a Mobility as a Service (MaaS) automotive market in the future, Japan’s SoftBank Vision Fund, a leading global large tech investor, invested US$2.2 billion in May 2018 for a 19.6% stake in GM’s autonomous driving business.

Exhibit 3: Key Firms With Permits to Test Self Driving Cars in California

The other key technology enabler to full autonomy will be the development of comprehensive data networks, comprising edge-based and cloud-based infrastructure. This will require the automotive industry working closely with other sectors, such as IT and telecoms industries, as well as public and industry policymakers, reaching agreements on a range of important issues, such as data center design and locations, enabling vehicles to communicate reliably and securely with their local environment.

On the face of it, while the future of AVs looks bright, the automotive industry is still a long way from manufacturing vehicles that can self-drive anywhere and everywhere under all conditions.

What's your reaction?

Add Comment

to top