Why We Should Get Rid of Driverless... Levels

Jeudi 29 septembre 2016
Technologie, Technologie, Mobilité durable
sae_levels_graphic_with_stop_sign.jpg
Michael Haines
VANZI

We’ve recently seen at least one (and possibly two, if you count the one in China) fatal accidents involving a Tesla where the driver had relinquished control to the ‘autopilot’.

In this case, Tesla is denying liability as they correctly claim that the autopilot is sold as a ‘Level 2’ device. This requires the driver to remain on alert, and ready to respond if for some reason the autopilot fails to detect impending danger.

The trouble is; people are terrible at passive monitoring tasks. We get bored and distracted.

Tesla has just released an upgrade, which they say is even better and will avoid the problems of the earlier version. As a result, there will be many fewer times the driver needs to interject, with the likelihood that they become even more blasé. The unfortunate irony is that, as the device improves, it potentially becomes more dangerous.  When it does fail, the ‘distracted’ driver will have no hope of responding in time.

It is clear that the development of Driverless cars is outpacing the ability of regulators to govern road safety.

In the US, the National Highway Traffic Safety Administration (NHTSA) and SAE International (a global association of more than 128,000 engineers and related technical experts in the aerospace, automotive and commercial-vehicle industries) have each developed a set of guides. As well, each State is writing its own legislation. In Europe, as elsewhere, various organizations and government bodies are all grappling with how to define and regulate increasing levels of automation.

The problem is that the guidelines are far too general.

It means that innovative companies like Tesla can be led down a path that seems to improve safety, but is actually dangerous.

For the sake of clarity, it would be better to recognise two separate challenges:

1. Driver Assist

2. Driverless

In the first case, the car may control acceleration and braking to maintain speed and spacing (eg active cruise control). However, the Driver should never be permitted (even for a moment) to voluntarily release control of steering, nor should the car take over active steering… except to avert an accident. In this instance, the challenge is to monitor the driver’s behaviour and alertness, as well as the immediate surrounds, to ensure the Driver is responding appropriately and, if not, to intervene. Including steering out of harm’s way, and perhaps even pulling over.

In the second case, the challenge is to expand the areas and conditions where the car can operate in full control - with a managed handover between modes. This needs to operate like a handover between pilots, where the car alerts the person that it is approaching an area and/or conditions (eg impending rain) where the car is not rated to operate in driverless mode. The person will require time to orient themselves and to formally acknowledge they are back in control.

This may need to include monitoring of the person’s level of alertness before making the handover, and if there is any concern, having the car pull over. The problem could arise, for example, if the person has been in a deep sleep on a long highway journey.

There should never be any doubt as to who is in control.

This is important for operational reasons, but also we need to know who is in control legally.

If it is the driver, they will be primarily responsible for any accident, unless they can show equipment failure. In the second case, the manufacturer will be liable.

No doubt, this takes the ‘fun’ out of ‘handsfree’ driving.  But Driver Assist technology is not there for fun, it is there to improve safety. As such, the autopilot must sit in the background ready to respond to avoid a forward collision or any other threat it is programmed to manage.

Having the two classifications does not stop on-road testing of Driverless mode.

It just means that the manufacturer must put in place systems to compare what the driver actually does with how the car would have responded. In the case of the Tesla accident, with such a system, the software would detect that the driver braked ‘unexpectedly’ on the highway. This would trigger an upload of the data which would show in simulation that the autopilot failed to detect the truck in its path, leading to the improvements just announced.

Under this scenario, as more and more ‘learning’ and improvements take place, at some point, Tesla would become confident enough to release the autopilot for driverless mode (perhaps only on specific highways, or slow moving traffic to start). 

That’s when the fun starts.

Sur la toile

https://aqtr.com/association/actualites/revue-routes-transports-edition-printemps-2024-est-disponible
17 juin 2024

AQTr

https://www.quebec.ca/nouvelles/actualites/details/plan-daction-2023-2026-en-matiere-de-securite-sur-les-sites-de-travaux-routiers-des-milieux-plus-securitaires-pour-les-travailleurs-en-chantier-routier-49256
4 juillet 2023

MTMD

https://aqtr.com/association/actualites/revue-routes-transport-edition-printemps-2023-est-disponible
4 juillet 2023

AQTr