Driverless cars will become the norm in our future. With automated parking and cruise control already integrated into our cars today, governments and experts expect that cars will eventually become fully automated. As convenient as this may be, when control of the vehicle is taken away from the human hands, a number of legal issues arise. In September 2020, the National Transport Commission (NTC) released the ‘Automated Vehicle Program Approach’ addressing some of these issues and outlining the recommended approach going forward. Here, we will discuss some legal implications for driverless cars and what the government is doing to address them.
Who is responsible when things go wrong?
Imagine that you’ve picked out your driverless car from a reputable manufacturer and got it registered. You turn on the ignition and set it into motion, then sit back to read a book. Suddenly, a lady with a pram walks across the road. The car makes a beeping noise, asking you to assume control of the car. But before you have time to do anything, your vehicle crashes into another car, which breaks the arm of a passenger inside that car. Who is responsible for injuring the passenger?
Various entities may be legally responsible for damage caused by autonomous vehicles. In the NTC’s 2018 Policy Paper called ‘Changing driving laws to support automated vehicles’, the NTC determined that legal responsibility could lie with the following entities:
- The fallback-ready user of the car. This is the human ‘supervisor’ of the autonomous vehicle. They should be ready to assume control of the vehicle if some emergency or error arises.
- The operator. This is the person who sets the vehicle into motion. They are also able to ensure that the autonomous vehicle has all the latest software updates and is in a physically good condition.
- The registered operator. This is the person who is the registered owner of the vehicle. They also have responsibilities and other legal duties to ensure that the vehicle is operating correctly.
- The manufacturer. This is the entity that has created the vehicle. They are responsible for ensuring that the vehicle’s software is sound and operates in accordance with traffic laws.
NTC recommendations
The NTC has recommended that whether an entity is legally ‘in control’ of the vehicle should depend on the level of automation. For cars that allow human supervisor to take control, the ‘fallback-ready user’ of the car should be deemed legally ‘in control’ of the vehicle. This of course applies with cruise control and automated reverse parking, where the human driver can assume control at any point. However, this could extend to where the vehicle controls everything, such as the steering and braking. When the vehicle requests the human driver to take control in the event of an emergency or system failure, the human will be legally in control of the vehicle.
However, the NTC has indicated that where the car is fully automated and no option exists for humans to take control, the ‘automated driving system entity’ (ADSE) should be legally responsible. The ADSE should also be legally responsible for partially automatic vehicles, up until the vehicle requests its human supervisor to take control. The ADSE is the entity that is responsible for certifying that the autonomous vehicle is safe and will abide by traffic laws. This entity will ordinarily be the manufacturer.
To ensure that manufacturers or other ADSEs prioritise safety, the NTC has proposed the development of a Safety Assurance System. This system will require entities to certify its safety measures before the government approves its first supply.
Will insurance cover driverless cars?
Currently, insurance schemes do not sufficiently address automated vehicles. As noted by the NTC’s 2019 ‘Motor Accident Injury Insurance Automated Vehicles Discussion Paper’, this is for a number of reasons:
- Current laws on motor vehicle accidents do not account for autonomous vehicle systems being the ‘driver’ of the vehicle . The laws only apply to humans drivers. This may hinder people injured from driverless cars from accessing compensation.
- Most motor vehicle laws are based on some ‘fault’ of the driver. As machines can’t behave negligibly or maliciously, this can also stop people injured by driverless cars from accessing compensation.
- Most motor vehicle laws do not sufficiently cover accidents caused by faulty products.
The NTC has pushed for reform to more adequately address driverless cars in insurance schemes. In August 2019, transport ministers agreed to extend existing motor accident injury insurance schemes to cover automated vehicle crash injuries. This is awaiting endorsement by Treasury ministers responsible for the motor vehicle insurance schemes.
Cybersecurity of driverless cars
One intended advantage of automation is increased safety on our roads by reducing human errors. However, handing over all control of a vehicle to a machine may not be safe as it seems. Indeed, driverless vehicles bring about large cybersecurity risks. If someone were to hack into a driverless vehicle’s software, they can cause serious harm to the public.
Regarding this, the NTC has proposed including cybersecurity certification and accreditation requirements in the Safety Assurance System. Applicants, such as manufacturers, would need to provide proof of the strength of the vehicle’s cybersecurity system. They would be legally responsible should its cybersecurity system not be up to standard.
Conclusion
In the near future, driverless cars will become part of our everyday lives. In order to prepare for this, the government is considering policies to address some key legal issues as outlined above. The NTC’s recommendations appear sound. The hope is that they will soon be made into law to make sure someone is responsible for the actions of driverless cars.