Uber autonomous vehicle death likely to raise questions around safety
20 March 2018
20 March 2018
An autonomous vehicle operated by ride-hailing company Uber has been involved in a fatal collision with a pedestrian while being tested on roads in the US.
The company has now halted all trials of autonomous projects across the globe as both it and authorities carry out a full investigation into the circumstances surrounding the accident. The vehicle had a human pilot behind the wheel at the time. This incident marks the first time an autonomous vehicle has caused the death of a pedestrian and highlights the safety implications surrounding driverless cars.
The woman was crossing the road, apparently without using a designated pedestrian crossing, when she was struck by the Uber vehicle, a Volvo, according to the police. She was taken to hospital where she died from her injuries. ′Uber is assisting and this is still an active investigation,’ the police said.
The incident is likely to raise calls for increased safety during autonomous vehicle trials, with many, including senior former US officials, believing that current tests are coming too soon for public roads. Anthony Foxx, who served as US Secretary of Transportation under former President Barack Obama, called the accident a ′wake up call to the entire [autonomous vehicle] industry and government to put a high priority on safety.’
More than a dozen states in the US allow autonomous vehicles on the roads to some degree. Officials typically require a person to be on hand either in the car or remotely in case something goes wrong, according to the Centre for Automotive Research.
European countries are starting to allow driverless vehicle trials on public roads. In Germany, new legislation was passed last year stating that a black box record the journey underway, logging whether the human driver or the car’s self-piloting system was in charge at all moments of the ride. The box will be crucial in apportioning blame should an accident occur during testing, by telling authorities whether it was under human or computer control at the time.
The UK Government to is to approve plans to test self-driving vehicles without a human at the wheel on public roads by 2019. The new UK rules will allow companies to test vehicles on any public roads as soon as necessary changes are made to the Road Traffic Act, and the transport secretary is convinced that the testing cars are safe to operate without backup drivers.
While the tragic accident in the US is unlikely to prevent these plans from happening, it will raise questions about the need to rush through autonomous technology trials. There is sure to be debate as well as to who is responsible for tragic events. Germany, for example, blames the driver should they be piloting the vehicle, or the manufacturer and operator should autonomous components be in charge.
A Tesla driver died in a crash while the car was in autonomous mode in 2016. An investigation found that the driver did not heed warnings to keep his hands on the wheel.