Self-driving vehicles are coming to the UK, following a new code of practice for autonomous pods and self-driving buses. It will allow those vehicles to drive on specific roads and pathways in London, Bristol, Coventry, Milton Keynes.
The new code comes a few months after UK Chancellor George Osborne’s commitment to self-driving vehicles, announcing the UK government would fund £100 million into this market and help explore the advantages and disadvantages of automated driving.
For now, the code will not include driverless cars. This means software makers like Google and Uber will still need to use specialised roads or factories to test self-driving cars in the UK, or use California as testing ground in the US.
The autonomous pod Lutz Pathfinder will be tested on UK paths. The pod does not need to have an active driver inside, but should always have an active remote control driver.
Other concept vehicles from Ford, Jaguar, Tata and Innovate UK will be tested on the roads. We expect more to join the UK Autodrive Consortium once self-driving cars are approved for road testing, which will reportedly happen in 2017.
Public opinion polls will be held for the next three years to find out if the UK public opinion changes as more self-driving vehicles appear on the roads and pathways. There will also be tests to see if things like congestion and pollution are lowered with self-driving vehicles.
There are a few rules of the code for manufacturers to follow when testing, including:
- 30 seconds of backup data to reveal whether it was the driverless car or human car fault if an accident occurs.
- Pavement vehicles must have a remote controlled driver.
- Road vehicles must have an driver inside the car, capable of taking over if the self-driving system fails.
- All faults and accidents should be reported and the autonomous car driver will be held responsible, unless proof of intent to crash from the human driver is shown.
This code might put off a few drivers, especially Google who has filed reports in secret to the US driving association on crashes. Google claims all 13 of its crashes were due to human errors, not the self-driving platform, but has not published the reports for fact checking.