The first programme would be to allow driverless taxis to pick up passengers as long as there is a human driver present, before moving onto the second proposal which will allow autonomous vehicle manufacturers to offer rides without a human backup present. The programme will only be available to Transportation Charter Permit (TCP) holders with permitted autonomous vehicles in autonomous operation for at least 90 days.
Data
One stipulation of the taxis involved in the programme is that no money is allowed to be exchanged from passengers for rides. This is not a financial venture but a data gathering one. Companies involved in the pilot scheme must log the numbers of rides completed. In particular, they must declare how many rides requested from disabled passengers are fulfilled. Allowing ease of access for people is predicted to be one of the main advantages of driverless taxis and ride hailing in the future. This programme is an ideal way to amass real data to maximise the benefits of passengers.
It will also allow the companies providing the rides to assess how much they would need to charge given the reduced overheads of not needing a human driver. They would be able to charge less to reach the same margins that they do now, and offer competitive rates to passengers. It is predicted that in the driverless future, it will be cheaper to use autonomous taxis for commuting instead of owning a car.
Safety Issues
This proposal comes at an interesting time, with the safety of autonomous vehicles very much in the spotlight. Uber has had to cancel its testing in Arizona after one of its cars hit and killed a pedestrian. It intends to resume its testing in other US cities but it still is banned from testing in California after not renewing a permit, meaning the ridesharing company could miss out on this benchmark test.
Tesla’s Autopilot feature has also been scrutinised recently. In the UK, a driver was filmed sitting in the passenger seat after turning on the Autopilot. He has been banned from driving for 18 months and ordered to complete 100 hours of unpaid work after pleading guilty to dangerous driving. A driver in California also died in hospital after their Tesla X Model with the Autopilot turned on hit a concrete highway divider. Both companies have been quick to stress the overall higher safety records of their autonomous vehicles, but these incidents show the fallibilities of the systems involved.
Misleading the Public
Attributing blame in the case of a crash with an autonomous vehicle is one of the major hurdles on the horizon, but could more be done in these earlier stages? One of the problems with the current technologies being used, particularly Tesla’ Autopilot, is that the names can be misleading. “Autopilot” implies that the car can completely take over, which may be why incidents of misuse are cropping up. This reckless behaviour appears to be coming from drivers attempting to test the limits of their cars. The Tesla Autopilot can accelerate, brake and steer by itself but it is not designed to fully replace a human driver. It is only Level 2 Automation, on a scale where Level 0 is fully human and Level 5 is full automation.
Tesla may advertise warnings that the technology and reminders that drivers must keep their hands on the wheel so they can react in the case of a mistake, but it has faced calls from the German government to drop the term. Many safety experts believe that car companies producing autonomous vehicles need to exercise great care over the terms and jargon used so that customers won’t make mistakes in their use of the technology. Uber is working on vehicles that are Level 4 autonomy, where a human safety driver is on board to take control if necessary – the term “self-driving car” is not yet truly applicable and could be misleading, giving the impression to a customer that they can take a back seat.
Educating customers on what their their new vehicle can do is an obvious first step to make at the point of purchase, but rebranding the technologies involved would go a long way to ensuring that customers use the technology in a responsible way. Everyone wants to test all the features on their new toy, but a car travelling at 70 miles an hour carries a much higher degree of risk that needs to be acknowledged.