This article is more than 1 year old

Uber wasn't to blame for robo-ride crash – or was it? Witness said car tried to 'beat the lights'

Just what rules is this upstart putting in place?

Analysis A police report appears to support the claim that Uber was not to blame for a recent crash of its self-driving car in Tempe, Arizona.

But the incident raises serious questions about what rules the startup's engineers are putting into the car's software.

The report – made available Wednesday for anyone who is able to physically turn up at the police department (Bloomberg did so) – states that the Uber Volvo SUV was hit as it crossed an intersection by an oncoming car that was turning left. The impact turned the Uber on its side.

The Uber car was going under the speed limit – 38 in a 40MPH zone – but visibility was limited, according to witnesses, due to traffic. The woman driving the car that hit it, Alexandra Cole in a Honda SUV, stated that the Uber came "flying through the intersection."

That statement was backed up by a witness, Brayan Torres, who told police it was the Uber car's fault for "trying to beat the light and hitting the gas so hard."

The report states that the light turned yellow as the Uber entered the intersection, although it does not give more detail over how long the light had been yellow and how long the delay is between a yellow and red light at that intersection.

What does the law say?

At issue is, of course, how we are supposed to drive versus how we really drive.

Everyone knows that when you see a yellow light, you are supposed to slow down and stop. Everyone also knows that there is a grey zone when a light turns yellow as you arrive at an intersection.

There are two versions of what you should do at that point:

  1. Start braking unless it might cause an accident – ie, your hard braking could cause the car behind you to hit you.
  2. Maintain your speed through the intersection if you are confident you can pass through the entire intersection before the light turns red. Otherwise brake.

Legally, however, you only commit an offense if you enter an intersection when the light is red (the law does vary a little state-by-state, but mostly this is true).

In reality of course, a huge percentage of drivers push their luck and get into the – extremely dangerous – habit of accelerating when they see a yellow light in order to try to get through before the light turns red.

That habit is particularly risky when there are oncoming cars that want to turn across your lane and who are working on the same traffic light timings as you. This exact scenario is responsible for a majority of intersection crashes, which make up 40 per cent of all traffic accidents in the United States.

What is Uber's position?

You would imagine then that if you are creating a self-driving car, you would play very close attention to the left-turn scenario that causes so many car crashes.

The fact that the Uber SUV was involved in such an accident raises very serious questions over what rules the company is writing into its software.

The really big question is: has Uber decided to develop its self-driving cars with the persona of a safe driver? Or with the bad driving habits of a fast driver?

Uber has said that the car was in autonomous mode and the person in the car, Uber employee Patrick Murphy, said he saw the traffic signal turn yellow as it entered the intersection.

He also claims he saw the Honda turning left but that "there was no time to react as there was a blind spot" thanks to the traffic. So the Uber car made all the decisions itself.

We've all been in this situation on the road. What a majority of good drivers do is gauge the level of risk: if the intersection is clear, we will push through a yellow light and maybe even accelerate to get through in time; if there is traffic in the intersection, most people will decide to brake and wait.

What did the Uber algorithm do?

We don't know. We do know that Uber knows however, because everything its cars do is obsessively logged. The SUV's logs will provide the answers to several critical questions:

  1. Did the car see the yellow light at all? A previous crash in San Francisco that was Uber's fault was a result of it failing to see a red light (see the video below). Last month, the New York Times reported that it had seen internal Uber documents that said its cars had failed to recognize traffic lights on five other occasions.
  2. If it did see the light, did the car accelerate in response? If it did, this is a serious safety risk added to automated software. There is no justification to speed up in this scenario: a car should either brake or maintain its speed.
  3. Did the car register that there was a critical blind spot? If it couldn't see any cars that might be turning left, then why wasn't that included as a risk factor?

Don't forget the time an Uber autonomous vehicle ran a red light:

Youtube Video

Uber knows the answers to these questions and many believe that as self-driving and autonomous cars are now a reality in many cities, these details should be made public. If we are creating machines to emulate human behavior, we need to make sure that they are copying the safer human drivers and not the risk takers. ®

Updated to add

We've since learned that Uber's official policy is that if a car can cross an intersection with sufficient time, it will continue on – and at the same speed it entered the area. If it can stop safely, it will do so. That said, there are still some questions to be answered in this specific case, not least because one witness claimed it saw the car speed up.

More about

TIP US OFF

Send us news


Other stories you might like