Uber Self Drive car kills woman

Pride of Lions

I am now official
Sadly, the first reported death has been announced of a pedestrian killed by a self drive car. The incident occurred in Arizona when a woman was struck as she crossed the road.

https://www.theguardian.com/technology/2018/mar/19/uber-self-driving-car-kills-woman-arizona-tempe


Strangely, there was a vehicle operator in the car at the time......


 
Very sad. I wonder how it happened.

It's interesting that it's news though. Put it this way; if the headline said Car Kills Woman, nobody would read the article.
 
It had to happen, and it will happen again. The truth is, it will happen less often than when we drive.

Will we ever accept that it is safer though, and take these deaths in the way we do today when a person has an accident?
 
Wonder if this was because the pedestrian did something stupid and got herself killed, or if the self driving technology couldn’t cope with realistic scenarios

With the dangers of driving you’d hope that these self driving cars would not be allowed anywhere near public roads until they were proven to be as safe as human drivers
 
Reprost show that she did cross the road outside of the "crosswalk", as they call it.

So far, all of the Google car accidents have been caused by other drivers (rear-ending being the most common) and the Tesla one was due to the driver ignoring the warnings the car was giving him.

It's hard to compare stats, considering the total number of hours humans drive in, but I would suggests they are already safer. The distracted or tired driver vs an autonomous vehicle? No doubt in my mind.
 
Oh an pedestrian? Kills my joke then.

Had it been a car collision and a driverless car had killed a woman in a car I was obviously going to blame the woman.
 
4,092 pedestrians were killed in accidents with cars in 2009. That's more than 10 per day (in the US).
 
It seems the driver wasn't paying attention. At this stage, there still needs to be someone to override the system in cases like this. As they improve, lets hope they won't be needed.
 
Driver was on phone texting I think and didn't see her walk and break. Tech broke down somewhere though and the debate seems to be this company with sensors or Uber on how their software interpreted the sensor information.

My guess as daft as it sounds (and tragic in this case) is that code saying sense obstruction - swerve or break - in daylight software patch wasn't applied to night time patch.

It'll be something as simple as that sadly.
 
It's still in development, so mistakes will always be made. The human is there to cover it, until it does prove itself more reliable. He should have been paying attention.
 
Completely agree. Missing code, a wrong option selected when it should be changed etc HAS to be expected in development and switching. You can't physically notice everything and get it right, the process is reacting when you spot something wrong.

Sadly in this case the human element failed as back up.