Uber Tragedy Shows Autonomous Cars Still Have Far To Go

After a woman was knocked down and killed by an Uber self-driving prototype car, the inconvenient limitations of the technology have been exposed
Remote video URL

The most significant thing to have happened in the automotive world this week is, without question, the death of a woman at the bumper of an Uber prototype autonomous car; the first such fatality ever recorded. With that, 49-year-old Elaine Herzberg takes her tragic place in history alongside Irish scientist Mary Ward.

Who’s Mary Ward? We’d be willing to bet that 99.9 per cent of people, CTzens included, would have no idea until they Googled her. She was the first person to be killed by an automobile, full stop. On 31 August 1869, near Parsonstown, as the town of Birr was known then, she was riding in a steam-powered car built by her cousins when she was thrown from her seat, fell in front of one of the wheels and was killed almost instantly after sustaining severe head trauma and a broken neck.

Remote video URL

Today, as I’ve just outlined, to most people Mary Ward is nobody. She’s not even as famous as the Z-listers TV researchers keep digging up for every new series of I’m a Big Brother Dancing on Celebrity Strictly Bake Off, and yet she holds a unique place in car history.

Elaine Herzberg, too, will soon be forgotten. This poor woman, who was reportedly homeless at the time of her death, will be lost in the commercial tides pushing autonomous cars ever closer to reality. Another Mary Ward, collateral damage in the turbulence of progress.

Uber Tragedy Shows Autonomous Cars Still Have Far To Go

According to a San Francisco Chronicle report, footage taken from the Uber autonomous car showed that Ms Herzberg pushed a bicycle laden with plastic shopping bags out into the road in front of the car. We’ve since discovered that the car simply didn’t see the obvious obstacle.

Arguably the most wonderful thing about the human brain is its capacity to deal with infinite variables. A focused human brain can analyse a driving situation in ways a computer simply can’t, and probably never will be able to. Despite the darkness the Uber car - and the human backup, who was sadly distracted -absolutely should have seen Herzberg crossing the road, if the technology worked. But it didn’t, so we can only assume it doesn’t.

Remote video URL

If a self-driving machine fails to see an impending accident, people will die. If we can’t be completely sure the systems are foolproof, how can we ever trust them? Would you trust a mechanical, automated nanny with your baby son or daughter if you knew it might occasionally, if accidentally, try to kill them?

I’m not suggesting humans are safer. Far from it. The science that predicts a massive drop in road traffic deaths when autonomy becomes normal is no doubt spot-on. What I’m saying is that autonomous technology isn’t good enough, yet. It’s nowhere near. At the moment all it takes to upset the whole system is a pothole, rays of sun at the wrong angle, dirt on a sensor or darkness. Various prototypes testing on public roads have had various inexplicable faults, like slamming on the brakes at a green light. That in itself could cause a huge accident – and the machine would be to blame.

Remote video URL

Just as we accept the risk posed to us by human drivers getting it wrong, we have to accept the risk of imperfectly-programmed machines getting it wrong, too. That said, I will always prefer the task of anticipating what a fellow human might do, as opposed to what a machine will do – or not do – when its software is momentarily compromised.

Maybe a fully-focused human driver could have anticipated Elaine Herzberg’s movements, or maybe not. We may never know for sure. It’s clear that there’s a lot more work needed before all the kinks are ironed out of self-driving cars. Just as it always has been, progress is subjective.

Comments

D13H4RD2L1V3

Honestly, I think I’d prefer it if driverless tech is seen as a driving aid. Basically help the weak points of drivers without fully taking control away from them.

Computers are imperfect. There is no such thing as fault-free software. In the rare case that the system fails, it’s better to have some redundancy, whether that’s a secondary backup piece of software like the “Limp Home” mode on modern ECUs or giving control back to the driver.

03/24/2018 - 14:32 |
1 | 0
ThatCarGuy 2

Has everyone just conveniently ignored the fact that she was crossing in shadow, on a pitch black part, wearing black? And did anyone actually watch the video? You couldn’t even see her until she was in front of the car. If I or anyone else that is actually human was driving we would have hit her. Stop de-rationalising the situation.

03/24/2018 - 22:55 |
3 | 2
Anonymous

This is exactly why people should never buy autonomous cars. Especially if they can be herded into a river by the masses. Maybe thatll send a message to the government.

03/26/2018 - 12:51 |
0 | 0
Arun Parkin

Car companies are rushing into the development of autonomous cars too quickly. People say that by as soon as 2025 fully autonomous cars will become mainstream. I simply cannot believe that such a vast change will happen in less than ten years time; people have been driving their cars the same way for decades. Therefore I believe that any fully autonomous cars in the near future will be confined to dense urban areas and we are still a long way off from the “driverless dream”.

04/05/2018 - 18:45 |
0 | 0

Topics

Sponsored Posts