Uber Tragedy Shows Autonomous Cars Still Have Far To Go

After a woman was knocked down and killed by an Uber self-driving prototype car, the inconvenient limitations of the technology have been exposed
Remote video URL

The most significant thing to have happened in the automotive world this week is, without question, the death of a woman at the bumper of an Uber prototype autonomous car; the first such fatality ever recorded. With that, 49-year-old Elaine Herzberg takes her tragic place in history alongside Irish scientist Mary Ward.

Who’s Mary Ward? We’d be willing to bet that 99.9 per cent of people, CTzens included, would have no idea until they Googled her. She was the first person to be killed by an automobile, full stop. On 31 August 1869, near Parsonstown, as the town of Birr was known then, she was riding in a steam-powered car built by her cousins when she was thrown from her seat, fell in front of one of the wheels and was killed almost instantly after sustaining severe head trauma and a broken neck.

Remote video URL

Today, as I’ve just outlined, to most people Mary Ward is nobody. She’s not even as famous as the Z-listers TV researchers keep digging up for every new series of I’m a Big Brother Dancing on Celebrity Strictly Bake Off, and yet she holds a unique place in car history.

Elaine Herzberg, too, will soon be forgotten. This poor woman, who was reportedly homeless at the time of her death, will be lost in the commercial tides pushing autonomous cars ever closer to reality. Another Mary Ward, collateral damage in the turbulence of progress.

Uber Tragedy Shows Autonomous Cars Still Have Far To Go

According to a San Francisco Chronicle report, footage taken from the Uber autonomous car showed that Ms Herzberg pushed a bicycle laden with plastic shopping bags out into the road in front of the car. We’ve since discovered that the car simply didn’t see the obvious obstacle.

Arguably the most wonderful thing about the human brain is its capacity to deal with infinite variables. A focused human brain can analyse a driving situation in ways a computer simply can’t, and probably never will be able to. Despite the darkness the Uber car - and the human backup, who was sadly distracted -absolutely should have seen Herzberg crossing the road, if the technology worked. But it didn’t, so we can only assume it doesn’t.

Remote video URL

If a self-driving machine fails to see an impending accident, people will die. If we can’t be completely sure the systems are foolproof, how can we ever trust them? Would you trust a mechanical, automated nanny with your baby son or daughter if you knew it might occasionally, if accidentally, try to kill them?

I’m not suggesting humans are safer. Far from it. The science that predicts a massive drop in road traffic deaths when autonomy becomes normal is no doubt spot-on. What I’m saying is that autonomous technology isn’t good enough, yet. It’s nowhere near. At the moment all it takes to upset the whole system is a pothole, rays of sun at the wrong angle, dirt on a sensor or darkness. Various prototypes testing on public roads have had various inexplicable faults, like slamming on the brakes at a green light. That in itself could cause a huge accident – and the machine would be to blame.

Remote video URL

Just as we accept the risk posed to us by human drivers getting it wrong, we have to accept the risk of imperfectly-programmed machines getting it wrong, too. That said, I will always prefer the task of anticipating what a fellow human might do, as opposed to what a machine will do – or not do – when its software is momentarily compromised.

Maybe a fully-focused human driver could have anticipated Elaine Herzberg’s movements, or maybe not. We may never know for sure. It’s clear that there’s a lot more work needed before all the kinks are ironed out of self-driving cars. Just as it always has been, progress is subjective.

Comments

Anonymous

At this point we might as well slap on L plates on the autonomous vehicles and treat them like leaner drivers since it’s impossible that the car will get it right the first time.

03/24/2018 - 09:12 |
38 | 1
Ewan23 (The Scottish guy)

In reply to by Anonymous (not verified)

Need some A plates haha

03/24/2018 - 10:02 |
10 | 0
Ali Mahfooz

Slight off but relative topic - when the news broke out that the last male Northern White Rhino had gone extinct and people suggested that science could save and bring back the species to life, scientists ruled out that it’s not so easy or efficient to bring back a dead species because according to them, we still haven’t fully understood the human biology fully well let alone an animal species.

It’s somewhat similar with the autonomous cars. Teaching it what the human brain perceives is literally very hard when we ourselves don’t know how the human brain works well enough.

Next we have the cameras and sensors. Sure the LIDAR system is very advanced but even that managed to miscalculate a human with a bicycle and some bags hug on to it. So did the sensors. Then we have these cameras which to me look like a college project approach. A camera that’s usually found in a mobile phoneor DLSR stuck on to a car? Really? Those things have very limited resolution to what a human eye can see (which has the equivalent of 576 megapixels).

I’m not suggesting that we abandon it but rather take a more non cynical and logical approach to solving a problem. Rather than marketing as a breakthrough in the name of competition, I’d rather the researchers have fully study what and how the human brain sees and understands and then replicate that into a system. Perhaps that would be a more logical approach.

03/24/2018 - 09:41 |
29 | 2

Why downvotes?

Also as a guy who writes code and programes, I’ll be honest with all of you guys. There is no such thing as a working softwere. Every softwere has glitches and bugs, some are smaller some are bigger. But If those bugs can kill somebody, and there IS NO WAY of making a bugless code, there is one option left. And that is to use human brain, not computers to operate our cars

03/24/2018 - 09:53 |
16 | 10

The resolution is not necessarily the main issue in this situation, t’was the cameras low light capabilities, in a lower light situation having something like 1080p or 4k resolutions would make no difference to perceived image quality because in low light the human high relies on rods which have low visual acuity (the ability to define objects clearly) this is why teddy bears in a darkish room were so damn scary when you were a kid, the rods are so bad than in extreme cases it isn’t immediately obvious if an object is moving or not. So to say the camera is insufficient because of it’s resolutions probably isn’t true, while a higher resolution can’t really hurt it probably wouldn’t help much either, a better solution would be to implement a sensor that detected something other than light, it would be a lot better for mapping out the path in front than relying on a camera.

03/24/2018 - 11:30 |
1 | 0
TheMindGarage

This is why level 3 and 4 driverless technology has NO place on our roads. We cannot count on a human to react to situations when needed. Until we can perfect full level 5 autonomy (probably at least 20 years away), stick with level 2 please. Use your 86 billion neurons.

03/24/2018 - 09:56 |
5 | 0
Ewan23 (The Scottish guy)

Nothing new there lol it’s a long long long long time away yet.

03/24/2018 - 10:02 |
1 | 0
Aaron 15

So in the future when all cars become autonomous….. will there be such thing as a driving licence anymore? Will humans have to take a driving test?? #MindBlown

03/24/2018 - 11:28 |
1 | 1
Toby Westlake

All car mags seem to have taken this opportunity to shit on autonomous vehicles. The woman was an idiot for crossing a dual carriageway in front of oncoming traffic in pitch black wearing nothing bright or reflective that would’ve identified her to either human or machine.

03/24/2018 - 11:29 |
4 | 1

Thank you. It seems like no one sees this fact. Cross at night in the middle of a road wearing dark clothing. I doubt a human could have seen them let alone a robot

03/24/2018 - 13:44 |
1 | 0

Well, to be frank, what you see in a dashcam video isn’t necessarily 100% what you’d see on your eyes.

I’ve watched it multiple times. In my personal view, if the driver was looking at the road a few seconds before the collision, there may be an opportunity to apply heavy braking and steer away from her. Emphasis on the “may” because many factors are at play.

To me, a collision is likely unavoidable, so I’d see if there was any way to mitigate the impact as much as possible. Trying to steer away while applying full brakes would probably help.

03/25/2018 - 06:35 |
0 | 0

Shhhhhh…..Be quiet, if we let people be stupid, this will be a hurdle and a setback for self-driving cars, we don’t want them to know that human could not have stopped in time to avoid collision either.

03/29/2018 - 18:42 |
0 | 0
Anonymous

Of course, it’s inevitable. Any new technology has it’s flaws. Imagine the first time people tested cars. They do get solved and that’s why we have them.

03/24/2018 - 11:50 |
3 | 1
TheRacingGoat

I just wish that autonomous cars never happen. I may write a post about it today.

03/24/2018 - 12:35 |
2 | 1
AliE131
03/24/2018 - 12:54 |
6 | 3
Anonymous

Comparisons between autonomous car crashes and the general population of human drivers are not fair comparisons.

Expert drivers have been involved in the programming of autonomous car computers, yet the software still relies heavily on human-orientated assists like ABS and ESC, just like an average driver. People who have been trained to a higher level, like police officers, are almost never involved in accidents that require stopping and/ or steering quickly.

When people are trained to drive properly, they are superior to computers. The answer to reducing accidents is driver training, not a (job-stealing) computer-driven car.

03/24/2018 - 13:06 |
1 | 0

Topics

Sponsored Posts