A Tesla Driver Has Died In The First Ever Fatal Crash For A Self-Driven Car

The NHTSA is investigating Tesla, after a Model S failed to prevent a fatal accident with Autopilot enabled
A Tesla Driver Has Died In The First Ever Fatal Crash For A Self-Driven Car

Tesla has revealed that a Model S driver died in an accident on 7 May while Autopilot was activated, in what’s thought to be the first fatal crash involving an autonomous vehicle. The driver - 40-year-old Joshua D. Brown - was on a divided highway in Williston, Florida, when a tractor pulled out, at which point neither Brown nor Autopilot reacted.

The Model S passed under the tractor’s trailer, with the bottom of the trailer hitting the windscreen. The car then continued down the road, before leaving the highway and hitting a fence. He died at the scene.

In a statement released on Thursday, Tesla said that the National Highway Traffic Safety Administration (NHTSA) has started a “preliminary evaluation” into the performance of Autopilot during the crash. “This is the first known fatality in just over 130 million miles where Autopilot was activated,” Tesla said, adding, “Among all vehicles in the US, there is a fatality every 94 million miles.”

Brown was well known in the Tesla community, and just a month before the fatal crash had posted a video on YouTube (below) of Autopilot successfully averting an accident. The video quickly clocked a million views.

Remote video URL

Tesla’s Autopilot is at the moment intended to be a driver assist, and more of a ‘semi-autonomous’ mode that requires the driver to be holding the steering wheel at all times. In the statement Tesla notes that “Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” but that hasn’t stopped some well-documented abuses of the system. It’s been heavily criticised in some corners for lulling its users into a false sense of security. Earlier this year, a senior Volvo engineer slammed Autopilot, labelling it an “Unsupervised wannabe” that “Gives you the impression that it’s doing more than it is.”

At this early stage of the investigation, it’s not known exactly why Brown didn’t brake himself. Tesla’s statement speculates that he simply did not see “The white side of the tractor trailer against a brightly lit sky,” however in a report in the Associated Press, the 62-year-old driver of the tractor claimed to have heard one of the Harry Potter films playing from the car at the crash scene. Tesla responded to the claims, stating that it isn’t possible to watch videos on the main screen found in the Model S.

Find out more about how Autopilot works by watching our video below:

Remote video URL

Comments

Ali Mahfooz

I feel that people are trusting the autonomous system a bit too much even though it’s in a development phase. May as well remove the human out of the equation. Condolences to the family tho. :(

07/01/2016 - 08:53 |
530 | 2

Yeah. It’s like preparing for a driving exam on a simulator game in its alpha phase. It isn’t gonna end well. 😔

07/01/2016 - 11:00 |
22 | 0
07/01/2016 - 11:03 |
142 | 1

My thoughts exactly

07/01/2016 - 18:36 |
1 | 0

Well I guess we need to stop the priority of technology so it doesn’t take over vehicles but we could still use the ECU for tuning purposes.

07/01/2016 - 22:25 |
2 | 0
Raregliscor1

Is it me or does Musk seem very detached from the entire situation. Especially for someone who seems to push this technology….

…I’ve never much liked the guy but is anyone else getting this feeling?

07/01/2016 - 08:55 |
93 | 11

Yeah… I kinda feel the same about him. His whole concept of make everything electric and go green does seem to have a lot of loop holes.

07/01/2016 - 09:09 |
48 | 5

When was the last time ANY car company truly expressed sorrow when a fault on a car has led to a death? I can’t remember ever seeing it. It’s not just Elon Musk, to be fair.

07/01/2016 - 11:03 |
27 | 2

No, it was very common knowledge that this was not to be used as full auto, and you had to always keep paying attention, it was to make things easy not ready to take everything over completely. If you ignore this and just do your things without paying attention you are just as stupid as anyone else who has a fatal crash, I am sorry for the guy, but I have to say that these things are way to easily blamed way to much on the developer of the system when the driver is still the person who is responsible for using it in this manner at all, so I wouldn’t get that involved either if I would have been in his shoes…

07/01/2016 - 11:08 |
18 | 0

He focuses on his rocketships that don’t work…

07/01/2016 - 12:52 |
2 | 1

I feel like he looks at his “reality” through numbers and equations rather than how he world actually is. Don’t get me wrong, he’s an extremely smart guy but he’s not really thinking about what we need right now. He’s only thinking of what he’s going to be remembered for.

I don’t think we’re ready for self driving cars. If he really wants to do something about our traffic and congestion issues, what ever happened to the high speed rail plans he sold off because he “didn’t have time for it”?

07/01/2016 - 13:18 |
2 | 2
Zanzaroni

As much as I love driving and as much as I don’t want to think of a world full with self driving cars, I understand what a convenience it might be. I read an article about Eric Noble from carlab a car consulting company accuse Tesla of providing their customers with an untested technology and I have to agree no matter how difficult it is the only way you can’t spot a semi perpendicular to you while driving is if you are blind or if you are not paying attention. If the sun was getting in his eyes he should slow down, that would give him time to adjust to the brightness, if he was not paying attention then he is at fault as stated by Tesla, but they are too. Further measures should be taken just like the measures taken by MB and their semi autonomous cars, or this will be the first in a long series of tragic incidents..

07/01/2016 - 08:56 |
9 | 2

Humans need to do things themselves. Computers will always be less reliable than us.

07/01/2016 - 20:15 |
0 | 0

All it takes is focus. Its simply dangerous to let computers act as humans. Humans need to use their brains and problems solve. With these systems, we are becoming lazy and will be less likely to be able to problem solve. This goes deeper than safetly. And in my opinion, “convience” in this case just means laziness

07/01/2016 - 20:18 |
0 | 0
Anonymous

At the end of the day what’s the point of having any auto pilot technology if it doesn’t brake when there’s an obstacle, when it’s engaged or not haha. I know the driver has got to be aware at all times as well but really and fundamentally it’s absolutely and utterly pointless at this stage if the car can’t even stop in an the evnt of an accident. Basically using humans to beta test a very dangerous software and then when it goes wrong blaming the human for not using it right. IMO that’s not on. Tesla should have only released that software once it was up to full standards and then when something does go wrong they take liability. You can’t keep blaming the humans… oh they musn’t have been looking or concentrating or have their hands on the wheel…. how do you know?! For all we know the guy tried to brake but the auto pilot had overridden him or had a fault that kept the gas on?! Who knows. ….Or did the software think… we’re on a highway if we suddenly brake there’s going to be a massive pile up behind us. There’s only one occupant in the car… software makes the decision to keep the car moving to reduce loss of life elsewhere? What ya reckon conspiracy theorists? :P :P

07/01/2016 - 09:16 |
5 | 0
Anonymous

In reply to by Anonymous (not verified)

Not all cars have autonomous breaking systems or safety features. So the fact that this car has them and they don’t work every time, doesn’t mean that they are at fault and the driver shouldn’t rely on them to do the work for him. As much as it is sad this happened, the driver is only to blame. Pay attention, the systems are there to help you, not replace you.

07/01/2016 - 09:47 |
11 | 1
Anonymous

In reply to by Anonymous (not verified)

I reckon that humans will ALWAYS be superior to computers. They will never be as reliable as a good old human. That goes for every facet of life too. It doesnt matter if they dont make mistakes and are mire accurate than humans. We are more reliable.

07/01/2016 - 20:14 |
0 | 0
Anonymous

it is autonomous but YOU CANNOT rely on it. you should not lapse in attention when behind the wheel of an automobile whether it is driving itself or not because you never know when a hazard may present itself at short notice. such as a tractor.

07/01/2016 - 09:34 |
17 | 0
Anonymous

In reply to by Anonymous (not verified)

True that but what are the benefits then

07/01/2016 - 11:07 |
0 | 0
Iliekdriftz34

In reply to by Anonymous (not verified)

This is impossible to achieve, even using cruise control is not recommended on long straight roads let alone autopilot.
Simply driving automatic cars makes me zone out, compared to stick shifters that keep me busy.
If a car does everything for you, you simply can not stay concentrated. It’s against the laws of biology and how the brain works, our brains love shortcuts and zone out moments to save energy.

07/01/2016 - 11:15 |
9 | 0
Faizan

The whole point of collision warning autonomous braking and auto pilot systems is to stop the car if the driver isn’t paying attention? Driver was at fault but the system failed. The car travelled even after the accident. It really shouldn’t activate unless you have a hand on the steering wheel etc.

Overall It should never have been sold as auto pilot just yet. I agree with the Volvo engineer.

07/01/2016 - 09:55 |
5 | 0
Anonymous

In reply to by Faizan

None if these driver assists should exist in my opinion. All it takes is a little extra effort by drivers. Not a hackable, unreliable, lazyness encouraging system

07/01/2016 - 20:11 |
0 | 0
Anonymous

[DELETED]

07/01/2016 - 09:58 |
1 | 9
Khaled Taha

In reply to by Anonymous (not verified)

That’s niether a good joke nor is it timed correctly…
Have some repect…

07/01/2016 - 10:26 |
3 | 0
Anonymous

I guess Jesus is his copilot now…

07/01/2016 - 09:59 |
30 | 4
Anonymous

In reply to by Anonymous (not verified)

Jesus take the wheel

07/01/2016 - 10:59 |
19 | 0
Ben Saville

In reply to by Anonymous (not verified)

Missed a golden opportunity for a ‘god is my copilot’ reference

07/01/2016 - 13:13 |
0 | 0
Anonymous

It’s a shame that Tesla is choosing to make their customers the test pilots (passengers?) in a system that is potentially dangerous and very poorly regulated internationally. The responsible thing to do would be to sort it out first then sell it, people will always try to abuse the system so make sure it can hold up first.

07/01/2016 - 10:19 |
2 | 4
Anonymous

In reply to by Anonymous (not verified)

Sort it out where? How? The only place to perfect it is on the road, and any Tesla customer should be well aware that they aren’t being replaced by this system and should keep their eyes on the road at all times. If this was a regular car, and if he legitimately couldn’t see, then it would have happened. Tesla have said they haven’t perfected the system, so it’s not as if they’re selling to people who assume they’ll never have to drive again.

07/01/2016 - 10:29 |
5 | 1
Kei Cars Are My Jam

Condolences to the family, but it seems he didnt brake either, lets hope people can step back from this horrific incident and see that this is a fledgling technology that still requires human concentration too.

07/01/2016 - 10:21 |
0 | 0

Topics

Manufacturers

Sponsored Posts