Did The First Fatal Tesla Autopilot Crash Actually Happen Back In January?

Tesla is investigating a crash that happened last January in China, which might actually be the first ever fatal Autopilot accident
Did The First Fatal Tesla Autopilot Crash Actually Happen Back In January?

It had been thought the first ever fatal crash for a Tesla under the control of Autopilot happened in May when Model S owner Joshua D. Brown lost his life, but that might not be the case. Reuters reports that Tesla is investigating a fatal accident that happened in Beijing, China, way back in January.

23-year-old Gao Yaning died when his Model S hit the back of a road sweeper, however Tesla has stated that it’s impossible to know whether or not Autopilot was enabled. “Because of the damage caused by the collision, the car was physically incapable of transmitting log data to our servers,” the company said.

Warning: The video news report below contains footage of the fatal crash. Viewer discretion advised

Remote video URL

Tesla noted that in its investigation it has been trying to work with Gao’s father, the owner of the car, but “he has not provided us with any additional information that would allow us to do so.”

Since no evasive action was taken, it does make you wonder if Gao was relying on Autopilot at the same time. If that was the case, we can only speculate as to why the system failed to spot the road sweeper.

Find out more about how Autopilot works in our video below:

Remote video URL

Comments

Freddie Skeates

Thought Teslas were supposed to be super strong, how the hell did this happen?

09/15/2016 - 14:22 |
2 | 1

Only thing that’s protecting you in such a crash are the pillars and the roof. The guy got decapitated because of his stupidity.

09/15/2016 - 14:31 |
2 | 2
Deadpool (Cam's much sexier twin) (Official Demon Fangirl)

In reply to by Freddie Skeates

Well because I can’t watch a 21 minute long only Chinese spoken video, I can’t tell you how fast the car was going. But I’m pretty certain highway safety tests are performed at approx 35-40 miles per hour, above that the amount of energy involved in a collision is far greater than what road vehicles are built to endure. Even the safest vehicles are no match for physics. It’s a sobering reminder of why even at the normal highway speed limits, there is always a possibility of serious injury, or death.

09/15/2016 - 14:37 |
2 | 0

Because he was at motorway speeds

09/15/2016 - 17:33 |
0 | 0
RoyP

Let’s not forget more people died from pilot error than instrument error..

09/15/2016 - 14:23 |
8 | 2
Anonymous

In reply to by RoyP

Yes but more people died on earth than in space, that doesn’t make space a nice place to live

09/15/2016 - 16:55 |
13 | 1
Anonymous

Tesla seems to be getting on the news more often unlike other brands (i.e. Volvo, Mercedes) which have similar systems. Maybe Tesla drivers aren’t’t as bright? Probably 😛

09/15/2016 - 14:28 |
13 | 0
Anonymous

In reply to by Anonymous (not verified)

Maybe Tesla is a just starting up company doing radical things such as all electric cars?

09/16/2016 - 07:06 |
2 | 2
FLixy Madfox

Here’s the thing. People are too darn reliant on their goshdarn Teslas! They think that autopilot is Lewis Hamilton and can successfully dodge anything in 0.000001 of a second. They think the car will brake and turn for them so they basically go to sleep while driving. (https://www.carthrottle.com/post/this-tesla-model-s-driver-took-advantage-of-autopilot-to-catch-up-on-sleep/) Most of the the time this was true. When a crash happens while the driver isn’t paying attention, hands off the wheel, playing pokemon go on their phone, overall not paying attention. Then whoopsies, something gets in the way of the autopilot, which is still in development, and crash. It’s not the cars fault you crashed, it’s someone in the car…
(edit) The point I was trying to get through was that people shouldn’t put so much faith into a system in development….

unless it’s being taken over by computers, then it’s not their fault.

09/15/2016 - 14:30 |
54 | 0

I’m with you on that bud, even in autopilot there are some things an AI can’t predict surely.. It is not 100% foolproof.

09/15/2016 - 14:35 |
18 | 0

Like a GT-R

09/15/2016 - 15:19 |
7 | 15

Ik right???

09/15/2016 - 20:39 |
1 | 0
Anonymous
09/15/2016 - 14:31 |
42 | 0
Anonymous

“Crashed during or caused by autopilot”. I’m sorry, but this sounds like complete bullshit. They or crashed because of other drivers or because they weren’t actually paying attention to the road. I don’t know how fast you can switch between auto/manual, but if you didn’t see the road sweeper in time, you probably weren’t paying attention.

Technology is fragile and can cause errors, but those errors have to be corrected by the driver. So basically, you can’t actually ‘blame’ Tesla for his death.

09/15/2016 - 14:43 |
3 | 2
Raregliscor1

In reply to by Anonymous (not verified)

You can if Tesla marketed it as 100% safe.

09/15/2016 - 14:46 |
2 | 0
Anonymous

Autopilot should only be used as a means of relieving oneself when on a freeway so they don’t have to do all the tasks (same as cruise control) and to let it do the simple stuff, keep you within a lane or going a constant speed. If a hazard arises then of course you should ease up a bit and avoid it. If you have radar guided cruise control it’s not 100% certain to stop when something is laying on a road, you can’t expect autopilot to do that either. Sure relieve your arms a bit with it and lightly hold the wheel, but don’t become oblivious to the world because you pushed a button

09/15/2016 - 15:54 |
1 | 0
Akashneel

I condolences goes to Gao Yaning’s family. RIP

09/15/2016 - 15:55 |
5 | 0
Jefferson Tan(日産)

In reply to by Akashneel

i somehow read Gao Yaning as Yao Ming…

09/21/2016 - 08:09 |
1 | 0
Guss De Blöd

I don’t know why I read “James May” instead of “Last May” and for a moment I was really frightened.

09/15/2016 - 15:57 |
9 | 0
Antiprius

Doesn’t sound like Autopilot if it didn’t even try to avoid the thing. Likely driver error.

09/16/2016 - 01:09 |
1 | 0

Another death was caused by Auto pilot not avoiding a giant truck the width of the road. Safe to assume it missed a small street sweeper.

09/16/2016 - 18:14 |
0 | 0

Topics

Manufacturers

Sponsored Posts