Here's An Effective Demonstration Of Semi-Autonomous Limitations

Thatcham Research is urging car makers and legislators to provide "greater clarity" about the capabilities of driver assistance systems
Remote video URL

Level 2 driver assistance systems are becoming ever-more impressive. They can take care of the braking and acceleration, navigate bends and even changes lanes all by themselves, to the point that people are placing ever more faith in the technology. But you only have to look at the spate of recent high-profile Tesla Autopilot crashes to know that this can be problematic.

This isn’t lost on Thatcham Research and the ABI (Association of British Insurers) - both firms issued “an urgent call to carmakers and legislators for greater clarity around the capability of vehicles sold with technology that does more and more driving on behalf of motorists.”

The other Matt had a rant a few weeks ago about the use - and indeed misuse - by manufactures of the word ‘autonomous’, and it seems these two organisations also aren’t happy about current driver assistance systems being described thusly. “These are not Autonomous systems. Our concern is that many are still in their infancy and are not as robust or as capable as they are declared to be,” Thatcham’s research boss Matthew Avery said.

To go with the press release, there’s a simple but effective video (above) showing what can go wrong when you put too much trust into a system like Autopilot. In it, we see a Tesla Model S following another vehicle. The car in front changes lanes, revealing a stationary car ahead, which the Tesla isn’t able to avoid.

Here's An Effective Demonstration Of Semi-Autonomous Limitations

Thatcham also has an issue with the branding used for this kind of technology. “Names like Autopilot [Tesla] or ProPilot [Nissan] are deeply unhelpful, as they infer the car can do a lot more than it can,” Avery said.

This summer, Thatcham will be undertaking an extensive consumer test programme to see how the current crop of systems fare. Criteria will including naming, back-up systems, emergency intervention and the kind of notice that’s given if full control needs to be unexpectedly handed back to the driver.

Tesla and Nissan - the two companies whose tech was name-checked by Thatcham - both responded to the report:

Tesla

“Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents and the issues described by Thatcham won’t be a problem for drivers using Autopilot correctly.”

Nissan

“ProPilot Assist is a hands-on, eyes-on, driver-assist system that can be used for motorway and dual carriageway driving. This is clearly communicated to customers at all stages of the purchase process. The system requires the driver to be in control at all times, and with their hands on the steering wheel – the system deactivates if this is not the case.”

Comments

Drifting Dutch

Telsa shouldn’t call it autopilot

06/13/2018 - 21:03 |
1 | 0
Anonymous

What if the buyer bought the car from the used market ? or a father bought the car for his son or daughter ? or an owner gave someone else the keys to do something ?

Expecting each driver to have a crystal clear understanding for how the system work and operate is impossible. There is always outliers.

reading the word AutoPilot and the first thing that comes to my mind is planes and there autonomous systems. giving the intention that the car has a similar capabilities, which is not true.

manufacturers mostly name their systems by their function, to make things as clear and as simple as possible so people can understand its function instantly. without going in the trouble of reading a manual or looking for the it online.

When a function misunderstanding can lead to fatalities, naming clarity must be a priority.

06/16/2018 - 08:02 |
0 | 0
Anonymous

Did Thatcham make sure the inflatable fake car reflected the cars’ radar beams? If not, the test is invalid (even though the overall message is valid).

06/21/2018 - 18:41 |
0 | 0

Topics

Sponsored Posts