Here's An Effective Demonstration Of Semi-Autonomous Limitations

Thatcham Research is urging car makers and legislators to provide "greater clarity" about the capabilities of driver assistance systems
Remote video URL

Level 2 driver assistance systems are becoming ever-more impressive. They can take care of the braking and acceleration, navigate bends and even changes lanes all by themselves, to the point that people are placing ever more faith in the technology. But you only have to look at the spate of recent high-profile Tesla Autopilot crashes to know that this can be problematic.

This isn’t lost on Thatcham Research and the ABI (Association of British Insurers) - both firms issued “an urgent call to carmakers and legislators for greater clarity around the capability of vehicles sold with technology that does more and more driving on behalf of motorists.”

The other Matt had a rant a few weeks ago about the use - and indeed misuse - by manufactures of the word ‘autonomous’, and it seems these two organisations also aren’t happy about current driver assistance systems being described thusly. “These are not Autonomous systems. Our concern is that many are still in their infancy and are not as robust or as capable as they are declared to be,” Thatcham’s research boss Matthew Avery said.

To go with the press release, there’s a simple but effective video (above) showing what can go wrong when you put too much trust into a system like Autopilot. In it, we see a Tesla Model S following another vehicle. The car in front changes lanes, revealing a stationary car ahead, which the Tesla isn’t able to avoid.

Here's An Effective Demonstration Of Semi-Autonomous Limitations

Thatcham also has an issue with the branding used for this kind of technology. “Names like Autopilot [Tesla] or ProPilot [Nissan] are deeply unhelpful, as they infer the car can do a lot more than it can,” Avery said.

This summer, Thatcham will be undertaking an extensive consumer test programme to see how the current crop of systems fare. Criteria will including naming, back-up systems, emergency intervention and the kind of notice that’s given if full control needs to be unexpectedly handed back to the driver.

Tesla and Nissan - the two companies whose tech was name-checked by Thatcham - both responded to the report:

Tesla

“Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents and the issues described by Thatcham won’t be a problem for drivers using Autopilot correctly.”

Nissan

“ProPilot Assist is a hands-on, eyes-on, driver-assist system that can be used for motorway and dual carriageway driving. This is clearly communicated to customers at all stages of the purchase process. The system requires the driver to be in control at all times, and with their hands on the steering wheel – the system deactivates if this is not the case.”

Comments

No comments found.

Topics

Sponsored Posts