Innovator, new-age entrepreneur and tech evangelist, Elon Musk, his fans say, is a man bordering on genius. He is always in the news, though the products of his creation are but a few as yet, and rarely seen outside the US.

His concepts are literally out of this world, yet at the cusp of fruition like the SpaceX Falcon reusable rocket and the Hyperloop One high-speed transportation system. His designs are bold, their execution even bolder and post-launch, his company has most-likely already moved on to the next big thing. So, you'll be excused for thinking that you should either buy in quickly or risk being left behind, especially in relation to his Tesla cars.

His company's first car, the Roadster, an all-electric, two-door sports car, was so radical at the time of its launch, it propelled the idolising of Musk among car and tech fans in Silicon Valley and beyond. Now his every move is watched with interest and every announcement is awaited with bated breath. But Musk has had his share of controversies, with fresh ones arising every week. Plotting the media's obsession with everything Musk will yield a seismograph-like map of yo-yoing positive and negative news. Last week, the plotter swung below the median, with considerable negative news.

Tesla's Autopilot feature has been facing the heat after a fatal accident in the first week of May involving the Model S in Florida, in what initial traffic investigations have concluded was a failure to detect a vehicle directly in front by both the Autopilot and the driver of the vehicle. The investigations, by no means conclusive as yet, seem to suggest that the driver Joshua Brown, who was killed, had engaged Autopilot, completely let go of controlling the wheel, and was allegedly watching a movie instead of keeping his eyes on the road.

'Termixology'

Much has been reported about the accident and its potential causes, about how the statistics show that it is safer to be inside a semi-autonomous vehicle than a regular vehicle and that driverless cars are the future despite these setbacks. The media has taken both offensive and defensive stances on the subject, like it may somehow be able to influence the inevitable march of progress towards driverless cars. Though it is disputable how much of that goal can be assumed to have already arrived today.

In all the whirlwind of reportage and analysis of the incident, nobody has discussed if Tesla's strategy to group its assisted driving features together into one defining term and calling it 'Autopilot' is by itself misleading and flawed. Could it be leading Tesla drivers into believing that the car can take over despite the numerous warnings on-board which stress the point that he or she should always be alert and be ready to take over control? Or is Tesla facing disproportionately higher scrutiny and criticism than it should from one fatality, in one of its cars, just because of the use of the term 'Autopilot'?

Auto misunderstanding

Driverless cars have been in the news over the past two years and it is tempting to believe that we are at the threshold of a revolution in autonomous driving. But it is, as yet, an imperfect science in real-world conditions. To be clear, let me clarify that Tesla has done its due diligence and tested its cars in Autopilot mode over millions of kilometres in real-world conditions. But, clearly, that still doesn't cover all possible scenarios on the road.

To understand why Tesla’s terminology could be misleading for some drivers, let us first start by reiterating that the Model S or the Model X are not self-driving or autonomous cars. They are just cars with some semi-autonomous functions, such as adaptive cruise control with autonomous braking and auto lane-centering. If the driver engages the turn signal, the Teslas are also capable of autonomous lane-changing. The cars feature radar tech, and combine the feedback from cameras and sensors located around the exterior to keep a safe distance from traffic on the same lane, irrespective of the speed of travel.

The system also allows a Tesla to react to vehicles that may intrude into the lane. But obviously, there are limitations to its capabilities. The point to remember here is that there are a number of other luxury car brands, such as Mercedes-Benz, BMW, Audi and Infiniti, that have had a similar cocktail of semi-autonomous driving features in their cars. Adaptive cruise control, for example, has been available in some Mercedes cars from the late 1990s. Similarly, lane departure warning systems or lane assist, parking assist, autonomous braking including coming to a full stop and resumption of travel without any driver intervention have been available in many German luxury brand cars.

None of them call their systems autopilot or anything close to suggesting that the car can handle itself. They use a variation of terminology which clearly identifies that these are semi-autonomous driving aids or assistance systems, referring indirectly to the need for the driver to also remain alert at all times.

False sense of security

In Tesla's case, the system is tuned to continue on 'Autopilot' mode for long durations until a sharp turn or poor road markings force the system to issue warnings seeking the driver's intervention. Whereas most other brands’ cars which have these assistance systems require the driver to regularly take the wheel to indicate attentiveness and the semi-autonomous driving system then resets itself for the next short, preset duration of hands-free driving. There is a reason why many of them have chosen not to enable autonomous lane changing as part of the bouquet of features. And it won't be wrong to say that this must be to avoid giving the driver an elevated sense of security while cruising with his hands off the wheel.

Beta phase

Tesla says that its Autopilot tech is still in the beta phase. The only difference between Tesla's systems and those of the other cars that have similar tech is that the American brand’s cars are tuned to automatically send performance feedback to the company, which is then shared with all the cars that feature the same software. This is a bit like machine learning between cars, which enables real-time fine-tuning based on real-world performance.

But the fact remains that Tesla’s Autopilot system is far from perfect and that needs to be captured in the kind of terminology that the company uses to refer to the tech.

While encouraging its buyers to help fine-tune the system by trying it out cautiously, Tesla needs to disengage from actively promoting users who tend to behave like they are driving self-driving cars. Tesla also needs to remind buyers of its cars that staying alert at all times is mandatory, despite the advances in its semi-autonomous tech.

We are still grappling with accidents caused by distracted driving and human errors of judgement; we certainly don’t need machine errors of judgement added to that mix.

comment COMMENT NOW