Defect vs. Misuse: when should automakers be to blame?

Jeremy Clarkson once said that the best safety device a car could have would be a knife sticking out of the steering wheel and pointing to the driver’s heart. It would be the most effective way for people to wear seat belts and drive carefully. Even if that was meant as a joke, this quote brings a very wise lesson: you should never get too confident at the wheel. Never trust too much your gear. We have recently had 2 cases of deaths related to this very problem. In one case, the automaker issued a recall even if its equipment did not present a defect. It only worked in a way people were not used to. In the other, there will be no recall, but an update has already been released. The question we would like to pose is: should automakers be blamed for misuse?

FCA and Anton Yelchin

Anton-Yelchin-Jeep-Grand-Cherokee-01

The first case involves the death of the actor Anton Yelchin, crushed against a gate by his own Jeep Grand Cherokee. FCA has issued a recall due to the electronic gear selector of some vehicles. They always get back to a central position. Moving them allows the driver to select P, R, N or D. The selected position letter lights up, on top of the lever. The interesting part of this recall is that it has nothing wrong. In other words, it works as expected, but FCA has accepted that the operation may confuse the driver.

With that, FCA has put itself in a very dangerous position. In theory, it is accepting a blame that is not theirs.“There’s no ‘defect,’ but you could argue that gearshift lever is defective in design if people can’t understand it,” said Jack R. Nerad, Kelley Blue Book executive market analyst, in an interview to The Detroit News. Can you? Even FCA states in its recall release: “Gear-selection is conveyed to the driver by multiple sets of indicator lights, not gear-selector position, and unless due care is taken, drivers may draw erroneous conclusions about the status of their vehicles”. All you need to operate the gearbox is “due care”, something you also need to drive. Or don’t you? If you are not able to operate a machine, is the manufacturer to blame? There is even an article from Road Show about the problem. And it makes fun of the ones that were not able to deal with the gear selector.

In the same interview, Nerad said this: “It’s probably a new area for NHTSA, where there are new ways and new technologies that are not necessarily easy to understand and mistakes could be made even though the system is operating as designed”. If this is the case, the discussion should go as far as establishing operating patterns for cars as there are for airplanes. And there is no such thing for automobiles. Who establishes if a new technology is easy to understand? What is “easy to understand”? Can NHTSA or any other entity make sure an “easy to understand” car will be properly driven?

There are many more questions to ask, but they all bring us to a clear conclusion: government and automakers are increasingly trying to patronize drivers. And most of them are willingly accepting this patronizing. So much so that it is difficult to find anyone that takes responsibility. Either out of shame or in order to get rich with a lawsuit.

If you are not able to operate a car properly, should you even have a driver’s license in the first place?

Tesla and Joshua Brown

The first person killed in an semi-autonomous car was in a Tesla Model S. Joshua Brown was 40 years old and a massive Tesla supporter. When the first reports on the accident emerged, many have rushed into putting the blame on the electric car manufacturer. Tesla should not have offered Autopilot so soon, according to some critics. That would set the technology back in a few years, if not a lot of them. An article from Todd Lassa, the Detroit Bureau Chief of Automobile, even claims Tesla should be treated as “a conventional automaker”. Conventional is everything Tesla is not. And everything it fights. If Tesla wanted to be a conventional carmaker, it would produce ICE cars. It would have dealers.

The last sentence in his opinion article is symptomatic: “Whether or not Tesla can act like a conventional automaker could determine whether it has a true future or whether it is nothing more than a modern-day Tucker with a longer shelf life”. Really? So only conventional carmakers will have the right to survive? Like Volkswagen and its US$ 14.7 billion agreements? Like FCA, looking desperately for a partner in order to reach a 6 million unit year production? What about Riversimple and all other disruptive car companies? Will they have to be “conventional” to succeed? There are so many wrong aspects about this sort of reasoning we won’t even dissect them. The present article is about something else. Check the image below:

tesla-truck-accident-electrek

It is an image published by Electrek that explains the accident. The tractor-trailer came from an intersection. The Tesla Model S went right under it. Brakes were not applied, a clear sign the driver did not see it and that Autopilot did not detect the obstacle. Now check the video below:

It contains a few mistakes, such as saying Autopilot kept on driving the car (it didn’t, the car just did not stop after having its roof ripped off. That is called inertia). But it brings important information, such as the truck driver claim that Brown was probably watching “Harry Potter” in the car when the accident happened. A DVD portable player was found inside the vehicle. So the new information clearly shows Brown has put too much trust in Autopilot. He has apparently turned it on and started watching a movie. He has ignored Tesla’s warns for drivers to keep their hands on the wheel and to be aware about what is happening around. Brown’s confidence on Autopilot even made him publish a video on YouTube that has gone viral. It has prevented an accident with his car before. But was that enough for him to just hand the control of the car and watch a movie?

Bottom line

Autopilot is a semi-autonomous system. It will not drive you from A to B with no assistance. You cannot take the back seat and order your electronic chauffeur to get you some place. Is it a defect if it is not able to detect a truck that suddenly crosses the road? Analysing everything in cold blood, Brown’s death will improve the system. It will bring one more variable for Autopilot to consider. Possibly more sensors, radars and cameras, in order to make the autonomous systems redundant. A new Tesla software, the 8.0 version, brings important updates to the system and Tesla is also about to present the Autopilot 2.0.

Getting to the bottom of the matter, should Tesla be to blame if Brown turned on Autopilot and decided to watch a movie? Should Mercedes-Benz be to blame if an E-Class driver did the same? We sincerely do not think so. Recalls should be issued if a defect threatens the safety of the cars occupants. Automakers should be held responsible if such defect kills or harms anyone. But they are not to blame if people make mistakes using their products. As it apparently seems to have happened with Yelchin and Brown.

Gustavo Henrique Ruffo

I have been an automotive journalist since 1998 and have worked for many important Brazilian newspapers and magazines, such as the local edition of Car and Driver and Quatro Rodas, Brazilian's biggest car magazine. I have also worked for foreign websites, such as World Car Fans and won a few journalism prizes, among them three SAE Journalism Awards and the 2017 IAM RoadSmart Safety Award. I am the author of "The Traffic Cholesterol", a book about bad drivers that you can buy at Hotmart, Google Play, Amazon and Kobo.