Karl D. Stephan
mercatornet.com
Karl D. Stephan received the B. S. in Engineering from the California Institute of Technology in 1976. Following a year of graduate study at Cornell, he received the Master of Engineering degree in 1977 and was employed by Motorola, Inc. and Scientific-Atlanta as an RF development engineer. He then entered the University of Texas at Austin’s graduate program and received the Ph. D. in electrical engineering in 1983.
What is certain is that around 11:25 pm, the Tesla went off the road and crashed at considerable speed into a tree. The car’s massive lithium-ion battery caught fire, and photographs of the wreckage after the bodies were removed show only two door uprights standing on either side of the otherwise flattened and blackened wreckage.
First responders found the body of Talbot in the front passenger seat and that of Varner in the back seat. Neither was at the wheel of the vehicle at the time of the crash.
Harris County Constable Mark Herman claimed to reporters that “no one was driving” the 2019 Tesla at the time of the crash.
But in a tweet the following Monday, Tesla CEO Elon Musk stated: “Data logs recovered so far show Autopilot was not enabled and this car did not purchase FSD.” FSD stands for Full Self-Driving, a mode which still requires driver supervision. Musk went on to say “Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.”
In defense of Tesla and its CEO, Tesla drivers are warned repeatedly to keep their hands on the wheel even if Full Self Driving mode is engaged. However, this is like telling a five-year-old to keep your hand in the cookie jar, but just don’t take any cookies.
Many Tesla drivers have given in to the temptation to engage Autopilot or otherwise surrender control of the vehicle to the system computer and allow their attention to stray, or even leave the driver’s seat altogether, as Dr Varner apparently did. And the self-driving capabilities of the car are good enough so that — most of the time — absentee drivers can get away with it.
Musk bases his claim that Autopilot was not enabled on the fact that Tesla’s telemeter “periodic” updates via wireless links to the company.
Leaving aside the question of whether having your car inform Tesla of your every driving move is compatible with privacy, it is not clear how frequent these updates are. If you read Musk’s tweet like a lawyer, the phrase “Data logs recovered so far” could cover the possibility that the most recent data log Tesla has from the vehicle in question was many minutes before the actual crash occurred. In other words, Musk could be saying nothing more significant than, “We know that ten minutes before the crash happened, Autopilot was not engaged.”
But a lot can happen in ten minutes.
The Houston police authorities have both impounded the wreckage of the Tesla and stated that they “eagerly wait” for the data that Tesla has recovered remotely. It is unclear at this writing whether any data logs can be recovered from the incinerated wreck. Unless Tesla has taken steps to harden the housing of the car’s computer memory in a way similar to the kind of waterproofing and fireproofing that aviation black boxes have, I’d say that the remote data is all the data that anyone’s likely to recover.
In the long run, removing the human element entirely from driving may significantly reduce automotive fatalities and injuries. And when so-called “driver assist” systems such as lane-keeping, station-keeping at a fixed distance behind a leading vehicle, and automatic braking in emergencies are employed in the way they are supposed to be used—as assists to a real driver at the wheel, not as a substitute—studies have shown that they do reduce accidents.
But the way Tesla has marketed their vehicles and promoted the driver-assist features as “Full Self-Driving” and “Autopilot” is misleading on the face of it.
Musk’s cowboy reputation, which he appears to relish, may be a big reason why only 14 percent of Americans say they “would trust riding in a vehicle that drives itself.” If Musk really intends to sell a mass-market car rather than one that only doctors and stockbrokers can afford, that 14 percent number will have to increase a lot before a truly self-driving car can succeed.
In the meantime, deceptive and hypocritical marketing such as Tesla engages in contributes to the perception that alone among automakers, Tesla has really arrived at what the Society of Automotive Engineers calls “level 5” autonomous driving.
In a letter to California’s Department of Motor Vehicles in March of this year, Tesla representatives admitted that the most advanced features of any Tesla vehicle amount only to Level 2 autonomy. According to the SAE, Level 2 automation is simply driver-support, not automated driving, and requires that the driver constantly supervise the car’s operation. Clearly, many Tesla drivers are going beyond that. Dr Varner gambled on getting away with it and lost.
There’s nothing new about automakers providing features on cars that some drivers abuse. The muscle cars of the 1960s had power and acceleration that went way beyond anything normal driving required. As a consequence some people wound up dying in fiery crashes after 140mph joyrides.
But bigger engines and faster cars were just incremental changes that took place since the invention of the automobile.
Cars that seem to drive themselves are something truly new in automotive history, and we are still in the early stages of seeing how autonomous driving will play out. While Tesla deserves credit for marketing what is probably the most technically advanced combination of driver-assist technologies on the market today, they have created a dangerous situation in which even a few spectacular crashes such as the one in Houston can put a damper on an entire technology and scare people away from it.
If Tesla is smart enough to make a nearly autonomous car, they are also smart enough to figure out how to keep drivers from absenting themselves from the wheel until the future date when it is reasonably safe to do so. By Tesla’s own admission, that date is not here yet.
It is apparently ridiculously simple to fool a Tesla car into driving itself while you play your violin in the back seat. Other car makers are taking more sophisticated precautions such as eye-motion detection to ensure that the driver-assist system always has a driver to assist who is paying attention. It’s way past time for Tesla to do something like the same.
This article has been republished with permission from Engineering Ethics.
Please share this article so that others can discover The BFD.