Contents
New fatal accident for Tesla. Two victims, who were inside the car but not driving.
The news brings two fundamental issues to the fore for the future of self-driving cars: that of safety and that of responsibility.
New accident for a Tesla car
As we reported yesterday, Monday, April 19, two men lost their lives on Saturday evening. At 9pm local time, a 59-year-old man and a 69-year-old man lost their lives near a small town north of Houston, Texas after colliding with a tree. The crash produced a fire that was extinguished in four hours. But the sensational news is another: neither man was driving the 2019 Tesla Model S. One of the two had taken my place in the passenger seat, the other in the back seat.
Obviously, autopilot was onwhich however should require the alert presence of a driver in the driver’s seat, with his hands always on the steering wheel.
Autonomous driving conditioning
The Tesla Model S crash on Sunday leaves some doubt.
In the meantime, we remind you that this is a not completely autonomous driving car. To date, Tesla has only released air-conditioned self-driving cars on the market. It means that there is a need for a constant presence of a human being at the wheel, to intervene in case of danger or malfunction of the system.
Therefore, behind the accident in Texas there is a little yellow: it is true that neither of the two men was driving. But it is equally true that Tesla S sensors should always prevent the car from starting unless a driver is detected.
The Tesla accident: the background
The federal prosecutor is investigating the dynamics of the incident. The intervening agents, due to the position of the bodies after the crash, exclude that anyone was driving. It must now be established whether the autopilot was engaged or not.
Federal authorities in the United States are investigating more than twenty car accidents with activated autopilot.
The first fatal accident involving a self-driving car dates back to 2016. The forty-year-old Joshua Brown, who in Florida had failed to intervene to dodge a truck, had lost his life. More precisely, the engaged autopilot had not been able to recognize the truck driver’s maneuver in time.
Tesla’s reaction
Tesla in 2016 had limited itself to a dry statementin which he recalled how that of Brown had been the first death, in the face of more than 130 million miles traveled by his cars on autopilot.
After the accident on Saturday night in Texas, for now the top management of the company founded by Elon Musk have not made any statements.
Self-driving cars and six levels of safety
The classification of self-driving cars has six levels, from 0 to 5. Where level 0 is that of cars completely devoid of autonomy and level 5, for now only hypothesized, would correspond to the constant autonomous driving of the car. At level five there is no longer any need for the steering wheel and pedals to be present.
To date, the driverless taxis Waymo is experimenting with in the United States have reached level 4. But Teslas that can be purchased by individuals are still at level 3, that of the so-called conditioned automation. At this level the car fully handles driving in ordinary situations. However, the driver is required to be present in order to be able to intervene in adverse conditions.
It would therefore be improper to blame the construction companiessince in level 3 cars (the most advanced on the market) it is still necessary to supervise a human being at the wheel.
Autonomous cars should precisely reduce road accidents, as well as improve traffic conditions (with a consequent reduction in emissions) and allow greater mobility for people with physical impediments.
But one of the fundamental aspects – as the Saturday night incident shows so resoundingly – is that it is a so to speak hybrid technology, which cannot be completely entrusted to drive.
The issue of responsibility
The Tesla accident that took place three days ago in Texas also reopens the second major problem of autonomous cars together with that of safety: that of responsibility.
Without going into quirky legal issues, we can say that up to level 3 the responsibility for accidents lies entirely with the driver. More thorny and more subtle will be the problem starting from level 4, where responsibility will have to be regulated. And decide how to divide it between the driver, the house that produces the car and the company that developed the driving software.
Things will not be easy at all, because in the Italian criminal code – for example – responsibility is always individual, and resorting to any objective liability would be a stretch from the criminal point of view.
This being the case, a future is foreseen in which laws will have to adapt to technology and not vice versa.