Tesla: End of an autopilot experiment?

The American press with satisfaction reports that NHTSA (finally, after more than 5 years from the first disturbing reports of irregularities!) She decided to investigate repetitive accidents, and there are also votes demanding strict financial penalties for Tesla Elon Muska and -Perhaps - repair or limitation of the functionality of the system installed in over 750 thousand.cars.The case has a second, third and fourth bottom.

Autopilot Tesla "overrated"?

Automotivenews journalists call the autopilot of Tesla directly "overrated" and add: "Elon Musk publicly declares that his company is close to creating an autonomous vehicle at the fifth level (the highest level - it is a car completely replacing the driver - ed.), But in confrontation with official organsHe admits that the autopilot and the system of full autonomous driving in Teslas are consistent with the second level on the SAE scale " - that is, it has possibilities not greater than the systems offered by other manufacturers.And yet ... only Tesla drivers can check e-mails during long highway journeys, doze or deal with activities that have nothing to do with management.Cars of other brands do not give such freedom!

Six levels of autonomous driving - Tesla at level 2

It is worth explaining here that there is a scale authorized by SAE describing the degree of automation of vehicles.The scale has six levels, where zero means a car without any autonomous functions, and five - a vehicle that does not have to have a steering wheel, and is driven fully automatically, e.g. by completing the user's voice commands.The latter have not yet been invented and it is not known if and when it will happen.

Tesla autopilot problem - what went wrong?

The situation is explained by the engineer Sam Abuelsamid in the American Forbes.According to him, the first version of the autopilot presented in 2015 was based on devices of the external manufacturer (Mobileye) intended for use in the warning systems about riding off the lane, about the possibility of collision and - in the most sophisticated edition - for correcting the track or automatic braking.Many vehicle manufacturers use these devices, but the majority only to the extent provided by Mobileye.

Tesla based on these components (one camera at the front, one long -range radar and 12 ultrasonic sensors) created an autopilot with a functionality of a far exceeding the assumptions of the manufacturer of these parts.A version of the autopilot was created, which soon (probably) contributed to a fatal accident and to parting between Tesla and Mobileye.

The next version of the autopilot (V2) was already created on Tesla's own components: next to the long -range radar, it consists of 8 cameras and a more efficient processor with software created by Tesla.Competitors, such as GM, next to the long -range radar, which has low resolution and was initially designed to support simpler systems (such as adaptive cruise control), use four more short -range radars monitoring the situation around the vehicle.

Tesla: koniec eksperymentu z Autopilotem?

The radar in Tesla does not see the difference

The author of Forbes explains that the long -range radar used in the Tesla autopilot (and in other cars as an element of adaptive cruise control) does not distinguish between normal elements of road infrastructure (road signs, gates, etc.) from e.g. vehicles standing on the side of the road. Systems that can take over from the driver on the highway, so they are designed to ignore these various objects on the side of the road and around the road at higher speeds. Assuming that the driver has a steering wheel in his hand, has open eyes and watches, it is fine - in normal conditions on the highway there is nothing, e.g. on the emergency belt, nothing stands in the lane, and if what, the driver sees them From a distance and can press the brake in advance. It's just that, for example, GM does not claim that his cars equipped with a Super Cruise system, e.g. Cadillac CT6, are capable of autonomous driving - Super Cruise is only a driver's assistant (although also qualified as ADAS - Advanced Driver Assist System). And Elon Musk offers a system under a much speaking name: autopilot. And he claims that the autopilot manages better than a man.

It is highly likely that Elon Muska engineers, aware of the shortcomings used in Tesla technology, have tried to stretch a slightly too short quilt, tinkering in the software and turning up the system's sensitivity to data flowing from the long -range radar.This is evidenced by cases of rapid braking of these cars for no apparent reason.

And one more bad news: tests carried out in Korea by KNCAP prove that although Tesla's autopilot can generally do a lot, for example, he consistently does not recognize cyclists on an angle arranged at a right angle to a right angle to the direction in which Tesla goes.

테슬라 모델3 | KNCAP 2021년 자동차 안전도평가(종합)

NHTSA spokesman: "In every US state, the driver is personally responsible for driving his vehicle."

Elon Musk: The driver causes a greater threat than the autopilot

Elon Musk and his followers emphasize that the accidents caused by the autopilot occur extremely rarely and if it wasn't for this device, there would be more victims.There is talk of a dozen or several dozen accidents with wounded and one with a fatal victim to be the subject of the NHTS investigation, but the truth is that only Elon Musk knows how many dangerous events the autopilot took part.Not everyone who causes an accident or collision is publicly admitted that he has launched an autopilot to be able to sleep during the night trip.Also only Elon Musk knows how often drivers use the option of autonomous driving, and only a combination of these data (the number of kilometers traveled and the number of collisions or, for example, unnecessary emergency braking) would give a picture of actual benefits and threats related to autopilot.

The autopilot leads and the driver is sleeping?

Meanwhile, another doubt related to Tesla's autopilot concerns driver control.While other manufacturers quickly noticed that one of the biggest problems is the supervision of ADAS while driving, Elon Musk publicly indicates that the driver in Tesla is unnecessary and causes a greater threat than the machine.

The problem is this (anyone who has a relatively modern and well -equipped car can convince himself) that man remains vigilant as long as he is largely involved in the activities he deals with.As soon as he looks after the machines, he begins to sleep.

Hence the ideas of producers, such as Google or Continental, to install infrared cameras in the dark and through dark glasses in order to monitor the driver's activity.And only ordinary wide -angle cameras are mounted in Teslas, which can be checked in, placing a photo of a human face with open eyes on the driver's seat.In the same way, external systems do not recognize whether they are dealing with the image of a person on a billboard or pedestrian - and in both cases they are willing to react similarly.

They want punishment for Tesla and Muska

"We hope for a detailed investigation and that possible punishments will have sharp teeth. It's time to end with the fun of Tesla with human life on public roads," write automotivenews journalists and this is clearly heard in the industry.Some count on the involvement of the Federal Trade Commission (FTC-the American equivalent of the Polish UOKiK): Two senators turned to the intervention of the autopilot of Tesla.Doubts relate to dishonest ads exposing public safety by declaring that the car can do the driver, although in practice the possibilities in this matter are very limited.

Conclusions from the investigation regarding the autopilot of Tesla will affect all manufacturers?

The matter is serious from the point of view of the entire automotive industry, because it concerns the rules for introducing beta devices to the market, which in addition have a decisive impact on safety.It is not without reason that NHTSA asks all manufacturers introducing elements of autonomous systems to provide the number of dangerous events that occurred during the operation of these systems.The paradox is that currently their manufacturers have the only real knowledge about the effects of assistance systems.Another thing is that the concealing of sensitive data, which NHTS asks, if it came to light, could end for the manufacturer really bad, which Volkswagen already convinced in the US during Dieselgate.

After the end of the NHTSA investigation, she will have to answer a few questions:

Anyway, the anxiety of Tesla investors caused by the opening of the investigation by NHTS seems justified.After the possible implementation of the above -mentioned conclusions, it may turn out that other producers who announce the introduction of their "autopilots" on the market only in the coming months, they really have something to sell, and Tesla ... must still refine their system very much.