Tesla collects a huge amount of data to realize 'autopilot' and 'fully automatic driving'

Electric vehicle maker Tesla offers functions such as `` autopilot '' and ``

full self-driving (FSD) '', but in order to realize these, we collect a huge amount of data from the vehicle. IEEE Spectrum issued by the American Institute of Electrical and Electronics Engineers summarizes what kind of data the Tesla car collects from the vehicle.

The Radical Scope of Tesla's Data Hoard - IEEE Spectrum

Tesla's Autopilot Depends on a Deluge of Data - IEEE Spectrum

Who Actually Owns Tesla's Data? - IEEE Spectrum

More than 99% of new Tesla cars are equipped with a recorder called 'Event Data Recorder (EDR)'. EDR is activated when a vehicle causes a collision accident, collects and records information on vehicle speed, acceleration, brake use, steering input, automatic braking, safety control, etc. in just 5 seconds, and details of the collision accident This data will be useful when investigating EDR data is stored on an SD card inserted in the vehicle's on-board computer. The SD card also contains time-stamped gateway logs such as seatbelt fastening information, autopilot usage information, cruise control settings, and whether or not the driver was behind the wheel. It looks like it's going to be. Since this information is stored at a relatively low resolution, several years' worth of gateway logs can be stored on a single SD card for the life of the vehicle.

This gateway log is periodically uploaded to Tesla's servers when the vehicle connects to the Wi-Fi network. Since Tesla has submitted this gateway log as evidence in past lawsuits, IEEE Spectrum points out that it is clear that the gateway log is semi-permanently stored on Tesla's server.

In addition, Tesla cars are equipped with many sensors such as cameras and radars, which can be used by Tesla to collect data on drivers, pedestrians and other surrounding environments. It seems that these data are not set to be uploaded to Tesla's servers, but they are set to record the surrounding situation and the situation inside the car with photos when the airbag is activated, similar to the gateway log. It seems that it is designed to upload to Tesla's server when the vehicle connects to Wi-Fi. Please note that these photos will be deleted from the onboard computer after being uploaded to the server.

Additionally, the Autopilot computer records a complete trip log each time the Tesla transitions from park to drive. This trip log seems to include information on GPS, speed, road type, whether autopilot is activated until the vehicle parks, and information on the timing of activation. This trip log is also uploaded to Tesla's server when the in-vehicle computer connects to Wi-Fi and is deleted from the in-vehicle computer.

What Tesla is doing with these vast amounts of data is developing features like autopilot and FSD. Since 2016, Tesla cars have a mode called 'shadow mode' that simulates the driving process by running the autopilot in the background in parallel with the human driver when the autopilot function is not working. doing. If the autopilot's predictions and the driver's driving differ, the vehicle will capture the situation with its on-board camera and upload it to Tesla's servers, including detailed parameters such as vehicle speed.

The development team verified this data and input it as training data for the neural network used in the autopilot, for example, ``Autopilot cannot accurately identify road signs covered with trees.'' It seems that they are starting to notice.

Since this shadow mode works in millions of Tesla vehicles around the world, properly processing and storing the data collected by the vehicle poses a very large cost problem. In fact, Tesla has admitted that building a large team to evaluate and label the images and videos it collects in Shadow Mode is 'extremely costly' and will be released in June 2022. It was reported that about 200 employees involved in this work were laid off.

Tesla dismissed about 200 employees involved in automatic driving technology and closed the office in California - GIGAZINE

It seems that Tesla's autopilot function is improving year by year because it processes a huge amount of data, but it is still a problem that serious and fatal accidents continue to occur. Some point out that there is a fatal flaw in Tesla's autopilot function, but it is also pointed out that drivers overestimate Tesla cars.

In fact, on July 28, 2022, the California State Department of Motor Vehicles (DMV) advertised that Tesla's self-driving technology, Autopilot , has achieved complete self-driving. , is suing Tesla for causing excessive trust by users.

California DMV accuses Tesla of false advertising - Los Angeles Times

California DMV says Tesla FSD, Autopilot marketing deceptive

Tesla is in hot water with California DMV over its Autopilot and self-driving claims - Electrek

In two documents submitted to the California Administrative Hearing, the DMV stated, ``Tesla's notations such as 'Autopilot' and 'FSD' are not just product names and brand names, but advanced driver assistance systems ( ADAS) is a phrase that advertises as if a vehicle equipped with ADAS operates as a self-driving car.Tesla's ADAS-equipped vehicle does not operate as a self-driving car at the time of advertising by Tesla, and currently But it does not work as an automatic driving car, ”he points out that Tesla's advertising method is difficult.

Anita Gore, DMV's deputy director of public affairs, said the agency sued Tesla 'to prevent driver misunderstanding and misuse of new vehicle technology.' Tesla will place advertisements to consumers requesting more detailed explanations from Tesla drivers, including limitations and reminders regarding Autopilot and FSD, and other appropriate actions in light of the violation,' Gore said. explained to the foreign media CNBC.

Mr. Gore said the DMV lawsuit 'is solely related to Tesla's autopilot and FSD marketing and advertising practices.' DMV also declares that it is conducting a separate safety review to determine whether Tesla's autopilot and FSD are of a level that can be used on public roads without special permission.

Tesla has implemented an autopilot function in all newly manufactured vehicles and offers FSD on a subscription basis of $ 12,000 (about 1.6 million yen) purchase or $ 199 per month (about 27,000 yen) doing. The FSD provided by Tesla is in beta at the time of writing, allowing drivers to test the FSD on public roads in the United States. At the time of writing the article, it is said that there are more than 100,000 testers using the beta version of FSD, but there are still doubts about the accuracy of FSD, saying, ``I can not recognize obstacles on the road and collide. ” has been reported.

Movie that collides without recognizing obstacles on the road in beta test of Tesla's fully automatic driving system - GIGAZINE

According to figures released by the federal government in early July 2022, 70% (more than 270) of reported “self-driving car crashes” between June 2021 and July 2022 Turns out it was a Tesla car. This data does not show that Tesla's self-driving cars are more likely to cause crashes, but it is enough data to show that Tesla vehicles are not fully self-driving. I can say

In addition, the US Department of Transportation Highway Traffic Safety Administration (NHTSA) pointed out that there are at least 37 accidents that are thought to be caused by the company's driving support system among the collision accidents caused by Tesla vehicles. In addition, it reports that a special investigation into the incident is underway. At least 17 people died in the 37 crashes that were the subject of the special investigation.

Separately, NHTSA has started an evaluation to see if a recall is necessary due to defects in autopilot technology due to a series of collisions between Tesla vehicles and emergency response vehicles.

in Ride, Posted by logu_ii