The National Highway Transportation Administration made a special request from Tesla at the end of July for an abundance of info regarding its Autopilot and driver observation systems. The automobile protection authorities need to be informed of the "Elon Mode" setup for Tesla cars, which abolishes a regular prompt prompting drivers to keep their hands on the wheel. Among other things, they sought data concerning how many Tesla clienteles ever had this arrangement energized.
Tesla has been given a special order by federal automotive safety regulators mandating them to supply detailed information on their driver assistance and driver monitoring systems, including a confidential configuration referred to as "Elon mode". Generally, when a Tesla driver uses the company's driver assistance systems (called Autopilot, Full Self-Driving, or FSD Beta), a visual icon appears on the car's touchscreen to prompt the driver to interact with the steering wheel. If the driver does not take the wheel after a certain time, the vehicle will disconnect the advanced driver assistance features for the remainder of the trip or longer.
As first reported by Bloomberg, the National Highway Traffic Safety Administration sent a letter and special order to Tesla on July 26th to gain knowledge on the configuration, including how many cars and drivers Tesla has enabled to utilize it. On August 25th, Tesla satisfied the agency's demand for information and requested for confidential treatment, yet did not respond to CNBC's request for input.
Automotive safety researcher and Carnegie Mellon University associate professor of computer engineering Philip Koopman commented that the NHTSA definitely disapproves of “cheat codes” that disable safety features such as driver monitoring, and he is in support of that notion. This order by NHTSA occurs as the agency is concluding investigations into associated Tesla Autopilot crashes.
Over the weekend, Tesla CEO Elon Musk live-streamed a trial drive of a Tesla fitted with FSD (version 12), on a social platform, without displaying the details of his touchscreen nor demonstrating his hands on the steering wheel. Machine learning expert Greg Lindsay pointed out that this likely goes against Tesla's terms of use for Autopilot, FSD, and FSD Beta, while Grep VC managing director Bruno Bowden observed that while the Tesla system showed some improvements, it still has a long way to go to render a safe, self-driving system. During the drive, the Tesla system almost went through a red light, an issue Musk was able to swiftly address by stepping on the brakes to avert any danger.
top of page
bottom of page
Comments