Tesla’s controversial Autopilot semi-autonomous driving technology has been the subject of recalls and safety investigations, and now an employee has detailed the concerning ways the brand treats its data.
Autopilot is a Level 2 autonomous driving system that incorporates now-common features like adaptive cruise control and lane-keep assist.
US publication Business Insider interviewed an anonymous member of Tesla’s Autopilot team who ‘trains’ the driver assistance system – and its more capable Full Self-Driving sister system – to perform safer in the real world.
100s of new car deals are available through CarExpert right now. Get the experts on your side and score a great deal. Browse now.
The unnamed employee’s daily job is to review five to six hours of footage captured by the ‘Tesla Vision’ cameras on the company’s vehicles and ‘label’ objects such as road signs, traffic roads and line markings so the system knows what they are.
According to the insider, employees’ concerns about Autopilot were dismissed, even when addressing basic road rules which could see the human behind the wheel fined for breaking them.
“When we had concerns they were often brushed off. There were some times we were told to ignore ‘No Turn on Red’ or ‘No U-Turn’ signs,” the Tesla employee told Business Insider.
“Those were the kind of things that made me and my coworkers uncomfortable. In some cases, they would hear us out, but other times the general response was along the lines of ‘Mind your business and your pay grade’.”
Autopilot has drawn criticism from experts who believe its name gives drivers a false sense of understanding about what the system is capable of.
While Tesla requires drivers to place their hands on the steering wheel to use Autopilot – and will lock them out if the system detects they aren’t paying attention – many users in the past have found ways to bypass its governing controls.
This led to a recall of more than two million Teslas in the US late last year, after the National Highway Traffic Safety Administration (NHTSA) found the failsafes in place weren’t enough to prevent drivers misusing Autopilot.
Autopilot has been the subject of more than 40 NHTSA investigations, which includes at least 23 deaths in the US.
The employee interviewed by Business Insider, who remains with Tesla, said they initially believed working for the company would be “a great opportunity for my career”, though they now view it as dystopian.
They referenced productivity being monitored by Flide Time, an employee monitoring software which tracks keystrokes and detects inactivity.
While Tesla provides one 15-minute break and an additional 30 minutes for lunch, the employee said staff could be disciplined for taking too long to “go out of the labelling system to review traffic laws or [find] Tesla’s labelling policies”, or if their bathroom break runs over time.
Last year, news agency Reuters reported a group of Tesla employees had been caught sharing private images and video from customer’s internal and external cameras, breaching owners’ privacy.
The Tesla employee told Business Insider the company has since introduced better policies to reduce the risk of future leaks.
“Tesla cracked down on image sharing and what we could access after Reuters published a story on it. They essentially told us ‘If you’re caught once, that’s your ticket out the door.’
“After that, you couldn’t access images outside of your allocated team folder anymore, and Tesla put watermarks on some of the images so you could easily tell where it came from, if it was redistributed.
“Sometimes people still pass images around the office, especially if it’s something out of the ordinary, but it doesn’t happen as often.
“There is something very strange about having this very intimate view into someone’s life. It feels odd to see someone’s daily drive, but it’s also an important part of correcting and refining the program.”
MORE: Tesla employees busted remotely sharing images of naked man and others from customer car cameras