Markey Pushes Tesla to Fix ‘Dangerous' Detour Around Autopilot Safety Feature

After an NBC10 Boston Investigation, Markey called on the company to upgrade what he called "flaws" in its cars

What to Know

  • Sen. Edward Markey urged Tesla last week to prevent owners from bypassing an automatic shut-off feature
  • Drivers have discovered ways to trick the car's sensors, allowing them to cruise with driver assistance engaged

U.S. Sen. Edward Markey is calling on Tesla to improve its driver assistance system after a report from NBC10 Boston exposed how drivers are skirting one of its key safety features.

In a letter to CEO Elon Musk, Markey urged the electric car maker last week to prevent Tesla owners from bypassing an automatic shut-off feature designed to make sure they remain engaged behind the wheel.

The system triggers a series of safety alerts if drivers take their hands off the wheel for more than 30 seconds. But as NBC10 Boston Investigator Ryan Kath reported, drivers have discovered ways to trick the car's sensors, allowing them to cruise with driver assistance engaged.

Videos circulating online show a range of methods, from laying a hand on the six o'clock position of the steering wheel to tying a weight around it, strapping in a water bottle — even wedging an orange in place.

"If you can use an orange to get around safety procedures, then that's a lemon," Markey said, sharing his concerns with NBC10.

After the station's story aired, Markey called on the company to upgrade what he called "flaws" in its cars.

"These techniques reveal inherent flaws in Tesla's Autopilot system that may pose a public safety danger and Tesla should quickly take action to address these risks before any tragedy occurs," Markey wrote.

The Massachusetts Democrat, who serves on the Senate Committee on Commerce, Science, and Transportation, also raised concerns at a safety hearing Wednesday in Washington D.C. Markey peppered the head of the National Highway Traffic Safety Administration with questions about how the federal agency will ensure Tesla drivers and others aren't put at risk.

"That's not safe," Markey said, citing NBC10's coverage. "Someone is going to die."

When Autopilot is engaged, the car helps with steering and matches your speed to surrounding traffic. All Tesla models come equipped with the feature, which doesn't make the vehicle fully autonomous, but assists with what Tesla describes as the "most burdensome parts of driving."

Autopilot has drawn scrutiny after some high-profile crashes involving Teslas, however, including one that killed a Florida man when his Tesla collided with a semi trailer while in Autopilot mode.

In September, video of a Tesla owner seemingly asleep at the wheel while cruising down the Massachusetts Turnpike quickly went viral, inviting further questions about the technology.

Tesla did not respond to a request for comment Thursday on the latest criticism from Markey. In a previous statement, the company said videos circulating online of drivers appearing to doze appear to be "dangerous pranks or hoaxes."

After seeing the Turnpike clip, a Newburyport man contacted NBC10 Boston to share his personal experience using workarounds to keep Autopilot engaged. The man said he fell asleep for 14 miles while traveling home on Route 2, allowing the car to travel unimpeded without his supervision.

"I was ashamed of myself," he told NBC10. "I was furious with myself that I put myself in that position."

After seeing the story, Markey called it "outrageous" that drivers could easily disengage a key safety feature.

"Driver assistance is turning into driver replacement, and we aren't ready for that on the roads," he said.

In his letter to the company, Markey posed a series of questions about its safety protocols, asking whether it has exhaustively tested potential methods for evading Autopilot's safety features, whether it tracks and responds to online videos that show how to disengage safety alerts, and what actions it will take to upgrade Autopilot's "now-known flaws."

He asked the company to respond by Dec. 6.

"We need to ensure that Tesla has the strongest possible safeguards for their Autopilot," he told NBC10. "If they don't, it will be a danger to the public."

Contact Us