This week, a US Department of Transportation report detailed the accidents that advanced driver assistance systems have been involved in over the past year or so. Tesla’s advanced features, including autopilot and full self-driving, accounted for 70 percent of the nearly 400 incidents – many more than previously known. But the report may raise more questions about this security technology than it answers, researchers say, due to blind spots in the data.
The report examined systems that promise to take some of the tedious or dangerous things out of driving by automatically changing lanes, staying within lane lines, braking before collisions, braking before large curves on the road and in some cases driving on highways without driver intervention. . The systems include Autopilot, Ford’s BlueCruise, General Motors Super Cruise and Nissan’s ProPilot Assist. While it shows that these systems are not perfect, there is still plenty to learn about how a new type of security feature actually works down the road.
This is largely because automakers have very different ways of submitting their accident data to the federal government. Some, like Tesla, BMW and GM, can pull detailed data from their cars wirelessly after a crash has occurred. It allows them to quickly comply with the government’s 24-hour reporting requirements. But others, like Toyota and Honda, do not have these options. Chris Martin, a spokesman for American Honda, said in a statement that the automaker’s reports to the DOT are based on “unverified customer statements” about whether their advanced driver assistance systems were on when the accident occurred. The automaker can later pull “black box” data from its vehicles, but only with the customer’s permission or at the request of the police, and only with specialized wired equipment.
Of the 426 accident reports described in the government report’s data, only 60 per cent came through cars’ telematics systems. The other 40 percent were through customer reports and claims – sometimes leaked through diffuse dealer networks – media reports and law enforcement. As a result, the report does not allow anyone to make “apples-to-apples” comparisons between safety features, says Bryan Reimer, who studies automation and vehicle safety at MIT’s AgeLab.
Even the data collected by the government is not put in full context. For example, the government does not know how often a car that uses an advanced assistance function crashes per day. miles it runs. The National Highway Traffic Safety Administration, which published the report, warned that some incidents could occur more than once in the dataset. And automakers with high market share and good reporting systems in place – especially Tesla – are likely to be overrepresented in accident reports simply because they have more cars on the road.
It is important that the NHTSA report does not discourage automakers from providing more comprehensive data, says Jennifer Homendy, chair of the Federal Watchdog National Transportation Safety Board. “The last thing we want is to punish manufacturers who collect robust safety data,” she said in a statement. “What we want is data that tells us what security improvements need to be made.”
Without that transparency, it can be difficult for drivers to understand, compare and even use the features that come with their car – and for regulators to keep track of who does what. “As we gather more data, NHTSA will be better able to identify any new risks or trends and learn more about how these technologies perform in the real world,” Steven Cliff, the agency’s administrator, said in a statement.