How Safe Are Systems Like Tesla’s Autopilot? No One Knows.

Every three months, Tesla publishes a safety report that provides the number of miles between crashes when drivers use the company’s driver-assist system, autopilot, and the number of miles between crashes when they do not.

These statistics always show that accidents are lower with autopilots, which can drive, brake and accelerate Tesla vehicles on their own.

But the numbers are misleading. According to the Department of Transportation, autopilots are used primarily for highway driving, which is usually twice as safe as driving on city streets. Less crashes can occur with an autopilot as it is usually used in safe situations.

Tesla has not provided data that would allow autopilot safety comparisons to be made on similar roads. Neither are the other car manufacturers that offer similar systems.

Autopilot has been on public roads since 2015. General Motors introduced the Super Cruise in 2017, and Ford Motor introduced the Blue Cruise last year. But publicly available data that reliably measure the safety of these technologies is scarce. American drivers – whether using these systems or sharing the road with them – are effectively guinea pigs in an experiment whose results have not yet been made public.

Car manufacturers and tech companies are adding more vehicle features that they claim improve their safety, but these claims are hard to verify. All this time, casualties on the country’s highways and streets have been rising in recent years, reaching a 16-year high in 2021. It seems that any additional protection provided by technological advances is not compensating for the poor decisions made by the drivers behind the wheel.

Professor and co-director of the Center for Mechanical Engineering at Stanford University. “There is a lack of data that would convince people that these systems, as they are deployed, live up to their expected security benefits,” said Christian Gardes. Automotive Research who was the first Chief Innovation Officer for the Department of Transportation.

GM collaborated with the University of Michigan on a study that explored the potential safety benefits of super cruises but concluded that they did not have enough data to understand whether the system reduces crashing.

One year ago, the government’s auto safety regulator, the National Highway Traffic Safety Administration, ordered companies to report potentially serious accidents involving advanced driver-assistance systems along autopilot lines within a day of learning about them. The order said the agency would release the reports, but has not yet done so.

The security agency declined to comment on the information it has collected so far, but said in a statement that the data would be released “in the near future.”

Tesla and its chief executive, Elon Musk, did not respond to requests for comment. GMA said it had reported two incidents involving Super Cruise to NHTSA: one in 2018 and one in 2020. Ford declined to comment.

The agency’s data is unlikely to provide a complete picture of the situation, but it could encourage lawmakers and drivers to take a closer look at these technologies and ultimately change the way they are marketed and regulated.

“To solve the problem, you first have to understand it,” said Bryant Walker Smith, an associate professor in law and engineering schools at the University of South Carolina who specializes in emerging transportation technologies. “It’s a way to get more ground truth as a basis for investigations, rules and other actions.”

Despite its capabilities, the autopilot does not remove responsibility from the driver. Tesla tells drivers to be vigilant and always be prepared to take the car under control. The same is true of Blue Cruise and Super Cruise.

But many experts worry that these systems, because they enable drivers to relinquish active control over the car, may calm them down to the idea that they are driving their own car. Then, when there is a defect in technology or the situation cannot be handled on its own, drivers may not be ready to take control as quickly as needed.

Older technologies such as automatic emergency braking and lane departure warning provide safety nets for longtime drivers by slowing or stopping the car or warning drivers when they get out of their lane. But new driver-assistance systems flip that system by making the driver a safety net for technology.

Safety experts are particularly concerned about the autopilot as it is marketed. For years Mr. Musk says the company’s car is on the verge of true autonomy – driving itself in practically any situation. The name of the system also implies automation which technology has not yet achieved.

This may compliment the driver. Autopilots have played a role in many fatal accidents, in some cases because drivers were unwilling to take control of the car.

Mr. Musk has long promoted autopilot as a way to improve safety, and Tesla’s quarterly safety reports support it. But a recent study by the Virginia Transportation Research Council, a branch of the Virginia Department of Transportation, shows that these reports are not what they appear to be.

“We know that cars that use autopilots crash more often than those that do not use autopilots,” said Council researcher Noah Goodall, who explores the safety and operational issues surrounding autonomous vehicles. “But are they driven in the same way, on the same roads, at the same time of day, by the same drivers?”

Analyzing police and insurance data, the nonprofit research institute funded by the insurance industry, the Insurance Institute for Highway Safety, found that older technologies such as automatic emergency braking and lane departure warning improved safety. But the institute says studies have not yet shown that the driver-assistance system offers similar benefits.

Part of the problem is that police and insurance data do not always indicate whether these systems were in use at the time of the crash.

The Federal Auto Safety Agency has ordered companies to provide data on crashes when the use of driver-assisted technology is taking place within 30 seconds of being affected. Can provide a comprehensive picture of how these systems are working.

But even with that data, security experts say, it will be difficult to determine whether using these systems is safer than shutting them down in similar situations.

The Alliance for Automotive Innovation, a trade group for car companies, has warned that data from the Federal Safety Agency could be misinterpreted or misrepresented. Some independent experts express similar concerns.

“My biggest concern is that without comparative data on crashes involving conventional cars, we will have detailed data on crashes involving these technologies,” said Matthew Wansel, a professor at Cardozo School of Law in New York who specializes in emerging automotive technologies and General Consultant on Autonomous Vehicle Start-up. “It could potentially appear that these systems are much less secure than they really are.”

For these and other reasons, car manufacturers may be reluctant to share some data with the agency. Under his order, companies can ask him to withhold certain data by claiming that he will reveal business secrets.

The agency is also collecting crash data on automated driving systems – more advanced technologies aimed at completely removing drivers from cars. These systems are often referred to as “self-driving cars”.

For the most part, this technology is still being tested in a relatively small number of cars with drivers behind the wheel as a backup. Waymo, Google’s parent, Alphabet-owned company operates driverless services in the suburbs of Phoenix, and similar services are planned in cities such as San Francisco and Miami.

Companies are already required to report crashes involving automated driving systems in some states. The data from the Federal Safety Agency, which covers the entire country, should provide additional insights into this area as well.

But a more immediate concern is the safety of autopilot and other driver-assistance systems, which are installed on thousands of vehicles.

“An open question is: is the autopilot increasing or decreasing the crash frequency?” Mr. Wansley said. “We may not get a complete answer, but we will find some useful information.”

Similar Posts

Leave a Reply

Your email address will not be published.