General Motors startup Cruise LLC has recently recalled and updated software for its 80 self-driving vehicles. The changes come after the June crash in San Francisco which led to injury for two people. According to regulators, the new software can “incorrectly predict” an oncoming vehicle’s path.
Cruise said it had determined this unusual scenario would not recur after the update. The National Highway Traffic Safety Administration (NHTSA) has stepped up its scrutiny of advanced driver assistance systems and autonomous vehicle systems in recent months. Last year, it directed all automakers and tech companies to promptly report crashes involving self-driving vehicles.
NHTSA said Thursday that Cruise’s recall filing “to address a safety defect in its automated driving systems software” was required by law.NHTSA added it “expects all manufacturers, including those developing automated driving systems, to continuously ensure that they are meeting their requirements to initiate a recall for any safety issue that poses an unreasonable risk to safety.”
NHTSA said the recalled Cruise software could “in certain circumstances when making an unprotected left, cause the (autonomous driving system) to incorrectly predict another vehicle’s path or be insufficiently reactive to the sudden path change of a road user.”
Vehicle operation
Cruise disclosed Thursday that after the June 3 crash in San Francisco, it temporarily prevented its vehicles from making unprotected left turns and reduced the area in which its vehicles could operate. After the software update on July 6, Cruise said it had gradually reintroduced unprotected left turns, which refers to turning left at an intersection with a solid green light that directs all traffic, rather than a designated green arrow just for turning vehicles.
Cruise emphasized in a statement Thursday all vehicles had software updates and that the recall “does not impact or change our current on-road operations.” The company added, “Cruise AVs are even better equipped to prevent this singular, exceptional event.” NHTSA said, “an improper (Automated Driving Systems) response can increase the risk of a crash.” The agency said last month it had opened a special investigation into the Cruise crash.
In rare circumstances, Cruise said the software caused the autonomous vehicle to hard brake while performing an unprotected left turn that it deemed necessary to avoid a severe front-end collision. The self-driving vehicle “had to decide between two different risk scenarios and chose the one with the least potential for a serious collision at the time, before the oncoming vehicle’s sudden change of direction,” Cruise said.