The US, through its auto safety agency has opened an investigation into Alphabet-owned Waymo after its self-driving vehicle struck a child near an elementary school in California’s Santa Monica.
The incident reportedly resulted in minor injuries. According to the National Highway Traffic Safety Administration (NHTSA), the child ran across the street last week, on January 23, behind a parked vehicle, and was struck by the autonomous car during normal school drop-off hours.
There were other children in the vicinity when the incident occurred, together with a crossing guard, and several double-parked vehicles. The probe comes as Waymo’s robotaxis are also under probe over incidents at school zones in Austin, although the company reported that there were no confirmed injuries.
In a blog post on Thursday, Waymo committed to cooperating with authorities during the course of the investigations, adding that the child “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path.”
As per a Reuters report, the autonomous vehicle noticed an individual immediately as the child emerged from behind the stopped car, and promptly applied brakes. According to the company, the vehicle slowed down from 17 mph to below 6 mph before any contact was made. Waymo called 911 after the collision, as the child stood up and walked away immediately.
Now, the NHTSA revealed on Thursday that it is opening an investigation to ascertain if the Waymo AV exercised appropriate caution given its proximity to the school zone during drop-off period, as well as the presence of young pedestrians and other vulnerable road users.
As such, the NHTSA plans to examine the AV’s “intended behavior in school zones and neighboring areas, especially during normal school pick up/drop off times, including but not limited to its adherence to posted speed limits” and will “also investigate Waymo’s post-impact response.”
Waymo has however defended its AV, arguing it performed better than a human driver. The company revealed that a computer model suggested that a fully attentive human driver facing a similar situation would have made contact with the pedestrian at about 14 mph.
“The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.”
Waymo.
According to Reuters, the National Transportation and Safety Board also opened an investigation on the same day the incident happened after Waymo’s robotaxis illegally passed stopped school buses in Austin, Texas, and at least 19 times since the beginning of the school year.
This was not the only case reported, as the NHTSA also launched another investigation into its robotaxis following safety violations also involving a stationary school bus in Atlanta, Georgia, as previously reported by Cryptopolitan.
The company reportedly recalled over 3,000 vehicles in a bid to update software that had resulted in vehicles to drive past school buses loading or unloading students. Despite the software updates to resolve the issue, the Austin Independent School District said in November that five incidents had occurred in the same month.
The school system asked the company to stop operating around schools during pick up and drop off time until it could ensure its vehicles would comply with regulations.
Waymo, however, revealed that there were no collisions recorded from the incidents, and the school district said the company had refused to pause operations around schools.
Don’t just read crypto news. Understand it. Subscribe to our newsletter. It's free.