Another Tesla in Orlando reportedly collided with a parked police car on Interstate 4, leaving no serious injuries despite nearly hitting a policeman helping another driver on the side of the road. The Mercedes parked in the entrance was also involved in the crash, with the Tesla Model 3 colliding with both the police vehicle and the Mercedes.
The Florida Highway Patrol reports that the Tesla driver claimed to have been using Autopilot mode at the time of the accident. To date, NHTSA has investigated at least 11 crashes where Teslas had collisions with emergency vehicles, police cars, or other service vehicles between January 2018 and July 2021, across nine states. Most of these incidents occurred at night, with law enforcement vehicles equipped with lights, warnings, warning signals, and flashing lights.
Florida Highway Patrol will report this incident to both NHTSA and Tesla. Despite the numerous collisions, Tesla has not yet commented on the recent crash or the ongoing NHTSA investigation.
While Tesla claims its vehicles with Autopilot contribute to fewer accidents per mile compared to human-driven cars, experts warn that drivers must actively monitor their vehicles and the road, as self-driving cars do not yet exist. Automakers, including Tesla, are legally required to ensure their vehicles are always under human control.
Expert Sam Abuelsamid points out that while Tesla's driver-assistance features like Autopilot or adaptive cruising control can slow down preceding traffic, their vehicles may ignore standing objects at high speeds to avoid braking unnecessarily. However, most automatic braking systems do react to standing objects when moving at lower speeds.
The main issue, as per Abuelsamid, is that many Tesla drivers misunderstand their vehicles' capabilities, confusing assisted driving functions with autonomous driving. Moreover, human-centric cues such as flashing lights or warning signals are more crucial to human drivers than automatic systems.
"When it works, it's great—it usually is"—says Abuelsamid. "But it's easy to confuse it with things that humans wouldn't have a problem with. Machine vision isn't as adaptable as human vision. The problem is that all machine systems make dumb mistakes sometimes."
Additional Information
While there is no explicit data on the frequency of Tesla cars crashing into emergency vehicles while using Autopilot from verified sources, issues like the system's inability to detect stationary emergency vehicles and misuse have been raised. The National Highway Traffic Safety Administration (NHTSA) has been actively tackling these concerns.
The NHTSA investigates incidents, mandates recalls, enhances reporting requirements, and issues safety findings. For instance, following hundreds of crashes involving Autopilot, resulting in scores of injuries and over a dozen fatalities, the NHTSA ordered a recall to upgrade controls and alerts for active driver monitoring. They also require automakers to report crashes involving vehicles with ADAS features. Regular safety investigations and findings play a crucial role in shaping regulatory actions and future improvements.