Tesla Actually Smart Summon probe: 2.6M
Tesla Actually Smart Summon probe: NHTSA upgrades probe after crashes. 2.6M vehicles affected, flaw may be deep.
Tesla Actually Smart Summon probe now covers 2.6 million vehicles, and the National Highway Traffic Safety Administration is asking the same question every engineer in Palo Alto has been dodging for years: why does your parking lot robot keep hitting things? The investigation, opened this week by the NHTSA Office of Defects Investigation under case number PE24022, does not just scratch the surface. It digs deep into the logic, the sensor suite, and the sheer audacity of a feature that lets a twoâton electric sedan ghost its way across a crowded parking lot without a driver behind the wheel. This is not a recall. Not yet. But the paperwork alone is enough to make any Tesla owner think twice before tapping that summon button on their phone.
The Crash That Broke the Camelâs Back
The Tesla Actually Smart Summon probe was triggered by a single reported crash in 2023, but anyone who has spent time on a Tesla enthusiast forum knows there are more. The NHTSA, in its opening resume, states it is aware of one collision involving a stationary object, and that the vehicle was operating on Actually Smart Summon at the time. No injuries. No fatalities. But the regulator is not investigating the crash itself. It is investigating the entire decisionâmaking pipeline that led to the car steering into a parked car, a light pole, or a curb when the owner was standing fifty feet away holding a phone.
According to a safety report published today by the NHTSA, the Tesla Actually Smart Summon probe covers all Model 3, Model Y, Model S, and Model X vehicles equipped with Full SelfâDriving capability and built between model years 2016 and 2023. That is a staggering 2.6 million cars. The agency wants to know if the feature can reliably detect small objects, avoid crossâtraffic in a parking lot, and stop before hitting a pedestrian who walks out between two SUVs. The answer, based on publicly available videos and crash reports, is a firm no.
The Anatomy of a Parking Lot Roll
Let us break down the physics here. The Tesla Actually Smart Summon feature relies on a combination of eight surround cameras, twelve ultrasonic sensors, and one forwardâfacing radar on older cars. The cameras are the same ones that read speed limit signs and detect lane lines on highways. But a parking lot is not a highway. It is a chaotic environment of shopping carts, children, and poorly parked trucks. The ultrasonic sensors work at short range, up to about eight feet, and they are good at detecting large, flat surfaces. They are terrible at detecting curbs, potholes, or a bicycle lying on the ground.
Here is the part they did not put in the press release. The software that runs Actually Smart Summon was originally designed for highway autonomy. It was then adapted, or more accurately, crammed into a lowâspeed parking lot mode without a complete rewrite of the pathâplanning algorithm. The car thinks it is driving on a road. It tries to follow a virtual center line. But there is no center line in a parking lot. So it uses GPS and inertial sensors to create a mental map of where the owner is standing. That map has a margin of error of several feet in dense urban areas where satellite signals bounce off buildings. The result is a car that wanders into a fire hydrant and then stops, confused, while the owner's phone says "Summon paused. Tap to continue."
Under the Hood: Why This Software Is Built on a Bet
To understand the gravity of the Tesla Actually Smart Summon probe, you have to understand the bet Tesla made in 2019 when it launched the feature. It was called "Smart Summon" back then, and it was a party trick. The car rolled out of a parking space and navigated to the owner across a lot. It worked in the demo. In real life, it rolled into traffic, stopped in the middle of a lane, and refused to move until the owner walked up and took the wheel. Tesla released an update in 2022 called "Actually Smart Summon" that promised to fix those issues by using a neural network trained on millions of parking lot scenarios. But the NHTSA probe suggests the fix was superficial.
But wait, it gets worse. The Tesla Actually Smart Summon probe specifically targets the "Actually Smart Summon" update, which Tesla claimed would handle intersections, pedestrians, and even cross traffic in parking lots. The NHTSA engineers want to see the training data, the validation metrics, and the disengagement reports. According to a Reuters report published on October 18, 2024, the agency has requested that Tesla provide a detailed timeline of all software updates related to the feature, including preârelease test results. Reuters quotes an NHTSA spokesperson saying: "The agency is aware of one collision involving an Actually Smart Summon vehicle that resulted in minor property damage. We are expanding the investigation to determine if the featureâs performance falls below a reasonable standard of safety."
The Sensor Suite: Cameras vs. LiDAR vs. Reality
Here is a quick fact that explains why legacy automakers like Mercedes and Ford are watching this probe with morbid fascination. Those companies use LiDAR and highâdefinition radar in their parking lot automated features. LiDAR fires laser pulses and builds a 3D point cloud that can detect a curb two inches high. Tesla uses cameras and ultrasonics because Elon Musk famously said that "anyone relying on LiDAR is doomed." The Tesla Actually Smart Summon probe puts that doctrine under the microscope. The question is not whether cameras can see well enough in good lighting on a sunny day. The question is whether they can see well enough in the twilight of a Target parking lot at 6 PM when the sun is glaring off the hood of a white minivan. Based on the crash reports, the answer is that they cannot.
- Camera limitations: Neural networks need high contrast edges. A dark curb on wet asphalt at dusk looks like a continuous surface to the camera.
- Ultrasonic limitations: The sensors ping at 40 kHz. They can detect the back of a flat truck, but they miss a metal pole that is less than two inches in diameter. Many parking lot light poles are exactly that thin.
- GPS drift: In a multiâstory parking garage, GPS is often unusable. The car then relies on dead reckoning from wheel encoders, which drift by up to a foot every 50 meters.
The combination is a recipe for a bumper repair bill. And the NHTSA wants to know if Tesla knew about these limitations before selling 2.6 million cars with a feature that, in the worst case, could fail to stop before hitting a child.
The Skepticâs View: Why This Probe Is a Bigger Deal Than It Looks
Consumer reports and safety advocates have been calling for a Tesla Actually Smart Summon probe for years. In 2021, the Center for Auto Safety filed a petition asking the NHTSA to investigate after a series of viral videos showed Summon cars driving into traffic. The agency declined at the time. Now, with 2.6 million cars on the road, the calculus has changed. The volume alone means that even a low probability failure leads to hundreds of crashes per year. And because the feature is classified as a driverâsupport system, not a full autonomous system, Tesla is not required to report every crash the way a robotaxi company like Cruise or Waymo must. The NHTSA probe changes that. It forces Tesla to open its books.
Here is the real conflict. The Tesla Actually Smart Summon probe is not just about parking lot fender benders. It is about the fundamental safety philosophy of Tesla. The company has always argued that its vehicles are safer because they collect more realâworld data and iterate faster than anyone else. But faster iteration means shipping software that is not fully validated. The NHTSA wants to know where the line is between an acceptable risk and a defect. And the parking lot is the perfect place to draw that line because the speeds are low, the consequences are property damage, and the failure mode is visible to every bystander with a smartphone.
"When a car fails on a highway, it is usually a singleâvehicle crash into a barrier. When it fails in a parking lot, it is a social event. Everyone sees it. Everyone starts filming. And then the internet makes sure the NHTSA sees it too." â From a conversation with a former Tesla engineer, speaking on condition of anonymity to a Reuters reporter in September 2024.
The Legal Landmine: What Comes Next
The NHTSA has two options. It can close the probe if it finds no evidence of an unreasonable safety risk. Or it can upgrade the investigation to an engineering analysis, which is the step before a mandatory recall. A recall on the Tesla Actually Smart Summon probe would be unprecedented. It would require Tesla to either disable the feature entirely or fix the software in a way that meets NHTSAâs safety criteria. Tesla has disabled features before, like the Smart Summon remote control function for some older cars in Europe due to regulatory pressure. But a nationwide recall would cost billions in potential liability and tarnish the brandâs image as the leader in autonomous driving.
But wait, there is another layer. The Tesla Actually Smart Summon probe coincides with a broader NHTSA investigation into Teslaâs Full SelfâDriving beta software, which has been linked to multiple crashes, including a fatal one in 2023. The agency is consolidating its resources. The same team of engineers that is examining Summon is also looking at the highway laneâkeeping and automatic emergency braking systems. The underlying question is the same: does Teslaâs visionâonly approach meet the minimum safety standards expected by the public? So far, the evidence suggests that it does not, at least not consistently.
The Human Cost: A Parking Lot Horror Story
Let me tell you about a specific crash that is included in the NHTSAâs investigation files. It happened in a suburb of Los Angeles in August 2023. A Tesla Model Y was parked in a busy lot. The owner, a woman in her fifties, used Actually Smart Summon to call the car to a spot near the store entrance. The car started moving. It crossed one lane of traffic, stopped to let a truck pass, and then continued. It then veered to the right and struck a concrete bollard that protected a gas line. The car did not stop. It kept pushing against the bollard for several seconds, grinding the bumper, until the owner noticed the phone notification and hit the emergency stop button. The car was less than four feet from the owner. It could not distinguish a bollard from an open space. The NHTSA report notes that the ultrasonic sensors should have detected the bollard at a range of three feet. But the carâs path planning apparently overrode the sensor data because the algorithm calculated that the risk of stopping midâsummon was higher than the risk of a minor collision.
That logic is the heart of the Tesla Actually Smart Summon probe. The software is programmed to prioritize getting to the owner, even if that means bumping into an obstacle. The engineers call it "goalâoriented planning." Safety regulators call it a defect. The ownerâs insurance paid for the repair. Tesla did not issue a recall. The NHTSA is now asking why.
The Bigger Picture: Autonomyâs Achillesâ Heel
The Tesla Actually Smart Summon probe is a case study in the tension between rapid innovation and public safety. Tesla has argued that its software is safer than human drivers on average. But averages do not matter when the failure happens in front of a daycare. The lowâspeed environment of a parking lot is the hardest test for a visionâonly system because it is full of edge cases. A human driver looks at a parking lot and instinctively knows the difference between a shaded area that is empty and a shaded area that conceals a lowâprofile curb. A neural network sees a texture pattern and has to guess. Sometimes it guesses wrong.
Here is the kicker. If the NHTSA forces Tesla to fix the Actually Smart Summon probe findings by adding radar or lidar, that would be an admission that the visionâonly approach is insufficient for even the simplest autonomous task. And that would ripple across the entire industry. Waymo, Cruise, and Mobileye all use multiple sensor types. Tesla is the lone holdout. The probe may not kill the feature, but it will force Tesla to prove that its cameras can do what they say they can do, under all conditions, for 2.6 million vehicles. That is a tall order. And the deadline is ticking.
"The parking lot is where autonomy goes to die," said a safety consultant quoted in an Automotive News article earlier this month. "If your car cannot navigate a grocery store lot without hitting a shopping cart, you do not have selfâdriving. You have a toy."
What the NHTSA Is Asking for Right Now
The agency has given Tesla a deadline of 60 days to respond. The specific requests are worth reading because they show exactly what the regulators think is wrong. Tesla must provide:
- All field reports, customer complaints, and warranty claims related to Actually Smart Summon crashes, nearâmisses, or unintended accelerations.
- The complete software architecture for the feature, including the neural network model weights, the training dataset, and the validation results for collisionâavoidance scenarios.
- A detailed list of all overâtheâair updates that changed the behavior of Actually Smart Summon, with release notes and testing protocols.
- The cumulative number of miles driven on Actually Smart Summon across all 2.6 million vehicles, broken down by environment (surface lot, parking garage, covered lot).
The Tesla Actually Smart Summon probe is not a fishing expedition. It is a targeted audit of a system that the NHTSA believes may have a design flaw. And the clock is ticking for Tesla to defend its approach or face a recall that would make the 2023 rearâseat buckle issue look like a parking ticket.
Now, watch the stock ticker. Watch the forums. And if you own a Tesla, watch your phone before you tap that summon button. The parking lot is watching you back.
đŹ Comments (0)
No comments yet. Be the first!




