Piecemeal Autopilot Fails

Why emergency braking systems sometimes hit parked cars and lane dividers. Recent Tesla Autopilot crashes hold a lesson for the whole industry. post

The fundamental issue here is that tendency to treat lane-keeping, adaptive cruise control, and emergency braking as independent systems. As we've seen, today's driver assistance systems have been created in a piecemeal fashion, with each system following a do-no-harm philosophy. They only intervene if they're confident they can prevent an accident—or at least avoid causing one. If they're not sure, they do nothing and let the driver make the decision.

Early driver assistance systems assumed that the driver could monitor the car and intervene if the car made a mistake. But a driver's ability to monitor a car's progress depends crucially on reflexes built up over years of driving. Those reflexes depend on cars behaving in consistent and predictable ways: for example, if you take your eyes off the road for a couple of seconds, it will continue traveling in the same direction.

Once a driver-assistance system reaches a certain level of complexity, the assumption that it's safest for the system to do nothing no longer makes sense. Complex driver assistance systems can behave in ways that surprise and confuse drivers, leading to deadly accidents if the driver's attention wavers for just a few seconds. At the same time, by handling most situations competently, these systems can lull drivers into a false sense of security and cause them to pay less careful attention to the road.

.

Thank goodness someone is thinking about how spurious feedback signals can wreck predictable HCI. Airplanes switch off autopilot, don't co-mingle systems. I really think Silicon Valley needs to grok applied feedback control theory. Understand integrating signals using Fourier and Laplace transforms for predictable realtime feedback control.