A self-driving vehicle
fatally striking a pedestrian is a tasteless venue for self-promotion, but it’s
also an important time to discuss the problems that created the situation.
Mobileye CEO and CTO Amnon Shashua seems to do a little of both in this post at
parent company Intel’s blog, running the company’s computer vision software on
Uber’s in-car footage and detecting the person a full second before impact.
It first must be said
that this shouldn’t be taken to demonstrate the superiority of Mobileye’ssystems or anything like that. This type of grainy footage isn’t what
self-driving — or even “advanced driver assistance” — systems are meant to
operate on. It’s largely an academic demonstration.
But
the application of a competent computer vision system to the footage and its
immediate success at detecting both Elaine Herzberg and her bike show how
completely the Uber system must have failed.
Even if this Mobileye
object detection algorithm had been the only thing running in that situation,
it detected Herzberg a second before impact (on highly degraded data at that).
If the brakes had been immediately applied, the car may have slowed enough that
the impact might not have been fatal; even a 5 MPH difference might matter.
Remember, the Uber car reportedly didn’t even touch the brakes until
afterwards. It’s exactly these types of situations in which we are supposed to
be able to rely on the superior sensing and reaction time of an AI.
We’re still waiting to
hear what exactly happened that the Uber car, equipped with radar, lidar,
multiple optical cameras and a safety driver, any one of which should have
detected Herzberg, completely failed to do so. Or if it did, failed to take
action.
This little exhibition by Mobileye, while it should be taken with a
grain of salt, at least gives a hint at what should have been happening inside
that car’s brain.
No comments:
Post a Comment