Many years ago, I was able to snag a win in the regional Science Fair with a project designed to automatically drive a car. In retrospect it was a ludicrously incapable system, but the judges were kind. Computers weren't available to high school students, so my design used hundreds of RTL ICs spread over many vector boards to handle the control requirements. It never really worked well even in the simulated tests I ran, but was a lot of fun to build.
Over forty years on and now Google and others want to populate our roads, the air, and the sea with self-driving vehicles.
Be afraid.
But first, I do think that eventually we will come up with incredibly reliable autonomous vehicles of all sorts. My concern is the transition period, that time when smart cars have to deal with zany human drivers. And the time when the systems' bugs are still being worked out.
While it's pretty hard to imagine a computer performing worse than those tailgating, texting psychopaths who now do everything in their power to make the roads as dangerous as possible, humans are wonderfully capable at dealing with the unexpected.
Computer systems, however, are notoriously fragile. In fact, they are fragile in a way never before seen. Most engineered structures bend before they fail, or can be made robust by the addition of margin. Software fails suddenly and unexpectedly. The good news is that there's a lot of work being done to understand and mitigate software's weaknesses. But the evidence suggests that unless there's a very concerted effort made – at great expense – failure will be the norm.
Consider avionics. Avionics failures are managed via triply-redundant systems. How much will that drive up the cost of a car? Then there's the engineering costs. Agonizing development approaches mediated by regulatory requirements help create systems that work as expected. DO-178B (now C) for software and DO-254 for hardware have been brilliantly successful. But to my knowledge Google et al don't make use of these standards. One wonders what assurances they can offer that their designs will be correct and reliable.
Companies are petitioning the FAA to open the airspace to drones. Drones don't have passengers, so generally don't get the engineering attention lavished on commercial airliners. But they do share the air with those hurtling tubes full of warm bodies, and we on the ground are probably not thrilled with the idea of absorbing some of a drone's kinetic energy if the software – and the aircraft – crashes. I think the same standards used in commercial aviation should be applied to unattended aircraft.
The irony of a self-driving automobile is that they will probably be required to obey traffic laws. Will shoppers cough up for a car that can only do 55 in a 55 MPH zone?
Or will the dashboard have a control that varies behavior from "grandma going to church" to "sociopath on PCP?"
文章评论(0条评论)
登录后参与讨论