Real intelligence, or artificial intelligence; you be the judge
Posted by
John KellerMaybe it's me, but sometimes I just feel more comfortable knowing there's real intelligence, rather than
artificial intelligence, standing between me and a potentially fiery death.
Yeah, I know, I'm just a nervous Nellie, but I gotta admit that a story in
Network World entitled
DARPA advances artificial intelligence program for air traffic control gives me pause. Don't get me wrong, I think artificial intelligence is great, but I've just seen too much of it to trust it with my life.
There, I said it. You can condemn me as a Luddite now.
But just be advised that back in the mid-'80s I used to edit a technology newsletter called
Advanced Military Computing, which used to cover artificial intelligence (AI) projects at DARPA pretty closely. I remember the Autonomous Land Vehicle, as well as
Pilot's Associate. Those programs yielded great technology, but -- to put it mildly -- there were a few bumps along the way.
The Autonomous Land Vehicle's machine vision system used to mistake shadows for big potholes and other obstructions. I remember a joke about a rule-based expert system designed to diagnose problems with cars. You'd tell the computer that the car was six years old and had red flecks on its body. Answer: the car has the measles.
Maybe I'm just being silly. That's 20 years ago, and artificial intelligence has had two more
decades of research and development. It's gotta be a lot better, right?
According to Network World, DARPA is using the
Generalized Integrated Learning Architecture (GILA) system, developed by Lockheed Martin's Advanced Technology Laboratories. GILA is intended to help the Air Force keep airspace operating safely with increasing numbers of manned aircraft, unmanned aerial vehicles (UAVs) and even airborne weapons.
Airborne weapons? Yikes! I'll remember that the next time I can read the tail numbers on that eastbound US Airways 727 somewhere over Kansas as I'm heading to the West Coast. I'll thank my lucky stars it was a passenger jet and not a cruise missile.
Is anyone else nervous out there?
DARPA and Lockheed Martin say the GILA system can actually learn flight controllers' tasks -- sometimes using just one example. C'mon, can't we build in some review time for that system? If I had to learn with just one example I'd never have made it out of elementary school. Reports
Network World:
DARPA says the artificial intelligence software will learn by assembling knowledge from different sources-including generating knowledge by reasoning. According to a Military & Aerospace item, such software has to combine limited observations with subject expertise, general knowledge, reasoning, and by asking what-if questions.
Jeez, I WROTE that Military & Aerospace item. I didn't think they'd use it for air traffic control! I've got a what-if question: what if the computer learns the wrong stuff?
I can't help thinking of that old joke about the fully automated aircraft. There's a recording to reassure passengers that goes something like this: "Welcome to the first fully automated passenger aircraft. There is no pilot on board. Everything's taken care of by the world's most advanced and sophisticated computers. We have taken great care to ensure that nothing will ever go wrong ... go wrong ... go wrong ... go wrong ...
You'll have to excuse me now. I've got a plane to catch.