The future of network-centric surveillance
Posted by
John KellerI think we're seeing the future of
network-centric surveillance in a U.S. Air Force project called Wide-Area Surveillance that seeks to blend information from many different imaging sensors into 3-D views of areas of interest.
This vision appears in a story in the
Aviation Week and Space Technology blog entitled
U.S. Air Force Eyes New Surveillance System that talks about Pentagon planning to develop an electro-optical intelligence system beginning sometime next year. Writes Av Week:
Dubbed "wide area surveillance," the project stems from a prototype now operating in Iraq. The prototype system, built by the Air Force Research Laboratory and called Angel Fire, comprises multiple commercial cameras capable of collecting 1-2 frames per second. They are perched on a twin-engine, manned aircraft, which is being operated by contract personnel, the sources say. Images collected from the cameras can be "stitched" together using computers to present a near-360-degree vantage of a wide area. They may also be displayed in rapid succession to form a product similar to video.
It is this stitching together of images from different sensors that is so intriguing. They re talking about several sensors on one aircraft, but what is to prevent smart systems designers from blending images from many sensors on different platforms -- manned and unmanned aircraft, orbiting satellites, surveillance balloons, combat vehicles, and even from gun sights on individual soldiers' weapons.
What might this kind of system yield? I can imagine the ability to mouse through a computer image and travel above, around, below, and inside areas of interest. This kind of imagery potentially could even add the dimension of time. Users could look around and inside areas of interest in real time, and then compare those images to the same images one hour, one day, or one week in the past.
Computer
signal processing can do amazing things these days. The television weather man can give me his "
futurecast" of storms many hours in the future. What if computers could extrapolate future events based on what has happened in the past? Might a
multisensor system be able to shift us between the past, present, and future of surveillance areas of interest?
I know what you're probably thinking, because I saw that movie, too. Remember Deja Vu with Denzel Washington, which came out in 2006? It's a movie with the same kind of speculation, and you ought to take a look at it if you can suspend disbelief and can get through the car chases.
Still, this would be groundbreaking capability -- view from any angle from any time. This would be of immeasurable benefit, but only if we remember to trust what we can see with the naked eye.