The Mil & Aero Blog
Tuesday, February 5, 2008
  The future of network-centric surveillance

Posted by John Keller

I think we're seeing the future of network-centric surveillance in a U.S. Air Force project called Wide-Area Surveillance that seeks to blend information from many different imaging sensors into 3-D views of areas of interest.

This vision appears in a story in the Aviation Week and Space Technology blog entitled U.S. Air Force Eyes New Surveillance System that talks about Pentagon planning to develop an electro-optical intelligence system beginning sometime next year. Writes Av Week:

Dubbed "wide area surveillance," the project stems from a prototype now operating in Iraq. The prototype system, built by the Air Force Research Laboratory and called Angel Fire, comprises multiple commercial cameras capable of collecting 1-2 frames per second. They are perched on a twin-engine, manned aircraft, which is being operated by contract personnel, the sources say. Images collected from the cameras can be "stitched" together using computers to present a near-360-degree vantage of a wide area. They may also be displayed in rapid succession to form a product similar to video.

It is this stitching together of images from different sensors that is so intriguing. They re talking about several sensors on one aircraft, but what is to prevent smart systems designers from blending images from many sensors on different platforms -- manned and unmanned aircraft, orbiting satellites, surveillance balloons, combat vehicles, and even from gun sights on individual soldiers' weapons.

What might this kind of system yield? I can imagine the ability to mouse through a computer image and travel above, around, below, and inside areas of interest. This kind of imagery potentially could even add the dimension of time. Users could look around and inside areas of interest in real time, and then compare those images to the same images one hour, one day, or one week in the past.

Computer signal processing can do amazing things these days. The television weather man can give me his "futurecast" of storms many hours in the future. What if computers could extrapolate future events based on what has happened in the past? Might a multisensor system be able to shift us between the past, present, and future of surveillance areas of interest?

I know what you're probably thinking, because I saw that movie, too. Remember Deja Vu with Denzel Washington, which came out in 2006? It's a movie with the same kind of speculation, and you ought to take a look at it if you can suspend disbelief and can get through the car chases.

Still, this would be groundbreaking capability -- view from any angle from any time. This would be of immeasurable benefit, but only if we remember to trust what we can see with the naked eye.
Comments: Post a Comment

Subscribe to Post Comments [Atom]

Links to this post:

Create a Link

<< Home
The MAE editorial staff uses the Military Aerospace and Electronics Blog to share ...

November 2007 / December 2007 / January 2008 / February 2008 / March 2008 / April 2008 / May 2008 / June 2008 / July 2008 / August 2008 / September 2008 / October 2008 / November 2008 / December 2008 / January 2009 / February 2009 / March 2009 / April 2009 / May 2009 / June 2009 / July 2009 / August 2009 / September 2009 / October 2009 / November 2009 / December 2009 / January 2010 / February 2010 / March 2010 / April 2010 / May 2010 / June 2010 / July 2010 / August 2010 / September 2010 / October 2010 / November 2010 / December 2010 / January 2011 / February 2011 / March 2011 / April 2011 / May 2011 / June 2011 / July 2011 / August 2011 / September 2011 / October 2011 / November 2011 / December 2011 /

Powered by Blogger

Subscribe to
Posts [Atom]