The National Geospatial-Intelligence Agency (NGA), an arm of the U.S. Pentagon, is home to imagery analysts who pore over huge amounts of satellite images and data for insight into everything from natural disasters to potential threats from other countries and terrorist cells. However, there are concerns among those that work at the agency that their roles are set to be phased out.
The NGA is currently in the midst of a far-reaching effort to implement new technology in its practices. Some analysts are concerned that this might result in their jobs being reassigned to automated systems that utilize machine learning and computer vision.
“The fundamentals of our job are to take images of the planet from all sources, some government and some commercial, and create an understanding of man-made activity around the globe,” said NGA director Robert Cardillo in a recent interview with Foreign Policy. “I’m optimistic about the advances in machine learning on that part.”
However, other experts in the field of imagery analysis feel that an over-reliance on new technology might result in weaker intelligence. Francisco Nix is an imagery analysis instructor in the aerospace department of Northland Community and Technical College, and he suggests that these advances should serve as a supplement to human analysts, rather than as a replacement.
“The need for AI and machine learning has its place as an asset, but nothing more,” Nix wrote in an email to Futurism. “The information still has to be seen by an analyst and confirmed, edited, or discarded. I think Mr. Cardillo knows this – he also knows the overwhelming information available to him and the demand for analysis. He wouldn’t be doing his job without moving forward with what AI or machine learning has to offer NGA and DoD.”
There’s never been a more information-rich time for an organization like the NGA. Private companies are making it easier than ever before to launch satellites for surveillance purposes, while drones offer a method of more localized imaging. With all this raw data, automated systems can certainly play a role in sifting through the noise and making sure than human operatives spend their time analyzing the most relevant images. Bu that doesn’t mean these systems will know what to do with what they find.
“There’s no sub for the human eye – AI can probably help an analyst determine an item, find activity or numbers, but who is going to provide the analysis?” said Nix.
Nix argued that “years of experience, repetition, and self-learned training” are essential for a good analyst. This is a role that requires a great deal of nuance, and it might be a mistake to allocate the subtle responsibility of threat assessment to a machine.
This situation seems to be something of a microcosm of a wider friction when it comes to automation. It’s tempting to go all-in on these new technologies – but their most effective implementation seems to be an ancillary role, one that combines their strengths with the advantages of the human mind.