Recognizing Violence In Faces
A Russian firm called NTechLab has created a software that, when used in tandem with surveillance cameras, can detect emotions and identify people who are angry, nervous, or stressed in a crowd. The software then processes the emotions it perceives in the context of the age, gender, and identity (if known) of the people it is surveilling to decide who the potential criminals and terrorists are. Last year, the firm's software was used to power the FindFace app, which works on the Russian version of Facebook to find anyone from missing family members to suspects in cold cases.
NTechLab claims that the technology is more than 94 percent accurate. If they're proven right, municipalities that use it may be able to monitor situations in real time, stopping crime before it happens. Its clients mostly include retail businesses and security firms, but local, state/regional or even federal governments could conceivably use their technology.
New Technologies Preventing Crimes
Technology has already changed the way the authorities fight crime and work to prevent it. The FBI has been using the Next Generation Identification (NGI) facial recognition system, which allows the agency to parse more than 411 million photos to identify suspects — and not just the faces of people who have committed crimes. It also searches the visa and passport application photos of the State Department. In fact, experts estimate that about 117 million Americans — around half of all adults in the U.S. — are in the database. This kind of technology has also been implemented in airports since the 9/11 terror attacks.
Facial recognition is also being used to boost security in other contexts: HSBC uses facial recognition software rather than more traditional security measures, as does Lloyds in partnership with Microsoft. While this technology is primarily intended to boost online security, it is in essence also working to prevent crime and fraud.
So, can a dystopian future like the one shown in Minority Report — in which innocent people are imprisoned without ever committing actual crimes — be possible in a world that makes use of this technology? The FBI has responded to criticisms of its use of the NGI system by saying that it uses the software to generate leads — not to make positive identifications. However, state and local law enforcement agencies also have access, and might have different policies or de facto procedures. As of October 2016, Wired reported that more than 40 civil liberties groups had requested that the Civil Rights Division of the Justice Department (now headed by Jeff Sessions) evaluate the use of the technology around the country and issue guidance. As yet, the matter remains unresolved.
Share This Article