In BriefAI is being integrated into surveillance systems. While that smarter technology could help law enforcement, it could also be abused in the wrong hands.
The All-Seeing AI
Surveillance cameras are ubiquitous, providing constant footage of public spaces like major intersections and bank tellers to even private front doorsteps. But integrating artificial intelligence (AI) and security camera technology, like CCTV, is only a recent development. That integration represents a significant improvement over prior surveillance, and one Florida-based company is at the forefront.
IC Realtime is responsible for introducing Ella, software that uses AI to analyze footage and return search results based on a selection of hundreds of thousands of natural language queries. Billed as “the Google of CCTV” by The Verge, Ella allows users to search through a footage database by looking for key indicators, even down to specific colors of clothing or even models of cars. The footage is then sorted by time period and given a simple thumbs up or thumbs down rating scale to improve future queries.
While Ella’s functionality may not seem like much, its implications for surveillance and law enforcement are striking. Investigators may no longer have to spend hours pouring over surveillance footage, manually searching for suspects matching key indicators. Instead, Ella could do it, and potentially help police catch criminals sooner.
IC Realtime CEO Matt Sailor spoke with The Verge about the company’s technology: “Let’s say there’s a robbery and you don’t really know what happened. But there was a Jeep Wrangler speeding east afterward. So we go in, we search for ‘Jeep Wrangler,’ and there it is.”
Pandemic of Paranoia
While homeowners are embracing surveillance as a new option for increased security, some people are skeptical about surveillance cameras’ increased presence in society. Surveillance tech, particular when coupled with AI like Ella, could be abused in the wrong hands.
In China, where there are already 170 million CCTV cameras in place, with 400 million more set to be installed by 2020, the potential for abuse is clear. In a recent report from the BBC, reporter John Sudworth had his photo flagged to see how long it would take authorities to use their network of cameras and facial recognition software to capture him. Shockingly, it only took seven minutes. This technology could easily be abused by the government to track down journalists to limit information sharing, or monitor political opposition.
Of course, any improvements to the efficacy of law enforcement investigations are welcome developments. Ella has the potential to revolutionize police work. But at what point does the erosion of privacy offset the relative boosts in productivity or easily observable crimes? Rapidly developing AI programs will soon force society to decide what it wants to prioritize — freedoms or function.