FBI’s “Next Generation” Facial Recognition Software can be Wrong 1 out of 5 Times

ngilogo8202010

The FBI’s “Next Generation Identification (NGI)” project, currently under development, will be the largest biometric database in the world when complete. The records available for cross-reference through NGI will include various biometric markers used in criminal investigation and identification, including long-archived fingerprints, “DNA profiles,voice identification profiles, palm prints, and photographs.” The system will be equipped with facial recognition software to analyze images captured and retained by cameras connected to the database.

FBI_biometrics_32012

The majority of information publicly available about NGI is so thanks to a FOIA request and subsequent lawsuit filed by the Electronic Privacy Information Center, (EPIC).

In 2012, EPIC filed two Freedom of Information Act (FOIA) requests for documents related to the FBI’s NGI system. One request sought technical specifications related to the roll out of the NGI system. The other sought contracts between the FBI and the private entities developing the system. The FBI did not promptly comply with the law’s requirements and has so far failed to give EPIC any responsive documents. After the agency failed to comply with the Freedom of Information Act, EPIC filed a lawsuit in federal district court.

The documents obtained by EPIC are available on their website, where they describe more fully the breadth not only of content the new system will curate, but the wide scope of public and private entities who will have access to the database’s very sensitive information:

The NGI database will be used for both law enforcement and non-law enforcement purposes. It will be available to law enforcement agencies at the local, state, and federal level. But it will also be available to private entities, unrelated to a law enforcement agency.

Using facial recognition on images of crowds, NGI will enable the identification of individuals in public settings, whether or not the police have made the necessary legal showing to compel the disclosure of identification documents. The New York City Police Department began scanning irises of arrestees in 2010; these sorts of records will be entered into NGI. The Mobile Offender Recognition and Information System (“MORIS”), a handheld device, allows officers patrolling the streets to scan the irises and faces of individuals and match them against biometric databases. Similarly, children in some school districts are now required to provide biometric identifiers, such as palm prints, and are also subject to vein recognition scans. Clear, a private company offering identity services based on biometric identifiers, attempted to sell the biometric database of its users after its parent company, Verified Identity Pass, declared bankruptcy. The transfer of the biometric database was blocked by a federal district court judge.

While biometrics is a fairly new term in public conversation, its standardized use by law enforcement is nearly a century old – the term describes all identification methods which assign a system of measurements (a *metric) to features of the human body, with the ultimate goal of providing unique markers that can be assigned and attributed exclusively to one person (markers which that person would be unable to themselves corrupt or exchange), and includes fingerprints.

What is truly new and at the core of the problem with current application of biometric technology is the AMOUNT of data being collected and retained, the necessary AUTOMATION to realistically process/analyze it, and the INTEGRATION of the information into a form and location which is accessible to an unprecedented multitude of public and private security and intelligence interests, many of whom are subject to zero public oversight.

risc1 (1)

Fingerprints long collected and now part of IAFIS, the FBI *Integrated Automated Fingerprint Identification System  will now be combined with DNA, images, voice recordings, iris scan data, palm prints – terabytes upon terabytes of data which must be sifted and gleaned for information which can be used in the primary business of threat intelligence machinery – producing reports for policy makers and risk assessments, *alerts for law enforcement and private security. Upon encountering law enforcement (people of color especially) might be immediately submitted to the system, in the process undoubtedly having further information obtained from them in the form of fingerprints (and in many cases of late, DNA by default) – which can then remain in the system, available to the security market at-large, indefinitely.

*(responds to requests 24 hours a day, 365 days a year with automated fingerprint search capabilities, latent search capability, electronic image storage, and electronic exchange of fingerprints and responses. ….Not only fingerprints, but corresponding criminal histories; mug shots; scars and tattoo photos; physical characteristics like height, weight, and hair and eye color; and aliases. The system also includes civil fingerprints, mostly of individuals who have served or are serving in the U.S. military or have been or are employed by the federal government. The fingerprints and criminal history information are submitted voluntarily by state, local, and federal law enforcement agencies)

TrapWire-Screenshot-620x434

This deluge of data cannot be analyzed manually to timely effect, by a long shot, and so increasingly law enforcement ID systems include softwares with higher and higher levels of automation. Often systems are upgraded using only existing hardware – already installed cameras, for instance, were networked through the Trapwire software to provide automated threat response intelligence to the agency clients of the software’s developer. These systems, many of which attempt to provide agencies with “predictive policing” capabilities, have an abysmal track record. The National Journal reports, also from documents obtained by EPIC, that the standards set for the Next Generation Identification system’s facial recognition software’s accuracy, are disturbing at, well, face-value:

A 2010 report recently made public by the Electronic Privacy Information Center through a Freedom of Information Act request states that the facial-recognition technology “shall return an incorrect candidate a maximum of 20% of the time.”

When the technology is used against a searchable repository, it “shall return the correct candidate a minimum of 85% of the time.”

via FBI’s Facial Recognition Software Could Fail 20 Percent of the Time – NationalJournal.com.

80% might be overshooting, it seems, as additional EPIC reporting shows that Virginia Beach has a facial recognition  system currently in operation that has never produced a match or arrest since its installation in 2002. And Boston’s Logan Airport ran two separate facial recognition system tests at its security checkpoints using volunteers posing as terrorists over a three-month period and posted disappointing results. Throughout the testing period, the systems correctly identified the volunteers 153 times. However, they failed to identify the volunteers 96 times, a success rate of only 61.4 percent.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s