GAO Report - Artificial Intelligence “privacy and accuracy-related risks”
Are you saying that using AI creates false positives? I also remember a certain PITA account that tweeted in Feb 2017 watch the facial recognition software. Nothing good will come of it HERE WE ARE
Brief factual background because the GAO Report is a long time in the making or skip it…
Should you want to take a trip down Twitter memory lane - this Nov 2020 archived thread was a harbinger to this week’s GAO Report. And in case you forgot about what DHS did during the Spring/Summer of 2020 - sending surveillance planes to fly and monitor peaceful protestors and running their pictures into the bad and I mean bad Facial Recognition Software with a janky-AI …see November 2020 Twitter thread -archived
Just in case you want to read about Portland and what the DHS-OIG discovered, see this Twitter thread archive too -because we all learned about Operation Diligent Valor via deposition transcripts
And let’s not forget about how the IRA managed to use artificial intelligence to create whole online personas, acting like Reporters —wait for the whole September 2020 archived thread to populate
Man that took me a bit of time to carefully run a deeper way back search. In retrospect my god Twitter is Toxic AF- some of the trolling for a brief moment I felt a wave of PTSD because under the presidency of Trump —cruelty was the feature and it seems like he stress tested all of our Federal Agencies. Then of course the archive gods gave me a gift - this June 2020 Twitter Thread - ClearviewAI - but it’s that multi word string that over a year later makes me snort laugh
Okay now that you have a bit of a background let’s go ahead and dive into the recently released GAO report - shall we?
On July 6, 2021 the GAO publicly published GAO Report 21-518 this is roughly an 89 page report. I’m going to let you in on a little secret
… always read the Appendices…
especially as it relates GAO Reports. Those of us in DC know exactly why I’m saying this, because when you know what you know until you don’t.
Yes the date of the GAO Report No 21-518 states June 3, 2021 but the truth is the GAO didn’t publicly publish the report until late in the afternoon of June 29, 2021 and it took me a while to do the backend research and quadruple check my facts. Because I felt my readers needed the background information, Congressional Communications and other various Public Reporting to fully understand the context and content of the GAO Report. I strive to be thorough and I’ll take accuracy over expediency any day. But it’s a very document & info heavy article and I make no apologies for it.
FACIAL RECOGNITION TECHNOLOGY
Federal Law Enforcement Agencies Should Better Assess Privacy and Other Risks
To be clear this report is strictly concerning Federal Law Enforcement - the twenty agencies that reported “owning systems with facial recognition” -those 20 agencies translates to - cumulatively speaking about collectively 120,000 federal law enforcement officers1
GAO surveyed 42 federal agencies that employ law enforcement officers about their use of facial recognition technology. Twenty reported owning systems with facial recognition technology or using systems owned by other entities, such as other federal, state, local, and non-government entities
To be fair yes Facial Recognition AI software can be used for good. I’m not inherently distrustful of this ever evolving technology - meaning if it can get reliable information to law enforcement with equal accuracy and expediency then of course it’s something to invest in. You’ll note that the Veteran Affairs is uniquely suited to not only be an incubator of emergent technology but they have the unique ability to use facial recognition AI for patient care. But it’s a very delicate balancing act - with any technology the list of pros/cons should be weighted before an enterprise roll out.
Congress - Mitch I am looking at you -you legislative grim reaper. Do. You. Job.
Also I will say this until I’m blue in the face - CONGRESS is dropping the ball - that technology is moving at lightening fast speed and Congress continues to be 14 to 30 months too late, meaning there are no current Federal Laws that fully regulate Facial Recognition AI - in the 116th Congress
S. 3284 “To create a moratorium on the government use of facial recognition tech- nology until a Commission recommends the appropriate guidelines and limitation for use of facial recognition technology” went to Mitch McConnell’s legislative graveyard where it died
S.2878 - Facial Recognition Technology Warrant Act of 2019 - To limit the use of facial recognition technology by Federal agencies, and for other purposes. Introduce Nov 2019 & then DOA
117th Congress - H.R.3907 & S. 2052 “To prohibit biometric surveillance by the Federal Government without explicit statutory authorization and to withhold certain Federal public safety grants from State and local governments that engage in biometric surveillance”
an analysis of facial recognition tools by the National Institute of Standards and Technology found that Black, Brown, and Asian individuals were up to 100 times more likely to be misidentified than white male faces. Three Black men have already been wrongfully arrested based on a false facial recognition match, and earlier this month, more than 40 leading civil rights and privacy groups called for a moratorium on law enforcement entities’ use of this technology.
Facial Recognition Technology:Federal Law Enforcement Agencies Should Better Assess Privacy and Other Risks 2 3
GAO-21-518
Published: Jun 03, 2021. Publicly Released: Jun 29, 2021.
Agencies reported using the technology to support several activities (e.g., criminal investigations) and in response to COVID-19 (e.g., verify an individual’s identity remotely).
Six agencies reported using the technology on images of the unrest, riots, or protests following the death of George Floyd in May 2020. Three agencies reported using it on images of the events at the U.S. Capitol on January 6, 2021. Agencies said the searches used images of suspected criminal activity
How does Facial Recognition Work
The figure included in the newly released GAO report does an excellent job via this graphic to explain how AI and Facial Recognition Software really works - I now refer you to page 11 of the newly released GAO Report
There two known, but rather general categories for the use of Facial Recognition, powered by AI :
verification: often referred officially as “one to one searches” which compared a photograph and the photo identification presented (like a drivers license and/or passport)
identification: unlike verification which is limited by “ one to one” identification uses one-to-many algorithm. Meaning this compares a singular photo of an individual against a large gallery (or database) of photos of numerous individuals to see if there’s a potential match.
As depicted below (and found on page 6 of the GAO Report) —with respect to the “identification” this what law enforcement uses as part of their investigatory process in criminal matters. This applies to Federal, State and Local criminal investigations.
Vigilant Solutions (VS) and Clearview AI (CVAI)
The history of Clearview AI is - not exactly the best
August 2019 ACLU of Northern California, Facial Recognition Technology Falsely Identifies 26 California Legislators with Mugshots
Facial recognition misclassifies transgender and non-binary people, study finds, Mic, (Oct. 30, 2019)
June 24, 2020 New York Times Wrongfully Accused by an Algorithm
July 7, 2019 New York Times ICE Used Facial Recognition to Mine State Driver’s License Databases
Senator Markley 2020 Communications and Responses from Clearview, are embedded below in reverse chronological order (oldest to newest) -Clearview (which as a start up in 2017) a massed a database of 3+B pictures
January 2020 Sen Markley’s letter to CVAI -also read pages 2 and 3 for the itemized list of documents and answers to CVAI “..letter querying Clearview AI CEO Hoan Ton-That about the company’s facial recognition app, which has reportedly been sold to more than 600 law enforcement agencies. Recent reporting in the New York Times revealed that Clearview’s app allows users to capture and upload photos of strangers, analyze the photographed individuals’ biometric information, and provide users with existing images and personal information of the photographed individuals online.”
January 31, 2020 CVAI response to the specific question from Senator Markley concerning the Company’s cyber security infrastructure
February 2020 CVAI”s was hacked and records stolen - this Daily Beast article is one of the more detailed tick tocks of what the cyber criminals purloined.
March 3, 2020 Sen Markley sent the following follow up letter to CVAI - (pay attention to pages 3-5 and the questions asked. It relates to COPPA (just Google my former Twitter handle and COPPA and FTC —I tweeted about this a lot)… in the letter Sen Markley specifically asked about Facebook, Twitter, Instagram, TikTok,Vine etc user deletion of their images and CVAI’s policy. Or lack thereof -they’ve never obtained the lawfully required parental approval of users under the age of 13.
‘March 24, 2020 CVAI’s response to Sen Markley’s March 3, 2020 - CVAI’s answer to one question —is underwhelming
‘The test compared the headshots from all three legislative bodies against Clearview AI’s proprietary database (at that time) of 2.8 billion images (112,000 times the size of the database used by the ACLU). The Panel determined that Clearview AI rated 100% accurate, producing instant and accurate matches for every one of the 834 federal and state legislators in the test cohort”
April 30, 2020 Sen Markley’s 3rd follow up letter to CVAI: “Please identify any government entities, including state government entities, with which Clearview has engaged on the use of Clearview’s technology for contact tracing or other responses to the ongoing pandemic”
May 15, 2020 - CVAI Letter Response to the April 30th letter again their response is underwhelming and presents more questions than actual answers: “..we have engaged in confidential discussions with a small number of public officials to explore how we might be of assistance. Many other technology companies have done the same. If we can help stop the spread of this lethal pandemic, we welcome the opportunity to further Clearview AI's core mission - protecting our communities”
June 8, 2020 Sen Markley’s 4th follow up letter to CVAI - who had previously rejected the request for a neutral 3rd party oversight and/or audit, failed to answer questions about the privacy and civil liberties implications.
“Civil liberties experts have expressed concerns that unregulated deployment of facial recognition technologies could allow law enforcement agencies to identify and arrest protesters long after the demonstrations end”
“…your company has not been adequately transparent about several issues, including how law enforcement agencies procure access to Clearview AI’s app; how Clearview AI ensures that the software will not be misused; and whether Clearview AI’s technology is free of dangerous biases and inaccurate results…”
June 11, 2020 Sen Markley’s letter to Attorney General Barr concerning the covert domestic surveillance of peaceful protestors and the FBI, DEA and ATF use of CVAI (the letter has been 404’d but the press release is still live)
“…today queried Attorney General William P. Barr following reports that the Department of Justice (DOJ) has approved surveillance of peaceful protestors participating in demonstrations inspired by the killing of George Floyd. Senator Markey highlighted a number of reports that the federal government has authorized different means of surveillance against protestors in recent weeks”
These include DOJ granting a request from the Drug Enforcement Administration (DEA) to engage in "covert surveillance" and conduct “interviews and searches” in response to the ongoing demonstrations
Vigilant Solutions - ICE, DHS but mainly used by State and Local Law -Enforcement. I ran a recent Contract Search for awards for Vigilant Solutions via FPDS and VIGILANT VIDEO INC via FPDS
…also I can’t believe I have to say this again, it’s about professional respect anc ethics —if you’re going to pull down my files from my public drive and then tweet them like it’s your research - don’t think that I haven’t noticed, I have and it’s annoying. This does not apply to reporters who have messaged me asking for permission and/or offered attribution —which I largely declined because I’m not in this for the notoriety…
Similar to Clearview AI - Vigilant Solutions has entirely questionable practices and ethics - because civil rights and privacy appeared to be optional under the Trump Administration
June 2019 - ACLU recently revealed that over 9,000 ICE agents have access to billions of license plate scans in the Vigilant database, collected by law enforcement agencies and private companies operating automated license plate readers (ALPR). Location information over time reveals, as the U.S. Supreme Court said last year, an individual’s “familial, political, professional, religious, and sexual associations.”
December 2018 via ACLU and Forbes: “ICE may also be tracking drivers on behalf of other federal and local agencies. The search logs also reference Interpol Red or Blue Notices, which foreign countries use to seek the arrest of government critics and individuals who haven’t been convicted of a crime”
March and February 2018 - ICE used a feature called “Stakeout Browsing Mobile.” According to Vigilant materials and training video, this allows a user to search for license plates scanned in the vicinity of a particular location. A “Stakeout Browsing Mobile” query by ICE in March 2018 yielded almost 10,000 results, potentially subjecting drivers to dragnet surveillance.
The search logs are available here. Please note that the ACLU redacted certain cells that contained personally-identifying information by placing a black box in that cell.
So now that you have a bit of background on the TWO service providers - the GAO Report elaborates privacy concerns and certain statutory limitations. I now refer you to page 7 of the GAO Report
Protip - by cutting straight to the GAO’s “Recommendations and Appendices” that tells you the reports findings and each Agency’s response to the pre-publication of the GAO Report. For Example the GAO made a total of 28 recommendations - largely the Agencies identified by the GAO had nearly verbatim recommendations - with a few agencies contradicting themselves— of which the GAO noted
…should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities
…after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracyrelated risks
Department of the Treasury
…while it largely concurred with the GAO’s recommendations. Stating in part; the Internal Revenue Service’s Criminal Investigation Division;
does have mechanisms in place capable of tracking non-federal systems used by employees.
yet the Criminal Investigation Division told the GAO it does not track what systems are used by employees because it is not the owner of these technologies. Moreover, as noted in our report, the Criminal Investigation Division used facial recognition technology owned by other governmental entities at the federal and state, local, tribal, or territorial level
…therefore the GAO continues to believe that the Criminal Investigation Division should, as part of implementing these recommendations, ensure mechanisms are in place to appropriately track what non-federal systems (e.g., systems owned by other federal and state agencies) are used to support its investigative activities.
Appendix II - Systems with Facial Recognition Technology Owned by Federal Agencies that Employ Law Enforcement Officers;
Eight of the 42 surveyed agencies reported owning systems with facial recognition technology.
Eight summaries are presented below—that is, one for each of the eight agencies.1 Each summary includes a description of the agency’s mission, an overview of the agency’s ownership and use of other entities’ systems, and detailed descriptions of each system owned by the agency. Information in these summaries was provided by the respective agency.
See figure 4 for an illustration of the layout of the summaries, including a description of each section in the summaries.
FBI Owned Facial Recognition System:
The FBI Summarized their system thusly:
From January 2015 through March 2020, the FBI owned six systems with facial recognition technology; however, only one of these systems was in operation as of March 31, 2020. The FBI also used facial recognition technology owned by other federal, state and local, and non-government entities.
The FBI’s Next Generation Identification Interstate Photo System
allows authorized law enforcement officials to conduct facial recognition searches using a repository of approximately 42.8 million photos
To conduct a search, a user uploads a probe photo and the facial recognition software compares the probe photo against the repository to find likely matches.
The likely matches are returned in a ranked candidate list of two to 50 photos, depending on the user’s specification.
2015 through 2019, the total number of facial recognition searches conducted on this system was 232,915.
The FBI has three other systems that are not in “full operation” as of March 2020 - the “government off-the-shelf systems” are:
Horus - FBI forensic examiners compare two images to determine whether faces within the images represent the same individual (i.e., one-to-one comparisons). The FBI is researching Horus to determine theres this type of system could be incorporated into the FBI’s one-to-one comparisons process
Rank One - FBI is using Rank One, a government-off-the-shelf product, for research purposes. Specifically, the FBI is researching Rank One to determine whether this type of system could be incorporated into the FBI’s one-to-one comparisons process
Automatic Face Detection and Recognition/Cluster Base - FBI used Automatic Face Detection and Recognition/Cluster Base, a government-off-the-shelf product, for research purposes and for educational purposes in demonstrating facial recognition technology - As of March 31, 2020, the system was not in operation and no longer used for research purposes.
There are two previously used systems but it’s apparent that the FBI (as of March 31, 2020) “facial recognition technology software was not in operation…has no plans to use the software… and/or not in operation and no longer undergoing testing” I now refer you to page 45 of the report
Camera with Facial Recognition Software - in late 2015, the FBI purchased a commercial “one camera that came with facial recognition software” The camera is similar to the one used to take driver’s license and passport photographs for undercover operations. The equipment is solely used for this purpose and the FBI has never used the facial recognition capability that was provided with the camera.
NeoFace Reveal (a commercial NEC system, see more here - here and here) According to the recently published GAO Report
Federal Bureau of Prisons (BOP) -Department of Justice
So similar to the FBI -agency owned system the BOP also owns its facial recognition software - but there’s an anomaly in the info the BOP provided to the GAO (yes I know it’s a tiny detail)
“the BOP owned one system with facial recognition technology that was in operation. BOP did not use facial recognition technology owned by other federal, state or local, and non-government entities…Number of photos in database - approximately 8,000”
Do you understand why that 8,000 number is kind of troubling? According to the recent data
153,996 Total Federal Inmates of that
9,368 federal inmates in privately managed facilities and
14,990 federal inmates in other types of facilities.
This BOP data was last updated on July 8, 2021 - so it would be accurate to state that BOP data is pretty damn fresh, no? Which bodes the follow up question- if the BOP has their own facial recognition software then why does their database only have 8K photos whereas their total inmate population is about 154K prisoners… but I suppose that’s none of my business
U.S. Customs and Border Protection has TWO facial recognition software which it owns & uses other agency databases
To be fair this kind of makes sense but it is still kind of unnerving to read just how much data —specifically photographs CBP has
From January 2015 through March 2020, CBP reported owning two systems with facial recognition technology in operation—
Traveler Verification Service (TVS) or see DHS/CBP/PIA-056
Automated Targeting System (ATS) or see DHS/CBP/PIA-006(e)
According to CBP, these systems allow CBP to identify travelers and provide tools to collect and disseminate information on individuals who could pose a risk to the country, among other things. In addition, CBP also used facial recognition technology owned by other federal, state and local, and non-government entities.
Traveler Verification Service uses facial recognition technology to verify the identity of international travelers entering and exiting the United States…system searches DHS databases of photos associated with individuals listed on the travel manifest, and it then creates a prestaged “gallery” of templates created from those photos. CBP uses these “galleries” for matching purposes only and deletes them from the system within 12 hours (read the footnotes)
Automated Targeting System CBP uses the system for the following populations:
1) individuals seeking to enter or exit the U.S. whose names appear on travel manifests (e.g., flight or vessel manifests);
2) individuals applying for CBP programs facilitating travel to the U.S.; and
3) subjects of interest who require additional research and analysis. CBP will match photographs for these three populations against a predetermined gallery of photographs.
Now if I wanted to be a complete PITA -I’d ask questions like - so DHS Secretary does TSA ATS and/or TVS obtain the written parental permission for children under the age of 13 - cough COPPA - spoiler no they do not because well I can’t tell you why because that could get me in lukewarm water for stating this inconvenient fact that no one wants to talk about
Transportation Security Administration (TSA)
The TSA uses various systems and that’s not at all surprising— but geographically speaking the trial/pilots programs are beyond vexing. Because if the focus is on inbound international flights/passengers why wouldn’t the TSA “pilot” their CAT2 in the top ranked inbound/outbound international airports
Hartsfield–Jackson Atlanta ranks # 1
Los Angeles International Airport ranks # 2
O'Hare International Airport ranks # 3
Dallas/Fort Worth International Airport ranks # 4
Denver International Airport #5
Credential Authentication Technology-2 (CAT-2) or see TSA Identity Management: CAT-2 Overview Nov 2017
As of March 31, 2020, CAT-2 was in development. TSA began conducting a demonstration of CAT-2 at Reagan National Airport in August 2020 (which is ranked 30 of international inbound/outbound flights). At the time, TSA was also planning to conduct field test events at Reagan National Airport, Miami International Airport, Phoenix Sky Harbor International Airport, Indianapolis International Airport, and a demonstration at Denver International Airport.
Automated Credential Authentication Technology (AutoCAT) - see Feb 2021 TSA press release or see April 2021 Capability Acceptance Process Acceptable Capability List TSA Report or see DHS/TSA/PIA-046(a) - conceptually of course this makes sense - statutorily TSA Modernization Act 2018 - here’s 2019 NBC local affiliate. Also it’s great but DHS storing that amount of biometric data not exactly comforting given the aggressive cyber intrusions. Just a thought not a sermon
Police Service Department of Veterans Affairs
What most might not know - the VA has:
National Artificial Intelligence Institute (NAII) with a Research and Development focus on Veterans
…seeks to develop AI research and development capabilities in VA to support Veterans, their families, survivors, and caregivers. The NAII designs and collaborates on large-scale AI R&D initiatives, national AI policy, and partnerships across agencies, industries, and academia. The NAII is a joint initiative by the Office of Research and Development and the Office of the Secretary's Center for Strategic Partnerships in VA. The NAII is dedicated to advancing AI research and development for real-world impact and outcomes to ensure Veteran health and well-being.
As the GAO Report notes - the VA uses several facial recognition programs
Motorola Avigilon Appearance Search which launched in 2018 and it’s a commercially available system with facial and weapons recognition capabilities, according to the VA Police Service.
AnyVision 4 (an Israeli facial recognition startup - swear to baby Jesus if we find out that Jared Kushner pressured VA Sec Wilkie I’m going to lose my marbles) because it seems like Congress is asleep at the legislative wheel - if a Facial Recognition AI company is not American based - then it seems like a no brained for Congress to (maybe amend the FY2021 or 2022 NDAA to REQUIRE federal agencies to solely use American made products)…but I suppose that’s none of my business
a commercially available system that can compare video footage to photos stored in a central database, according to the VA Police Service. This technology can search for suspects in video footage and display detections in a timeline. “VA Police Service plans to use the software at the VA Medical Center at West Palm Beach to, among other things, respond to missing patients or those at risk of suicide, and to identify suspected or active threats”.
Extra reading because if you read both recent GAO reports -well you should be concerned and I think it’s bonkers that Mitch McConnell has that much power that he can kill a bill -because under the 116th Congress
Forensic Technology:
Algorithms Strengthen Forensic Analysis, but Several Factors Can Affect Outcomes
Published: Jul 06, 2021. Publicly Released: Jul 06, 2021.
Law enforcement agencies use forensic algorithms in criminal investigations to help assess whether evidence originated from a specific individual—improving the speed and objectivity of many investigations.
However, analysts and investigators face several challenges, such as difficulty interpreting and communicating results, and addressing potential bias and misuse.
We developed three policy options that could help with such challenges:
Increased training
Standards and policies on appropriate use
Increased transparency in testing, performance, and use of these algorithms
Forensic algorithms can help analyze latent fingerprints, faces, genetic information, and more
and if you’re wondering about my “those of us who are really in DC know to read the Appendices “ -welp that’s because the Appendices contain the Agency’s comments and any potential rebuttals but almost always contain a lot of factual information. That’s why I said that those of us in DC know what we know because we do and we don’t need to pretend with comments like “my people” <—I find those who say that actually have no people - at all. At any rate I think you can tell this article took a while to not only read and understand the report but it took some time to track down the relevant and underlying documents.
At any rate it’s Monday which means
I’ll be out of pocket until mid afternoon - so you have a fantastic Monday because mine will likely be similar to Dante’s Third Level: Wrath by 11AM <—snort
-Filey
120,000 Federal Law Enforcement Officers, data derived from , based on Bureau of Justice Statistics, Federal Law Enforcement Officers, 2016 – Statistical Tables, NCJ 251922 - https://bjs.ojp.gov/content/pub/pdf/fleo16st.pdf
Facial Recognition Technology:Commercial Uses, Privacy Issues, and Applicable Federal Law GAO-15-621 Published: Jul 30, 2015. Publicly Released: Jul 30, 2015. https://www.gao.gov/assets/680/672127.pdf
Facial Recognition Technology:Privacy and Accuracy Issues Related to Commercial Uses GAO-20-522 Published: Jul 13, 2020. Publicly Released: Aug 11, 2020. https://www.gao.gov/assets/710/708201.pdf
Microsoft DIVEST from AnyVision - March 2020 Reuters https://www.reuters.com/article/us-microsoft-anyvision-idUSKBN21E3BA - last visited July 8, 2021 “…law firm found that AnyVision’s technology was in use at checkpoints in border crossings between Israel and the West Bank - as the startup had said - but that it had not fueled a mass surveillance program there, according to a copy of the audit’s findings posted on the website of M12, Microsoft’s venture fund.”
"CONGRESS is dropping the ball" is truly an understatement. They not only dropped, but they're actively stomping while they have inserted a lubricated needle into the valve all while leading gqp'ers bobert, mpg and getemgetz sit on said lubricated needle to maximize air loss...