15+US City Bans
3EU Ban Categories
50+Laws Worldwide
$16BMarket (2025)

Table of Contents

1. Overview & Definitions

Facial recognition technology (FRT) uses AI to identify or verify individuals by analyzing facial features from images or video. It is among the most regulated and contested AI applications globally, touching on fundamental rights to privacy, non-discrimination, and freedom of assembly.

1.1 Types of Biometric AI

Technology Description Use Cases Regulatory Concern
Facial Recognition (1:1)Verification: compares face to a known reference imagePhone unlock; border control; banking KYCConsent; data security; accuracy
Facial Recognition (1:N)Identification: searches face against a database of manyLaw enforcement suspect identification; missing personsMass surveillance; false positives; civil liberties
Real-Time Remote Biometric IDLive identification of individuals in public spacesLive surveillance cameras; event security; smart citiesChilling effect; freedom of assembly; disproportionate surveillance
Emotion RecognitionAI inferring emotional states from facial expressionsMarket research; hiring screening; classroom monitoringScientific validity questioned; manipulation potential; discrimination
Gait RecognitionIdentifying individuals by walking patternsSecurity; surveillance where faces are obscuredCovert identification; hard to opt out
Voice BiometricsIdentifying individuals by voice patternsPhone banking; call center verification; smart speakersConsent; eavesdropping; deepfake vulnerability
Fingerprint/Iris ScanningTraditional biometrics enhanced with AIAccess control; national ID; criminal justiceDatabase security; mission creep; compelled collection

1.2 Accuracy & Bias Concerns

Documented Bias: Multiple studies, including landmark research by NIST (Face Recognition Vendor Test, 2019) and MIT researcher Joy Buolamwini, have demonstrated that facial recognition systems exhibit significantly higher error rates for women, people of color, and particularly dark-skinned women. NIST found false positive rates up to 100 times higher for African and East Asian faces compared to Eastern European faces in some algorithms. These findings have been central to legislative efforts restricting FRT use.

2. European Union

2.1 EU AI Act — Biometric Provisions

The EU AI Act contains the world’s most comprehensive restrictions on biometric AI systems:

Prohibited Biometric Practices (Article 5)

Prohibition Description Exceptions
Real-Time Remote Biometric ID in PublicLive facial recognition in publicly accessible spaces for law enforcement purposesThree narrow exceptions: (1) targeted search for specific crime victims/missing children; (2) prevention of specific imminent threat to life/terrorist attack; (3) identification of suspects for serious crimes (listed in Annex). All require prior judicial authorization (or post-hoc within 24 hours for urgency).
Emotion Recognition (Workplace/Education)AI systems inferring emotions of persons in workplace or educational settingsMedical or safety purposes only (e.g., detecting drowsiness in safety-critical roles)
Untargeted Facial Image ScrapingScraping facial images from internet or CCTV to build facial recognition databasesNo exceptions — absolute prohibition (targeting Clearview AI-type practices)
Biometric Categorization (Sensitive Attributes)Categorizing persons based on biometric data to infer race, political opinions, religion, sexual orientationLabeling of lawfully acquired biometric data in law enforcement (narrow)

High-Risk Biometric Systems (Annex III)

Biometric systems not prohibited are classified as high-risk under the AI Act if used for:

High-risk classification triggers: conformity assessment, risk management, data governance, transparency, human oversight, accuracy requirements, and registration in EU database.

2.2 GDPR — Biometric Data

The GDPR provides additional protections for biometric data:

2.3 Enforcement Actions

Case Authority Fine Issue
Clearview AI (Italy)Garante (Italy)€20 millionUnlawful processing of biometric data; scraping Italian residents’ facial images; no legal basis; lack of transparency
Clearview AI (France)CNIL (France)€20 millionSame violations under French law; failure to comply with CNIL enforcement notice
Clearview AI (UK)ICO (UK)£7.55 millionProcessing UK residents’ data without consent; failure to provide right of access; no lawful basis
Clearview AI (Greece)HDPA (Greece)€20 millionUnlawful biometric data processing; non-compliance with GDPR
Clearview AI (Austria)DSB (Austria)Order to cease processingGDPR violations related to biometric data scraping; Austria-specific cease and desist
PimEyes (Poland)UODO (Poland)Investigation ongoingFacial recognition search engine; GDPR compliance questioned

3. United States — Federal

The US has no comprehensive federal biometric or facial recognition law, though several bills have been introduced and sector-specific rules exist.

3.1 Federal Legislation (Proposed)

Bill Status Key Provisions
Facial Recognition and Biometric Technology Moratorium ActIntroduced 2020, 2021, 2023; not passedFederal moratorium on government use of facial recognition; ban on federal funds for FRT; requires Congressional authorization to resume
FACE Act (Facial and Accurate Centralized Ethics)Introduced 2023; committeeWarrant requirement for federal FRT use; prohibition on real-time mass surveillance; accuracy standards
American Data Privacy and Protection Act (ADPPA)Passed House committee 2022; stalledWould have included biometric data as “covered data”; consent requirements; opt-out rights
REAL ID Act (implications)Enacted 2005; enforcement ongoingStandardized ID requirements; DHS facial recognition for identity verification; state DMV photo databases

3.2 Federal Agency Policies

3.3 Executive Action

4. United States — State & Local

In the absence of federal legislation, US states and cities have enacted a patchwork of biometric and facial recognition laws ranging from comprehensive bans to targeted regulations.

4.1 State Biometric Privacy Laws

State Law Year Key Provisions Private Right of Action
IllinoisBiometric Information Privacy Act (BIPA)2008Informed consent before collection; retention schedule required; prohibition on profiting from biometric data; written policy disclosureYes — $1,000-$5,000 per violation
TexasCapture or Use of Biometric Identifier Act2009Consent required; commercial purpose restrictions; destruction requirementsNo (AG enforcement only; $25,000/violation)
WashingtonBiometric Identifiers Act (HB 1493)2017Notice and consent for commercial biometric use; enrollment database restrictionsNo (AG enforcement only)
CaliforniaCCPA/CPRA (biometric provisions)2018/2020Biometric data as sensitive personal information; opt-out rights; disclosure requirements; data minimizationLimited (data breach claims only; AG and CPPA enforcement for other violations)
New YorkSHIELD Act + NYC Biometric Identifier Info Law2019/2021SHIELD: biometric data breach notification; NYC: commercial establishments must disclose biometric collection with signageNYC law: $500/violation per incident for signage; $5,000 for misuse
ColoradoColorado Privacy Act (biometric)2023Biometric data as sensitive data; consent required for processing; opt-out for profiling; DPIA requiredNo (AG enforcement)
MarylandFacial Recognition Transparency Act2024Law enforcement must disclose FRT use; judicial authorization required; annual reporting; accuracy standardsNo (AG enforcement)

4.2 Illinois BIPA — Landmark Litigation

BIPA’s Impact: Illinois BIPA is the most litigated biometric law in the world. Key cases have established that: (1) every scan without consent is a separate violation (Cothron v. White Castle, IL Supreme Court 2023); (2) standing does not require actual harm (Rosenbach v. Six Flags, IL Supreme Court 2019); and (3) claims accrue per scan, not per person. These rulings have created massive class action exposure. Meta (Facebook) settled for $650M; Google settled for $100M; TikTok settled for $92M. Total BIPA settlements exceed $5 billion.
Case Year Settlement/Ruling Significance
In re Facebook Biometric Info Privacy Litigation2021$650 millionLargest biometric privacy settlement; Tag Suggestions feature; per-user payout ~$350
In re Google Biometric Info Privacy Litigation2022$100 millionGoogle Photos face grouping feature; applied BIPA to cloud photo services
In re TikTok Inc. BIPA Litigation2021$92 millionFacial filter features; expanded BIPA to social media engagement tools
Rosenbach v. Six Flags2019IL Supreme Court rulingNo actual harm needed for BIPA standing; statutory violation alone sufficient
Cothron v. White Castle2023IL Supreme Court rulingEach biometric scan is a separate BIPA violation; damages accrue per scan, not per person
Rogers v. BNSF Railway2022$228 million (jury verdict)First BIPA jury trial; fingerprint scanning of truck drivers; largest single BIPA verdict

4.3 City/County Facial Recognition Bans

Jurisdiction Year Scope
San Francisco, CA2019First US city to ban government use of FRT; police and all city agencies prohibited
Oakland, CA2019Government use ban; aligned with San Francisco
Berkeley, CA2019Government use ban
Somerville, MA2019Government use ban; first East Coast city
Boston, MA2020Government use ban; largest East Coast city to ban
Portland, OR2020Most comprehensive: bans both government AND private sector use in public accommodations
Minneapolis, MN2020Government use ban
New Orleans, LA2020Government use ban (passed alongside predictive policing ban)
King County, WA2021Government use ban; most populous county to ban (~2.3M people)
New York City, NY2021Commercial establishments: must post signage; cannot sell biometric data (Biometric Identifier Info Law)
Baltimore, MD2021Government use ban

4.4 State-Level FRT Restrictions

5. United Kingdom

5.1 Legal Framework

The UK does not have FRT-specific legislation but regulates through existing frameworks:

5.2 Metropolitan Police & South Wales Police

UK police forces have been the most prominent users of live facial recognition (LFR) in a democratic country:

5.3 ICO Enforcement

6. China

China presents a paradox: extensive government deployment of facial recognition alongside growing legal protections for individuals in the private sector.

6.1 Government Use

6.2 Private Sector Restrictions

China has progressively restricted private sector facial recognition use:

Law/Regulation Year Key Provisions for FRT
Personal Information Protection Law (PIPL)2021Facial data is “sensitive personal information”; separate consent required; purpose limitation; necessity principle; DPIA mandatory
“Face Recognition” Judicial Interpretation2021Supreme People’s Court rules: businesses cannot force customers to use FRT as sole ID method; consent must be voluntary; individuals can sue for violations
GB/T 41772–2022 Standard2022National standard for facial recognition in information security; data protection requirements; accuracy benchmarks
Provisions on Face Recognition Technology (Draft)2023CAC draft: purpose limitation; necessity requirement; alternatives must be offered; no forced FRT for services; restrictions on public space deployment

6.3 Landmark Cases

7. Other Jurisdictions

7.1 Canada

7.2 Australia

7.3 Japan

7.4 India

7.5 Brazil

7.6 South Korea

8. Law Enforcement Use

Law enforcement use of facial recognition is the most contentious application, balancing public safety against civil liberties, privacy, and the risk of discriminatory policing.

8.1 Use Cases

Use Case Description Civil Liberties Concern Regulatory Status
Live Surveillance (LFR)Real-time identification against watchlists in public spacesMass surveillance; chilling effect on protest and assembly; racial profilingBanned (EU, most US cities with laws)
Post-Event InvestigationSearching recorded footage after a crime occursScope creep; retention of innocent people’s data; accuracy concernsRestricted (warrant requirements in some jurisdictions)
Mugshot Database SearchComparing suspect images against existing arrest photo databasesPerpetuates racial bias in policing; databases over-represent minoritiesGenerally permitted (varies by jurisdiction)
Missing Persons / Child ExploitationIdentifying missing children or trafficking victimsGenerally supported; concern about mission creep from these exceptionsPermitted (often cited as exception in bans)
Border ControlIdentity verification at borders and airportsTraveler privacy; data retention; accuracy for diverse populationsGenerally permitted (CBP, Frontex, etc.)

8.2 Wrongful Arrest Cases

Documented Wrongful Arrests: As of 2025, at least seven cases of wrongful arrest based on faulty facial recognition matches have been publicly documented in the US. All involved Black individuals. These cases have been central to legislative efforts and have demonstrated the real-world consequences of deploying inaccurate technology.
Case Year Location Details
Robert Williams2020Detroit, MIArrested based on facial recognition match from shoplifting footage; held 30 hours; charges dropped; ACLU lawsuit filed
Michael Oliver2019Detroit, MIFalsely identified as carjacking suspect; charges dropped when alibi confirmed; victim of FRT false match
Nijeer Parks2019Woodbridge, NJArrested for shoplifting and assault based on FRT match; spent 10 days in jail; all charges dismissed; was 30 miles away at time
Porcha Woodruff2023Detroit, MI8 months pregnant; arrested for carjacking and robbery based on FRT match; detained for hours; charges dropped; federal lawsuit filed
Randal Reid2022Jefferson Parish, LAArrested in Georgia on Louisiana warrant based on FRT; held 6 days; was in Houston at time of crime; charges dismissed

8.3 Best Practice Standards

9. Private Sector Use

9.1 Common Applications

Sector Application Regulatory Framework
RetailShoplifter identification; customer analytics; age estimation for restricted productsVaries: BIPA consent (IL); GDPR consent (EU); Portland ban (OR); NYC signage requirement
Financial ServicesKYC identity verification; fraud prevention; account accessGenerally permitted with consent; PSD2 (EU); BSA/AML (US) may encourage adoption; accuracy standards emerging
Social MediaPhoto tagging; content moderation; age verificationBIPA litigation (Meta $650M, Google $100M); GDPR consent required (EU); opt-in requirements growing
HospitalityHotel check-in; casino patron identification; VIP recognitionLimited regulation; state gaming commission rules for casinos; general data protection applies
EmploymentTime and attendance; hiring screening; access controlBIPA consent (IL); EU AI Act prohibition on emotion recognition in workplace; growing state restrictions
HealthcarePatient identification; rare disease diagnosis; pain assessmentHIPAA (US); EU MDR for medical devices; GDPR special category; less contested than surveillance uses
Real EstateBuilding access; tenant identification; visitor managementTenant protection laws in some jurisdictions; NYC proposed restrictions; GDPR consent (EU)

9.2 Self-Service Biometrics

10. Comparative Analysis

10.1 Global Regulatory Approach Comparison

Dimension EU US (Patchwork) UK China
Government UseBanned for real-time public (narrow exceptions)No federal ban; 15+ city bans; some state restrictionsActive police use; court-imposed safeguardsExtensive government deployment; limited restrictions
Private SectorGDPR consent + AI Act high-riskBIPA (IL) most restrictive; varies by stateUK GDPR consent requiredPIPL consent; separate consent for FRT; alternatives required
Emotion RecognitionBanned in workplace/educationNo specific ban (some state limits)No specific regulationNo specific ban
Scraping BanAbsolute prohibition (AI Act)Clearview settlements; no explicit federal banICO enforcement; no explicit banPIPL consent requirements; draft FRT rules
EnforcementDPAs + AI Office; fines to 6% turnoverBIPA private right of action (IL); FTC; state AGsICO; courts (Bridges case)CAC; courts (Safari Park case)
Bias RequirementsAI Act: bias testing mandatoryNIST FRVT; no mandatory testingEquality Act obligationsStandards developing

12. References & Resources

Legislation & Regulation

Research & Testing

Enforcement & Case Law

Civil Society & Advocacy

Previous Autonomous Systems Next Workplace AI