Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

Witnesses warn Congress: data‑broker sales, facial recognition and AI broaden surveillance beyond FISA gaps

2900485 · April 8, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

At a House Judiciary subcommittee hearing, witnesses and members cautioned that commercial data‑broker sales, facial recognition services and AI tools enable federal agencies to track Americans without court oversight, and urged statutory limits and agency audits.

Witnesses at a House Judiciary subcommittee hearing on October 12 told lawmakers that U.S. agencies routinely obtain Americans’ sensitive information without warrants by purchasing data from commercial brokers and by using facial‑recognition and AI tools.

The concern: Kia Hamadanshi of the ACLU and others said federal agencies can “sidestep the requirements of the Fourth Amendment” by buying bulk data — including location and health information — from private firms, and then searching those commercial datasets without judicial oversight. James Chernowski listed “closing the data broker loophole” among his top three priorities for reform.

Clearview AI and facial recognition drew specific attention. Chairman Biggs and witnesses discussed reports that federal agencies, including the ATF, used facial recognition to identify gun owners and that agencies have purchased access to Clearview AI’s database, which aggregates publicly posted photos from social media. The Government Accountability Office has reviewed federal agencies’ use of facial recognition and reported that some agencies lacked adequate privacy risk assessments or even a clear inventory of which systems agents were using.

Artificial intelligence and automated tools were described as amplifiers of surveillance capacity. Several witnesses warned that AI can automate target identification and expand surveillance at speed and scale; the ACLU asked the committee to “undertake a comprehensive review of AI technologies used for surveillance” and urged legal limits on how those systems are deployed.

Policy responses suggested in testimony included statutory prohibitions or stricter limits on law‑enforcement purchases of commercially available personal data; mandatory privacy‑risk assessments for agency use of facial recognition; improved inventory and auditing of agency systems; and review or a statutory standard for AI use in surveillance contexts.

Members pressed witnesses about tradeoffs between public safety and privacy. Some witnesses said narrowly tailored uses of AI could help allocate resources, but they urged guardrails to prevent incorrect or biased outputs from triggering intrusive enforcement actions.

The subcommittee included the discussion on data brokers and automated surveillance as part of its broader review of FISA and will consider drafting targeted legislation and oversight requests to agencies to inventory purchases, contracts and algorithms used in practice.