Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

Experts urge Congress to set guardrails for AI and survivor data in trafficking investigations

Subcommittee on Cybersecurity, Information Technology, and Government Innovation, House Committee on Oversight and Reform · December 11, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Witnesses warned that biased or poorly governed AI could misdirect investigations and retraumatize survivors; they urged transparency, limits on retention, auditing, and survivor consent before datasets are used.

Experts told a House Oversight subcommittee that AI and other digital tools can help identify trafficking networks, but only if Congress requires clear standards for data quality, transparency and limits on retention.

Roy Austin, director of the Artificial Intelligence Initiative at Howard University, said AI systems are ‘‘only as sound as the data upon which they are trained’’ and cautioned that incomplete or biased datasets can “distort results, misdirect investigations, and perpetuate racial and gender disparities.” Austin called for federal standards on data transparency, auditing and accountability to reduce harm.

Megan Lundstrom, CEO of Polaris and a survivor, urged that survivor consent and trauma‑informed approaches be central to any data collection. She testified that technology that collects information without consent can replicate exploitative dynamics and retraumatize survivors; she recommended survivor involvement in design and compensation for survivor contributions.

Melissa Snow of the National Center for Missing and Exploited Children said NCMEC uses donated and emerging technologies—image matching and mapping tools—to connect data points, but described careful intake protocols and collaborations with law enforcement. Snow and Lundstrom emphasized that human review and victim‑sensitive practices must accompany automated screening to avoid causing additional harm.

Panelists also warned about synthetic media. Austin said deepfakes threaten evidentiary integrity and urged investment in deepfake detection tools. Lawmakers pressed witnesses on how evidence and recording practices are stored and who has access; witnesses said storage and access policies vary and noted the need for encryption, minimization and transparent retention policies.

The hearing did not produce legislative votes. Members may send written follow‑ups and questions for the record.