Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

Witnesses tell Congress national privacy rules and privacy‑enhancing technologies are core to trustworthy AI in finance

5785141 · September 18, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Experts recommended a U.S. federal data privacy framework, wider use of privacy‑enhancing technologies for model training, and clearer supervisory guidance so financial institutions can train high‑quality models without compromising consumer privacy.

In testimony before the House Financial Services subcommittee, privacy and industry experts recommended a national data privacy framework and greater adoption of privacy‑enhancing technologies (PETs) to support model training and protect consumer financial data.

Why it matters: Panelists said model quality depends on access to rich, permissioned datasets while privacy safeguards are essential to preserve consumer trust and avoid discriminatory outcomes when AI is used in underwriting, marketing, or fraud detection.

Matthew Reisman of the Center for Information Policy Leadership urged a "risk‑based approach that focuses on the outcomes to be achieved" and recommended regulators clarify how existing rules should apply to AI. He said policymakers should "enable the responsible use of data for model training and development" and promote PETs such as synthetic data and differential privacy as ways to preserve utility while protecting individuals.

David Cox of IBM Research emphasized that "transparency is vital" and called for regulators to "understand the provenance and quality of the data underlying deployed systems." Several witnesses noted PETs have improved substantially in recent years: panelists said fully homomorphic encryption and differential privacy have become more practicable as compute power increased and specialist vendors now offer off‑the‑shelf services.

Witnesses recommended regulators recognize and permit PETs in supervised contexts, encourage standards for data lineage and auditability, and consider safe harbors or recognized standards to reduce compliance burden, especially for smaller banks and fintechs.

No formal rule changes were proposed at the hearing; members asked witnesses for follow‑up technical materials and pledged to consider the need for federal privacy legislation as part of broader AI policy work.