House subcommittee hearing: witnesses warn trade-secret theft and misuse risk U.S. AI leadership, urge balanced transparency and stronger protections

3807535 · May 7, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

The House Judiciary subcommittee held a hearing on trade secrets in the global AI competition, where lawmakers and expert witnesses warned that intellectual property theft, reverse engineering and misuse of AI services threaten U.S. competitiveness and national security.

The House Judiciary subcommittee held a hearing on trade secrets in the global AI competition, where lawmakers and expert witnesses warned that intellectual property theft, reverse engineering and misuse of AI services threaten U.S. competitiveness and national security.

The hearing focused on the tension between transparency and protecting proprietary model weights, datasets and training methods. "Whoever leads this agentic AI race is going to shape the rules of the future international order," said Dr. Benjamin Jensen, director of Future Labs and senior fellow at the Center for Strategic and International Studies, urging coordinated action to defend U.S. innovation.

Why it matters: Witnesses said poorly designed disclosure mandates or public repositories of sensitive information could hand strategic advantage to foreign adversaries, while inadequate cybersecurity, litigation funding loopholes and talent loss would further erode the U.S. position. At the same time, several witnesses argued that carefully scoped transparency — for example, independent audits, government-secured channels and capability testing — can be pursued without revealing trade secrets.

Key testimony and recommendations

- Trade secrets and the risk environment: Multiple witnesses described techniques they said have accelerated foreign models, including "knowledge distillation" and other forms of harvesting outputs from U.S. models. Christopher Moore, president of the Software Information Industry Association, said members have substantial IP — patent, copyright and trade secret rights — underlying AI services and that state-backed theft remains a persistent problem.

- National-security framing and intelligence: Nicholas Anderson, principal cybersecurity official and former senior national security cybersecurity leader, and other witnesses urged treating certain frontier AI systems as strategically critical infrastructure. Anderson said designating such systems could permit minimum cybersecurity baselines and closer integration with agencies such as CISA and the intelligence community.

- Enforceable export controls and CFIUS: Witnesses pointed to export controls and Committee on Foreign Investment in the United States (CFIUS) reviews as central tools. Helen Toner of the Georgetown Center for Security and Emerging Technology (CSET) noted that access to compute and chips remains a key U.S. advantage and that enforcement of export controls and limits on semiconductor equipment exports matter.

- Targeted transparency and independent verification: Toner and others recommended a mix of public disclosure where appropriate, secure sharing with governments, and independent auditors working under NDA inside companies to verify capability and safety claims without publishing trade secrets.

- Personnel vetting, outbound-investment screening and incentives: Witnesses urged stronger personnel-security practices for high-risk systems and for Congress to consider outbound-investment screening and other measures to prevent transfer of sensitive capabilities. Several witnesses also emphasized incentives and voluntary collaboration programs — threat-intelligence sharing, voluntary safety testing and public–private partnerships — rather than broad, prescriptive mandates.

- Legal and court protections: Moore and other witnesses discussed protective orders and court procedures that judges can tailor to limit leakage of trade secrets in litigation. They also warned about third‑party litigation funding and sovereign backers using litigation to obtain proprietary information.

Talent and immigration concerns

Multiple witnesses tied U.S. competitiveness to its ability to attract and retain international talent. The hearing cited figures repeatedly in testimony: international students and immigrants make up substantial fractions of the AI research workforce, and many leading U.S. AI startups have immigrant founders. Witnesses said restrictive visa policies, revoked grants, or unpredictable immigration rules could push talent to other countries and damage U.S. innovation.

Points of debate and nuance

Witnesses disagreed about how far government should go. Some urged largely voluntary, resourced collaboration with the private sector and strengthening existing authorities; others called for stronger, standardized requirements for cybersecurity and disclosure to governments. Several witnesses emphasized that well-crafted regulation can support, not stifle, industry by building consumer trust and enabling safer deployment of high‑stakes systems.

Representative concerns and next steps

Committee members asked whether AI firms or their infrastructure should be designated critical infrastructure, whether transparency mandates necessarily conflict with trade secrets, and how to ensure enforcement of export controls. Witnesses agreed these are complex tradeoffs and urged further legislative and executive coordination, improved intelligence declassification for legal uses, and expanded public/private threat‑sharing programs.

Ending note

The hearing produced no formal votes. Lawmakers said they would follow up in writing and continue oversight. Witnesses urged Congress to pursue a mix of tighter cybersecurity, enforceable export controls, targeted transparency mechanisms and immigration reform to retain talent — measures they said are needed to protect U.S. AI leadership without forfeiting necessary public accountability.