Back to Bills

Report AI Use on Raw Surveillance Data

Full Title:
A bill to require a report on the use of artificial intelligence with respect to access to unminimized information collected pursuant to the Foreign Intelligence Surveillance Act of 1978, and for other purposes.

Summary#

  • This bill would require the government to explain how it uses artificial intelligence (AI) on raw surveillance data collected under the Foreign Intelligence Surveillance Act (FISA).

  • It aims to add transparency and oversight without changing what data can be collected.

  • Key points:

    • Within 120 days of becoming law, the Attorney General and the Director of National Intelligence must report all uses of AI that have access to “unminimized” FISA data. Unminimized means raw data before privacy filters are applied.
    • The report must describe each AI system’s name, purpose, testing and ongoing checks, human review, who built or trained it, what data trained it, when and how it got access to raw FISA data, and any other data it uses.
    • It must also say if the use was previously shared with Congress or the surveillance court, and whether the court issued any related opinions.
    • The report goes to the House and Senate Intelligence Committees, the House and Senate Judiciary Committees, and the leaders of the Foreign Intelligence Surveillance Court and its review court.
    • The report must have a public version posted on the Justice Department and ODNI websites, plus a classified version for secure readers.
    • Before any new AI system gets access to raw FISA data, the same officials must notify those bodies and explain how the plan complies with FISA, court-approved rules, and other laws.

What it means for you#

  • All residents:

    • This does not expand government surveillance powers. It adds reporting on how AI tools touch the most sensitive, raw surveillance data.
    • You may get a public summary showing what kinds of AI tools are used and for what purpose.
  • Privacy and civil liberties advocates:

    • You gain a regular source of information on AI use with raw FISA data, including testing, human oversight, and model limits.
    • The public version may help you assess risks, though some details will stay classified.
  • Technology companies and AI vendors:

    • If your tools are used, the report may name them and describe their use and training approach.
    • Agencies may ask you for documentation on testing, data sources, limits, and compliance before adoption.
  • Journalists and researchers:

    • The public report offers new, official information to analyze AI use in national security.
  • Intelligence and law enforcement personnel who use FISA data:

    • Expect added documentation and notice requirements before giving AI systems access to raw data.
    • Programs may face added review steps to describe testing, human oversight, and legal compliance.

Expenses#

No publicly available information.

Proponents' View#

  • Increases transparency around powerful AI tools that can touch raw surveillance data, which may include information about Americans.
  • Strengthens oversight by requiring details on testing, monitoring, and human review, not just a list of tools.
  • Builds public trust by posting an unclassified report while still protecting sensitive details in a classified version.
  • Forces early legal checks before new AI tools get access, reducing the risk of misuse or unlawful queries.
  • Helps Congress and the surveillance courts understand and manage risks like bias, overreach, or model limits.

Opponents' View#

  • Adds paperwork and could slow the rollout of useful tools, making operations less agile.
  • Even an unclassified summary might reveal sensitive capabilities or vendor relationships to adversaries.
  • The broad definition of AI may sweep in many analytics tools, creating confusion and over-reporting.
  • Does not impose hard limits on AI use; critics who want stronger privacy rules may see it as too weak.
  • Pre-notification could become a bureaucratic hurdle for routine model updates or retraining.