Cynthia Lummis Proposes Artificial Intelligence Bill, Requiring AI Firms to Disclose Technicals

Policy

Share this article

By Sam Reynolds, AI Boost|Edited by Parikshit Mishra

Jun 13, 2025, 6:30 a.m.

OpenAI (Jonathan Kemper/Unsplash)
  • Senator Cynthia Lummis introduced the RISE Act of 2025 to clarify liability for AI used by professionals and mandate transparency from developers.
  • The bill requires AI developers to release model cards detailing AI systems’ data sources, use cases, and limitations.
  • The RISE Act stops short of requiring AI models to be open source, but mandates updates and justifications for withheld proprietary information.

Senator Cynthia Lummis (R-WY) has introduced the Responsible Innovation and Safe Expertise (RISE) Act of 2025, a legislative proposal designed to clarify liability frameworks for artificial intelligence (AI) used by professionals.

The bill could bring transparency from AI developers – stoping short of requiring models to be open source.

STORY CONTINUES BELOW

Don’t miss another story.Subscribe to the State of Crypto Newsletter today.See all newslettersBy signing up, you will receive emails about CoinDesk products and you agree to ourterms of useandprivacy policy.

In a press release, Lummis said the RISE Act would mean that professionals, such as physicians, attorneys, engineers, and financial advisors, remain legally responsible for the advice they provide, even when it is informed by AI systems.

At the time, AI developers who create the systems can only shield themselves from civil liability when things go awry if they publicly release model cards.

The proposed bill defines model cards as detailed technical documents that disclose an AI system’s training data sources, intended use cases, performance metrics, known limitations, and potential failure modes. All this is intended to help help professionals assess whether the tool is appropriate for their work.

“Wyoming values both innovation and accountability; the RISE Act creates predictable standards that encourage safer AI development while preserving professional autonomy,” Lummis said in a press release.

“This legislation doesn’t create blanket immunity for AI,” Lummis continued.

However, the immunity granted under this Act has clear boundaries. The legislation excludes protection for developers in instances of recklessness, willful misconduct, fraud, knowing misrepresentation, or when actions fall outside the defined scope of professional usage.

Additionally, developers face a duty of ongoing accountability under the RISE Act. AI documentation and specifications must be updated within 30 days of deploying new versions or discovering significant failure modes, reinforcing continuous transparency obligations.

The RISE Act, as it’s written now, stops short of mandating that AI models become fully open source.

Developers can withhold proprietary information, but only if the redacted material isn’t related to safety, and each omission is accompanied by a written justification explaining the trade secret exemption.

In a prior interview with CoinDesk, Simon Kim, the CEO of Hashed, one of Korea’s leading VC funds, spoke about the danger of centralized, closed-source AI that’s effectively a black box.

“OpenAI is not open, and it is controlled by very few people, so it’s quite dangerous. Making this type of [closed source] foundational model is similar to making a ‘god’, but we don’t know how it works,” Kim said at the time.

Disclaimer: Parts of this article were generated with the assistance from AI tools and reviewed by our editorial team to ensure accuracy and adherence to our standards. For more information, see CoinDesk’s full AI Policy.

Sam Reynolds is a senior reporter based in Asia. Sam was part of the CoinDesk team that won the 2023 Gerald Loeb award in the breaking news category for coverage of FTX’s collapse. Prior to CoinDesk, he was a reporter with Blockworks and a semiconductor analyst with IDC.

CoinDesk News Image

“AI Boost” indicates a generative text tool, typically an AI chatbot, contributed to the article. In each and every case, the article was edited, fact-checked and published by a human. Read more about CoinDesk’s AI Policy.

CoinDesk Bot

 

Leave a Reply

Your email address will not be published. Required fields are marked *