Sen. James Maloney aims to protect residents from discriminatory algorithms and criminalize the proliferation of certain AI after facing pushback from Gov. Ned Lamont, tech officials, and state officials. He said a sweeping rewrite of the Artificial Intelligence Bill was “nearing completion”. -Generated pornographic and political material.
The Senate on Monday sacrificed key regulatory measures in its next bill to assuage concerns from small businesses, big tech companies, the governor and the Department of Economic and Community Development that overregulation could drive tech companies out of the state. He suggested that it would be the content.
At a press conference Monday, surrounded by supporters, Senate leaders and colleagues, Maloney said the state must pass an AI protection bill this Congress.
“We have a duty to act now,” Maloney said. “We know there is harm that needs to be addressed.”
The proposal would establish a framework to regulate AI algorithms that often work covertly in the public and private sectors to determine everything from credit scores to social service interventions.
It would also criminalize the dissemination of non-consensual AI-generated images, also known as deepfake pornography, and the dissemination of AI-generated election materials, which fall under the bill's “deceptive media” category.
Maloney said Monday that the bill would boost AI innovation by funding AI education, training and research, and requiring states to determine how to use AI to improve performance. He said it would also move the implementation forward.
With 16 days left in Congress, Senate President pro tempore Martin Rooney said passing Maloney's AI bill is a top priority.
“We believe this needs to be done on a ballot. It's a very important issue,” Rooney said. “This is basic.”
Parts of the AI bill passed the Judiciary Committee on a bipartisan vote on Monday, but not without reservations from lawmakers who wanted the language changed.
“This bill remains something of a work in progress,” said committee chairman Rep. Steven Safstrom. “We intend to support this bill today to continue the dialogue, but we strongly hope that amendments are made before it is considered in the House of Commons.”
House Speaker Matt Ritter told reporters last week that he was “sympathetic” to arguments by the governor's office and DECD that some provisions in the bill could be “burdensome” to small tech startups. He said there is.
After speaking with the Lamont administration, Maloney said he plans to repeal certain provisions of the original bill that applied to general purpose AI model developers. Maloney said the now removed section is “causing the most anxiety among many large companies.”
According to a proposal summary prepared by the Legislative Research Service, this section would require developers to establish policies regarding federal and state copyright laws by January 1, 2026, and “create and maintain detailed copyright laws.” , and to make it publicly available. In addition to other designated technical documents and information, see Overview of content used to train general-purpose AI models.
Maloney said he agrees with the Lamont administration and technology representatives that “that part of the bill may be premature,” but said there will be a task next year to reinstate the provision and “get that part right.” He said he wanted to establish a force.
Mr Maloney said the next draft would also address the concerns of small businesses. His next bill would establish a partnership with the Connecticut Academy of Science and Technology to develop AI compliance checkers and algorithmic impact assessments based on existing international models, allowing developers and adopters to check their systems against current law. I explained that I would do this.
He said the new language would shield adopters from some of the bill's risk management framework and other provisions, as long as they use uncustomized, “off-the-shelf” AI that has been tested by developers.
Maloney said the bill would encourage tech companies to abandon Mark Zuckerberg's famous “move fast and break things” mantra, slow down and eliminate potential harm before releasing algorithmic technology to the public. He said he hopes they will be encouraged to identify the
“We need to look at things, make sure they're safe, make sure there aren't any side effects, test them, and then release them to the public,” Maloney said.
Maloney said the state wants to partner with businesses to “do this together the right way.”
“We're looking for ways to help small businesses build compliant and trustworthy AI, because we know that's how they thrive,” said Maloney. . “We know that in the long run it's going to be really cheap and good for them.”
Matthew Wallace, president and CEO of VRSim, a Connecticut company that has been building training simulations using virtual reality for the past 20 years, said the proposed AI bill would “promote growth, not stifle it. It is a thing.”
“Regulation is an important way to avoid unintended consequences from technology,” Wallace said. “We are now building AI capabilities into our products. We like the idea of regulation. We like the idea of certainty. I like the concept of understanding what should be.”
Senate Majority Leader Bob Duff said the hands-off mentality adopted by federal and state governments as the internet became mainstream in the 1990s resulted in a proliferation of data privacy issues, social media abuse and other problems. He said he did.
Duff said policymakers can't afford to take the same laissez-faire approach to AI.
“Our job here is to not repeat the sins of the past. Our job is to make sure we set the guidelines and guardrails (and) parameters that are necessary for this technology now,” Duff said. states.
“We shouldn't be afraid to adopt this technology, but we shouldn't be afraid to apply the rules of the road to it,” Duff added. “I think it’s going to actually help our state and help our country.”