SACRAMENTO, Calif. — Generative artificial intelligence tools will soon be used by California government.
Democratic Gov. Gavin Newsom's administration announced Thursday that the state will partner with five companies to develop and test generative AI tools that can improve public services.
As lawmakers across the country grapple with how to regulate emerging technologies, California is one of the first states to issue guidelines on when and how state agencies can purchase AI tools.
Here are the details:
What is generative AI?
Generative AI is a field of artificial intelligence that can create new content, such as text, audio, and photos, in response to prompts. This is the technology behind ChatGPT, a controversial writing tool launched by Microsoft-backed OpenAI. San Francisco-based company Anthropic is also getting into the generative AI game, with help from Google and Amazon.
How could it be used in California?
California envisions using this type of technology to reduce customer call wait times at state government offices, improve traffic and road safety, and more.
Initially, four state departments will test the generated AI tools: the Department of Taxation and Fees Administration, the California Department of Transportation, the Department of Public Health, and the Department of Health and Human Services.
The Tax and Fees Agency administers more than 40 programs and received more than 660,000 calls from businesses last year, Director Nick Maduros said. States want to deploy AI to listen in on these calls, pull out important information about state tax laws in real time, and help employees answer questions faster because they don't have to look up the information themselves. .
In another example, states want to use this technology to provide people with information about health and social services benefits in languages ​​other than English.
Who will use these AI tools?
The general public does not yet have access to these tools, but may do so in the future. The state will begin a six-month trial during which the tool will be tested internally by state officials. In the tax example, Maduros said the country plans to use the technology to analyze recordings of phone calls from businesses and then see how the AI ​​processes them, rather than doing it in real time. Told.
However, not all tools are designed to interact with general users. For example, tools designed to improve highway congestion and traffic safety are only used by state officials to analyze traffic data and brainstorm potential solutions.
State officials will test and evaluate its effectiveness and risks. If the tests go well, the state will consider deploying the technology more widely.
How much does it cost?
The final cost is unknown. For now, the state plans to pay each of the five companies $1 to begin a six-month in-house trial. The state can then evaluate whether to enter into a new contract for the long-term use of the tool.
“If it turns out that it doesn't serve our people better, we lose every dollar,” Maduros said. “And I think that's a pretty good deal for Californians.”
The state currently has a large budget deficit, which could make it difficult for Mr. Newsom to make the case that such technology is worth deploying.
Administration officials said they did not have an estimate of how much these tools would ultimately cost the state and did not immediately provide copies of agreements with five companies to pilot the technology. He said it was not made public. Those companies are Deloitte Consulting, LLP, INRIX, Inc., Accenture, LLP, Ignyte Group, LLC, and SymSoft Solutions LLC.
What am I doing wrong?
Rapidly growing technology also raises concerns about unemployment, misinformation, privacy, and automation bias.
State officials and academic experts say generative AI has great potential to help make government agencies more efficient, but safeguards and oversight are also urgently needed.
Meredith Lee, chief technical advisor in the Department of Computing, Data Science, and Sociology at the University of California, Berkeley, said one way to limit potential risks is to test tools in a limited manner.
But testing cannot be stopped after six months, she added. States should have a consistent process for testing and learning the potential risks of tools if they decide to implement them at scale.