News
Save Article
Failed to save article
Please try again
Gov. Gavin Newsom speaks to reporters in the spin room following the CNN Presidential Debate on the Georgia Institute of Technology campus on June 27, 2024 in Atlanta, Georgia. In an unusual letter to the California Privacy Protection Agency’s board, Gov. Newsom is urging the regulators to dial back proposed regulations on automated decision-making technology.(Andrew Harnik/Getty Images)
Gov. Gavin Newsom is urging the California Privacy Protection Agency to regulate the state’s AI sector with a light touch.
Newsom, in an unusual letter to the California Privacy Protection Agency’s board, urged the regulators to dial back proposed regulations (PDF) on automated decision-making technology.
“As my office has relayed to Agency staff over the last year, enacting these regulations could create significant unintended consequences and impose substantial costs,” the governor wrote in a letter obtained by KQED (PDF).
Sponsored
He went on to write, “The Agency can fulfill its obligations to issue the regulations called for by Proposition 24 without venturing into areas beyond its mandate. Thank you for working in partnership with my Administration and the Legislature to balance privacy protection with clear and implementable guidelines that allow regulated entities to innovate responsibly, creating a fairer and more trustworthy digital environment for California consumers.”
“The CPPA Board and staff continue to refine the draft regulations and will further discuss them at the May 1st board meeting,”Tom Kemp, CPPA’s executive director, wrote in response. He added, “We are grateful for the Governor’s Office’s continued engagement around this important issue.”
“It’s unfortunate to see a lot of the industry talking points coming out of a letter from the governor,” said Jake Snow, a technology and civil liberties attorney at the ACLU of Northern California. “The agency has a really broad authority to put in place new rules and the regulations that they’ve written are simple rules that encourage transparency and trust in AI for people in California.”
There’s really nothing like the California Privacy Protection Agency anywhere in the United States. Created in 2020, the agency is just beginning to find its voice, and that means “increasingly, it is attracting lobbying attention from industry,” said Jonathan Mehta Stein, chair of the California Initiative for Technology and Democracy.
The draft regulations would require businesses to assess and report privacy risks, perform annual cybersecurity audits, and give consumers more control over how automated systems (like AI and profiling tools) use their personal data. Public comment for the draft regulations closed on February 19. The board discussed those comments at the April board meeting, and they’ll discuss again on May 1.
The broad scope of the conversation brought out a fulsome array of interested parties, including not just the governor, but industry lobbyists and consumer advocates as well.
“AI, social media and data privacy are fundamentally intertwined, and if we are going to protect consumers and our democracy, from these combined, interwoven threats, you have to be talking about all of them all at once,” Stein said. “Right now, social media and AI are almost totally unregulated.
“California has made some good starts on data privacy in some recent bills in recent years, but there is almost no industry I can think of that has an impact on our lives so enormous, and sits under a regulatory regime so light and so minimal.”
Newsom has a reputation in Sacramento for lending a friendly ear to industry concerns. He has killed a couple of the most controversial AI bills, like one that would have required large-scale AI developers to submit their safety plans to the state attorney general, and two that would have forced tech platforms to share ad revenues with news organizations.
However, Newsom has also signed many bills that consumer advocates like, addressing everything from online privacy to critical infrastructure.
At a board meeting three weeks ago, CPPA Board member Alastair Mactaggart worried that moving forward aggressively could trigger industry lawsuits designed to bury the agency’s small staff in paperwork.
Or Silicon Valley lobbyists might appeal to President Donald Trump and the Republican-controlled Congress to preempt California’s privacy protections with weaker federal rules. However, it’s not clear how friendly that audience would be, given the federal government’s continued aggressive legal assaults against Google and Meta.
“Rules around artificial intelligence are really a part of privacy law, because they govern the control that people should have over the use of information about them, and the use of that information that affects people’s lives,” said Snow, urging the board to move forward on “common sense restrictions on this technology.” What defines “common sense,” however, is a matter of continued debate.
The CPPA board has a November deadline to finalize the rules.