Skip to Main Content
 
Thought Leadership

Alternative Commercial Finance Update - California's Latest Mission: The Regulation of Artificial Intelligence

 
Blog

It seems artificial intelligence (AI) has become a daily topic of conversation, and the focus of late is on AI developers and ensuring transparency for users of the technology. California is once again at the forefront of technology regulation (emphasis on regulation). Our colleagues on Husch Blackwell’s Data Privacy team have been following California’s AI legislation all year, and now that the California legislature has closed its 2024 session—passing four AI-related bills—they have published a summary of California’s legislative activity concerning AI. We encourage you to check out the Byte Back blog for more detail.

California has a long track record of shaping regulatory frameworks that often influence state and national standards, as seen with the California Consumer Privacy Act and the California Commercial Financing Disclosure Laws, and this latest round of AI legislation could exert a similar impact. And even though the laws likely only impact developers located in California or providing AI models to residents in California, such a distinction may not matter for developers, many of whom are located in the technology hubs of California anyway. As a result, users of AI may feel the effects of AI regulations sooner rather than later.

In particular, AB 2013, introduced by Assembly Member Jacqui Irwin, aims to enhance transparency regarding the training data used for generative AI systems. This bill mandates that, starting January 1, 2026, developers of generative AI systems (including significant modifications to these systems) must provide detailed documentation about the training data on their websites before making the system available to the public. The documentation must include a high-level summary of the datasets, such as their sources, purpose, size, and whether they contain copyrighted material or personal information. Additionally, developers must disclose whether data was purchased, licensed, or modified, and provide information on the data collection period. Certain AI systems are exempt from these requirements, including those used for security, aviation, or national defense purposes.

Also of note, SB 1047, the “Safe and Secure Innovation for Frontier Artificial Intelligence Models Act” introduced by Senator Scott Wiener, aims to regulate the development and deployment of advanced AI models by imposing stringent safety and security requirements. Developers must implement administrative, technical, and physical cybersecurity protections before training AI models, including the capability for a full shutdown and a detailed safety and security protocol. The bill mandates that developers obtain annual third-party audits starting January 1, 2026. Compliance with these requirements must be documented and retained, and upon request, developers must grant the attorney general access to unredacted copies of the safety and security protocols and third-party audits. Additionally, developers must submit compliance statements and report AI safety incidents to the attorney general.

SB 1047 also establishes the Board of Frontier Models, which is tasked with issuing regulations on or before January 1, 2027. Additionally, the Government Operations Agency must develop a framework for a public cloud computing cluster, “CalCompute,” by January 1, 2026, to foster safe, ethical, equitable, and sustainable AI research and innovation that benefits the public.

As we’ve previously noted, the use of emerging technologies by the alternative commercial finance industry represents a key opportunity to expand access to products, make better underwriting decisions (for instance, through the use of machine learning and alternative data in the decision-making process), and perform more effective due diligence and fraud prevention. However, as many in the financial services industry understand, poorly conceived or overly restrictive regulations can backfire—harming the very businesses and consumers they aim to protect by slowing progress, deterring competition, and limiting innovation. While the full impact of this recent legislation on the commercial finance sector remains uncertain, it’s important to stay informed as you continue to adopt and refine AI-driven practices.

News and views

Speaking of AI, our own Chris Friedman will be moderating a panel at the 10th Annual LEND360 conference titled “Leveraging New Technologies to Expand Access in SMB Lending.” In particular, the panel – made up of several thought leaders and lenders in the SMB space – will discuss how the potential benefits of AI in the business lending space to both lenders and customers. They will also discuss the challenges posed by aggressive regulators and new legal regimes and will explore the various ways that companies are utilizing AI and machine learning to enhance the customer experience. If you’re at LEND360 this year, make sure to stop by!

Increased regulatory requirements continue to be a hot topic for fintechs and other new and emerging industries. We found this article by PYMTS,  Priority CEO: New Regulations Could Be ‘Overwhelming’ For FinTechs, provides good insight into areas that may be impacted by additional regulatory scrutiny and how compliance preparation can prevent disaster. As alternative commercial finance solutions continue to receive scrutiny from both state and federal regulators, it is important to stay ahead of the game.

The Small Business Finance Insights published an article last week on Factoring in 2024: Market Trends and Their Impact on Brokers, recognizing that the industry currently is estimated to be worth $3.6 trillion, with a projected growth of $1.6 trillion over the next four years. Technology enhancements, supply chain finance growth, and increased focus on small and medium-sized enterprises are a few of the trends noted in the article.

The Consumer Financial Protection Bureau (CFPB) recently launched a beta platform for its small business lending data collection rule under Section 1071 of the Dodd-Frank Act. Financial institutions are encouraged to test the platform by uploading sample data files, available in the CFPB’s test file repository, or their own test files. It is crucial, however, that these files do not contain actual customer data, as the beta platform is for testing purposes only and submissions will not count toward compliance with reporting requirements. To participate in testing, financial institutions must ensure they have a Legal Entity Identifier, create an account using a financial institution email through Login.gov, and select the appropriate institution for which they are authorized to file reports. The CFPB is actively seeking feedback on the platform, and participants are encouraged to share their experiences.

News you can bank on

Interested in more updates on the financial services industry? Subscribe and receive Husch Blackwell consumer financial services insights in your inbox.

Professionals:

Alexandra McFall

Senior Counsel

Shelby Lomax

Associate

Grant Tucek

Associate