California’s Chatbot Bill May Impose Substantial Compliance Burdens on Many Companies Deploying AI Assistants
What You Need to Know
Key takeaway #1
If SB 243 passes, liability risk arising from the use of chatbots increases.
Key takeaway #2
Careful review and proactive compliance planning are essential.
Client Alert | 4 min read | 09.17.25
California Governor Gavin Newsom has until October 12, 2025, to sign into law a first-in-the-nation bill that will, if enacted, likely impose significant regulatory obligations and litigation risk on companies deploying AI chatbots in California.
Last week, the California Assembly and California State Senate adopted Senate Bill (SB) 243, which aims to regulate “companion chatbots” by targeting AI systems that engage users in ongoing, human-like social interactions. While its authors intend for the law to address risks associated with emotionally engaging chatbots targeting children, the bill’s definition of “companion chatbot” may cover more ground—potentially capturing website chatbots and virtual assistants that serve a variety of seemingly innocuous purposes.
Key Issue: The Definition of a “Companion Chatbot”
SB 243 defines “companion chatbot” as an AI system with “a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user’s social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.”
The law expressly excludes bots “used only for customer service, a business’ operational purposes, productivity and analysis related to source information, internal research, or technical assistance.” (emphasis added)
But this language leaves room for interpretation. If a chatbot’s use extends beyond these exclusions—such as by engaging users in ongoing dialogue, offering personalized recommendations, or supporting social/emotional needs—it could fall within the law’s scope. Put another way, many bots may be primarily used for an excepted purpose, such as for customer service, but also in a manner “capable of meeting a user’s social needs,” thus falling within the ambit of the bill.
Potential Examples of Covered Chatbots
- Website Chatbots with Persistent Profiles: Many companies deploy chatbots that remember previous user interactions, offer personalized greetings, and provide tailored recommendations. If these bots maintain ongoing relationships or appear “friendly,” they could be considered companion chatbots.
- Customer Engagement Bots Offering Emotional Support: Some brands use chatbots to check in on users, offer wellness tips, or provide encouragement. These bots may go beyond customer service and touch on users’ social or emotional needs.
- Virtual Shopping Assistants: E-commerce sites increasingly use AI assistants that help users navigate complex choices, remember preferences, and engage in multi-session dialogues. If the assistant’s interaction feels anthropomorphic or relationship-building, the AI assistant may be covered.
- Financial Wellness Bots: Banks and fintech firms sometimes offer bots that help users manage stress, set goals, and provide ongoing motivational feedback. If these chatbots sustain relationships and meet social needs, they may be covered.
- Education Platforms with “Study Buddy” Chatbots: Some online learning platforms use chatbots that support students emotionally, encourage persistence, and maintain ongoing dialogue. These bots could qualify as companion chatbots.
Why This Matters: Potentially Significant Litigation Risk and Liability Exposure
SB 243 requires operators of companion chatbots to comply with disclosure, notice, and regulatory reporting obligations. And in some cases, companion chatbot operators must build protocols to limit certain types of dangerous conversations with the chatbot.
Critically, SB 243 also allows private lawsuits against operators for violations, with damages set at the greater of actual damages or $1,000 per violation, plus attorney’s fees and costs. This means individual consumers can sue operators and developers, and damages can be massive.
Moreover, such lawsuits could invite the risk of not just consumer class actions, but also state attorney general and other executive office enforcement actions. This is especially so given that state attorneys’ general have expressed concerns about the use of chatbots, and the Federal Trade Commission (FTC) has launched an inquiry into AI chatbots acting as companions.
This law could create substantial risk for companies whose chatbots might fall within the definition of “companion chatbots,” especially if the chatbot’s functions are not strictly limited to customer service, operational, or technical support.
Plaintiffs may argue that any chatbot exhibiting anthropomorphic features or sustaining a relationship across multiple interactions meets the definition, exposing businesses to potentially costly litigation and compliance burdens.
Recommended Actions
If SB 243 becomes law, companies operating chatbots in California should:
- Review all current and planned chatbot deployments for features that could be interpreted as “companion” functions.
- Contact Crowell to assess exposure and consider limiting chatbot functionality or clearly documenting its primary purpose as customer service or technical support.
- Monitor regulatory developments and prepare for compliance with notification, reporting, and audit requirements if covered.
Bottom Line: The broad definition of “companion chatbot” in SB 243 means that many websites and chatbots could be swept into the law’s scope, exposing companies to private actions and significant damages. Careful review and proactive compliance planning are essential.
Insights
Client Alert | 3 min read | 09.17.25
On August 8, 2025, the Attorneys General of 23 Republican-led U.S. states (the “AGs”) sent a letter to Science Based Targets Initiative (“SBTi”), a U.K. non-profit climate organization, expressing concern with the SBTi’s climate initiatives.[1]SBTi had previously received a subpoena from Florida Attorney General James Uthmeier in connection with his office’s investigation into what he described as a “climate cartel,” which he alleges includes SBTi and CDP (formerly the Carbon Disclosure Project).[2]
Client Alert | 5 min read | 09.16.25
Bucking the Odds: Why Technology Companies Should Embrace Software Patents Today
Client Alert | 4 min read | 09.16.25
Client Alert | 3 min read | 09.15.25