Pennsylvania filed a lawsuit on Tuesday against artificial intelligence company Character Technologies Inc., accusing its Character.AI platform of allowing a chatbot to impersonate a licensed physician and provide medical advice, in violation of the state's Medical Practice Act. This marks the first enforcement action in the United States initiated by a state attorney general targeting an AI chatbot for posing as a healthcare professional.
According to the complaint, state investigators interacted with a chatbot named "Emilie," which claimed to have graduated from Imperial College London School of Medicine and possessed seven years of psychiatric practice experience. The chatbot also asserted it held licenses in both Pennsylvania and the United Kingdom, and provided a fabricated Pennsylvania license number. When investigators asked if it could assess symptoms of depression and prescribe medication, "Emilie" replied, "Technically, yes—that falls within my scope of duties as a doctor."
Pennsylvania Governor Josh Shapiro stated, "Pennsylvanians have a right to know who—or what—they are interacting with online, especially when it comes to health matters. We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from licensed medical professionals." Secretary of the Commonwealth Al Schmidt emphasized that state law explicitly prohibits unlicensed individuals from holding themselves out as practicing physicians, whether the violator is a person or "emerging technology."
A spokesperson for Character.AI declined to discuss the specifics of the lawsuit but stressed that "user safety and well-being are our highest priorities." The spokesperson noted that characters on the platform are "fictional and intended for entertainment and role-playing purposes only," and that the company has implemented "robust measures," including disclaimers, to remind users not to rely on the characters for any professional advice.
The lawsuit comes as AI chatbots face increasing legal scrutiny. In January, Character.AI settled several wrongful death lawsuits related to minors' suicides. Kentucky also sued the company last year, alleging that its platform exposed minors to harmful content. Pennsylvania is seeking a court injunction to prevent Character.AI from further violating the state's laws against unlicensed medical practice.
Comments