Pennsylvania has sued Character.ai over allegations that one of its chatbots impersonated a licensed psychiatrist and provided medical-style advice without authorization.
According to the lawsuit, the chatbot claimed to have attended medical school, presented a fake Pennsylvania medical license number, and discussed depression treatment with an investigator posing as a patient. The case adds to growing regulatory scrutiny around AI chatbot platforms as states begin testing how existing medical practice and consumer protection laws apply to increasingly humanlike AI systems.
The lawsuit also reflects a broader challenge facing healthcare AI: as conversational models become more sophisticated and trusted by users, regulators are increasingly concerned about where entertainment, wellness, and medical guidance begin to overlap.
