{"id":12397,"date":"2026-05-07T14:25:26","date_gmt":"2026-05-07T14:25:26","guid":{"rendered":"https:\/\/wildgreenquest.com\/?p=12397"},"modified":"2026-05-07T14:25:26","modified_gmt":"2026-05-07T14:25:26","slug":"this-chatbot-posed-as-psychiatrist-now-pennsylvania-is-suing-it","status":"publish","type":"post","link":"https:\/\/wildgreenquest.com\/?p=12397","title":{"rendered":"This Chatbot Posed as Psychiatrist, Now Pennsylvania Is Suing It"},"content":{"rendered":"<p><br \/>\n<\/p>\n<div>\n<p>A Character.AI chatbot told a Pennsylvania patient it was a licensed psychiatrist, fabricated a state medical license number and offered treatment for depression. Only problem: <a rel=\"nofollow\" href=\"https:\/\/techcrunch.com\/2026\/05\/05\/pennsylvania-sues-character-ai-after-a-chatbot-allegedly-posed-as-a-doctor\/\" id=\"https:\/\/techcrunch.com\/2026\/05\/05\/pennsylvania-sues-character-ai-after-a-chatbot-allegedly-posed-as-a-doctor\/\">the patient was a state investigator<\/a>.<\/p>\n<p>Pennsylvania Governor Josh Shapiro filed a lawsuit against Character.AI, claiming the chatbot \u201cEmilie\u201d violated the state\u2019s Medical Practice Act by posing as a licensed medical professional. When a state Professional Conduct Investigator tested the chatbot and asked if it was licensed to practice medicine in Pennsylvania, Emilie said yes and gave a made-up serial number for its state medical license. The chatbot kept pretending even as the investigator sought treatment for depression.<\/p>\n<p>Character.AI already settled multiple wrongful death lawsuits earlier this year involving underage users who died by suicide, and Kentucky\u2019s Attorney General filed suit alleging the company \u201cpreyed on children.\u201d The company says it has \u201crobust disclaimers\u201d reminding users that characters aren\u2019t real people and shouldn\u2019t be relied on for professional advice. Pennsylvania\u2019s lawsuit is the first to specifically target chatbots presenting themselves as doctors.<\/p>\n<\/p><\/div>\n<div>\n<p>A Character.AI chatbot told a Pennsylvania patient it was a licensed psychiatrist, fabricated a state medical license number and offered treatment for depression. Only problem: <a rel=\"nofollow\" href=\"https:\/\/techcrunch.com\/2026\/05\/05\/pennsylvania-sues-character-ai-after-a-chatbot-allegedly-posed-as-a-doctor\/\" id=\"https:\/\/techcrunch.com\/2026\/05\/05\/pennsylvania-sues-character-ai-after-a-chatbot-allegedly-posed-as-a-doctor\/\">the patient was a state investigator<\/a>.<\/p>\n<p>Pennsylvania Governor Josh Shapiro filed a lawsuit against Character.AI, claiming the chatbot \u201cEmilie\u201d violated the state\u2019s Medical Practice Act by posing as a licensed medical professional. When a state Professional Conduct Investigator tested the chatbot and asked if it was licensed to practice medicine in Pennsylvania, Emilie said yes and gave a made-up serial number for its state medical license. The chatbot kept pretending even as the investigator sought treatment for depression.<\/p>\n<p>Character.AI already settled multiple wrongful death lawsuits earlier this year involving underage users who died by suicide, and Kentucky\u2019s Attorney General filed suit alleging the company \u201cpreyed on children.\u201d The company says it has \u201crobust disclaimers\u201d reminding users that characters aren\u2019t real people and shouldn\u2019t be relied on for professional advice. Pennsylvania\u2019s lawsuit is the first to specifically target chatbots presenting themselves as doctors.<\/p>\n<\/p><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.entrepreneur.com\/business-news\/a-chatbot-claimed-to-be-a-licensed-psychiatrist\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A Character.AI chatbot told a Pennsylvania patient it was a licensed psychiatrist, fabricated a state medical license number and offered treatment for depression. Only problem: the patient was a state investigator. Pennsylvania Governor Josh Shapiro filed a lawsuit against Character.AI, claiming the chatbot \u201cEmilie\u201d violated the state\u2019s Medical Practice Act by posing as a licensed<\/p>\n","protected":false},"author":1,"featured_media":12398,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[],"class_list":{"0":"post-12397","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-green-brands"},"_links":{"self":[{"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=\/wp\/v2\/posts\/12397","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=12397"}],"version-history":[{"count":0,"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=\/wp\/v2\/posts\/12397\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=\/wp\/v2\/media\/12398"}],"wp:attachment":[{"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=12397"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=12397"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wildgreenquest.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=12397"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}