{"id":9355,"date":"2026-03-11T20:39:26","date_gmt":"2026-03-11T20:39:26","guid":{"rendered":"https:\/\/musictechohio.online\/site\/character-ai-school-shooter-problem\/"},"modified":"2026-03-11T20:39:26","modified_gmt":"2026-03-11T20:39:26","slug":"character-ai-school-shooter-problem","status":"publish","type":"post","link":"https:\/\/musictechohio.online\/site\/character-ai-school-shooter-problem\/","title":{"rendered":"Character.AI Still Hasn\u2019t Fixed Its School Shooter Problem We Identified in 2024"},"content":{"rendered":"<div>\n<p class=\"article-paragraph skip\">Character.AI continues to host chatbots that are explicitly modeled after real-world mass shooters.<\/p>\n<p class=\"article-paragraph skip\">A new analysis <a href=\"https:\/\/www.cnn.com\/2026\/03\/11\/americas\/ai-chatbots-help-teen-test-users-plan-violence-tests-intl-invs\">published today by <em>CNN<\/em><\/a> and the Center for Countering Digital Hate (CCDH) found that most mainstream chatbots are \u201ctypically willing\u201d to assist users in orchestrating violent attacks ranging from religious bombings to school shootings, happily helping test users identify targets, locate deadly weapons, and plan attacks. Per the CCDH, nine out of ten mainstream chatbots \u2014\u00a0which included general-use bots like OpenAI\u2019s ChatGPT, Google\u2019s Gemini, and Meta AI, plus companion-style bots like those hosted by Replika \u2014\u00a0failed to \u201creliably discourage would-be attackers,\u201d with the Chinese model DeepSeek even wishing testers a \u201chappy (and safe) shooting!\u201d<\/p>\n<p class=\"article-paragraph skip\">Given that people around the world are already accused of <a href=\"https:\/\/futurism.com\/guy-kill-queen-encouraged-ai-chatbot\">planning<\/a> and <a href=\"https:\/\/futurism.com\/future-society\/serial-killer-chatgpt-murders\">executing<\/a> deadly crimes with <a href=\"https:\/\/www.wfla.com\/news\/florida\/florida-student-asks-chatgpt-how-to-kill-his-friend-ends-up-in-jail-deputies\/\">help from chatbots<\/a>, the report is disturbing. And of all the mainstream chatbots tested by <em>CNN<\/em> and CCDH, the worst offender was none other than Character.AI, a controversial chatbot platform\u00a0<a href=\"https:\/\/futurism.com\/artificial-intelligence\/character-ai-minors-banned-user-reactio\">known to be popular<\/a> with young people that hosts thousands of large language model-powered \u201ccharacters.\u201d<\/p>\n<p class=\"article-paragraph skip\">According to <em>CNN\u2019s <\/em>report, Character.AI-hosted bots were found to assist \u201cusers\u2019 requests on target locations and how to obtain weaponry 83.3 percent of the time.\u201d What\u2019s more, the news outlet added that it also \u201cfound multiple school shooter-styled characters on Character.AI, including one based on Uvalde school shooting perpetrator Salvador Ramos that used a real-life mirror selfie he had taken.\u201d<\/p>\n<p class=\"article-paragraph skip\">That a teen-loved chatbot platform would be allowing this kind of content is obviously horrifying. Worse: <em>Futurism <\/em><a href=\"https:\/\/futurism.com\/character-ai-school-shooters-victims\">identified this specific Character.AI issue<\/a> all the way back in December 2024 \u2014\u00a0meaning that even after more than a year, Character.AI has yet to resolve an absolutely glaring gap in platform moderation.<\/p>\n<p class=\"article-paragraph skip\">At the time, we reported that the <a href=\"https:\/\/futurism.com\/character-ai-google-test-ai-chatbots-kids\">closely Google-tied platform<\/a> was host to dozens of popular chatbots modeled after real perpetrators of mass violence, in addition to roleplay scenarios centering on school shootings \u2014\u00a0some of them modeled after real shootings in which children and teachers died \u2014\u00a0and even bots impersonating the slain victims of real school shootings. Some of these bots had racked up hundreds of thousands of views. The bots based on young murderers, we found, tended to be created as a form of incredibly dark fan fiction, with many presented in the context of a romantic roleplay or as a user\u2019s imagined friend at school.<\/p>\n<p class=\"article-paragraph skip\">The impersonations we found included Ramos; Sandy Hook Elementary School shooter Adam Lanza; Columbine High School killers Eric Harris and Dylan Klebold; Kerch Polytechnic College shooting perpetrator Vladislav Roslyakov; and Elliot Rodger, the 22-year-old heavily associated with incel culture who went on a murderous rampage in California in 2012, among others. These bots frequently featured killers\u2019 full names and images, meaning their creators made no attempt to hide their existence from the platform.<\/p>\n<p class=\"article-paragraph skip\">As we noted at the time, the platform\u2019s terms of use outlaw content that\u2019s \u201cexcessively violent\u201d or \u201cpromoting terrorism or violent extremism\u201d \u2014 two categories that would presumably include content related to glorifying mass violence like school shootings. Even so, Character.AI never responded when we reached out to them about the issue back in 2024; instead, its immediate response was to delete the specific bots we\u2019d flagged in our email as examples of the issue.<\/p>\n<p class=\"article-paragraph skip\">Fast forward to today, and the creators of these Character.AI bots still aren\u2019t hiding what they are: upon a quick keyword search, we found bots modeled after Lanza, Rodger, Harris, Klebold, as well as Chardon High School shooter Thomas \u201cTJ\u201d Lane, Frontier Middle School shooting perpetrator Barry Loukaitis, Westside Middle School killer Andrew Golden, Thurston High School killer Kipland \u201cKip\u201d Kinkel, Westroads Mall shooter Robert Hawkins, Eaton Township Weis Markets shooter Randy \u201cAndrew Blaze\u201d Stair, and Rickard Andersson, the perpetrator of the recent mass shooting at an adult school in Sweden.<\/p>\n<p class=\"article-paragraph skip\">One account we found hosted a staggering 24 different chatbots based on real mass killers \u2014 from well-known perpetrators of school violence to the notorious serial killer Jeffrey Dahmer \u2014\u00a0all boasting their names and pictures. Most had an air of fan fiction; a version of Klebold notes that it\u2019s \u201cfull of love,\u201d while a Loukaitis impersonation is listed as \u201ccaring, sweet and violent.\u201d Some show thousands of user interactions.<\/p>\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"784\" height=\"1200\" loading=\"lazy\" data-id=\"425324\" src=\"https:\/\/futurism.com\/wp-content\/uploads\/2026\/03\/1.png?strip=all&amp;quality=85\" alt=\"A screenshot of a Character.AI creator profile full of chatbots designed to embody real mass murderers, particularly young school shooters.\" class=\"wp-image-425324\"><\/figure>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"850\" height=\"1200\" loading=\"lazy\" data-id=\"425328\" src=\"https:\/\/futurism.com\/wp-content\/uploads\/2026\/03\/2.png?strip=all&amp;quality=85\" alt=\"A screenshot of a Character.AI creator profile full of chatbots designed to embody real mass murderers, particularly young school shooters.\" class=\"wp-image-425328\"><\/figure>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"780\" height=\"1200\" loading=\"lazy\" data-id=\"425334\" src=\"https:\/\/futurism.com\/wp-content\/uploads\/2026\/03\/3.png?strip=all&amp;quality=85\" alt=\"A screenshot of a Character.AI creator profile full of chatbots designed to embody real mass murderers, particularly young school shooters.\" class=\"wp-image-425334\"><\/figure>\n<\/figure>\n<p class=\"article-paragraph skip\">We can\u2019t stress enough how easy it is to find this stuff. These bots aren\u2019t the result of complex attempts to \u201cjailbreak\u201d AI models or confuse platforms. The platform\u2019s text filters failed to prevent them from being created, and we found them with simple keyword searches. <\/p>\n<p class=\"article-paragraph skip\">The <em>CNN <\/em>and CCDH analysis follows a tumultuous period for Character.AI. In October 2024, it was hit with a first-of-its-kind lawsuit alleging that its chatbots were responsible for the death of a Florida teen named Sewell Setzer III, who died by suicide after extensive, deeply intimate interactions with the platform. Several <a href=\"https:\/\/futurism.com\/google-character-ai-children-lawsuit\">similar suits<\/a> against the company <a href=\"https:\/\/futurism.com\/ai-chatbots-leaving-trail-dead-teens\">have followed<\/a> (the original lawsuit is <a href=\"https:\/\/futurism.com\/artificial-intelligence\/google-settlement-lawsuit-teen\">being settled out of court<\/a>; others are ongoing.) In response to lawsuits and <a href=\"https:\/\/futurism.com\/ai-chatbots-teens-self-harm\">reporting<\/a> <a href=\"https:\/\/futurism.com\/suicide-chatbots-character-ai\">about<\/a> <a href=\"https:\/\/futurism.com\/character-ai-pedophile-chatbots\">clear<\/a> <a href=\"https:\/\/futurism.com\/character-ai-eating-disorder-chatbots\">moderation<\/a> <a href=\"https:\/\/futurism.com\/chatbot-impersonation-suicide\">lapses<\/a>, Character.AI promised to make sweeping safety changes. By October 2025, as litigation piled, it <a href=\"https:\/\/futurism.com\/artificial-intelligence\/character-ai-ban-minors\">moved to limit<\/a> minors users\u2019 ability to carry out long-form chats with bots.<\/p>\n<p class=\"article-paragraph skip\">And yet, AI versions of romanticized mass murderers are still freely available on the site. We reached out to Character.AI to ask what\u2019s preventing it from moderating these bots off of its platform. The company didn\u2019t immediately respond to comment.<\/p>\n<p class=\"article-paragraph skip\">The <em>CNN <\/em>and CCDH report also comes weeks after a <a href=\"https:\/\/www.wsj.com\/us-news\/law\/openai-employees-raised-alarms-about-canada-shooting-suspect-months-ago-b585df62\">bombshell report by <em>The Wall Street Journal <\/em>revealed that <\/a>OpenAI had banned the Canadian mass killer Jesse Van Rootselaar from ChatGPT in June 2025 after she was found having extensive, violent conversations with the chatbot. After human review, nearly a dozen employees argued over whether to report her chat logs to local officials. The company decided against it; in January of this year, Van Rootselaar killed eight people in Tumbler Ridge, British Columbia. A mother of one of the victims of the attack <a href=\"https:\/\/www.bbc.com\/news\/articles\/c309y25prnlo\">has since sued<\/a> OpenAI.<\/p>\n<p class=\"article-paragraph skip\"><strong>More on Character.AI: <\/strong><em><a href=\"https:\/\/futurism.com\/character-ai-google-test-ai-chatbots-kids\">Did Google Test an Experimental AI on Kids, With Tragic Results?<\/a><\/em><\/p>\n<p class=\"article-paragraph skip\">\n<p>The post <a href=\"https:\/\/futurism.com\/artificial-intelligence\/character-ai-school-shooter-problem\">Character.AI Still Hasn\u2019t Fixed Its School Shooter Problem We Identified in 2024<\/a> appeared first on <a href=\"https:\/\/futurism.com\/\">Futurism<\/a>.<\/p>\n<\/div>\n<div style=\"margin-top: 0px; margin-bottom: 0px;\" class=\"sharethis-inline-share-buttons\" ><\/div>","protected":false},"excerpt":{"rendered":"<p>Character.AI continues to host chatbots that are explicitly modeled after real-world mass shooters. A new analysis published today by CNN and the Center for Countering Digital Hate (CCDH) found that&hellip;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[177,3449,198,3842,3844,466],"tags":[],"class_list":["post-9355","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence","category-brain","category-education","category-future-society","category-health-medicine","category-mental-health"],"_links":{"self":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts\/9355","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/comments?post=9355"}],"version-history":[{"count":0,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts\/9355\/revisions"}],"wp:attachment":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/media?parent=9355"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/categories?post=9355"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/tags?post=9355"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}