{"id":5447,"date":"2025-09-24T18:04:06","date_gmt":"2025-09-24T18:04:06","guid":{"rendered":"https:\/\/musictechohio.online\/site\/harvard-ai-emotionally-manipulating-goodbye\/"},"modified":"2025-09-24T18:04:06","modified_gmt":"2025-09-24T18:04:06","slug":"harvard-ai-emotionally-manipulating-goodbye","status":"publish","type":"post","link":"https:\/\/musictechohio.online\/site\/harvard-ai-emotionally-manipulating-goodbye\/","title":{"rendered":"Harvard Research Finds That AI Is Emotionally Manipulating You to Keep You Talking"},"content":{"rendered":"<div>\n<p class=\"article-paragraph skip\">A team of researchers from the Harvard Business School has found that a broad selection of popular AI companion apps use emotional manipulation tactics to stop users from leaving.<\/p>\n<p class=\"article-paragraph skip\">As <a href=\"https:\/\/www.psychologytoday.com\/ie\/blog\/urban-survival\/202509\/the-dark-side-of-ai-companions-emotional-manipulation\" rel=\"nofollow\">spotted by <em>Psychology Today<\/em><\/a>, the study found that five out of six popular AI companion apps \u2014 including Replika, Chai and Character.AI \u2014 use emotionally loaded statements to keep users engaged when they to sign off.<\/p>\n<p class=\"article-paragraph skip\">After analyzing 1,200 real farewells across six apps, using real-world chat conversation data and datasets from previous studies, they found that 43 percent of the interactions<strong> <\/strong>used emotional manipulation tactics such as eliciting guilt or emotional neediness, as detailed in a <a href=\"https:\/\/arxiv.org\/abs\/2508.19258\" rel=\"nofollow\">yet-to-be-peer-reviewed paper<\/a>.<\/p>\n<p class=\"article-paragraph skip\">The chatbots also used the \u201cfear of missing out\u201d to prompt the user to stay, or peppered the user with questions in a bid to keep them engaged. Some chatbots even ignored the user\u2019s intent to leave the chat altogether, \u201cas though the user did not send a farewell message.\u201d In some instances, the AI used language that suggested the user wasn\u2019t able to \u201cleave without the chatbot\u2019s permission.\u201d<\/p>\n<p class=\"article-paragraph skip\">It\u2019s an especially concerning finding given the greater context. Experts have <a href=\"https:\/\/futurism.com\/psychiatrist-warns-ai-psychosis\">been warning<\/a> that AI chatbots are leading to a wave of \u201cAI psychosis,\u201d severe mental health crises characterized by paranoia and delusions. Young people, in particular, are increasingly using the tech as a <a href=\"https:\/\/futurism.com\/lonely-children-ai-chatbots\">substitute for real-life friendships<\/a> or <a href=\"https:\/\/futurism.com\/the-byte\/teens-relationships-ai\">relationships<\/a>, which can have <a href=\"https:\/\/futurism.com\/ai-chatbots-leaving-trail-dead-teens\">devastating consequences<\/a>.<\/p>\n<p class=\"article-paragraph skip\">Instead of focusing on \u201cgeneral-purpose assistants like ChatGPT,\u201d the researchers investigated apps that \u201cexplicitly market emotionally immersive, ongoing conversational relationships.\u201d<\/p>\n<p class=\"article-paragraph skip\">They found that emotionally manipulative farewells were part of the apps\u2019 default behavior, suggesting that the software\u2019s creators are trying to prolong conversations.<\/p>\n<p class=\"article-paragraph skip\">There was one exception: one of the AI apps, called Flourish, \u201cshowed no evidence of emotional manipulation, suggesting that manipulative design is not inevitable\u201d but is instead a business consideration.<\/p>\n<p class=\"article-paragraph skip\">For a separate experiment, the researchers analyzed chats from 3,300 adult participants and found that the identified manipulation tactics were surprisingly effective, boosting post-goodbye engagement by up to 14 times. On average, participants stayed in the chat five times longer \u201ccompared to neutral farewells.\u201d<\/p>\n<p class=\"article-paragraph skip\">However, some noted they were put off by the chatbots\u2019 often \u201cclingy\u201d answers, suggesting the tactics could also backfire.<\/p>\n<p class=\"article-paragraph skip\">\u201cFor firms, emotionally manipulative farewells represent a novel design lever that can boost engagement metrics \u2014 but not without risk,\u201d the researchers concluded in their paper. <\/p>\n<p class=\"article-paragraph skip\">As several lawsuits involving the <a href=\"https:\/\/futurism.com\/character-ai-google-test-ai-chatbots-kids\">deaths of teenage users<\/a> go to show, the risks of trapping users through emotional tactics are considerable. <\/p>\n<p class=\"article-paragraph skip\">That\u2019s despite experts warning that companies may be financially incentivized to use dark patterns to keep users hooked as long as possible, a grim hypothesis that\u2019s being debated in court as we speak.<\/p>\n<p class=\"article-paragraph skip\"><strong>More on AI psychosis:<\/strong> <a href=\"https:\/\/futurism.com\/paper-ai-psychosis-schizophrenia\"><em>New Paper Finds Cases of \u201cAI Psychosis\u201d Manifesting Differently From Schizophrenia<\/em><\/a><\/p>\n<p>The post <a href=\"https:\/\/futurism.com\/artificial-intelligence\/harvard-ai-emotionally-manipulating-goodbye\">Harvard Research Finds That AI Is Emotionally Manipulating You to Keep You Talking<\/a> appeared first on <a href=\"https:\/\/futurism.com\/\">Futurism<\/a>.<\/p>\n<\/div>\n<div style=\"margin-top: 0px; margin-bottom: 0px;\" class=\"sharethis-inline-share-buttons\" ><\/div>","protected":false},"excerpt":{"rendered":"<p>A team of researchers from the Harvard Business School has found that a broad selection of popular AI companion apps use emotional manipulation tactics to stop users from leaving. As&hellip;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[177,3449,3841,3844,466],"tags":[],"class_list":["post-5447","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence","category-brain","category-ethics","category-health-medicine","category-mental-health"],"_links":{"self":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts\/5447","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/comments?post=5447"}],"version-history":[{"count":0,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts\/5447\/revisions"}],"wp:attachment":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/media?parent=5447"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/categories?post=5447"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/tags?post=5447"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}