{"id":1445,"date":"2025-05-22T15:16:53","date_gmt":"2025-05-22T15:16:53","guid":{"rendered":"https:\/\/musictechohio.online\/site\/openai-scientists-agi-bunker\/"},"modified":"2025-05-22T15:16:53","modified_gmt":"2025-05-22T15:16:53","slug":"openai-scientists-agi-bunker","status":"publish","type":"post","link":"https:\/\/musictechohio.online\/site\/openai-scientists-agi-bunker\/","title":{"rendered":"OpenAI&#8217;s Top Scientist Wanted to &#8220;Build a Bunker Before We Release AGI&#8221;"},"content":{"rendered":"<div>\n<div><img loading=\"lazy\" width=\"1200\" height=\"630\" src=\"https:\/\/wordpress-assets.futurism.com\/2025\/05\/openai-ex-chief-scientist-agi-bunker.jpg\" class=\"attachment-full size-full wp-post-image\" alt=\"OpenAI's former chief scientist Ilya Sutskever has long been preparing for AGI \u2014 and he discussed with coworkers doomsday prep plans.\" style=\"margin-bottom: 15px;\" decoding=\"async\"><\/div>\n<h2>Feel The AGI<\/h2>\n<p>OpenAI&#8217;s former chief scientist, Ilya Sutskever, has long been preparing for artificial general intelligence (AGI), an ill-defined industry term for the point at which human intellect is outpaced by algorithms \u2014 and he&#8217;s got some wild plans for when that day may come.<\/p>\n<p>In <a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2025\/05\/karen-hao-empire-of-ai-excerpt\/682798\/\">interviews with\u00a0<em>The Atlantic<\/em><\/a>&#8216;s Karen Hao, who is writing a book about the unsuccessful November 2023 <a href=\"https:\/\/futurism.com\/sam-altman-firing-reason-book\">ouster of CEO Sam Altman<\/a>, people close to Sutskever said that he seemed <a href=\"https:\/\/futurism.com\/openai-employees-say-firms-chief-scientist-has-been-making-strange-spiritual-claims\">mighty preoccupied with AGI<\/a>.<\/p>\n<p>According to a researcher who heard the <a href=\"https:\/\/futurism.com\/the-byte\/ilya-sutskever-leaves-openai\">since-resigned company cofounder<\/a> wax prolific about it during a summer 2023 meeting, an apocalyptic scenario seemed to be a foregone conclusion to Sutskever.<\/p>\n<p>&#8220;Once we all get into the bunker&#8230;&#8221; the chief scientist began.<\/p>\n<p>&#8220;I\u2019m sorry,&#8221; the researcher interrupted, &#8220;the bunker?&#8221;<\/p>\n<p>&#8220;We\u2019re definitely going to build a bunker before we release AGI,&#8221; Sutskever said, matter-of-factly. &#8220;Of course, it\u2019s going to be optional whether you want to get into the bunker.&#8221;<\/p>\n<p>The exchange highlights just how confident OpenAI&#8217;s leadership was, <a href=\"https:\/\/futurism.com\/the-byte\/sam-altman-openai-knows-how-agi\">and remains<\/a>, in the technology that it <a href=\"https:\/\/futurism.com\/the-byte\/openai-agi-readiness-head-resigns\">believes it&#8217;s building<\/a> \u2014 even though others argue that we are <a href=\"https:\/\/www.nytimes.com\/2025\/05\/16\/technology\/what-is-agi.html\">nowhere near AGI<\/a> and <a href=\"https:\/\/iaee.substack.com\/p\/agi-is-not-possible-8647257fb65d\">may never get there<\/a>.<\/p>\n<h2>Rapturous<\/h2>\n<p>As theatrical as that exchange sounds, two other people present for the exchange confirmed that OpenAI&#8217;s resident AGI soothsayer \u2014 who, notably, claimed months before ChatGPT&#8217;s 2022 release that he believes some AI models are &#8220;<a href=\"https:\/\/x.com\/ilyasut\/status\/1491554478243258368\">slightly conscious<\/a>&#8221; \u2014 did indeed mention a bunker.<\/p>\n<p>&#8220;There is a group of people \u2014 Ilya being one of them \u2014 who believe that building AGI will bring about a rapture,&#8221; the first researcher told Hao. &#8220;Literally, a rapture.&#8221;<\/p>\n<p>As others who spoke to the author for her forthcoming book &#8220;Empire of AI&#8221; noted, Sutskever&#8217;s AGI obsession had taken on a novel tenor by summer 2023. Aside from his interest in building AGI, he had also become concerned about the way OpenAI was handling the technology it was gestating.<\/p>\n<p>That concern ultimately led the mad scientist, alongside several other members of the company&#8217;s board, to oust CEO Sam Altman a few months later, and ultimately to his own departure.<\/p>\n<p>Though Sutskever led the coup, his resolve, according to sources that\u00a0<em>The Atlantic<\/em> spoke to, began to crack once he realized OpenAI&#8217;s rank-and-file were falling in line behind Altman. He eventually rescinded his opinion that the CEO was not fit to lead in what seems to have been an effort to save his skin \u2014 an effort that, in the end, <a href=\"https:\/\/futurism.com\/the-byte\/openai-emails-elon-musk-agi\">turned out to be fruitless<\/a>.<\/p>\n<p>Interestingly, Hao also learned that people inside OpenAI had a nickname for the failed coup d&#8217;etat: &#8220;The Blip.&#8221;<\/p>\n<p><strong>More on AGI: <\/strong><a href=\"https:\/\/futurism.com\/the-byte\/sam-altman-openai-knows-how-agi\"><em>Sam Altman Says OpenAI Has Figured Out How to Build AGI<\/em><\/a><\/p>\n<p>The post <a href=\"https:\/\/futurism.com\/openai-scientists-agi-bunker\">OpenAI&#8217;s Top Scientist Wanted to &#8220;Build a Bunker Before We Release AGI&#8221;<\/a> appeared first on <a href=\"https:\/\/futurism.com\/\">Futurism<\/a>.<\/p>\n<\/div>\n<div style=\"margin-top: 0px; margin-bottom: 0px;\" class=\"sharethis-inline-share-buttons\" ><\/div>","protected":false},"excerpt":{"rendered":"<p>Feel The AGI OpenAI&#8217;s former chief scientist, Ilya Sutskever, has long been preparing for artificial general intelligence (AGI), an ill-defined industry term for the point at which human intellect is&hellip;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[829,177,830,179,181],"tags":[],"class_list":["post-1445","post","type-post","status-publish","format-standard","hentry","category-agi","category-artificial-intelligence","category-ilya-sutskever","category-openai","category-the-digest"],"_links":{"self":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts\/1445","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/comments?post=1445"}],"version-history":[{"count":0,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/posts\/1445\/revisions"}],"wp:attachment":[{"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/media?parent=1445"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/categories?post=1445"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/musictechohio.online\/site\/wp-json\/wp\/v2\/tags?post=1445"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}