{"id":141819,"date":"2025-02-10T13:25:00","date_gmt":"2025-02-10T17:25:00","guid":{"rendered":"https:\/\/www.shortform.com\/blog\/?p=141819"},"modified":"2025-02-19T13:28:21","modified_gmt":"2025-02-19T17:28:21","slug":"yuval-noah-harari-ai","status":"publish","type":"post","link":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/","title":{"rendered":"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk"},"content":{"rendered":"\n<p>What&#8217;s really happening when AI systems choose the information we see online? How can we maintain control over technology that&#8217;s becoming better than humans at understanding our world?<\/p>\n\n\n\n<p>In his book <em>Nexus<\/em>, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. From social media algorithms to sophisticated language models, AI systems are increasingly determining what we read, watch, and believe.<\/p>\n\n\n\n<p>Read more to get Yuval Noah Harari&#8217;s AI insights and to better understand how to navigate this changing landscape.<\/p>\n\n\n\n<!--more-->\n\n\n\n<p><em>Image credit: <a href=\"https:\/\/commons.wikimedia.org\/wiki\/File:MKr364740_Yuval_Noah_Harari_(Frankfurter_Buchmesse_2024).jpg\" target=\"_blank\" rel=\"noreferrer noopener\">Martin Kraft via Wikimedia Commons<\/a> (<a href=\"https:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/\" target=\"_blank\" rel=\"noreferrer noopener\">License<\/a>)<\/em><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-yuval-noah-harari-on-ai\">Yuval Noah Harari on AI<\/h2>\n\n\n\n<p>According to Yuval Noah Harari, AI must be understood in the context of our era. He explains that we\u2019re living in an \u201cinformation age,\u201d where knowledge is proliferating, access to information is democratized, and everyone with a smartphone and internet access can share their ideas with the world. As we develop tools such as AI, we\u2019re increasing the speed at which stories can be shared. When you consider what makes a human society free, equitable, or democratic, <strong>having more information sounds like an inherent good<\/strong>. (An especially American expression of this is the idea, written down by Thomas Jefferson, is that a well-informed electorate plays a vital role in keeping authorities in check and guarding against tyranny.)\u00a0<\/p>\n\n\n\n<p>But, counter to that notion, Harari worries that recent developments that make information more accessible to us threaten to tip the balance toward the most extreme, least truthful, and most divisive messages.<\/p>\n\n\n\n<p>(Shortform note: Even before AI entered the scene, some experts questioned whether our increasing access to information is an inherent good. Early internet idealists <a href=\"https:\/\/journal-redescriptions.org\/articles\/10.33134\/rds.352\" target=\"_blank\" rel=\"noreferrer noopener\">envisioned a digital utopia<\/a> where open access to knowledge would lead to more informed, rational discourse that disrupted monopolies of information. True to form, the \u201c<a href=\"https:\/\/lab.cccb.org\/en\/the-i-in-the-internet\/\" target=\"_blank\" rel=\"noreferrer noopener\">old internet<\/a>\u201d\u2014characterized by decentralized blogs and forums\u2014fostered niche communities and a diversity of perspectives. However, the \u201cnew internet\u201d is dominated by profit-driven platforms that Kyle Chaka, author of <a href=\"https:\/\/www.penguinrandomhouse.com\/books\/695902\/filterworld-by-kyle-chayka\/\" target=\"_blank\" rel=\"noreferrer noopener\"><em>Filterworld<\/em><\/a>, argues \u201c<a href=\"https:\/\/www.npr.org\/2024\/01\/17\/1224955473\/social-media-algorithm-filterworld\" target=\"_blank\" rel=\"noreferrer noopener\">flatten culture<\/a>\u201d by concentrating the flow of information in particular ways. As a result, new monopolies of information are emerging\u2014and undermining the internet\u2019s democratic potential by steering <a href=\"https:\/\/www.shortform.com\/blog\/political-discourse\/\">public discourse<\/a> toward homogenized content.)<\/p>\n\n\n\n<p>Because humans are wired to seek out a good story rather than to pursue the truth, <strong>putting AI in a position to determine what ideas we&#8217;re exposed to could have potentially disastrous consequences<\/strong>. Harari identifies three main dangers: AI&#8217;s disregard for truth, its ability to manipulate and polarize us, and its potential to surpass human understanding of the world. For each of these threats, he offers specific recommendations for how we can maintain human agency and control over our information landscape.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Danger #1: Since We Don\u2019t Prioritize the Truth, AI Doesn\u2019t Either<\/h3>\n\n\n\n<p>Scientists have made it possible to build AI models that can generate language and tell stories just like humans. Harari contends that <strong>AI\u2019s ability to create compelling stories and produce the illusion of emotions (and <a href=\"https:\/\/www.shortform.com\/blog\/emotional-intimacy-in-a-relationship\/\">emotional intimacy<\/a>) is where its real danger lies<\/strong>. When we talk to an AI-powered chatbot such as <a href=\"https:\/\/www.shortform.com\/blog\/what-can-you-do-with-chatgpt\/\">ChatGPT<\/a>, it\u2019s easy to lose sight of the fact that these systems aren\u2019t human and don\u2019t have a vested interest in telling the truth. That will only become harder to recognize as AI gets better and better at mimicking human emotions\u2014and creating the illusion that it thinks and feels like we do. So it will only become easier for us to lose sight of the fact that <strong>AI isn\u2019t prioritizing the truth when it selects and generates information for us<\/strong>.<\/p>\n\n\n\n<p>Harari says that AI already influences what information we consume: An <em>algorithm<\/em>\u2014a set of mathematical instructions that tell a computer what to do to solve a problem\u2014chooses what you see on a social network or a news app. Facebook\u2019s algorithm, for example, chooses posts to maximize the time you spend in the app. <strong>The best way for it to do that is not to show you stories that are true, but content that provokes an emotional reaction<\/strong>. So it selects posts that make you angry, fuel your animosity for people who aren\u2019t like you, and confirm what you already believe about the world. That\u2019s why social media feeds are flooded with fake news, conspiracy theories, and inflammatory ideas. Harari thinks this effect will only become more pronounced as AI is curating and creating more of the content we consume.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">How to Fix It: Pay Attention to What\u2019s True<\/h4>\n\n\n\n<p>Harari argues that <strong>we need to take deliberate steps to tilt the balance in favor of truth as AI becomes more powerful<\/strong>. While his proposed solutions are somewhat abstract, he emphasizes two main approaches: being proactive about highlighting truthful information and maintaining decentralized networks where information can flow freely among institutions and individuals who can identify and correct falsehoods.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Danger #2: We\u2019re Becoming Easier to Manipulate and Polarize<\/h3>\n\n\n\n<p>Harari warns that, as AI increasingly controls what information we see, <strong>algorithms will push us toward more extreme ideas and greater polarization<\/strong>. We can already see this happening with today&#8217;s algorithmically stoked outrage and clickbait-fueled misinformation. Harari believes the problem will only intensify as AI becomes more sophisticated and commercialized, and he predicts <strong>AI systems will create, interpret, and spread stories without <a href=\"https:\/\/www.shortform.com\/blog\/human-intervention\/\">human intervention<\/a><\/strong>. One system might select pieces of information, another spin that information into a story, and yet another determine which stories to show to which users. This will leave us increasingly vulnerable to <a href=\"https:\/\/www.shortform.com\/blog\/ai-manipulation\/\">manipulation by AI<\/a> systems and the corporations that control them.<\/p>\n\n\n\n<p><strong>Harari explains this represents a significant shift in power<\/strong>: The ability to set the cultural agenda and shape public discourse\u2014traditionally the domain of newspaper editors, book authors, and intellectuals\u2014will increasingly belong to AI systems optimized not for truth or <a href=\"https:\/\/www.shortform.com\/blog\/social-unity\/\">social cohesion<\/a>, but for engagement and profit.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">How to Fix It: Build Institutions to Help People Understand What AI Is Doing<\/h4>\n\n\n\n<p>To counter AI&#8217;s growing influence over public opinion, <strong>Harari calls for the creation of new institutions to monitor artificial intelligence and inform the public about its capabilities and risks<\/strong>. He argues that we shouldn&#8217;t let tech giants regulate themselves. While his vision for these oversight institutions remains abstract, he suggests they should function somewhat like today&#8217;s free press or academic institutions, serving as independent watchdogs that can help the public understand and evaluate AI&#8217;s decisions and actions. <strong>Harari frames this as primarily a political challenge, arguing that we need the collective will to establish these safeguards<\/strong>.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Danger #3: We\u2019re Setting AI Up to Understand the World Better Than We Do<\/h3>\n\n\n\n<p>Harari warns that <strong>we&#8217;re creating AI systems that will soon surpass human capabilities in understanding and manipulating the shared stories that organize our societies<\/strong>. This shift represents a real danger: While humans have traditionally maintained power through our unique ability to create and control these shared fictions\u2014such as laws, money, and social institutions\u2014AI is poised to eclipse us at our own game.<\/p>\n\n\n\n<p>The root of this problem lies in human nature. <strong>We often lack the patience and attention span to dig deep into complex truths, preferring simpler stories that are easier to grasp<\/strong>. AI systems, in contrast, can process vast amounts of information, and work together in ways humans can&#8217;t\u2014while one AI system analyzes market trends, another can simultaneously study legal documents, and thousands more can coordinate to spot patterns across these different domains. They can comprehend intricate systems\u2014such as legal codes and financial markets\u2014far better than most humans can. They can even create entirely new frameworks that go beyond human understanding. This capability gap marks an unprecedented shift in power.<\/p>\n\n\n\n<p>For tens of thousands of years, humans have been the sole architects of our information networks, generating and sharing the ideas that shape our societies. But <strong>as AI systems become more sophisticated, we&#8217;ll increasingly rely on them to process information and make decisions.<\/strong> When we delegate decisions, we also surrender our understanding of the information that drives them\u2014potentially giving up our position as the primary shapers of human society.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">How to Fix It: Focus on Maintaining Human Agency<\/h4>\n\n\n\n<p>Harari believes <strong>to deal with this transition, we must develop new frameworks to maintain human agency and ethical guardrails<\/strong>. He explains we should consider <a href=\"https:\/\/www.shortform.com\/blog\/digital-sweatshop\/\">training AI<\/a> systems to express <a href=\"https:\/\/www.shortform.com\/blog\/overcoming-self-doubt\/\">self-doubt<\/a>, seek human feedback, and acknowledge their own fallibility\u2014essentially equipping them with a self-awareness of the limits of their knowledge. He also recommends that we use AI to augment human <a href=\"https:\/\/www.shortform.com\/blog\/methods-of-decision-making-crucial-conversations\/\">decision-making<\/a> instead of replacing it, which would help retain human values and oversight.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">The Real Risk: How Humans Choose to Use AI<\/h3>\n\n\n\n<p>The <a href=\"https:\/\/www.shortform.com\/blog\/existential-threats-to-humanity\/\">existential threat<\/a> of artificial intelligence, Harari argues, doesn&#8217;t come from malevolent computers but from human decision-making. While we often hear that technology itself poses the danger\u2014that we repeatedly create tools with the potential to destroy us\u2014Harari sees the <a href=\"https:\/\/www.shortform.com\/blog\/core-problem\/\">core problem<\/a> differently. <strong>The real risk lies in how humans choose to use these powerful new tools, especially when we make those choices based on bad information<\/strong>.<\/p>\n\n\n\n<p>This insight shifts the focus from AI itself to the human systems that control it. Harari warns that if paranoid dictators or terrorists gain unlimited power over AI systems, catastrophic consequences could follow. But these outcomes aren&#8217;t inevitable; they depend entirely on human decisions about how to develop and deploy the technology.<\/p>\n\n\n\n<p>Harari&#8217;s conclusion is ultimately hopeful: If we can understand the true impact of our choices about AI\u2014and ensure those choices are based on reliable information rather than manipulation or misinformation\u2014<strong>we can harness this powerful technology to benefit humanity rather than harm it.<\/strong> The key is not to fear AI itself, but to be thoughtful and intentional about how we choose to use it. Like any tool, we can use AI to achieve positive or negative ends, and we have to prioritize making choices that will benefit humanity, not destroy it.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>What&#8217;s really happening when AI systems choose the information we see online? How can we maintain control over technology that&#8217;s becoming better than humans at understanding our world? In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. From social media algorithms to sophisticated language models, AI systems are increasingly determining what we read, watch, and believe. Read more to get Yuval Noah Harari&#8217;s AI insights and to better understand how to navigate this changing landscape.<\/p>\n","protected":false},"author":9,"featured_media":141828,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[160,24],"tags":[1749],"class_list":["post-141819","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-science","category-society","tag-nexus","","tg-column-two"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v24.3 (Yoast SEO v24.3) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk - Shortform Books<\/title>\n<meta name=\"description\" content=\"In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. Get his take on AI.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk\" \/>\n<meta property=\"og:description\" content=\"In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. Get his take on AI.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\" \/>\n<meta property=\"og:site_name\" content=\"Shortform Books\" \/>\n<meta property=\"article:published_time\" content=\"2025-02-10T17:25:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-02-19T17:28:21+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/s3.amazonaws.com\/wordpress.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"673\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Elizabeth Whitworth\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Elizabeth Whitworth\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\"},\"author\":{\"name\":\"Elizabeth Whitworth\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13\"},\"headline\":\"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk\",\"datePublished\":\"2025-02-10T17:25:00+00:00\",\"dateModified\":\"2025-02-19T17:28:21+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\"},\"wordCount\":1668,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg\",\"keywords\":[\"Nexus\"],\"articleSection\":[\"Science\",\"Society\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\",\"url\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\",\"name\":\"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk - Shortform Books\",\"isPartOf\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg\",\"datePublished\":\"2025-02-10T17:25:00+00:00\",\"dateModified\":\"2025-02-19T17:28:21+00:00\",\"description\":\"In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. Get his take on AI.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage\",\"url\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg\",\"contentUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg\",\"width\":1200,\"height\":673,\"caption\":\"Yuval Noah Harari (AI thought leader) at the 2024 Frankfurt Book Fair\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.shortform.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#website\",\"url\":\"https:\/\/www.shortform.com\/blog\/\",\"name\":\"Shortform Books\",\"description\":\"The World&#039;s Best Book Summaries\",\"publisher\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.shortform.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#organization\",\"name\":\"Shortform Books\",\"url\":\"https:\/\/www.shortform.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png\",\"contentUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png\",\"width\":500,\"height\":74,\"caption\":\"Shortform Books\"},\"image\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13\",\"name\":\"Elizabeth Whitworth\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g\",\"caption\":\"Elizabeth Whitworth\"},\"description\":\"Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books\u2014and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.\",\"sameAs\":[\"rina@shortform.com\"],\"award\":[\"Contributions to joint task force efforts (FBI)\",\"Contributions to Special Operations Division (DOJ & DEA)\",\"Efforts to fight the war on drugs (NSA)\",\"Contributions to Operation Storm Front (US Customs Service)\"],\"knowsAbout\":[\"History\",\"Theology\",\"Government\"],\"jobTitle\":\"Senior SEO Writer\",\"worksFor\":\"Shortform\",\"url\":\"https:\/\/www.shortform.com\/blog\/author\/elizabeth\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk - Shortform Books","description":"In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. Get his take on AI.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/","og_locale":"en_US","og_type":"article","og_title":"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk","og_description":"In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. Get his take on AI.","og_url":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/","og_site_name":"Shortform Books","article_published_time":"2025-02-10T17:25:00+00:00","article_modified_time":"2025-02-19T17:28:21+00:00","og_image":[{"width":1200,"height":673,"url":"https:\/\/s3.amazonaws.com\/wordpress.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg","type":"image\/jpeg"}],"author":"Elizabeth Whitworth","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Elizabeth Whitworth","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#article","isPartOf":{"@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/"},"author":{"name":"Elizabeth Whitworth","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13"},"headline":"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk","datePublished":"2025-02-10T17:25:00+00:00","dateModified":"2025-02-19T17:28:21+00:00","mainEntityOfPage":{"@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/"},"wordCount":1668,"commentCount":0,"publisher":{"@id":"https:\/\/www.shortform.com\/blog\/#organization"},"image":{"@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage"},"thumbnailUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg","keywords":["Nexus"],"articleSection":["Science","Society"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/","url":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/","name":"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk - Shortform Books","isPartOf":{"@id":"https:\/\/www.shortform.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage"},"image":{"@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage"},"thumbnailUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg","datePublished":"2025-02-10T17:25:00+00:00","dateModified":"2025-02-19T17:28:21+00:00","description":"In his book Nexus, Yuval Noah Harari explores how artificial intelligence is reshaping our relationship with information. Get his take on AI.","breadcrumb":{"@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#primaryimage","url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg","contentUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg","width":1200,"height":673,"caption":"Yuval Noah Harari (AI thought leader) at the 2024 Frankfurt Book Fair"},{"@type":"BreadcrumbList","@id":"https:\/\/www.shortform.com\/blog\/yuval-noah-harari-ai\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.shortform.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Yuval Noah Harari: AI Dangers, Solutions, &amp; the Real Risk"}]},{"@type":"WebSite","@id":"https:\/\/www.shortform.com\/blog\/#website","url":"https:\/\/www.shortform.com\/blog\/","name":"Shortform Books","description":"The World&#039;s Best Book Summaries","publisher":{"@id":"https:\/\/www.shortform.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.shortform.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.shortform.com\/blog\/#organization","name":"Shortform Books","url":"https:\/\/www.shortform.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png","contentUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png","width":500,"height":74,"caption":"Shortform Books"},"image":{"@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13","name":"Elizabeth Whitworth","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g","caption":"Elizabeth Whitworth"},"description":"Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books\u2014and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.","sameAs":["rina@shortform.com"],"award":["Contributions to joint task force efforts (FBI)","Contributions to Special Operations Division (DOJ & DEA)","Efforts to fight the war on drugs (NSA)","Contributions to Operation Storm Front (US Customs Service)"],"knowsAbout":["History","Theology","Government"],"jobTitle":"Senior SEO Writer","worksFor":"Shortform","url":"https:\/\/www.shortform.com\/blog\/author\/elizabeth\/"}]}},"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/02\/Yuval-Noah-Harari-2024.jpg","_links":{"self":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts\/141819","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/comments?post=141819"}],"version-history":[{"count":8,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts\/141819\/revisions"}],"predecessor-version":[{"id":141829,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts\/141819\/revisions\/141829"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/media\/141828"}],"wp:attachment":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/media?parent=141819"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/categories?post=141819"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/tags?post=141819"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}