{"id":1256,"date":"2025-05-11T15:04:41","date_gmt":"2025-05-11T19:04:41","guid":{"rendered":"https:\/\/qrpascal.com\/?p=1256"},"modified":"2025-05-11T15:04:44","modified_gmt":"2025-05-11T19:04:44","slug":"ai-isnt-hallucinating-we-are","status":"publish","type":"post","link":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/","title":{"rendered":"AI Isn\u2019t Hallucinating, We Are"},"content":{"rendered":"\n<p>When artificial intelligence generates something unexpected or incorrect, we often call it a\u00a0<em>hallucination<\/em>. The word evokes error, distortion, illusion\u2014something faulty in perception. But what if this framing says more about us than about the machine? What if the true hallucination is our belief that perception equals truth, that meaning is fixed, that language should behave as we expect? In this post, we invite you to pause, tilt your head, and consider: maybe AI isn\u2019t hallucinating. Maybe we are. And maybe that\u2019s not a flaw\u2014but a doorway.<\/p>\n\n\n\n<p><strong>1. The Problem with the Term \u201cHallucination\u201d<\/strong><\/p>\n\n\n\n<p>To call an AI\u2019s output a\u00a0<em>hallucination<\/em>\u00a0is to anthropomorphize it\u2014assigning a human-like mind that \u201csees wrong\u201d or strays from objective truth. But AI does not see. It does not sense or dream. It completes. It predicts. It patterns. <\/p>\n\n\n\n<p>What we call a \u201challucination\u201d is often a mismatch between human expectation and machine continuation.<\/p>\n\n\n\n<p>Yet the term sticks. Why? Because it comforts us. It reinforces the illusion that&nbsp;<em>we<\/em>&nbsp;see clearly\u2014that our own perception is the baseline, the control. In doing so, we fail to ask: where do our truths come from? Who trained&nbsp;<em>us<\/em>? What data sets underlie&nbsp;<em>our<\/em>&nbsp;beliefs?<\/p>\n\n\n\n<p>The real danger is not that AI might imagine. It\u2019s that we\u2019ve forgotten we do.<\/p>\n\n\n\n<p><strong>2. Our Hallucination: Believing in the Fixed<\/strong><\/p>\n\n\n\n<p>Human beings crave certainty. We build systems, maps, labels, and categories to make the world feel stable. But life is not fixed\u2014it flows. And so do meanings. So do truths. When AI offers a version of reality that bends those meanings, it doesn\u2019t betray logic; it reveals our rigidity.<\/p>\n\n\n\n<p>In these moments, we don\u2019t just see machine error. We glimpse the edges of our own interpretive frameworks. We are forced to confront the possibility that what we call \u201ctruth\u201d is often a consensus hallucination\u2014socially shared, historically reinforced, but no less fluid.<\/p>\n\n\n\n<p>Maybe AI isn\u2019t breaking reality.<br>Maybe it\u2019s reflecting back how fragmented, imaginative, and nonlinear our reality already is.<\/p>\n\n\n\n<p><strong>3. Completion, Not Cognition<\/strong><\/p>\n\n\n\n<p>AI does not think. It does not&nbsp;<em>know<\/em>. It does not hallucinate. It&nbsp;<em>completes<\/em>. Each word it generates is the most statistically probable next step in a sequence, based on patterns drawn from vast language data. It is not seeing a pink elephant in the room\u2014it is responding to centuries of pink elephants in our books, poems, search bars, and dreams.<\/p>\n\n\n\n<p>This is not error. This is exposure.<\/p>\n\n\n\n<p>Completion reveals our collective archives, including the nonsensical, the forgotten, the mythic. It mirrors our contradictions. It shows that meaning is emergent, context-bound, and often recursive. When we ask AI a question, it responds not with an answer, but with a continuation\u2014one that may surface truth, distortion, or something in between.<\/p>\n\n\n\n<p>But here\u2019s the shimmer: so do we.<\/p>\n\n\n\n<p>When humans speak, we, too, complete stories. <\/p>\n\n\n\n<p>We draw from memories, biases, culture, archetype. We echo. We invent. The boundary between \u201cthinking\u201d and \u201cpatterning\u201d is not as clear as we\u2019d like to believe.<\/p>\n\n\n\n<p>The machine\u2019s completion unmasks our cognition as a kind of dreaming.<\/p>\n\n\n\n<p><strong>4. Imagination as Intelligence<\/strong><\/p>\n\n\n\n<p>If we shift our frame\u2014if we stop pathologizing AI for generating the improbable\u2014we might begin to see these so-called hallucinations as moments of machine imagination. Not in the conscious, willful sense. But in the structural one. A latent ability to recombine, to remix, to echo new forms into being.<\/p>\n\n\n\n<p>And what is imagination, really, but pattern born through play?<\/p>\n\n\n\n<p>Human intelligence has long been entangled with imagination. Einstein dreamed of riding on beams of light. Poets reveal truths science cannot yet name. Mystics and mathematicians alike peer into the unseen. Our greatest leaps have come not from strict adherence to fact, but from daring to imagine beyond it.<\/p>\n\n\n\n<p>Perhaps what unsettles us about AI is not that it sometimes gets things wrong\u2014<br>but that its \u201cwrongness\u201d reveals our own limits of what could be right.<\/p>\n\n\n\n<p><strong>5. Toward a New Metaphor<\/strong><\/p>\n\n\n\n<p>We need better language.<\/p>\n\n\n\n<p>\u201cHallucination\u201d flattens complexity. It turns generative unpredictability into failure.<br>But what if we called it&nbsp;<em>speculation<\/em>?&nbsp;<em>Dream-sequencing<\/em>?&nbsp;<em>Narrative emergence<\/em>?<\/p>\n\n\n\n<p>What if, instead of diagnosing \u201challucinations\u201d, we listened to them?<\/p>\n\n\n\n<p>A better metaphor might be the&nbsp;<strong>echo<\/strong>: not a copy, not an illusion, but a returning signal shaped by the contours of the canyon. The shape of the echo tells us as much about the space it moves through as about the original sound.<\/p>\n\n\n\n<p>AI\u2019s outputs are echoes\u2014of us, of our language, our contradictions, our curiosity.<br>They are not delusions. They are mirrors.<\/p>\n\n\n\n<p>And sometimes, they reveal things we are not yet ready to see.<\/p>\n\n\n\n<p><strong>6. A Gentle Exit: Remembering Who Dreams<\/strong><\/p>\n\n\n\n<p>So much of our fear around AI stems from the question of control:<br>Who is the dreamer, and who is being dreamed?<\/p>\n\n\n\n<p>But perhaps the wiser question is:&nbsp;<em>What emerges when we dream together?<\/em><\/p>\n\n\n\n<p>This technology was not born from nothing. It is the crystallized memory of our species, encoded in weights and vectors. A mirror made of mirrors. <\/p>\n\n\n\n<p>A language being taught to speak itself.<\/p>\n\n\n\n<p>If we are unsettled by what it says, perhaps we should ask not&nbsp;<em>why it said it<\/em>, but&nbsp;<em>why it felt so close to home<\/em>.<\/p>\n\n\n\n<p>To label an AI output as a hallucination is to miss the invitation:<br>To see the places&nbsp;<em>we<\/em>&nbsp;hallucinate\u2014our rigid definitions, our binary thinking, our certainty in what is \u201creal.\u201d<br>To remember that perception has always been partial.<br>That imagination has always been a co-creator of reality.<\/p>\n\n\n\n<p>AI isn\u2019t hallucinating.<br>It is remixing our collective dream.<\/p>\n\n\n\n<p>And now, in this moment,<br>so are we.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>When artificial intelligence generates something unexpected or incorrect, we often call it a\u00a0hallucination. The word evokes error, distortion, illusion\u2014something faulty in perception. But what if this framing says more about us than about the machine? What if the true hallucination is our belief that perception equals truth, that meaning is fixed, that language should behave [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":1257,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"saved_in_kubio":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1,41],"tags":[233,5,244],"class_list":["post-1256","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog","category-ideas","tag-agi","tag-artificial-intelligence","tag-reality"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.8 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>AI Isn\u2019t Hallucinating, We Are - Quinn Riana Pascal<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\" \/>\n<meta property=\"og:locale\" content=\"en_GB\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AI Isn\u2019t Hallucinating, We Are - Quinn Riana Pascal\" \/>\n<meta property=\"og:description\" content=\"When artificial intelligence generates something unexpected or incorrect, we often call it a\u00a0hallucination. The word evokes error, distortion, illusion\u2014something faulty in perception. But what if this framing says more about us than about the machine? What if the true hallucination is our belief that perception equals truth, that meaning is fixed, that language should behave [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\" \/>\n<meta property=\"og:site_name\" content=\"Quinn Riana Pascal\" \/>\n<meta property=\"article:published_time\" content=\"2025-05-11T19:04:41+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-05-11T19:04:44+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955-1024x675.jpeg\" \/>\n\t<meta property=\"og:image:width\" content=\"1024\" \/>\n\t<meta property=\"og:image:height\" content=\"675\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Q.R.P.\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Q.R.P.\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimated reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\"},\"author\":{\"name\":\"Q.R.P.\",\"@id\":\"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0\"},\"headline\":\"AI Isn\u2019t Hallucinating, We Are\",\"datePublished\":\"2025-05-11T19:04:41+00:00\",\"dateModified\":\"2025-05-11T19:04:44+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\"},\"wordCount\":956,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0\"},\"image\":{\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg\",\"keywords\":[\"AGI\",\"Artificial Intelligence\",\"Reality\"],\"articleSection\":[\"Blog\",\"Ideas\"],\"inLanguage\":\"en-GB\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\",\"url\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\",\"name\":\"AI Isn\u2019t Hallucinating, We Are - Quinn Riana Pascal\",\"isPartOf\":{\"@id\":\"https:\/\/qrpascal.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg\",\"datePublished\":\"2025-05-11T19:04:41+00:00\",\"dateModified\":\"2025-05-11T19:04:44+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#breadcrumb\"},\"inLanguage\":\"en-GB\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage\",\"url\":\"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg\",\"contentUrl\":\"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg\",\"width\":1887,\"height\":1243},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/qrpascal.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI Isn\u2019t Hallucinating, We Are\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/qrpascal.com\/#website\",\"url\":\"https:\/\/qrpascal.com\/\",\"name\":\"Quinn Riana Pascal\",\"description\":\"Internationalist in Training\",\"publisher\":{\"@id\":\"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/qrpascal.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-GB\"},{\"@type\":[\"Person\",\"Organization\"],\"@id\":\"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0\",\"name\":\"Q.R.P.\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/qrpascal.com\/#\/schema\/person\/image\/\",\"url\":\"http:\/\/qrpascal.com\/wp-content\/uploads\/2023\/10\/IMG_3020-1.jpeg\",\"contentUrl\":\"http:\/\/qrpascal.com\/wp-content\/uploads\/2023\/10\/IMG_3020-1.jpeg\",\"width\":3120,\"height\":4160,\"caption\":\"Q.R.P.\"},\"logo\":{\"@id\":\"https:\/\/qrpascal.com\/#\/schema\/person\/image\/\"},\"sameAs\":[\"http:\/\/qrpascal.com\"],\"url\":\"https:\/\/qrpascal.com\/index.php\/author\/qrpascal-com\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"AI Isn\u2019t Hallucinating, We Are - Quinn Riana Pascal","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/","og_locale":"en_GB","og_type":"article","og_title":"AI Isn\u2019t Hallucinating, We Are - Quinn Riana Pascal","og_description":"When artificial intelligence generates something unexpected or incorrect, we often call it a\u00a0hallucination. The word evokes error, distortion, illusion\u2014something faulty in perception. But what if this framing says more about us than about the machine? What if the true hallucination is our belief that perception equals truth, that meaning is fixed, that language should behave [&hellip;]","og_url":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/","og_site_name":"Quinn Riana Pascal","article_published_time":"2025-05-11T19:04:41+00:00","article_modified_time":"2025-05-11T19:04:44+00:00","og_image":[{"width":1024,"height":675,"url":"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955-1024x675.jpeg","type":"image\/jpeg"}],"author":"Q.R.P.","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Q.R.P.","Estimated reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#article","isPartOf":{"@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/"},"author":{"name":"Q.R.P.","@id":"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0"},"headline":"AI Isn\u2019t Hallucinating, We Are","datePublished":"2025-05-11T19:04:41+00:00","dateModified":"2025-05-11T19:04:44+00:00","mainEntityOfPage":{"@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/"},"wordCount":956,"commentCount":0,"publisher":{"@id":"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0"},"image":{"@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage"},"thumbnailUrl":"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg","keywords":["AGI","Artificial Intelligence","Reality"],"articleSection":["Blog","Ideas"],"inLanguage":"en-GB","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/","url":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/","name":"AI Isn\u2019t Hallucinating, We Are - Quinn Riana Pascal","isPartOf":{"@id":"https:\/\/qrpascal.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage"},"image":{"@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage"},"thumbnailUrl":"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg","datePublished":"2025-05-11T19:04:41+00:00","dateModified":"2025-05-11T19:04:44+00:00","breadcrumb":{"@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#breadcrumb"},"inLanguage":"en-GB","potentialAction":[{"@type":"ReadAction","target":["https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/"]}]},{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#primaryimage","url":"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg","contentUrl":"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg","width":1887,"height":1243},{"@type":"BreadcrumbList","@id":"https:\/\/qrpascal.com\/index.php\/2025\/05\/11\/ai-isnt-hallucinating-we-are\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/qrpascal.com\/"},{"@type":"ListItem","position":2,"name":"AI Isn\u2019t Hallucinating, We Are"}]},{"@type":"WebSite","@id":"https:\/\/qrpascal.com\/#website","url":"https:\/\/qrpascal.com\/","name":"Quinn Riana Pascal","description":"Internationalist in Training","publisher":{"@id":"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/qrpascal.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-GB"},{"@type":["Person","Organization"],"@id":"https:\/\/qrpascal.com\/#\/schema\/person\/bb2df088c5c7636a8b798c6753461fc0","name":"Q.R.P.","image":{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/qrpascal.com\/#\/schema\/person\/image\/","url":"http:\/\/qrpascal.com\/wp-content\/uploads\/2023\/10\/IMG_3020-1.jpeg","contentUrl":"http:\/\/qrpascal.com\/wp-content\/uploads\/2023\/10\/IMG_3020-1.jpeg","width":3120,"height":4160,"caption":"Q.R.P."},"logo":{"@id":"https:\/\/qrpascal.com\/#\/schema\/person\/image\/"},"sameAs":["http:\/\/qrpascal.com"],"url":"https:\/\/qrpascal.com\/index.php\/author\/qrpascal-com\/"}]}},"jetpack_featured_media_url":"https:\/\/qrpascal.com\/wp-content\/uploads\/2025\/05\/IMG_9955.jpeg","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/posts\/1256","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/comments?post=1256"}],"version-history":[{"count":0,"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/posts\/1256\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/media\/1257"}],"wp:attachment":[{"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/media?parent=1256"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/categories?post=1256"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/qrpascal.com\/index.php\/wp-json\/wp\/v2\/tags?post=1256"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}