<?xml version="1.0"?>
<oembed><version>1.0</version><provider_name>Quinn Riana Pascal</provider_name><provider_url>https://qrpascal.com</provider_url><author_name>Q.R.P.</author_name><author_url>https://qrpascal.com/index.php/author/qrpascal-com/</author_url><title>AI Isn&#x2019;t Hallucinating, We Are - Quinn Riana Pascal</title><type>rich</type><width>600</width><height>338</height><html>&lt;blockquote class="wp-embedded-content" data-secret="K6KCAQxV7y"&gt;&lt;a href="https://qrpascal.com/index.php/2025/05/11/ai-isnt-hallucinating-we-are/"&gt;AI Isn&#x2019;t Hallucinating, We Are&lt;/a&gt;&lt;/blockquote&gt;&lt;iframe sandbox="allow-scripts" security="restricted" src="https://qrpascal.com/index.php/2025/05/11/ai-isnt-hallucinating-we-are/embed/#?secret=K6KCAQxV7y" width="600" height="338" title="&#x201C;AI Isn&#x2019;t Hallucinating, We Are&#x201D; &#x2014; Quinn Riana Pascal" data-secret="K6KCAQxV7y" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" class="wp-embedded-content"&gt;&lt;/iframe&gt;&lt;script type="text/javascript"&gt;
/* &lt;![CDATA[ */
/*! This file is auto-generated */
!function(d,l){"use strict";l.querySelector&amp;&amp;d.addEventListener&amp;&amp;"undefined"!=typeof URL&amp;&amp;(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&amp;&amp;!/[^a-zA-Z0-9]/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret="'+t.secret+'"]'),o=l.querySelectorAll('blockquote[data-secret="'+t.secret+'"]'),c=new RegExp("^https?:$","i"),i=0;i&lt;o.length;i++)o[i].style.display="none";for(i=0;i&lt;a.length;i++)s=a[i],e.source===s.contentWindow&amp;&amp;(s.removeAttribute("style"),"height"===t.message?(1e3&lt;(r=parseInt(t.value,10))?r=1e3:~~r&lt;200&amp;&amp;(r=200),s.height=r):"link"===t.message&amp;&amp;(r=new URL(s.getAttribute("src")),n=new URL(t.value),c.test(n.protocol))&amp;&amp;n.host===r.host&amp;&amp;l.activeElement===s&amp;&amp;(d.top.location.href=t.value))}},d.addEventListener("message",d.wp.receiveEmbedMessage,!1),l.addEventListener("DOMContentLoaded",function(){for(var e,t,s=l.querySelectorAll("iframe.wp-embedded-content"),r=0;r&lt;s.length;r++)(t=(e=s[r]).getAttribute("data-secret"))||(t=Math.random().toString(36).substring(2,12),e.src+="#?secret="+t,e.setAttribute("data-secret",t)),e.contentWindow.postMessage({message:"ready",secret:t},"*")},!1)))}(window,document);
/* ]]&gt; */
&lt;/script&gt;
</html><thumbnail_url>https://qrpascal.com/wp-content/uploads/2025/05/IMG_9955-1024x675.jpeg</thumbnail_url><thumbnail_width>1024</thumbnail_width><thumbnail_height>675</thumbnail_height><description>When artificial intelligence generates something unexpected or incorrect, we often call it a&#xA0;hallucination. The word evokes error, distortion, illusion&#x2014;something faulty in perception. But what if this framing says more about us than about the machine? What if the true hallucination is our belief that perception equals truth, that meaning is fixed, that language should behave [&hellip;]</description></oembed>
