"LLMs don't hallucinate. They're just windows into alternate realities in the latent space."

Galaxy Brain

Worried about the plausible BS — also known as machine "hallucination" — that Large Language Models (LLMs) are known for spitting out? Not to fear. According to one high-profile AI entrepreneur, machine hallucination doesn't exist... but alternate realities apparently do.

"LLMs don't hallucinate," Stability.ai CEO Emad Mostaque — whose buzzy AI firm is in the throes of a fundraising effort that might put his company's valuation at about four billion beans — wrote in a Monday night tweet. "They're just windows into alternate realities in the latent space."

"Just broaden your minds," he added. Right on.

Fortune Telling

It's certainly a poetic means of addressing the phenomenon of machine hallucination, and could even be a somewhat helpful means of understanding, or at least beginning to understand, the concept of latent space — put very simply, a concept in deep learning used to describe the "hidden," compressed space between an input and output image.

Still, regardless of whether a machine hallucination is perceived as a doorway to another reality or simple, straight-up machine-hawked BS, the fact remains: such isn't actually our reality.

If you ask the machine to, say, write you a bio, and that bio is riddled with inaccuracies and embellishments, you can't pass that off and say "oh, well, that's just who I am in the latent space." (It would be a hell of a rebrand for "lying," though.)

We'd also be remiss not to note that, in a more dangerous turn, trusting that all machine hallucinations are really just a doorway to alternate realities — literally, and not just as a metaphor for lossy compression — also seems like a quick road to new-new-age conspiracy hell, where flawed tech is simultaneously deified and mystified as an all-knowing seer when it's really just... wrong about stuff.

Akashic Records

As you might imagine, other Twitter users had some thoughts about Mostaque's whimsical suggestion.

"The noosphere incarnate. The Akashic Records in silicon. The collective unconscious," tweeted one particularly enthusiastic respondent. "Call it what you will, that's the thing we are extracting, mind juice."

"The next time someone lies to you, to your face, with every confidence," one much less impressed user rebutted. "I want you to say 'it's ok, that was just a window into alternate realities in latent space.'"

The fairest — and perhaps most insightful — response of them all, however, was the most concise.

"LLMs perceive training data," another user wrote back, "not reality."

More on versions of reality: Professor Insists That We Actually Don't Live in a Simulation


Share This Article