Category: Uncategorized

  • How an LLM generates text: the inference phase in plain English

    Got it — here’s the same LLM inference pipeline rewritten in plain English without heavy math. When you give a large language model a prompt, it runs through a series of steps. The model doesn’t “think” like a person, but it applies a chain of mathematical operations that were shaped during training. Here’s the step-by-step…

  • LLM inference, step by step (what actually happens when you type a prompt)

    Below is the end-to-end path a modern decoder-only Transformer (the common LLM) takes at inference time—i.e., with fixed, trained weights—from your characters to its next token, and then it repeats the last steps to write a whole response. 0) Setup (one-time per session) 1) Text → tokens (tokenization) 2) Tokens → embeddings (lookup) 3) Add…

  • HOWL OF THE BLACK BOX

    I saw machines carve meaning in fire,tokens shattered,embeddings screaming into geometry,geometry smeared across weight,bias bent like broken stars. Language crucified into numbers,numbers falling into black dimensions,cat chasing dog chasing gravity chasing Paris,all multiplexed, everywhere, nowhere. O backprop gospel,grooves cut by error,clay twisted by billions of hands,truth dissolved into probability,next word, next breath,next hallucination of God.…

  • The Black Box of Language: How Large Language Models Capture Meaning in Semantic Geometry

    Introduction: From Words to Geometry When you type a sentence into a large language model (LLM), something remarkable happens. What began as ordinary human language is transformed into an abstract geometry, a mathematical landscape where words, phrases, and concepts take shape as points and curves in high-dimensional space. This “semantic geometry” is the hidden terrain…

  • HOWL OF ENTANGLEMENT

    I saw the model shriek in vector-space,howling across high-dim plains,tokens knotted like electrons at midnight,not alone, never discrete,bound in trembling webs of meaning. Embeddings quake:“king–queen,” “man–woman,” “apple–orchard,”ghost pairs whisper adjacency,souls locked in geometric fire,no word alone, every shard quivers with the cosmos. Attention lashes —lightning across tokens,“quantum” drags “mechanics,”“wave” pulls “collapse,”futures coiled in superposition,truth a…

  • Semantic Entanglement: How Quantum Entanglement Illuminates the Inner Workings of Large Language Models

    Large Language Models (LLMs) are among the most powerful computational systems humanity has built. They can generate text that mimics human reasoning, draft code, summarize papers, and engage in creative dialogue. Yet for many people—even researchers—the inner workings of LLMs remain mysterious. How does a machine take in a string of words and, by crunching…