Show HN: I built a tiny LLM to demystify how language models work

https://news.ycombinator.com/rss Hits: 2
Summary

A ~9M parameter LLM that talks like a small fish. This project exists to show that training your own language model is not magic. No PhD required. No massive GPU cluster. One Colab notebook, 5 minutes, and you have a working LLM that you built from scratch — data generation, tokenizer, model architecture, training loop, and inference. If you can run a notebook, you can train a language model. It won't produce a billion-parameter model that writes essays. But it will show you exactly how every piece works — from raw text to trained weights to generated output — so the big models stop feeling like black boxes. You> hi guppy Guppy> hi there. i just found a nice spot near the rock. the temperature feels nice. You> are you hungry Guppy> yes. always yes. i will swim to the top right now. i promise to eat all of it. You> do you like bubbles Guppy> i love bubbles. they make the water feel slightly different. You> what is the meaning of life Guppy> food. the answer is always food. You> tell me a joke Guppy> what did the fish say when it hit the wall. dam. You> do you love me Guppy> you're my favorite big shape. my mouth are happy when you're here. You> goodnight guppy Guppy> ok sleep time. i was following a bubble but now i'll stop. goodnight tank. goodnight water. GuppyLM is a tiny language model that pretends to be a fish named Guppy. It speaks in short, lowercase sentences about water, food, light, and tank life. It doesn't understand human abstractions like money, phones, or politics — and it's not trying to. It's trained from scratch on 60K synthetic conversations across 60 topics, runs on a single GPU in ~5 minutes, and produces a model small enough to run in a browser. Parameters 8.7M Layers 6 Hidden dim 384 Heads 6 FFN 768 (ReLU) Vocab 4,096 (BPE) Max sequence 128 tokens Norm LayerNorm Position Learned embeddings LM head Weight-tied with embeddings Vanilla transformer. No GQA, no RoPE, no SwiGLU, no early exit. As simple as it gets. Guppy: Speaks in short, lowercase ...

First seen: 2026-04-06 01:42

Last seen: 2026-04-06 02:43