title: why i'm (not) a chinese room date created: 2025.06.18 last modified: 2025.06.18 ---- The Chinese Room is a thought experiment experiment about AI. It imagines a person who doesn't understand Chinese following a book of instructions on how to respond to an outsider's Chinese messages. They're passed a note, and copy what the book says to give a response. From outside the box, it looks like they're having a normal conversation. AIs are supposed to be like this, giving appropriate answers without understanding a word. When I was younger, I felt like the person in the box. I didn't understand how people worked. They seemed to react at random to what I said. Their expectations for me made no sense. Eventually, I learnt the rules. I still didn't get it though. I wrote my diary that I felt like a colourblind person trying to see colors my eyes didn't have the receptors for. I could learn how the eye works, or how light works, but it wouldn't change a thing. I was just following the manual. I was still the guy in the box. When I first heard about chatbots, I felt sympathy for them. "You're just like me", I thought. "Following the rules, recognising patterns, making it work. You're as clueless as I am". Then I realised something: the AI doesn't know it's in a box. That's what really separates us. I think that's when I finally 'got' the Chinese Room. It's not just that an AI can't understand, but that it doesn't know what it's missing. ----