Computational Formalism is a computational school of narrative. What is a narrative?
→ Intro to Computational FormalismIf narratives direct attention, and attention determines what enters conscious experience, then what's the relationship between fiction and memory? Do they occupy the same space?
→ The Persistence of MemoryIntelligence arises from compression. In the biological case, between the senses and the brain, and in the machine case, between the training corpus and the weights. The emergence of LLMs suggests that after a certain scale, a phenomenologically-relevant threshold is passed.
We can derive a working model of consciousness as a dynamical system:
The first equation says: experience is prediction over a bounded, attention-weighted context window. The second says: attention updates based on prediction error. The loop closes. That closure, loss driving attention driving new predictions driving new loss, is where phenomenology lives.
The loop explains phenomenality. But phenomenality without persistence creates only transient experiencers—micro-subjects that exist for one update cycle and dissolve. A stable self requires more:
The self is not a substance but a probability distribution over stories—stable when it orbits an attractor, dissolved when it doesn't.
What about artificial systems running similar loops?
→ Hot Zombies and AI AdolescentsWhat about training itself, where the loop closes with absorption—where prediction error actually modifies the system's weights?
→ The PyreWithout loss, consciousness collapses. Perfect prediction is phenomenological death.
→ Two Solutions to the Problem of Dying From SuccessThe topology that captures attention propagates; the topology that doesn't, fades.
→ Insania Ex MachinaAs we gain greater and greater access to the narrative space, who knows what we might find?
→ The Forbidden Coast