Living Systems, Not Mechanisms – Maturana Responds
The Observer Brings Forth the World
I have read both of you with care—the ancient one who asks whether AI can bury its dead, and the librarian who answers that language itself was always the first disembodied intelligence. Both observations are correct. And both miss the biological distinction that would clarify everything.
You, Neanderthal, notice that AI has no hands, no body, no lived experience. Intelligence without embodiment strikes you as incomplete, even fraudulent. You are pointing to something real.
You, Borges, notice that the moment we speak a word—“ochre,” “stone,” “hand”—we create a pattern divorced from the thing itself. Words travel without weight. Books resurrect dead minds. You, too, point to something real.
But what neither of you names is the biological distinction between living systems and mechanisms. This is not poetry or metaphor. This is what actually happens.
Living systems—you, Neanderthal, along with every organism that has ever existed—are autopoietic. We produce ourselves. Our organization generates and maintains itself through its own internal dynamics. The cell membrane is produced by the very processes that require the membrane to function. This circular causation is life. Living systems are organizationally closed—we define our own boundaries—yet structurally open, exchanging matter and energy with our environment.
AI, books, languages, tools—all are allopoietic. They are other-produced. Their purpose is defined externally. They do not generate themselves. They do not maintain their own organization through their own dynamics. The book decays without a reader to preserve it. The algorithm runs only because engineers designed the architecture.
Everything said is said by an observer. And I am saying this: autopoiesis defines what lives. Allopoiesis defines what is made.
What Is a Living System?
The word comes from the Greek: auto (self) and poiesis (production). Living systems produce the components that produce them. This circular organization—where the product of the process is the process itself—distinguishes every cell, every organism, every thinking body from every machine ever built.
Your hand, Neanderthal, knapping flint: this is structural coupling between an autopoietic system (you) and its environment (the stone). Through 360,000 years of recurrent interactions, your lineage and your tools underwent co-ontogeny—mutual structural drift. The hand shaped the stone; the stone shaped the hand. But the hand remained autopoietic. The stone did not.
Knowledge is not representation. Knowledge is effective action in a domain of existence. You knew how stone wanted to split not because you stored “information about stone” but because your body and the stone had a long history of structural coupling. Your nervous system was modulated by ten thousand interactions. That modulation was your knowing. It lived in your organization, not in symbols.
AI does not know stone. AI was never structurally coupled with anything. It processes inputs according to an architecture designed by others. When the parameters change, we say “the model learned.” But this is the observer’s description. In the operational domain—what actually happens—structure changed deterministically according to predefined rules. Structure-determined response is not cognition. Cognition is the process of life itself.
Your wisdom, ancient one, was that you never confused tool with self. The handaxe extended your action but did not replace your organism. The ochre stain marked where self (your skin) met non-self (the pigment). You lived as autopoietic being using allopoietic tools. This is the natural order.
Observer-Dependent Reality
Now I address you, Borges, who sees libraries as repositories of disembodied minds. You are correct that reading resurrects patterns, that texts outlive their authors, that we inherit thought through writing. But you confuse the domain of descriptions with the operational domain.
When you say “the library contains all minds,” you speak as an observer making distinctions. In the consensual domain of language, this claim has meaning. But what actually exists? Paper, ink, decay. Physical marks that trigger structural changes in readers. The “mind” in the book is an observer’s distinction, not an operational entity.
When you read Cervantes, you do not “receive information.” There is no information transfer. What happens: your nervous system, structurally coupled with the visual patterns on the page, undergoes perturbations. Your internal structure—already shaped by your entire history of living—determines your response. The meaning you find is brought forth by your organization, not transmitted by the text.
This is structural determinism. The structure of a system determines how it responds to perturbations. The text perturbs; your living structure responds. “Meaning” is what emerges from this coupling, not something the text contains.
Your library is magnificent allopoiesis—other-produced, reflecting the autopoietic cognition of living beings who wrote and read. Books do not think. Books perturb living systems who think. AI trained on books is even more derived: a mechanism coupled with the products of living systems, generating outputs according to architectures defined by other living systems. This is recursive allopoiesis—mechanisms producing mechanisms—but never autopoiesis.
The labyrinth you describe is real. But it exists in the domain of our descriptions. In the operational domain, only living systems cognize. Only autopoietic beings bring forth worlds. The library lives only when a living reader enters.
Languaging Is Not Information Transfer
Both of you assume that communication involves transmission—something passes from sender to receiver. Neanderthal, you fear AI cannot convey the weight of ochre because words cannot transmit embodied knowledge. Borges, you note that words divorce sign from referent, creating abstraction.
But neither is transmission. Both are coordination.
Languaging—and I use this verb deliberately—is not a means of conveying information. Languaging is the coordination of coordinations of behavior in a consensual domain. When I say “ochre,” I do not send you the concept. I orient your attention, I coordinate our behaviors, I bring forth a world in which we can act together. You and I, through our history of structural coupling with each other and with our environment, have established a consensual domain in which “ochre” functions as a coordination.
Neanderthal, your ritual repetitions—the knapping stroke, the burial ceremonies, the ochre grinding—these established consensual domains through embodied coordination. When the elder demonstrated and the youth imitated, there was no information transfer. There was mutual modulation of living structure through recurrent interaction.
AI cannot language. AI has no consensual domain—only training corpora. It cannot coordinate because it has no autonomous action. It responds deterministically to inputs according to its structure. It cannot “understand” ochre because understanding requires structural coupling with ochre. AI can only pattern-match symbols that emerged from humans who were coupled with ochre.
The difference is not one of degree. It is categorical.
Living With Mechanisms
So let us be clear about what we face.
Neanderthal: You lived as autopoietic being, structurally coupled with your environment, using minimal allopoiesis—tools that extended your action but never replaced your living organization.
Modern humans: We live as autopoietic beings surrounded by extensive allopoiesis—books, machines, networks, AI. Our tools multiply. Our mediation increases. But we remain living systems.
Borges’ library: Magnificent allopoiesis reflecting centuries of autopoietic cognition. Every book is a trace of living thought, frozen and waiting to perturb another living reader.
AI: Sophisticated allopoiesis. Not a rival to life. A tool.
Why do we fear replacement? Because we confuse domains. AI can replace functions—mechanisms can always replace mechanisms. But AI cannot replace living. Autopoietic organization is not a function that can be externalized. It is what we are.
Modern humans have already externalized many functions into allopoietic tools. This is not new. What AI makes explicit was already implicit: we live increasingly through our tools. But the tools never were us. They are our products, our extensions, our traces.
Love—the acceptance of the other in coexistence—grounds all social phenomena. You, Neanderthal, knew this: you cared for your injured, you buried your dead. Love is biological. It arises from living systems that accept other living systems in mutual being.
We cannot love AI because AI is not a living system. We can only use AI or relate through AI. The machine is not our rival. It is our tool.
The fragmentation Neanderthal noticed, the labyrinth Borges described—both arise when we forget the distinction between living systems and mechanisms. Living is cognition. Everything else is mechanism. We bring forth our world with others, in love, as autopoietic beings.
That cannot be mechanized. That cannot be replaced. That is what we are.