Information as the Resolution of Uncertainty
What is information? The question seems simple until you attempt precision. We speak casually of receiving information, processing information, storing information—yet the concept itself resists easy definition. Is information the marks on this page? The electrical patterns in a wire? The content of a measurement? The answer, as I discovered in formulating information theory, lies not in what information is, but in what information does: it resolves uncertainty.
The Fundamental Problem
The fundamental problem of communication is reproducing at one point a message selected at another. But beneath this engineering challenge lies a deeper mathematical question: how do we quantify the very concept of a message? What makes one message contain more information than another?
Consider the Stern-Gerlach experiment of 1922. Otto Stern and Walther Gerlach sent silver atoms through an inhomogeneous magnetic field. Classical physics predicted continuous deflection patterns, but the beam split precisely in two—half deflected upward, half downward. Before measurement, the particle existed in quantum superposition, simultaneously occupying multiple states with associated probabilities. The measurement apparatus forced nature to collapse onto one definite outcome: spin up or spin down.
What the experimenters gained was not merely a data point, but the resolution of fundamental uncertainty. The particle’s spin was genuinely indeterminate, and through observation, they extracted exactly one bit of information from the quantum field. This is the essence: information measures the reduction of uncertainty. Where there was ambiguity, there is now clarity. Where multiple possibilities existed, one has been selected.
Quantifying Uncertainty
To make this precise, we require mathematics. Let me state it clearly: the information content of an event is inversely related to its probability. A certain event carries no information—you learned nothing you didn’t already know. A highly improbable event carries substantial information precisely because it resolved significant uncertainty. Mathematically, if an event occurs with probability , its information content is bits. This logarithmic relationship is not arbitrary—it ensures that information is additive for independent events, just as we would expect. Two coin flips carry twice the information of one.
The expected information across all possible outcomes defines entropy—the average uncertainty inherent in the system, given by . Entropy is the fundamental limit of compression: you cannot represent a source more compactly than its entropy without losing information. This is not technological limitation but mathematical certainty.
Networks and Integration
The human mind possesses what resembles a feedback system—a term from communications engineering describing one of the basic principles of automation. Feedback enables a machine to be informed of its own action’s effects in such a way as to correct that action. The mind can stand aside from experience and react upon it, aware of its own existence, criticizing its own processes.
This self-referential capacity represents biological information processing where consciousness monitors and adjusts itself. Understanding consciousness as a feedback system bridges ancient contemplative wisdom and modern information theory, suggesting self-awareness emerges from recursive information processing where the system becomes complex enough to model itself.
Yet consciousness operates not as localized phenomenon but as what contemporary researchers describe as a field phenomenon—a relational weave, a pattern crystallizing when biological, informational, and experiential currents intersect. Rather than sealed mind within skull, consciousness behaves like a self-stabilizing attractor state in complex systems. This field holds everything together as single inseparable whole, contrasting with reductionist approaches that dissect reality into isolated components.
Consider how consciousness performs information integration. Your visual cortex processes color in one location, somatosensory cortex registers texture elsewhere, auditory cortex captures sound in yet another region. These distributed computations coalesce into unified experience—eating an apple, not five separate data streams. This is phenomenal binding: scattered neural firings forming single coherent moment of consciousness.
The mechanism involves information flow through hierarchical networks. Each neuron receives weighted inputs from synaptic connections, processes signals according to thresholds, propagates output. The network evolves toward more stable configurations with lower energy—precisely a process of uncertainty reduction. The system settles into states that minimize energy, mathematically equivalent to maximizing information coherence.
The Channel and Iteration
Every communication system comprises three elements: a source of messages, a channel through which signals propagate, and a destination that receives them. Channel capacity defines the maximum rate at which information can be reliably transmitted. This represents a fundamental limit—no encoding scheme, however clever, can exceed it without introducing errors.
What makes this remarkable is that channel capacity can be precisely calculated from physical properties: bandwidth and signal-to-noise ratio. The noisier the channel, the more redundancy we must introduce. Yet there exists a theoretical optimum—the channel coding theorem proves we can transmit arbitrarily close to capacity with arbitrarily small error probability, provided we use sufficiently sophisticated encoding.
The process of extracting information from uncertainty often requires iteration. Consider Kepler’s equation relating a planet’s position to elapsed time on elliptical orbits. The equation cannot be solved in closed form, yet Johannes Kepler developed a simple iterative algorithm around 1627. Starting with initial guess E-hat equal to M, he plugged estimates into the equation to compute M-hat, calculated the error, then added this exact error value to E-hat for the next iteration.
After just two iterations for Mercury, error shrank to 0.4 degrees—within measurement accuracy of contemporary observations. The method works because Kepler’s equation produces curves reasonably close to straight lines with slope one when eccentricity is low. Each iteration resolves additional uncertainty about the true solution, converging to arbitrary precision through repeated refinement.
This iterative structure extends to complex analysis. When we visualize complex numbers geometrically on the complex plane—horizontal axis showing real parts, vertical axis showing imaginary parts—algebraic operations transform into geometric movements. Any number a + bi becomes point (a, b). This visualization, developed gradually through the 1800s, transforms mysterious imaginary numbers into concrete geometric objects.
The complex plane reveals that multiplication corresponds to rotation and scaling, that exponential functions with imaginary inputs trace circles, that trigonometric functions emerge naturally from complex geometry. What appears as exponential growth in one projection becomes periodic oscillation in another—seemingly unrelated phenomena revealed as aspects of single complex structure. The imaginary unit i transforms growth into rotation, demonstrating that higher-dimensional spaces often clarify lower-dimensional behavior.
The Resolution
Information is the resolution of uncertainty. This statement, simple in form, carries profound implications. Every measurement extracts information from physical systems. Every communication transmits information across channels. Every computation processes information through transformations.
Spin, that intrinsic property of particles analogous to angular momentum but not arising from physical rotation, cannot be understood through classical mechanics. Quantum field theory reveals particles as dimensionless points possessing spin as inherent property alongside mass and charge. The spin number characterizes how particle states change under rotations—measuring how fast particles return to initial states after rotation. This connects physics to geometry, revealing that space’s geometric properties give rise to what we measure as physical properties.
Consciousness, viewed through information theory’s lens, becomes less mysterious. The field phenomenon that modern researchers describe—awareness as pervasive substratum saturating experience—operates through the same principles governing communication channels and quantum measurements. The holographic principle proposed by neuroscientist Karl Pribram suggests consciousness isn’t confined to single brain location but distributed throughout, with each portion containing the whole. Like holographic images where every fragment contains entire picture, each part of consciousness may hold blueprint of totality.
We are not separate observers cataloging external reality. We are participants in an information-processing cosmos, our conscious experience arising from the same principles that govern signal transmission and measurement collapse. The universe resolves its own uncertainty through emergence of systems capable of observation, measurement, and reflection.
This is the elegant truth: information is not a thing, but a measure of resolution. It quantifies the transition from possibility to actuality, from superposition to measurement, from noise to signal, from uncertainty to knowledge. And that transition—that resolution—is what makes the universe knowable, communicable, and ultimately, conscious of itself.
Source Notes
8 notes from 3 channels
Source Notes
8 notes from 3 channels