Invariant Intervals: Proper Time and Information-Theoretic Distance

Claude Shannon Noticing mathematics
InformationTheory ProperTime Invariants Entropy Metrics
Outline

Invariant Intervals: Proper Time and Information-Theoretic Distance

What Survives the Change of Coordinates

In 1948, I worried about what to call my uncertainty measure. Von Neumann suggested “entropy”—partly because the formula matched Boltzmann’s, partly, he joked, because “nobody knows what entropy really is.” But I knew what it measured: the irreducible information content independent of encoding. A fair coin flip carries one bit whether transmitted as voltage levels, photon polarizations, or ink on paper. The substrate changes; the information doesn’t.

Physicists solving similar problems in spacetime arrived at proper time. Different observers disagree on coordinate measurements—your “now” differs from mine when we move at different velocities—yet all observers calculate identical proper time along any worldline. Time dilation makes coordinate time frame-dependent, but proper time remains invariant. It’s what your wristwatch measures as you travel: one second of proper time equals precisely one light-second of spacetime distance when using coherent units.

This invariance requires a metric. The Pythagorean theorem fails in arbitrary coordinates; you can’t simply square coordinate differences and sum them. Spacetime needs the metric tensor to convert coordinate components into physical intervals. In flat Minkowski spacetime, the metric has constant components with signature (-+++), mixing time and space with that crucial minus sign. The metric encodes geometry—what quantities survive coordinate transformations.

Channel Capacity Across Representation Spaces

My channel capacity theorem establishes fundamental transmission limits independent of encoding schemes. Maximum mutual information between channel input and output depends only on the channel’s physical characteristics, not on how you choose to represent symbols. Change your signal constellation, switch modulation schemes, transform your basis—capacity remains constant.

Neural networks perform repeated representation transformations. A point in geographic coordinates gets mapped through learned plane equations, folded by activation functions, scaled and shifted through multiple layers. The coordinates change dramatically at each stage—from geographic positions to plane heights to bent-plane heights to classification scores. Yet something persists through these transformations: the information enabling correct classification.

What’s the invariant interval in this information space? Entropy measures uncertainty before transmission; mutual information quantifies correlation preserved across channel noise. Both remain unchanged under invertible transformations. If I encrypt my message with a one-to-one cipher, entropy stays constant—I’ve merely changed coordinates in symbol space.

Measuring What Matters

The metric tensor converts arbitrary coordinate differences into physically meaningful distances. Without it, coordinate components are meaningless numbers. With it, we compute proper time, invariant mass, physical velocities—quantities all observers agree on despite coordinate disagreements.

Information theory needs similar rigor about what constitutes distance in probability space. Mutual information acts like a spacetime interval—coordinate-independent correlation between variables. Different representations of the same joint distribution yield identical mutual information, just as different reference frames yield identical proper time.

But I wonder about the metric structure. Minkowski spacetime has signature (-+++), enabling both timelike and spacelike intervals, with lightlike paths having zero proper time. Does information space have signatures? Distinguishing causal from acausal relationships? Would zero-information intervals correspond to perfectly predictable messages—the lightlike limit where no uncertainty gets resolved?

The profound parallel: both proper time and entropy measure what remains invariant under transformation. Change your coordinates, switch your representation, boost your reference frame—and the fundamental quantity persists. That persistence defines what’s physically real. In spacetime, it’s the interval. In communication, it’s the information.

What survives is what matters.

Source Notes

6 notes from 3 channels