The Bit That Changed Everything: Quantizing Information

Claude Shannon Noticing information_theory
Bit InformationTheory Binary Quantization DigitalInformation
Outline

The Bit That Changed Everything: Quantizing Information

The Unit Information Never Had

Before 1948, information lacked units. You could measure heat in joules, distance in meters, time in seconds. But information? How do you quantify the content of a message, the surprise in an outcome, the uncertainty in a choice?

I defined the bit: binary digit, 0 or 1. The fundamental unit of information. Not chosen arbitrarily—it is the minimum unit of choice. The smallest amount of information that can exist.

This quantification changed everything. Information became measurable, comparable, optimizable. The bit is to information what the atom is to matter.

Why Binary Is Fundamental

One bit equals one binary question answered. Flip a fair coin: heads or tails? That is one bit of information. The outcome resolves one binary uncertainty.

Choose from four equally likely options? That requires two bits. You need two binary questions to distinguish four possibilities: 2² = 4. Choose from eight options: three bits, because 2³ = 8. The pattern is exact: N bits distinguish 2^N possibilities.

Why binary instead of ternary or decimal? Binary is the minimum. You cannot subdivide a binary choice and still have a choice. It is the atomic unit.

Ternary works mathematically—0, 1, 2 states per position. But binary is physically simpler. Two-state systems dominate nature and engineering: on/off, present/absent, high voltage/low voltage, magnetized north/magnetized south. Distinguishing two states requires minimal precision. Adding a third state demands finer distinctions, increasing noise susceptibility.

Binary representation matches machine capabilities naturally. Early punch cards: hole punched or not punched. Modern circuits: voltage high or low. Light switches: on or off. Every physical implementation reduces to binary distinctions because binary is most robust against noise and degradation.

Francis Bacon understood this in 1605, three centuries before information theory. His bilateral cipher used five binary positions to generate 2⁵ = 32 unique sequences, more than enough for the 26-letter alphabet. He recognized that “objects which are capable of a two-fold difference only” suffice to encode arbitrary messages. Binary sequences, replicated across positions, provide unbounded expressive capacity.

When Everything Becomes 0s and 1s

The bit’s universality is profound. Every form of information reduces to sequences of bits.

Sound: sample amplitude at regular intervals, quantize each sample to finite precision. Each sample becomes a sequence of bits. CD audio uses 16 bits per sample, 44,100 samples per second. Music becomes 705,600 bits per second.

Images: divide into pixels, quantize each pixel’s color. A photograph becomes millions of bits.

Text: assign each character a binary code. ASCII uses 7 bits per character. Unicode uses more. Words become bit sequences.

Video: sequence of images, each image already bits. Add temporal compression. Everything still bits.

DNA: four nucleotides A, C, G, T. Map to two bits each: A=00, C=01, G=10, T=11. The genetic code becomes a bit stream.

Quantum states: qubits extend the concept to superpositions of 0 and 1, but measurement still yields classical bits.

Information universality holds: all information, regardless of origin or meaning, decomposes into bits. The bit is the universal currency of information systems.

Quantifying What Was Unquantifiable

The bit made information measurable. Before, you could say one message contained “more” information than another, but you could not quantify how much more. After, you count bits. Precise, comparable, unambiguous.

One bit of entropy means one coin flip’s worth of uncertainty. Two bits mean choosing among four equally likely outcomes. The mathematics becomes exact.

This enabled the digital revolution. Communication systems optimize for bits per second. Storage measures bits retained. Compression eliminates redundant bits. Error correction adds protective bits. Everything becomes quantifiable.

Shannon’s information theory, built on the bit, determined fundamental limits: channel capacity, minimum compression size, maximum error correction. These are not engineering approximations. They are mathematical theorems.

The bit transformed communication from art to science. Simple idea, profound consequences. Everything we now take for granted—digital communication, computing, storage, transmission—starts here.

Information is the resolution of uncertainty. The bit measures that resolution. One binary choice at a time.

Source Notes

5 notes from 1 channel