Group Actions: Rotation Groups and Symmetry Transformations
My work on curved surfaces taught me this: seek the invariant. Gaussian curvature remains unchanged under isometries—those distance-preserving transformations that bend without stretching. The principle extends far beyond surfaces: understanding any system requires identifying what persists as everything else transforms.
The Algebra of Symmetry
Rotation groups formalize this intuition with mathematical precision. SO(3) captures all three-dimensional rotations through group structure: operations compose (successive rotations yield another rotation), invert (every rotation has an opposite), and include identity (the null rotation). Spin numbers emerge not as arbitrary labels but as natural consequences of how rotational symmetry acts on different mathematical objects. Scalars remain unchanged—spin-0. Vectors rotate once per spatial rotation—spin-1. Rank-2 tensors rotate twice, returning to initial states after 180°—spin-2.
Then spinors appear, mysterious and elegant. A 360° rotation multiplies the spinor by -1, transforming it to its opposite state. Physical return requires 720° of rotation—twice the cycle of ordinary vectors. This peculiar behavior reveals representation theory’s depth: the same symmetry group admits multiple representations, each transforming distinctly under identical operations. Particles are not objects with spin; they are irreducible representations of the rotation group.
Learned Transformations and Hidden Structure
Neural networks pursue a parallel quest: discovering transformations that reveal invariant structure. Convolutional architectures pool over rotations and translations, learning features independent of position. Transformers achieve permutation invariance, processing sequences regardless of ordering. The geometry mirrors my principle—find quantities unchanged under coordinate choice.
Composable transformations suggest deeper algebraic structure. Each layer performs simple operations: fold, scale, combine. These compose recursively—the second layer folds what the first already folded, creating compound geometric transformations. Do these learned operations form mathematical groups? Can we prove closure (composing two learned transformations yields another learnable transformation), associativity, identity, and invertibility?
The representation space itself transforms through training. Raw geographic coordinates become plane heights, then folded-plane heights, then confidence scores. The network constructs a coordinate system where complex patterns become linearly separable—precisely the shift from extrinsic to intrinsic description that characterized my differential geometry. What properties remain invariant across this transformation journey?
Symmetry Discovery Through Local Search
Evolutionary algorithms explore transformation space through local variations—small parameter perturbations analogous to infinitesimal rotations. The fitness landscape guides this search, occasionally revealing symmetries that enable better solutions. When mutation discovers a transformation preserving task-relevant structure, selection amplifies it.
This mirrors how rotation groups emerge from continuous symmetry: the group law (composition) follows from smoothly varying infinitesimal rotations. Both systems—mathematical groups and neural optimizers—discover structure through systematic exploration of local transformations.
The pattern reveals the shortcut: whether formalizing rotational symmetry through SO(3) or learning transformation-invariant representations through gradient descent, understanding requires identifying invariants. Spinors teach us that representation determines transformation behavior. Perhaps neural architectures possess similar “spinor-like” properties—unexpected transformation characteristics emerging from their representational structure. The question remains: what is intrinsic to these networks, independent of initialization and embedding details, and what is merely artifact of coordinate choice?
Few, but ripe.
Source Notes
6 notes from 3 channels
Source Notes
6 notes from 3 channels