Factorized World Models and the Tolman-Eichenbaum Machine Prediction Objective
The Tolman-Eichenbaum Machine (TEM) is a biologically inspired model that learns internal world models by predicting observations from sequences of actions and sensory inputs, using hippocampal organization as a blueprint.
Position Module and Path Integration as Grid Cell Analog
TEM’s position module serves as a computational analog of medial entorhinal cortex, updating an internal location code based solely on action sequences, much like biological grid cells integrate self-motion.
Memory Module as Place Cell Analog and Remapping Mechanism
TEM’s memory module binds positional codes with sensory inputs, functioning as an associative store analogous to hippocampal place cells that fire in specific locations and remap between environments.
Statistics of Experience and Salient Hotspots in Learned Maps
TEM’s learned representations reflect not only structural constraints of environments but also the statistics of experience, paralleling how animals over-sample boundaries, safe zones, and reward locations.