A Brain-Inspired Algorithm For Memory

Artem Kirsanov
Jul 3, 2024
6 notes
6 Notes in this Video

Associative Memory as Energy Minimization in Hopfield Networks

AssociativeMemory EnergyBasedModels HopfieldNetwork
01:00

Artem addresses cognitive neuroscientists, machine-learning practitioners, and curious learners who want a concrete algorithmic picture of how the brain might rapidly retrieve memories from partial cues without scanning a gigantic database of experiences.

Protein Folding Energy Landscapes as an Analogy for Associative Memory

ProteinFolding EnergyLandscape Analogy
03:30

Artem uses protein folding to give physicists, biologists, and ML researchers an intuitive feel for how a brain-like system can “find” a correct memory without brute-force searching through all possible experiences.

Hopfield Network Architecture and Energy Function

HopfieldNetwork NetworkArchitecture EnergyFunction
08:00

This construction targets readers who want a precise but intuitive definition of a Hopfield network as an energy-based model of associative memory.

Inference Dynamics and Pattern Completion in Hopfield Networks

PatternCompletion AttractorDynamics InferenceRule
13:30

Artem’s update rule is aimed at readers who want to see exactly how a Hopfield network moves from an initial noisy cue to a clean stored memory using only local neuron-level computations.

Hebbian Learning Rule and Storing Multiple Patterns in Hopfield Networks

HebbianLearning MemoryStorage SynapticWeights
19:00

Artem explains learning in Hopfield networks for readers curious how specific memories get inscribed into synaptic weights using only local information about neuron co-activity.

Memory Capacity and Limitations of Classical Hopfield Networks

MemoryCapacity Interference SpuriousAttractors
24:00

Artem concludes with a candid assessment for practitioners tempted to use vanilla Hopfield networks as practical memory systems rather than as conceptual models.