Watching Neural Networks Learn

Emergent Garden
Aug 17, 2023
5 notes
5 Notes in this Video

Universal Function Approximation and Curve Fitting

UniversalApproximation CurveFitting NeuralNetworks FunctionApproximation
01:30

Artificial neural networks configured as fully connected feedforward architectures act as universal function approximators, capable in principle of representing any continuous function given enough neurons and appropriate parameters.

Architectural Tricks that Improve Neural Learning

ActivationFunctions Normalization PracticalTricks NetworkDesign
09:00

Practitioners designing neural networks must choose activation functions, input scalings, and output constraints that shape how easily models can be trained using backpropagation and gradient-based optimizers.

Fourier and Taylor Features as Neural Network Enhancements

FourierFeatures TaylorSeries FeatureEngineering FunctionBases
18:00

Neural networks can be augmented with engineered feature maps built from classical mathematical bases such as Taylor polynomials and Fourier series, giving them richer building blocks for function approximation.

Approximating the Mandelbrot Set and Fractals with Neural Networks

Fractals MandelbrotSet ApproximationLimits InfiniteComplexity
24:00

Neural networks and Fourier-feature-augmented models are challenged with approximating the Mandelbrot set—a canonical fractal defined by iterating complex-valued functions that generate infinite detail at every scale.

Curse of Dimensionality, Fourier Features, and MNIST

CurseOfDimensionality MNIST HighDimensionalData Overfitting
32:00

High-dimensional tasks like MNIST digit classification expose the limitations of naive feature expansions and highlight the strengths of standard neural architectures for large input spaces.