We'll build a tiny but complete learning machine: a single-layer perceptron packaged as a reusable Rust library crate. Starting from an empty project, we'll define the data structure, initialize random weights, implement the forward pass and training loop, then verify everything with integration tests for AND, OR, and XOR logic gates.
Along the way we'll see how the perceptron learning rule adjusts weights, why convergence is guaranteed for linearly separable problems, and what happens when a problem (XOR) isn't linearly separable.