Posts

Project 6, Part 2:

Project 6, Part 2: Saving, Loading, and Running Inference How Neural Networks Become Real Systems In Part 1 of Project 6 , we built a fully modular neural network from scratch.  We created a: Layer class SequentialModel class activation functions training loop This gave us a complete learning system, but only inside a single Python session. Real machine learning systems must be able to save what they have learned and load it later without retraining. Part 2 introduces model persistence. We add the ability to save the learned parameters of the network, load them into a new model instance, and run inference on new data. This is the final step in turning our hand‑built neural network into a usable system. To verify that saving and loading work correctly, we use the XOR dataset. XOR is small, deterministic, and nonlinear, making it an ideal test case for a simple neural network. This part introduces four files: saved_model.py – saving and loading parameters data.py – the XOR dataset te...

Building the Neural Network Framework

  Project 6 – Part 1 Building the Neural Network Framework (Code Walkthrough) Part 1 introduced the architecture and file structure. Part 2 is where we implement the core components. You’ve already seen the math in Section 1. Now you’ll see how that math becomes code in a modular, reusable form that mirrors how real frameworks like PyTorch and TensorFlow are built. Before we write any code, here is the big picture of how the pieces fit together. DenseLayer Implements the math from Project 2 Stores weights, biases, z, a, x Computes forward and backward Updates its own parameters SequentialModel A container that stacks layers Runs forward through all layers Runs backward in reverse Provides predict() and summary() Trainer Handles batching Runs the training loop Computes loss Calls backward Updates parameters This is the same structure used in modern deep‑learning frameworks. How Projects 1–5 Connect to This Framework Project 2 → DenseLayer.forward() You learned Wx + b. That is exact...