Project 8: Recurrent Neural Networks (A Beginner-Friendly Walkthrough) Kaggle Notebook GitHub Repo Recurrent Neural Networks (RNNs) are the first neural architecture designed to handle sequences. Unlike feed‑forward networks, which treat every input independently, an RNN carries information forward through time through a hidden state. This makes it suitable for time‑series data such as weather, text, audio, or stock prices. To introduce RNN lets build the RNN entirely from scratch using NumPy. No deep‑learning frameworks. No shortcuts. Just the underlying math and the mechanics of recurrence. By the end, you will understand how hidden state works, how backpropagation through time (BPTT) computes gradients, and how to wrap everything into clean, reusable classes. Generating a Synthetic Weather Time-Series To learn RNNs, we need a sequence with memory. Weather is a natural example: today’s temperature depends on previous days plus some randomness. We generate a simple autoreg...