# Make Your Own Neural Network in Python (Tariq Rashid)

Introduction - How will we do it? A step-by-step guide to what we will be doing in this course and how we will do it. Part 1 - A Little Background - Estimating the Constant "c" Iteratively Use “Hit and Trial” technique to estimate a constant and refine the model by adjusting other parameters based on the original output. Part 1 - A Little Background - Error in the Training Classifier What is Error in the training classifier and how do we calculate it? Is there a relationship between our parameter “A” and the error “E”? All these questions will be answered in this lesson. Part 2 - Let's Get Started! - Matrix Multiplication is Useful .. Honest! This lesson will revise all the basic concepts of matrices and how they are multiplied. Part 2 - Let's Get Started! - A Three Layer Example: Working on Hidden Layer Moving forward, now we will calculate the weights for the hidden layer and calculate its output. Part 3 - Backward Propagation of Error - Backpropagating Errors From More Output Nodes In this lesson, we will learn how to propagate the error backward from the output node back into the network. This method is extensively used when more there are more than one nodes that contributes to the output. Part 3 - Backward Propagation of Error - Backpropagation: Recombining the Error In this lesson, we will try to come up with a formula to recombine bits of error at each internal node that we split in the previous lesson. Part 4 - Adjusting the Link Weights - How to Transform the Output into Error Function? Gradient descent is a really good way of working out the minimum of a function, and it really works well when that function is so complex and difficult. In this lesson, we will learn how to design that function from our Neural Net, which we can then pass to the gradient descent algorithm. Part 5 - A Gentle Start with Python - Functions Introduction to Functions and their syntax in Python. Prologue - The Search for Intelligent Machines A brief discussion on the intelligent machines and how much potential they hold for the future tech industry. Part 1 - A Little Background - Easy for Me, Hard for You Comparison between a biological mind and a scentific mind. Part 1 - A Little Background - Refining the Parameters of Training Classifier In this lesson, we will learn how to refine our parameters by updating their values based on the error value. To do this, we first need to identify a relationship between A and E. Part 1 - A Little Background - Setting up Learning Rate in Training Classifier In this lesson, we will find out the use of learning rate while training a classifier. Part 1 - A Little Background - Representing Boolean Functions with Linear Classification Is it possible to represent the data of any boolean function using a single linear classifier? Let’s find out! Part 2 - Let's Get Started! - How Neurons Really Work? This lesson explains the working of a Neuron in detail. Part 2 - Let's Get Started! - Replicating Neuron to an Artificial Model In this lesson, we will try to build our own artificial neuron with three layers, three inputs, and three outputs. We will also look at some of the basic terms that we are going to use in the further lessons. Part 2 - Let's Get Started! - Calculating Inputs for Internal Layers As we know, the inputs to internal nodes are calculated from the output generated by the previous nodes. In this lesson, we will try to calculate the inputs to an internal layer using previous layer’s outputs. Part 4 - Adjusting the Link Weights - How Do We Actually Update Weights? A neural network’s error is a function of the internal link weight. Improving a neural network means reducing this error - by changing the weights. But how do we change those weights? Let’s find out! Part 4 - Adjusting the Link Weights - Embrace Pessimism A little background on Gradient Descent. Part 4 - Adjusting the Link Weights - Using Gradient Descent to Update Weights Now that we have designed the function, we will pass it to gradient descent algorithm to update the weights in our network. Part 5 - A Gentle Start with Python - Loops Introduction to Loops and their syntax in Python. Introduction - What will we do? A little summary of what this course is about and what we will be doing throughout this course. Part 1 - A Little Background - Limitations of Linear Classifiers In this lesson, we will look at some of the limitations of linear classifiers. Part 2 - Let's Get Started! - Neurons, Nature’s Computing Machines A brief introduction to a basic unit of the biological brain, i.e., Neuron and its working. Part 2 - Let's Get Started! - What is an Activation Function? An introduction to Activation function and its use in calculating the output of a neural network. Part 2 - Let's Get Started! - Calculating Neural Network Output In this lesson, we will do layer-by-layer calculations to see how output is generated from the initial inputs. Part 3 - Backward Propagation of Error - Learning Weights From More Than One Node In this lesson, we will learn how to update the input weights inside the layers to refine the output. Part 5 - A Gentle Start with Python - Getting Started A basic hands-on Python tutorial for beginners. Part 5 - A Gentle Start with Python - Arrays Introduction to Arrays and their syntax in Python. Part 5 - A Gentle Start with Python - Plotting Arrays In this lesson, we will learn how to plot arrays using a Python library called “Matplotlib”. Part 5 - A Gentle Start with Python - Methods What are methods and what is their use? This lesson will help you learn all about it. Part 6 - Neural Network with Python - Initializing the Network Let’s begin with the initialization of the Neural Network. Part 6 - Neural Network with Python - Querying the Network In this lesson, we will work on the query function which is the first to train the network. Introduction - Author’s Note Author’s note to the learners. Part 1 - A Little Background - Classifying vs. Predicting This lesson is a comparison between Classification and Prediction. Are they same or is there really is a difference between them? Let’s find out! Part 1 - A Little Background - Building a Simple Classifier In this lesson, we will learn how to build a linear classifier with the help of some examples which we call “training data”. Part 2 - Let's Get Started! - Following Signals Through A Simpler Network To get a better understanding of things, we will build a simpler neural network with only two inputs and see what really happens inside the layers before we get the final output. Part 2 - Let's Get Started! - A Three Layer Example: Working on Input Layer In this lesson, we will take a look at a three-layered Neural Network example and calculate the values for input layer. Part 2 - Let's Get Started! - A Three Layer Example: Working on Output Layer In this lesson, we will work on the last and final layer, i.e., the Output Layer and calculate the final outputs of our Neural Network. Part 3 - Backward Propagation of Error - Backpropagation: Splitting the Error Neural networks learn by refining their link weights. So to backpropagate the error to internal nodes, we split the output layer errors in proportion to the size of the connected link weights, and then recombine these bits at each internal node. Part 3 - Backward Propagation of Error - Backpropagating Errors with Matrix Multiplication Backpropagating the error can be expressed as a matrix multiplication. It makes feeding signals forward and error backpropagation quite efficient. Part 4 - Adjusting the Link Weights - Understanding the Gradient Descent Algorithm This lesson is a brief introduction to Gradient Descent and how it works. We will also discuss whether how it is an efficient approach to minimize the error in our Neural Network? Part 4 - Adjusting the Link Weights - Choosing the Right Weights...Iteratively! Choosing the right weights directly is too difficult. An alternative approach is to iteratively improve the weights by descending the error function, taking small steps. Each step is taken in the direction of the greatest downward slope from your current position. Introduction - Who is this course for? A small guide for learners who want to study this course. Part 6 - Neural Network with Python - Building the Neural Network Class We will now start the journey to make our own neural network with the Python we’ve just learned. We’ll progress along this journey is short easy to tackle steps, and build up a Python program bit by bit. Part 6 - Neural Network with Python - Refining the Weights In this lesson, we will try to refine the weights. We will be using the Numpy library to calculate the dot product. Part 7 - Testing Neural Network against MNIST Dataset - Preparing the MNIST Training Data Rescaling the input values to a smaller range, between 0 and 1 to prepare it for training. Appendix: A Small Guide to Calculus - Functions of Functions Have questions about Functions of Functions? Go for it! Part 9 - Even More Fun! - More Brain Scans Some more results of back querying of the rest of digits. Part 5 - A Gentle Start with Python - Objects Introduction to Objects and how they are created in Python. Part 6 - Neural Network with Python - Weights - The Heart of the Network The next step is to create the network of nodes and links. The most important part of the network is the link weights. They’re used to calculate the signal being fed forward, the error as it’s propagated backward, and it is the link weights themselves that are refined in an attempt to improve the network. Appendix: A Small Guide to Calculus - Calculus By Hand Have questions about Calculus By Hand? Go for it! Part 7 - Testing Neural Network against MNIST Dataset - A Quick Look at the Data Files Quick Look at the files of MNIST handwritten digits database to get the idea. Part 7 - Testing Neural Network against MNIST Dataset - Plotting the Data Points Plotting some of the data points to see how digits are formed. Part 7 - Testing Neural Network against MNIST Dataset - Python Code to Create and Rescale the Output Array Given below is the Python code to rescale the output values. Part 8 - Some Suggested Improvements - Tweaking the Learning Rate How adjusting the learning can bring improvement in the accuracy of result? Let’s find out. Part 8 - Some Suggested Improvements - Doing Multiple Runs How doing multiple runs can bring improvement in the accuracy of result? Part 9 - Even More Fun! - Your Own Handwriting In this experiment, we will create our test dataset using our own handwriting. We’ll also try using different styles of writing, and noisy or shaky images to see how well our neural network copes. Part 9 - Even More Fun! - Backward Query In this lesson, we will learn what is Backward Query and how does it work? Epilogue - Epilogue A final comment on the Neural Network. Appendix: A Small Guide to Calculus - Calculus without Plotting Graphs Have questions about Calculus without Plotting Graphs? Go for it! Part 6 - Neural Network with Python - Training the Network In this lesson, we will try to come up with a solution to train our network. Part 6 - Neural Network with Python - The Code Thus Far.. In this lesson, we will take a look at the code we have created so far. Part 6 - Neural Network with Python - Optional: More Sophisticated Weights In this lesson, we will try to refine the initializing weights. This step is optional and can be skipped while building a Neural Network but it’s a very good approach to get good accuracy on your test data. Part 7 - Testing Neural Network against MNIST Dataset - The MNIST Dataset of Handwritten Numbers A brief introduction to the MNIST Dataset of Handwritten Digits. Part 7 - Testing Neural Network against MNIST Dataset - The Need to Rescale the Target Output Why is it necessary to rescale the input and output values in order to achieve good results? Let’s find out. Appendix: A Small Guide to Calculus - A Flat Line Have questions about A Flat Line? Go for it! Appendix: A Small Guide to Calculus - A Sloped Straight Line Have questions about A Sloped Straight Line? Go for it! Appendix: A Small Guide to Calculus - A Curved Line Have questions about A Curved Line? Go for it! Part 6 - Neural Network with Python - The Complete Neural Network Code Here’s the complete code of the Neural Network that we have built so far. Part 7 - Testing Neural Network against MNIST Dataset - Getting the Dataset Ready In this lesson, we will write the code to access the files stored in MNIST database. Part 7 - Testing Neural Network against MNIST Dataset - Updating Neural Network Code Let’s update our Python code to include the work we have done so far. Part 7 - Testing Neural Network against MNIST Dataset - Testing the Network on a Subset Now that we have trained the network, at least on a small subset of $100$ records, we want to test how well that works. Part 7 - Testing Neural Network against MNIST Dataset - Testing the Network Against the Whole Dataset! Given below is the code to see how the neural network performs against the rest of the dataset. Part 7 - Testing Neural Network against MNIST Dataset - Updating the Neural Network Code...Again In this lesson, we will be updating the code again with the additional work we have done so far. Part 8 - Some Suggested Improvements - Change Network Shape How changing the network shape can bring improvement in the accuracy of result? Let’s find out. Part 9 - Even More Fun! - Inside the Mind of a Neural Network In this lesson, we will take a peek inside a simple neural network to visualize what we have learned so far. Part 9 - Even More Fun! - Creating New Training Data by Rotations Have questions about Creating New Training Data by Rotations? Go for it! Appendix: A Small Guide to Calculus - A Gentle Introduction Have questions about A Gentle Introduction? Go for it! Appendix: A Small Guide to Calculus - Calculus Not By Hand Have questions about Calculus Not By Hand? Go for it! Appendix: A Small Guide to Calculus - Patterns Have questions about Patterns? Go for it! Appendix: A Small Guide to Calculus - Handling Independent Variables Have questions about Handling Independent Variables? Go for it! Part 6 - Neural Network with Python - Applying Sigmoid Function To get the signals emerging from the hidden node, we simply apply the sigmoid squashing function to each of these emerging signals. Part 4 - Adjusting the Link Weights - Weight Update Worked Example Let’s work through a couple of examples with numbers, just to see this weight update method working in Neural Network. Part 4 - Adjusting the Link Weights - Preparing Data: Inputs & Outputs In this section, we are going to consider how we might best prepare the training data and even design the outputs to give the training process a good chance of working. Part 1 - A Little Background - A Simple Predicting Machine Building a simple model to predict a missing value based on some past observations. Part 4 - Adjusting the Link Weights - One Last Thing... In this lesson, we will try to find a similar error slope for the weights between the input and hidden layers just like we did for the output layer. Prologue - A Nature Inspired New Golden Age A discussion on how Artificial Intelligence came into reality and how Neural Network’s idea was inspired by the concept of a scientific brain copying the basic features of a biological brain as that of a bee. Part 4 - Adjusting the Link Weights - Preparing Data: Random Initial Weights In this section, we are going to consider how we might best prepare the initial random weights to give the training process a good chance of working. Part 6 - Neural Network with Python - Testing Our Code Thus Far In this lesson, we will test our code with some random inputs and see if our neural network is built properly.