Original Source Here

# What do you mean by Forward Propagation in ANN? 🤔

Basics first!!! 🤠

**What is Neuron?**

**Neurons**(also called neurons or nerve cells) are the fundamental units of the brain and nervous system, the cells responsible for receiving sensory input from the external world, for sending motor commands to our muscles, and for transforming and relaying the electrical signals at every step in between.

Biological Neuron gets various inputs from “dendrites”, then the summation of weights and input is done along with addition bias for each neuron is done in the nucleus.

**2. What is the Artificial neural net(ANN)?**

- Artificial neural networks are computing systems inspired by the biological neural networks that constitute animal brains.
- An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain.

From the definition, of neuron and artificial neural network we get to know few terms like input, input layer, weights, Hidden layer, output layer and output.

## Let’s learn some ANN,

before that we are defining some **simplifying assumptions:**

- Neurons are arranged in layers and layers are arranged sequentially.
- Neurons within the same layer do not transfer information with each other.
- All data/info enter via the input layer and information/output goes out by the output layer.
- All Neurons in layer “ l ” are connected to all neurons in layer “ l+1 ”.
- Every interconnection in the neural network has a weight associated with it, and every neuron has a bias associated with it.
- All Neurons in a particular layer use the same activation function.

## — For Demonstration,

I am using a simple neural network for binary classification with three neurons in the input layer and one neuron in the hidden and one neuron in the output layer so my Artificial Neural network looks like this…

## ‣ Number of Neurons:

Now, You will ask me how you are taking the number of neurons in these layers as three in input and one in hidden and one in output.

- So While defining input layer neurons consider the
**number of neurons = number of columns in the dataset.** - By doing the
**Hyperoptimization technique**we will find the number of neurons in hidden layers. - We need one of the two outputs from
**binary classification so one neuron**, If it’s a**multiclass classification**we will take,**Number of Neuron = Number of classes in the output layer.**

## ‣ Weight Notations:

This is how weight notation is normally described…

Recall that models such as linear regression, logistic regressions, SVMs etc. are trained on their coefficients, i.e. the training is to find the optimal values of the coefficient to minimise some cost function.

**Neural networks are trained on weights and Biases, which are parameters of the network.**

**Hyperparameters of ANN or learning algorithm is trained on a fixed set of Hyperparameters- number of layers, number of neurons in the input layer, hidden and output layers.**

## ‣ Neuron Structure:

From Figure 1: ANN Overview, We have,

Now, From figure 3: Demo ANN,

- We will have three inputs that we will feed to the neuron.
- Step -1,
**y**=**Summation of the product of weight and input then the addition of bias to the product of weight and input.** - Then step -2, the value
**y is fed to the Activation function Which gives the output value(z) of the neuron.** - Similarly, the
**Information/output value(z)**is fed to the**next layer neurons as input,**Till the output layer.

## ‣ **Activation function:**

We are working on a binary classification problem, so for classification, we will use the Sigmoid function.

## ‣ Loss Function:

Loss function in classification, here it is binary cross-entropy.

For binary classification we have, y = 0 or y = 1.

y_pred is calculated by sigmoid function,

## This is the Forward Propagation of the Network.

In Simple terms, Forward propagation means we are moving in only one direction(forward), from input to output in a neural network.

In the next blog, we will get to know the Neural Network training with BackPropagation.

# Summary:

- Calculating, Z =summation[(weights*input)+bias].
- Choosing Activation function = for binary classification sigmoid function.
- Substituting the value of “ Z ”, we will get y_pred.
- Calculation of Loss using binary cross-entropy.

Here, we learned the forward propagation of ANN, next we will learn Backpropagation.

Along with forward propagation and backpropagation of ANN, we also need to learn the other activation functions, Chain-rule in backpropagation, Vanishing gradient problem, Exploding gradient problem, Dropout, Regularization, Weight initialization, Optimizers, Loss functions. Which we will cover one by one.

## References:

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot