Interested In Deep Learning?



Original Source Here

This section introduces the integral components of deep learning and how they operate to emulate learning commonly found amongst humans.

Perceptrons And Neurons

The brain is responsible for all human cognitive functions; in short, the brain is responsible for your ability to learn, acquire knowledge, retain information and recall knowledge.

Photo by Josh Riemer on Unsplash

One of the fundamental building blocks of the learning systems within the brain is the biological neuron. A neuron is a cell responsible for transmitting signals to other neurons and, as a consequence, other parts of the body.

Researchers, in some way, have replicated the functionality of the biological neuron into a mathematically representative model called the perceptron.

One thing to note is that the terms perceptron and neuron are used interchangeably in machine learning.

Artificial neurons are processing unit that receives some input, and after an internal mathematical operation(activation function) on the input data, the neuron provides an output.

Inner workings of a neuron

Image By Author

As depicted in the diagram above, a neuron’s output is derived from various neuron components.

The input data passed into neurons are numerical. Digital data formats such as images, videos, audio and text are transformed into numerical representation to be given as input data into neurons.

Below is a step by step process of the inner workings of neurons.

  1. The input data coming into the neuron is multiplied by weights. Weights are synapses within neurons that take on random values that are later adjusted as the neural network is trained and weight update occurs through backpropagation.
  2. A sum of all the weights multiped by the numerical input data is calculated. This summation is done by a component of a neuron called the adder.
  3. The summation of the weights and input data is also combined with a bias.
  4. The result from step 3 is then passed into an activation function. The role of an activation function is to enforce that the output of the neurons falls within a set of values, typically -1, 1 or 1,0. Examples of activation functions are threshold logic units, rectified linear unit(ReLU), leaky ReLU, tanh, sigmoid etc.

The learning process within neural networks is facilitated through backpropagation. Backpropagation is an algorithm that calculates the gradient of the cost function with respect to the weight values of the network. For more information on backpropagation and the role it plays within neural networks, check out this video.

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: