Original Source Here
Everyone in the field of Artificial Intelligence knows what neural networks are. And most practitioners know the huge processing power and energy consumption needed to train pretty much any noteworthy neural network. That is to say, for the field to develop further, a new type of hardware is needed.
Some experts consider that the quantum computer is that hardware. But even though it holds great promise, quantum computing is a technology that will take many decades to develop. Physics theories are not yet mature enough to enable the development of useful and cost-efficient devices.
Neuromorphic computing, on the other hand, will take less time and resources to develop and will be very useful and cost-efficient, both regarding the cost of device development and the energy costs for processing power.
Neuromorphic computing or how to create a brain:-
Neuromorphic computing is nothing too new, as it was first coined in 1980 and it referred to analog circuits that mimic the neuro-biological architectures of the human brain.
If you are at least a bit into the hype of Deep Learning you know that it revolves around software-based algorithms and architectures that mimic, abstractly, the neural circuits of the brain. Deep Learning spreads more and more to all types of industries and the costs to train algorithms are huge.
One thing Deep Learning failed to mimic is the energy efficiency of the brain. Neuromorphic hardware can fix that.
Also, you can say goodbye to the famous Von Neumann bottleneck. In case you don’t know what that is, it refers to the time it takes for the data coming from the memory of a device to reach the processing unit. This practically makes the processing unit wait (lose time) to get the data it needs to process.
Neuromorphic chips do not have a bottleneck, as all computations happen in memory. Memristors, which are the base of neuromorphic chips, can be used as both memory units and computation units, similar to the way the brain does it. They are the first inorganic neurons.
Why neuromorphic hardware is better for neural networks??
Neural networks primarily operate using real numbers (e.g. 2.0231, 0.242341, etc) to represent weights and other values inside of the neural network architecture. Those values, however, need to be transformed into binary, in the present computer architectures. That increases the number of operations needed to compute a neural network, both in training and deployment.
Neuromorphic hardware doesn’t use binary, but it can use real values, in the form of electrical values like current and voltage. Here, the number 0.242341 is represented, for example, as 0.242341 volts. This happens directly inside the circuit, there is no binary value present. All of the calculations happen at the speed of the circuit.
Another factor that highly increases both the response speed and also the training speed of neural networks, based on neuromorphic hardware is the high parallelism of the calculations. One sure thing that is known about our brain is that it is highly parallelized, millions of calculations happening at the same time, each second of our life. This is what neuromorphic chips can achieve.
All of those advantages come with a cherry on top: much lower energy consumption for training and deploying neural network algorithms.
“In contrast to computers, the brain is fully interconnected, with each neuron connected to thousands of others.”
Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot