Essence of Neural Networks

Anmol Sharma
4 min readOct 5, 2020

--

Know the mathematical story behind this hot tech !

Ever wondered what is this all about neural networks and why it’s all around the hotspots on the internet ?

Wanna know ?

You are on the right page, come let’s explore it together !

As neural network is the fundamental unit, vital for human cognitive functions the same is with the Artificial neural network (ANN) when it comes to the field of Deep Learning (Subset of Artificial Intelligence). But what the heck is this deep learning ?

Deep learning isn’t the learning involved with the depth of something OR learning done in the basement 😆 .

In simple words, the learning process in which machine learns the relevant action plan or task along with the formulation of the required mapping function for input/output data pairs can be stated as deep learning . 🙄Daunting enough na !

It can be made easy : When a computer doesn’t require explicit rules to perform a task, and works only on input data and the desired result.

Maths behind Neural networks

Coming back to neural network, it can be visualized as a network of small units known as neurons(basically mathematical functions represented as nodes in the network). These(NN) are classified into various types based on #of neurons, #of layers, types of problems they can solve, etc.

Let’s start with the fundamental building unit i.e. perceptron (with single neuron)

Python implementation of perceptron model :

# only 1 layer ----------- 
import numpy as np
inputs = [1, 2, 3, 2.5]
weights = [0.2, -0.27, -0.5, 1.0]
bias = 2.0
output = np.dot(weights, inputs) + bias
print(output)

Here inputs variable is the input to neuron, weights and bias can be considered as the knobs used to regulate the mapping function, output is the output produced by it.

In Brief about various parts of NN :

  • Neuron : Code block containing some Mathematical functions responsible for learning behavior of neurons.
  • Layers : The main entity from which a neural network is formed is known as Neuron, and the group of these neurons make up the layer.
  • I/P, O/P : The input data is in the form of vectors, text-vectors, image rgb vectors, etc. While output data is generally variable describing the result/conclusion of learning process.
  • Weights : Numeric values(generally b/w[-1, 1]) assigned to the edges between two , neurons which are responsible for the contribution of a neuron for successive layer.
  • Bias : Numeric value assigned to a neuron which decides the proportion of neuron in result formation of the layer.
  • Activation function : The function(condition check) which is generally responsible for the activation of a particular neuron in the layer.

Two Layered Neural Network with 1 neuron/layer

# Multi Layer NN using basic py code
inputs = [[1, 2, 3, 2.5],
[2.0, 5.0, -1.0, 2.0],
[-1.5, 2.7, 3.3, -0.8]]
# Layer1 ----------
weights1 = [[-0.2, -0.27, -0.5, 1.0],
[0.5, 0.91, -0.5, -0.33],
[0.2, 0.8, -0.5, 1.0]]
biases1 = [2, 3, 0.5]#Layer2 ------------
weights2 = [[0.2, -0.27, -0.5],
[0.5, 0.91, -0.5],
[-0.44, 0.8, -0.5]]
biases2 = [-1, 3, 0.5]# print(np.shape(inputs), np.shape(weights1), np.shape(weights2))layer1_output = np.dot(inputs, np.array(weights1).T) + biases1
layer2_output = np.dot(layer1_output, np.array(weights2).T) + biases2
print(layer2_output)

Activation function

  • ReLU (Rectified linear unit)

This is a kind if activation function which only activates a particular neuron if input in (max[0, x]) i.e. for all x >0.

# ReLU function
inputs =[0, 2, -1, 3.3, 2.7, 1.1, 2.2, -100]
output = []
for i in inputs:
if i > 0:
output.append(i)
elif i <= 0:
output.append(0)
print(output)

Full intuition of neural network with all layers, activation function, weights, bias, etc.

# numpy , weights, bias predefined in above code blocksX = [[1, 2, 3, 2.5],
[2.0, 5.0, -1.0, 2.0],
[-1.5, 2.7, 3.3, -0.8]]
class layer_dense:
def __init__(self, n_inputs, n_neurons):
self.weights = 0.10 * np.random.rand(n_inputs, n_neurons)
self.biases = np.zeros((1, n_neurons))
def forward(self, inputs):
self.output = np.dot(inputs, self.weights) + self.biases

# Activation class
class Activation_ReLU:
def forward(self, inputs):
self.output = np.maximum(0, inputs)
layer1 = layer_dense(2, 5)
#layer2 = layer_dense(5, 2)
layer1.forward(X)
activation1 = Activation_ReLU()
layer1.forward(X)activation1.forward(layer1.output)
print(activation1.output)

So , that was just a brief intro about Neural networks, try the code by yourself on your local machine with different inputs and also try adding some more layers .

Get your hands dirty in this cool stuff and keep Learning !

def Thanks(view):
if view == True:
print("Thanks for your read")
print("Yearn to Learn")

--

--

Anmol Sharma
Anmol Sharma

Written by Anmol Sharma

Machine learning enthusiast | Data science aficionado | Web Designer | Always curious to know something new and innovative

No responses yet