Neural network with single neuron is the simplest neural networks, and to some extent, all neural network models can be consist of multiple single neurons.
We have previously introduce the basic concepts of neuron in Neuron section. And it is quite simple to model the neural into a mathematical model:
The model above is the single neuron neural network. Based on the type of activation functions, we can have different models, such as sigmoid activation function, ReLU activation function, and so on. Here I will only introduce models with sigmoid and ReLU activation function.
If the activation function is , then the single neuron problem is the Linear Regression problem.
We first introduce the sigmoid function, which is expressed as:
The curve of above function can be visualized below:
It is quite simple because it is actually the Logistic Regression which we have already introduced in the machine learning part if we use the cross entropy loss function.
Rectified Linear Unit (ReLU) activation function is very popular in recent convolutional neural networks, which is expressed as:
The function curve of ReLU activation function can be visualized below:
It is very straightforward to generate the derivatives of the ReLU activation function:
If we use the identity function as the activation function of the single neural network, the model comes to linear regression.
Here we introduced different kinds of activation function for single neural network, which can further build up to multiple layer neural networks.