What is Gaussian Activation Function in Neural Networks? How can we use the Gaussian Function in ANN? Where can we use Gaussian Function in AI technologies? How can I use these functions in my C++ app? Let’s recap the activation function and explain these terms.
What is an activation function in an artificial neural network (ANN)?
An Activation Function ( phi() ) also called a transfer function, or threshold function determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function . Net Input Function, here the sum is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions. In another term. The activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.
In C++ (in general in most Programming Languages) you can create your activation function. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,
By using this sum Net Input Function Value and phi() activation functions, let’s see some of activation functions in C++; Now Let’s see how we can use Binary Step Function as in this example formula,
Gaussian Activation Function
A Gaussian Activation Function is another activation function that can be used in AI Technologies. Generally, gaussian functions are used in statistics to describe the normal distributions in a given dataset. It is also used in signal processing to define Gaussian filters. Nowadays it is used in image processing where two-dimensional Gaussians are used for Gaussian blurs. In Engineering and mathematics, it is used to solve heat equations and diffusion equations and to define the Weierstrass transform.
Gaussian function often simply referred to as a Gaussian, is a function of the form
This function is simplified as a Gaussian Activation Function as below,
Gaussian Activation Function can be coded as a activation function in C++ as below,
1 2 3 4 5 6 |
double phi(double sum) { return ( std::exp(-1*sum*sum) ); // Gaussian Function } |
Simple ANN Example with Gaussian Activation Function in C++
Let’s use this Gaussian Activation Function in our generic Simple ANN Example,
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
#include <iostream> #define NN 2 // number of neurons class Tneuron // neuron class { public: double a; // activity of each neurons double w[NN+1]; // weight of links between each neurons Tneuron() { a=0; for(int i=0; i<NN; i++) w[i]=-1; // if weight is negative there is no link } // let's define an activation function (or threshold) for the output neuron double activation_function(double sum) { return ( std::exp(-1*sum*sum) ); // Gaussian Function } }; Tneuron ne[NN+1]; // neuron objects void fire(int nn) { float sum = 0; for ( int j=0; j<NN; j++ ) { if( ne[j].w[nn]>0 ) sum += ne[j].a*ne[j].w[nn]; } ne[nn].a = ne[nn].activation_function(sum); } int main() { //let's define activity of two input neurons (a0, a1) and one output neuron (a2) ne[0].a = 0.0; ne[1].a = 1.0; ne[2].a = 0; //let's define weights of signals comes from two input neurons to output neuron (0 to 2 and 1 to 2) ne[0].w[2] = 0.6; ne[1].w[2] = 0.4; // Let's fire our artificial neuron activity, output will be fire(2); printf("%10.6f\n", ne[2].a); getchar(); return 0; } |
Are you ready to try some of this for yourself? Download a free trial of C++ Builder today!