What is Sigmoid Linear Unit in a neural network created with a C++ app? How can we use the SiLU function in an artificial neural network (ANN)? Where can we use SiLU in AI technologies? Let’s remind ourselves of the activation function and explain these terms.
What is an activation function and how can we use it in a C++ app?
Activation Function ( phi() ) also called as transfer function, or threshold function that determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function . Net Input Function, here the sum is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions. In another term. The activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.
In a C++ app (in general in most Programming Languages) you can create your activation function. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,
By using this sum Net Input Function Value and phi() activation functions, let’s see some of activation functions in C++; Now Let’s see how we can use Binary Step Function as in this example formula,
What is a sigmoid linear unit (SiLU)?
Sigmoid Linear Unit (SiLU) also known as Sigmoid-Weighted Linear Unit is an activation function that uses the sigmoid function with multiplication itself. In other words, the activation of the SiLU is computed by the sigmoid function multiplied by its input. Stefan Elfwinga, Eiji Uchibea, Kenji Doyab were explained SiLU well in their “Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning” study in 2017. They also used dSiLU term which is the derivative SiLU.
Let’s remember sigmoid function,
here f(x) is activation function ( we use phi() ) and the x is sum of weighted inputs.
Thus, Sigmoid Linear Unit (SiLU) is,
This SiLU function can be written in C++ as below
1 2 3 4 5 6 |
double phi(double sum) { return ( sum/(1+ std::exp(-1*sum)) ); // SiLU Function } |
A simple ANN C++ app example with SiLU activation function
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
#include <iostream> #define NN 2 // number of neurons double sqrt_2divPI= std::sqrt(2.0/M_PI); class Tneuron // neuron class { public: double a; // activity of each neurons double w[NN+1]; // weight of links between each neurons Tneuron() { a=0; for(int i=0; i<NN; i++) w[i]=-1; // if weight is negative there is no link } // let's define an activation function (or threshold) for the output neuron double activation_function(double sum) { return ( sum/(1+ std::exp(-1*sum)) ); // SiLU Function } }; Tneuron ne[NN+1]; // neuron objects void fire(int nn) { float sum = 0; for ( int j=0; j<NN; j++ ) { if( ne[j].w[nn]>0 ) sum += ne[j].a*ne[j].w[nn]; } ne[nn].a = ne[nn].activation_function(sum); } int main() { //let's define activity of two input neurons (a0, a1) and one output neuron (a2) ne[0].a = 0.0; ne[1].a = 1.0; ne[2].a = 0; //let's define weights of signals comes from two input neurons to output neuron (0 to 2 and 1 to 2) ne[0].w[2] = 0.6; ne[1].w[2] = 0.4; // Let's fire our artificial neuron activity, output will be fire(2); printf("%10.6f\n", ne[2].a); getchar(); return 0; } |
Why not download a free trial of C++ Builder and try these examples for yourself today?
Design. Code. Compile. Deploy.
Start Free Trial
Free C++Builder Community Edition