What is Self Regularized Non-Monotonic Activation Function in Neural Networks? How we can use the Mish function in ANN? Where can we use Mish in AI technologies? Let’s remember the activation function and explain these terms.**Activation Function** ( phi() ) also called as **transfer function**, or **threshold function** that determines the activation value ( a = phi(sum) ) from a given value (sum) from the **Net Input Function** . **Net Input Function**, here **the sum** is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions. In another term, the activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.

In C++ (in general in most Programming Languages) you can create your activation function. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. Here activation value of an artificial neuron (output value) can be written by the activation function as below,

By using this **sum **Net Input Function Value** **and **phi() activation functions**, we can code this phi() function. Let’s see some of activation functions in C++; Now Let’s see how we can use Mish Function as in this example formula,

## Self Regularized Non-Monotonic (Mish) Activation Function

**Self Regularized Non-Monotonic (Mish) Activation Function** is inspired from the Swish activation function is a smooth, continuous, self regularized, non-monotonic activation function. This function is published “Mish: A Self Regularized Non-Monotonic Activation Function” by Diganta Misra in 2019.

According to this study; “Mish uses the Self-Gating property where the non-modulated input is multiplied with the output of a non-linear function of the input. Due to the preservation of a small amount of negative information, Mish eliminated by design the preconditions necessary for the Dying ReLU phenomenon. This property helps in better expressivity and information flow. Being unbounded above, Mish avoids saturation, which generally causes training to slow down due to near-zero gradients drastically. Being bounded below is also advantageous since it results in strong regularization effects. Unlike ReLU, Mish is continuously differentiable, a property that is preferable because it avoids singularities and, therefore, undesired side effects when performing gradient-based optimization.”

We explained softplus() activation function before. Mish Activation Function can be defined by using softplus() as follows,

Hence, Mish Activation Function can be defined mathematically as follows,

Author compared well Mish, ReLU, SoftPlus and Swish activation function outputs and also compared first and second derivatives of Mish and Swish.

Mish function can be coded in C++ as below,

1 2 3 4 5 6 |
double phi(double sum) { return ( sum* std::tanh( std::ln(1+ std::exp(sum)) ); // Mish Function } |

**A ****Simple ANN example with** S**elf Regularized Non-Monotonic (Mish) Activation Function** in C++

**Simple ANN example with**

We can simply use this mish function in our generic simple ANN example as below,

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
#include <iostream> #define NN 2 // number of neurons class Tneuron // neuron class { public: double a; // activity of each neurons double w[NN+1]; // weight of links between each neurons Tneuron() { a=0; for(int i=0; i<NN; i++) w[i]=-1; // if weight is negative there is no link } // let's define an activation function (or threshold) for the output neuron double activation_function(double sum) { return ( sum* std::tanh( std::ln(1+ std::exp(sum)) ); // Mish Function } }; Tneuron ne[NN+1]; // neuron objects void fire(int nn) { float sum = 0; for ( int j=0; j<NN; j++ ) { if( ne[j].w[nn]>0 ) sum += ne[j].a*ne[j].w[nn]; } ne[nn].a = ne[nn].activation_function(sum); } int main() { //let's define activity of two input neurons (a0, a1) and one output neuron (a2) ne[0].a = 0.0; ne[1].a = 1.0; ne[2].a = 0; //let's define weights of signals comes from two input neurons to output neuron (0 to 2 and 1 to 2) ne[0].w[2] = 0.6; ne[1].w[2] = 0.4; // Let's fire our artificial neuron activity, output will be fire(2); printf("%10.6f\n", ne[2].a); getchar(); return 0; } |

Design. Code. Compile. Deploy.

Start Free Trial

Free C++Builder Community Edition