Artificial Intelligence TechC++C++11C++14C++17Learn C++

This Is How An SELU Activation Function Works In A C++ App

What is an SELU Activation function? How can we use Scaled Exponential Linear Unit in an artificial neural network (ANN )? How can I use SELU activation functions in my own C+ app?

Convolutional Neural Networks (CNNs) created a revolution in visual analysis and recurrent neural networks (RNNs) were similarly revolutionary in Natural Language Processing. Consequently, both are two of leading AI technologies that we use in Deep Learning. There are also rare success stories of Deep Learning with standard Feed-Forward Neural Network (FFN). There are many different activation functions used in these methods. Let’s refresh our memory about activation functions and explain these terms.

What is an Activation Function in AI?

An Activation Function ( phi() ) also called is a transfer function, or threshold function that determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function. The Net Input Function, here, is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions.

In other words, the activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.

In C++ (in general in most Programming Languages) you can create your activation function. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,

By using this sum Net Input Function Valueand phi() activation functions, let’s see some of activation functions in C++; Now Let’s see how we can use SELU Activation Function with this example formula,

What is a Scaled Exponential Linear Unit (SELU)?

The Scaled Exponential Linear Unit is another activation function which is a scaled version of ELU by using λ parameter. Scaled Exponential Linear Unit is developed and released with the “Self-Normalizing Neural Networks” paper by Günter Klambauer, Thomas Unterthiner, Andreas Mayr in 2017. They introduced self-normalizing neural networks (SNNs) to enable high-level abstract representations. Neuron activations of SNNs automatically converge towards zero mean and unit variance, while the batch normalization requires explicit normalization.

SELU is a scaled version of ELU activation function by multiplying with λ parameter, So we can simply say this,

The SELU Activation Function can be written as follows,

They have solved for α and λ and obtain the solutions α01 ≈ 1.6733 and λ01 ≈ 1.0507, where the subscript 01 indicates that these are the parameters for fixed point (0, 1). According this explanation, each node may have different α and λ parameters. So we can define alfa and lambda parameters in neuron structure and we can calculate SELU as below.

A simple ANN example with a Scaled Exponential Linear Unit (SELU)

We can use this given SELU function in our Tneuron class as below,


Why not download a free trial of C++ Builder today and see the kind of future you can build?

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome C++ content in your inbox, every day.

We don’t spam! Read our privacy policy for more info.

About author

Dr. Yilmaz Yoru has 35+ years of coding with more than 30+ programming languages, mostly C++ on Windows, Android, Mac-OS, iOS, Linux, and some other operating systems. He graduated and received his MSc and PhD degrees from the Department of Mechanical Engineering of Eskisehir Osmangazi University. He is the founder and CEO of ESENJA LLC Company. His interests are Programming, Thermodynamics, Fluid Mechanics, Artificial Intelligence, 2D & 3D Designs, and high-end innovations.
Related posts
C++C++11C++14C++17C++20Introduction to C++Learn C++

Learn Copy Constructors in C++ Classes

C++C++11C++14C++17Introduction to C++Learn C++Syntax

Learn How To Use Types Of Destructors In C++?

C++C++11C++14Learn C++Syntax

How To Convert u32string To A wstring In C++

C++C++11C++14C++17C++20Introduction to C++Learn C++

How To Learn The Move Constructors In Modern C++?