Artificial Intelligence TechC++C++11C++14C++17Learn C++

Why You Should Know ELU Artificial Neural Net Functions

Why You Should Know ELU Artificial Neural Net Functions

What is an Exponential Linear Unit or ELU? How can we use an ELU Activation Function?


What do we need to know about activation functions?

An Activation Function ( phi() ) also called as transfer function, or threshold function, determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function . Net Input Function, here the sum is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions. In another term. The activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.

In C++ (in general in most Programming Languages) you can create activation functions. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,

By using this sum Net Input Function Value and phi() activation functions.

What is an Exponential Linear Unit or ELU?

An Exponential Linear Unit (ELU) is another activation function which is developed and published by Djork-Arne Clevert, Thomas Unterthiner & Sepp Hochreiter with the title “FAST AND ACCURATE DEEP NETWORK LEARNING BY EXPONENTIAL LINEAR UNITS (ELUS)”. You can find the text of the paper by clicking here.

According to their study, they introduce the “exponential linear unit” (ELU) that it speeds up learning in deep neural networks and leads to higher classification accuracies. ELU activation function alleviate the vanishing gradient problem via the identity for positive values, like rectified linear units (ReLUs), leaky ReLUs (LReLUs) and parametrized ReLUs (PReLUs),. They also proof that ELUs have improved learning characteristics compared to the units with other activation functions. In contrast to ReLUs,

Exponential Linear Unit (ELU) can be written as below,

and derivative of this function can be written as,

In C & C++ Programming language, simply Exponential Linear Unit function can be written as below

Is there a simple Artificial Neural Network example which uses an Exponential Linear Unit (ELU) in C++?

We can use this given ELU function in our Tneuron class as below,

close

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome C++ content in your inbox, every day.

We don’t spam! Read our privacy policy for more info.


Reduce development time and get to market faster with RAD Studio, Delphi, or C++Builder.
Design. Code. Compile. Deploy.
Start Free Trial

Free C++Builder Community Edition

About author

33+ years of coding with more than 30+ programming languages, mostly C++ on Windows, Android, Mac-OS, iOS, Linux and some other operating systems. Dr. Yilmaz Yoru was born in 1974, Eskisehir-Turkey. He graduated from the department of Mechanical Engineering of Eskisehir Osmangazi University in 1997. One year later he started to work in the same university as an assistant. He received his MSc and PhD degrees from the same department of the same university. He has married and he is a father of a son. Some of his interests are Programming, Thermodynamics, Fluid Mechanics and Artificial Intelligence. He also likes the graphical 2D & 3D design and high-end innovations.
Related posts
C++ComponentsLanguage FeatureLearn C++

How To Add Shadow Effects To Your C++ Apps

C++ComponentsLanguage FeatureLearn C++

How To Make Controls Have A Glow Effect In C++?

C++Learn C++

What Is The sscanf Function In C++ And How Can I Use It?

C++C++17Introduction to C++Language FeatureLearn C++

This Is How To Use Parallel Programming in C++ Builder

en_USEnglish