Artificial Intelligence TechC++C++11C++14C++17Learn C++

Why You Should Know ELU Artificial Neural Net Functions

Why You Should Know ELU Artificial Neural Net Functions

In this post, we will explain what an Exponential Linear Unit, or ELU, is. How can we make use of an ELU Activation Function? By learning all of these, you will be able to create C++ applications using C++ software.


What do we need to know about activation functions?

An Activation Function ( phi() ) also called as transfer function, or threshold function, determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function . Net Input Function, here the sum is a sum of signals in their weights, and the activation function is a new value of this sum with a given function or conditions. In other terms. The activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.

In C++ (in general in most Programming Languages) you can create activation functions. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,

By using this sum Net Input Function Value and phi() activation functions.

What is an Exponential Linear Unit or ELU?

An Exponential Linear Unit (ELU) is another activation function which is developed and published by Djork-Arne Clevert, Thomas Unterthiner & Sepp Hochreiter with the title “FAST AND ACCURATE DEEP NETWORK LEARNING BY EXPONENTIAL LINEAR UNITS (ELUS)”. You can find the text of the paper by clicking here.

According to their study, they introduce the “exponential linear unit” (ELU) that it speeds up learning in deep neural networks and leads to higher classification accuracies. ELU activation function alleviate the vanishing gradient problem via the identity for positive values, like rectified linear units (ReLUs), leaky ReLUs (LReLUs) and parametrized ReLUs (PReLUs),. They also proof that ELUs have improved learning characteristics compared to the units with other activation functions. In contrast to ReLUs,

Exponential Linear Unit (ELU) can be written as below,

and derivative of this function can be written as,

In C & C++ Programming language, simply Exponential Linear Unit function can be written as below

Is there a simple Artificial Neural Network example which uses an Exponential Linear Unit (ELU) in C++?

We can use this given ELU function in our Tneuron class as below,

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome C++ content in your inbox, every day.

We don’t spam! Read our privacy policy for more info.


Reduce development time and get to market faster with RAD Studio, Delphi, or C++Builder.
Design. Code. Compile. Deploy.
Start Free Trial

Free C++Builder Community Edition

About author

Dr. Yilmaz Yoru has 35+ years of coding with more than 30+ programming languages, mostly C++ on Windows, Android, Mac-OS, iOS, Linux, and some other operating systems. He graduated and received his MSc and PhD degrees from the Department of Mechanical Engineering of Eskisehir Osmangazi University. He is the founder and CEO of ESENJA LLC Company. His interests are Programming, Thermodynamics, Fluid Mechanics, Artificial Intelligence, 2D & 3D Designs, and high-end innovations.
Related posts
C++C++11C++14C++17C++20Introduction to C++Learn C++

Learn Copy Constructors in C++ Classes

C++C++11C++14C++17Introduction to C++Learn C++Syntax

Learn How To Use Types Of Destructors In C++?

C++C++11C++14Learn C++Syntax

How To Convert u32string To A wstring In C++

C++C++11C++14C++17C++20Introduction to C++Learn C++

How To Learn The Move Constructors In Modern C++?