Artificial Intelligence TechC++C++11C++14C++17Learn C++

What Is A Rectified Linear Unit Activation ANN Function?

IThis article will explain what a Rectified Linear Function is in ANN. How can ReLU Activation Function Hyperbolic be used? Let us refresh our memories about activation functions and define these terms. Learning rectified linear functions will be useful when coding a C++ program with c++ software.


What is an artificial neural network (ANN) activation function?

An Activation Function ( phi() ) also called as transfer function, or threshold function determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function . Net Input Function, here the sum is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions. In another term. The activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.

In C++ (in general in most programming languages) you can create your activation function. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,

By using this sum Net Input Function Value and phi() activation functions, let’s see some of activation functions in C++; Now Let’s see how we can use Binary Step Function as in this example formula,

What is a Rectified Linear Unit?

In Artificial Neural Networks, the Rectifier Linear Unit Function or in other terms  ReLU Activation Function is an activation function defined as the positive part of its argument. Can be written as f(x)= max(0, x) where x is sum of weighted input signals to an artificial neuron. ReLU Function is also known as a Ramp Function and is analogous to Half-wave Rectification in electrical engineering.

This function called as a Parametric ReLU function. If Beta is 0.01 it is called Leaky ReLU function.

This is max-out ReLU function,

if Beta is 0 then f(x) = max(x, 0). This function will return always positive numbers. Let’s write maxout ReLU function in C programming language,

Is there a simple ANN example using a Rectified Linear Unit activation function in C++?

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome C++ content in your inbox, every day.

We don’t spam! Read our privacy policy for more info.


Reduce development time and get to market faster with RAD Studio, Delphi, or C++Builder.
Design. Code. Compile. Deploy.
Start Free Trial

Free C++Builder Community Edition

About author

Dr. Yilmaz Yoru has 35+ years of coding with more than 30+ programming languages, mostly C++ on Windows, Android, Mac-OS, iOS, Linux, and some other operating systems. He graduated and received his MSc and PhD degrees from the Department of Mechanical Engineering of Eskisehir Osmangazi University. He is the founder and CEO of ESENJA LLC Company. His interests are Programming, Thermodynamics, Fluid Mechanics, Artificial Intelligence, 2D & 3D Designs, and high-end innovations.
Related posts
C++C++17Code SnippetGame DevelopmentLanguage FeatureLearn C++

What Is Skia In Modern C++?

C++C++17Learn C++

How To Use Skia in C++ Builder 12?

C++C++17C++20Introduction to C++Language FeatureLearn C++Syntax

Learn How To Use Clamp (std::clamp) In Modern C++ 17 and Beyond

C++C++11C++14C++17C++20Learn C++

What Is The Priority Queue (std::priority_queue) In Modern C++?