Artificial Intelligence TechC++C++11C++14C++17Learn C++

What Is The Sigmoid Linear Unit (SiLU) In A Neural Network C++ App

What is Sigmoid Linear Unit in a neural network created with a C++ app? How can we use the SiLU function in an artificial neural network (ANN)? Where can we use SiLU in AI technologies? Let’s remind ourselves of the activation function and explain these terms.

What is an activation function and how can we use it in a C++ app?

Activation Function ( phi() ) also called as transfer function, or threshold function that determines the activation value ( a = phi(sum) ) from a given value (sum) from the Net Input Function . Net Input Function, here the sum is a sum of signals in their weights, and activation function is a new value of this sum with a given function or conditions. In another term. The activation function is a way to transfer the sum of all weighted signals to a new activation value of that signal. There are different activation functions, mostly Linear (Identity), bipolar and logistic (sigmoid) functions are used. The activation function and its types are explained well here.

In a C++ app (in general in most Programming Languages) you can create your activation function. Note that sum is the result of Net Input Function which calculates the sum of all weighted signals. We will use some as a result of the input function. Here activation value of an artificial neuron (output value) can be written by the activation function as below,

By using this sum Net Input Function Value and phi() activation functions, let’s see some of activation functions in C++; Now Let’s see how we can use Binary Step Function as in this example formula,

What is a sigmoid linear unit (SiLU)?

Sigmoid Linear Unit (SiLU) also known as Sigmoid-Weighted Linear Unit is an activation function that uses the sigmoid function with multiplication itself. In other words, the activation of the SiLU is computed by the sigmoid function multiplied by its input. Stefan Elfwinga, Eiji Uchibea, Kenji Doyab were explained SiLU well in their “Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning” study in 2017. They also used dSiLU term which is the derivative SiLU.

Let’s remember sigmoid function,

here f(x) is activation function ( we use phi() ) and the x is sum of weighted inputs.

Thus, Sigmoid Linear Unit (SiLU) is,

This SiLU function can be written in C++ as below

A simple ANN C++ app example with SiLU activation function

Why not download a free trial of C++ Builder and try these examples for yourself today?

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome C++ content in your inbox, every day.

We don’t spam! Read our privacy policy for more info.


Reduce development time and get to market faster with RAD Studio, Delphi, or C++Builder.
Design. Code. Compile. Deploy.
Start Free Trial

Free C++Builder Community Edition

About author

Dr. Yilmaz Yoru has 35+ years of coding with more than 30+ programming languages, mostly C++ on Windows, Android, Mac-OS, iOS, Linux, and some other operating systems. He graduated and received his MSc and PhD degrees from the Department of Mechanical Engineering of Eskisehir Osmangazi University. He is the founder and CEO of ESENJA LLC Company. His interests are Programming, Thermodynamics, Fluid Mechanics, Artificial Intelligence, 2D & 3D Designs, and high-end innovations.
Related posts
C++C++17Language FeatureLearn C++

How To Use Skia Images in C++ Builder?

C++C++17Code SnippetGame DevelopmentLanguage FeatureLearn C++

What Is Skia In Modern C++?

C++C++17Learn C++

How To Use Skia in C++ Builder 12?

C++C++17C++20Introduction to C++Language FeatureLearn C++Syntax

Learn How To Use Clamp (std::clamp) In Modern C++ 17 and Beyond