Compute Library
 19.08
ActivationLayerInfo Class Reference

Activation Layer Information class. More...

#include <Types.h>

Public Types

enum  ActivationFunction {
  LOGISTIC, TANH, RELU, BOUNDED_RELU,
  LU_BOUNDED_RELU, LEAKY_RELU, SOFT_RELU, ABS,
  SQUARE, SQRT, LINEAR, IDENTITY
}
 Available activation functions. More...
 

Public Member Functions

 ActivationLayerInfo ()=default
 
 ActivationLayerInfo (ActivationFunction f, float a=0.0f, float b=0.0f)
 Default Constructor. More...
 
ActivationFunction activation () const
 Get the type of activation function. More...
 
float a () const
 Get the alpha value. More...
 
float b () const
 Get the beta value. More...
 
bool enabled () const
 Check if initialised. More...
 

Detailed Description

Activation Layer Information class.

Definition at line 1517 of file Types.h.

Member Enumeration Documentation

◆ ActivationFunction

enum ActivationFunction
strong

Available activation functions.

Enumerator
LOGISTIC 

Logistic ( \( f(x) = \frac{1}{1 + e^{-x}} \) )

TANH 

Hyperbolic tangent ( \( f(x) = a \cdot tanh(b \cdot x) \) )

RELU 

Rectifier ( \( f(x) = max(0,x) \) )

BOUNDED_RELU 

Upper Bounded Rectifier ( \( f(x) = min(a, max(0,x)) \) )

LU_BOUNDED_RELU 

Lower and Upper Bounded Rectifier ( \( f(x) = min(a, max(b,x)) \) )

LEAKY_RELU 

Leaky Rectifier ( \( f(x) = \begin{cases} \alpha x & \quad \text{if } x \text{ < 0}\\ x & \quad \text{if } x \geq \text{ 0 } \end{cases} \) )

SOFT_RELU 

Soft Rectifier ( \( f(x)= log(1+e^x) \) )

ABS 

Absolute ( \( f(x)= |x| \) )

SQUARE 

Square ( \( f(x)= x^2 \) )

SQRT 

Square root ( \( f(x) = \sqrt{x} \) )

LINEAR 

Linear ( \( f(x)= ax + b \) )

IDENTITY 

Identity ( \( f(x)= x \) )

Definition at line 1521 of file Types.h.

1522  {
1523  LOGISTIC,
1524  TANH,
1525  RELU,
1526  BOUNDED_RELU,
1527  LU_BOUNDED_RELU,
1528  LEAKY_RELU,
1529  SOFT_RELU,
1530  ABS,
1531  SQUARE,
1532  SQRT,
1533  LINEAR,
1534  IDENTITY
1535  };

Constructor & Destructor Documentation

◆ ActivationLayerInfo() [1/2]

ActivationLayerInfo ( )
default

◆ ActivationLayerInfo() [2/2]

ActivationLayerInfo ( ActivationFunction  f,
float  a = 0.0f,
float  b = 0.0f 
)
inline

Default Constructor.

Parameters
[in]fThe activation function to use.
[in]a(Optional) The alpha parameter used by some activation functions (ActivationFunction::BOUNDED_RELU, ActivationFunction::LU_BOUNDED_RELU, ActivationFunction::LINEAR, ActivationFunction::TANH).
[in]b(Optional) The beta parameter used by some activation functions (ActivationFunction::LINEAR, ActivationFunction::LU_BOUNDED_RELU, ActivationFunction::TANH).

Definition at line 1545 of file Types.h.

1546  : _act(f), _a(a), _b(b), _enabled(true)
1547  {
1548  }
float a() const
Get the alpha value.
Definition: Types.h:1555
float b() const
Get the beta value.
Definition: Types.h:1560

Member Function Documentation

◆ a()

float a ( ) const
inline

Get the alpha value.

Definition at line 1555 of file Types.h.

1556  {
1557  return _a;
1558  }

Referenced by CLGEMMMatrixMultiplyKernel::configure(), and arm_compute::utils::info_helpers::is_relu6().

◆ activation()

◆ b()

float b ( ) const
inline

Get the beta value.

Definition at line 1560 of file Types.h.

1561  {
1562  return _b;
1563  }

Referenced by CLGEMMMatrixMultiplyKernel::configure(), and arm_compute::utils::info_helpers::is_relu6().

◆ enabled()


The documentation for this class was generated from the following file: