CMSIS-NN  
CMSIS NN Software Library
 
Loading...
Searching...
No Matches
Activation Functions

Functions

arm_cmsis_nn_status arm_nn_activation_s16 (const int16_t *input, int16_t *output, const int32_t size, const int32_t left_shift, const arm_nn_activation_type type)
 s16 neural network activation function using direct table look-up
 
void arm_relu6_s8 (int8_t *data, uint16_t size)
 s8 ReLU6 function
 
void arm_relu_q15 (int16_t *data, uint16_t size)
 Q15 RELU function.
 
void arm_relu_q7 (int8_t *data, uint16_t size)
 Q7 RELU function.
 

Description

Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh

Function Documentation

◆ arm_nn_activation_s16()

arm_cmsis_nn_status arm_nn_activation_s16 ( const int16_t *  input,
int16_t *  output,
const int32_t  size,
const int32_t  left_shift,
const arm_nn_activation_type  type 
)

s16 neural network activation function using direct table look-up

Parameters
[in]inputpointer to input data
[out]outputpointer to output
[in]sizenumber of elements
[in]left_shiftbit-width of the integer part, assumed to be smaller than 3.
[in]typetype of activation functions
Returns
The function returns ARM_CMSIS_NN_SUCCESS

Supported framework: TensorFlow Lite for Microcontrollers. This activation function must be bit precise congruent with the corresponding TFLM tanh and sigmoid activation functions

◆ arm_relu6_s8()

void arm_relu6_s8 ( int8_t *  data,
uint16_t  size 
)

s8 ReLU6 function

Parameters
[in,out]datapointer to input
[in]sizenumber of elements

◆ arm_relu_q15()

void arm_relu_q15 ( int16_t *  data,
uint16_t  size 
)

Q15 RELU function.

Parameters
[in,out]datapointer to input
[in]sizenumber of elements

◆ arm_relu_q7()

void arm_relu_q7 ( int8_t *  data,
uint16_t  size 
)

Q7 RELU function.

Parameters
[in,out]datapointer to input
[in]sizenumber of elements