Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh
◆ arm_nn_activation_s16()
s16 neural network activation function using direct table look-up
- Parameters
-
[in] | input | pointer to input data |
[out] | output | pointer to output |
[in] | size | number of elements |
[in] | left_shift | bit-width of the integer part, assumed to be smaller than 3. |
[in] | type | type of activation functions |
- Returns
- The function returns
ARM_CMSIS_NN_SUCCESS
Supported framework: TensorFlow Lite for Microcontrollers. This activation function must be bit precise congruent with the corresponding TFLM tanh and sigmoid activation functions
◆ arm_relu6_s8()
void arm_relu6_s8 |
( |
int8_t * |
data, |
|
|
uint16_t |
size |
|
) |
| |
s8 ReLU6 function
- Parameters
-
[in,out] | data | pointer to input |
[in] | size | number of elements |
◆ arm_relu_q15()
void arm_relu_q15 |
( |
int16_t * |
data, |
|
|
uint16_t |
size |
|
) |
| |
Q15 RELU function.
- Parameters
-
[in,out] | data | pointer to input |
[in] | size | number of elements |
◆ arm_relu_q7()
void arm_relu_q7 |
( |
int8_t * |
data, |
|
|
uint16_t |
size |
|
) |
| |
Q7 RELU function.
- Parameters
-
[in,out] | data | pointer to input |
[in] | size | number of elements |