Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh
◆ arm_nn_activation_s16()
void arm_nn_activation_s16 |
( |
const int16_t * |
input, |
|
|
int16_t * |
output, |
|
|
const uint16_t |
size, |
|
|
const uint16_t |
left_shift, |
|
|
const arm_nn_activation_type |
type |
|
) |
| |
- Parameters
-
[in] | input | pointer to input data |
[out] | output | pointer to output |
[in] | size | number of elements |
[in] | left_shift | bit-width of the integer part, assume to be smaller than 3 |
[in] | type | type of activation functions |
Supported framework: TensorFlow Lite for Microcontrollers. This activation function must be bit precise congruent with the corresponding TFLM tanh and sigmoid actication functions
◆ arm_relu6_s8()
void arm_relu6_s8 |
( |
int8_t * |
data, |
|
|
uint16_t |
size |
|
) |
| |
- Parameters
-
[in,out] | data | pointer to input |
[in] | size | number of elements |
◆ arm_relu_q15()
void arm_relu_q15 |
( |
int16_t * |
data, |
|
|
uint16_t |
size |
|
) |
| |
- Parameters
-
[in,out] | data | pointer to input |
[in] | size | number of elements |
◆ arm_relu_q7()
void arm_relu_q7 |
( |
int8_t * |
data, |
|
|
uint16_t |
size |
|
) |
| |
- Parameters
-
[in,out] | data | pointer to input |
[in] | size | number of elements |