Compute Library
 21.05
CpuPoolingAssemblyDispatch.h
Go to the documentation of this file.
1 /*
2  * Copyright (c) 2021 Arm Limited.
3  *
4  * SPDX-License-Identifier: MIT
5  *
6  * Permission is hereby granted, free of charge, to any person obtaining a copy
7  * of this software and associated documentation files (the "Software"), to
8  * deal in the Software without restriction, including without limitation the
9  * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
10  * sell copies of the Software, and to permit persons to whom the Software is
11  * furnished to do so, subject to the following conditions:
12  *
13  * The above copyright notice and this permission notice shall be included in all
14  * copies or substantial portions of the Software.
15  *
16  * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17  * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18  * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19  * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20  * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21  * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
22  * SOFTWARE.
23  */
24 #ifndef ARM_COMPUTE_CPU_POOLING_ASSEMBLY_DISPATCH_H
25 #define ARM_COMPUTE_CPU_POOLING_ASSEMBLY_DISPATCH_H
26 
27 #include "arm_compute/core/Types.h"
32 
33 namespace arm_compute
34 {
35 namespace cpu
36 {
37 class ITensor;
38 
39 /** Basic function to run pooling assembly kernels */
41 {
42 public:
43  /** Constructor */
44  CpuPoolingAssemblyDispatch(std::shared_ptr<IMemoryManager> memory_manager = nullptr);
45  /** Prevent instances of this class from being copied */
47  /** Default move constructor */
49  /** Prevent instances of this class from being copied */
51  /** Default move assignment operator */
53  /** Destructor */
55 
56  /** If supported create an assembly routine, else fallback to Compute Library function.
57  *
58  * @param[in] src Source tensor info. Data types supported: QASYMM8/QASYMM8_SIGNED/F16/F32.
59  * @param[out] dst Destination tensor info to store the result of pooling. Data types supported: same as @p src.
60  * @param[in] info Pooling meta-data
61  */
63 
64  /** Indicates whether or not this function can be used to process the given parameters.
65  *
66  * @param[in] src Source tensor info. Data types supported: QASYMM8/QASYMM8_SIGNED/F16/F32.
67  * @param[in] dst Destination tensor to store the result of pooling. Data types supported: same as @p src.
68  * @param[in] info Pooling meta-data
69  *
70  * @return a status.
71  */
72  static Status validate(const ITensorInfo *src, const ITensorInfo *dst, const PoolingLayerInfo &info);
73  /** Was the function successfully configured ?
74  *
75  * @return True if the function is configured and ready to run
76  */
77  bool is_configured() const;
78  // Run method overriden
79  void run(ITensorPack &tensors) override;
80 
81 private:
82  arm_compute::MemoryGroup _memory_group;
83 
84  arm_compute::Tensor _workspace;
85  bool _is_global_pooling_layer;
86 };
87 } // namespace cpu
88 } // namespace arm_compute
89 #endif /* ARM_COMPUTE_CPU_POOLING_ASSEMBLY_DISPATCH_H */
void run(ITensorPack &tensors) override
Run the kernels contained in the function.
Basic interface for functions which have a single async CPU kernel.
Definition: INEOperator.h:43
static Status validate(const ITensorInfo *src, const ITensorInfo *dst, const PoolingLayerInfo &info)
Indicates whether or not this function can be used to process the given parameters.
Store the tensor's metadata.
Definition: ITensorInfo.h:40
Status class.
Definition: Error.h:52
SimpleTensor< float > src
Definition: DFT.cpp:155
Copyright (c) 2017-2021 Arm Limited.
Pooling Layer Information struct.
Definition: Types.h:1142
Basic implementation of the tensor interface.
Definition: Tensor.h:37
bool is_configured() const
Was the function successfully configured ?
CpuPoolingAssemblyDispatch(std::shared_ptr< IMemoryManager > memory_manager=nullptr)
Constructor.
ScaleKernelInfo info(interpolation_policy, default_border_mode, PixelValue(), sampling_policy, false)
void configure(const ITensorInfo *src, ITensorInfo *dst, const PoolingLayerInfo &info)
If supported create an assembly routine, else fallback to Compute Library function.
Tensor packing service.
Definition: ITensorPack.h:37
Basic function to run pooling assembly kernels.
CpuPoolingAssemblyDispatch & operator=(const CpuPoolingAssemblyDispatch &)=delete
Prevent instances of this class from being copied.