ArmNN
 25.11
Loading...
Searching...
No Matches
IRuntime Class Reference

#include <IRuntime.hpp>

Classes

struct  CreationOptions

Public Member Functions

Status LoadNetwork (NetworkId &networkIdOut, IOptimizedNetworkPtr network)
 Loads a complete network into the IRuntime.
Status LoadNetwork (NetworkId &networkIdOut, IOptimizedNetworkPtr network, std::string &errorMessage)
 Load a complete network into the IRuntime.
Status LoadNetwork (NetworkId &networkIdOut, IOptimizedNetworkPtr network, std::string &errorMessage, const INetworkProperties &networkProperties)
TensorInfo GetInputTensorInfo (NetworkId networkId, LayerBindingId layerId) const
TensorInfo GetOutputTensorInfo (NetworkId networkId, LayerBindingId layerId) const
std::vector< ImportedInputIdImportInputs (NetworkId networkId, const InputTensors &inputTensors, MemorySource forceImportMemorySource=MemorySource::Undefined)
 ImportInputs separates the importing and mapping of InputTensors from network execution.
std::vector< ImportedOutputIdImportOutputs (NetworkId networkId, const OutputTensors &outputTensors, MemorySource forceImportMemorySource=MemorySource::Undefined)
 ImportOutputs separates the importing and mapping of OutputTensors from network execution.
Status EnqueueWorkload (NetworkId networkId, const InputTensors &inputTensors, const OutputTensors &outputTensors, std::vector< ImportedInputId > preImportedInputIds={}, std::vector< ImportedOutputId > preImportedOutputIds={})
 Evaluates a network using input in inputTensors and outputs filled into outputTensors.
Status UnloadNetwork (NetworkId networkId)
 Unloads a network from the IRuntime.
const IDeviceSpecGetDeviceSpec () const
const std::shared_ptr< IProfilerGetProfiler (NetworkId networkId) const
 Gets the profiler corresponding to the given network id.
void RegisterDebugCallback (NetworkId networkId, const DebugCallbackFunction &func)
 Registers a callback function to debug layers performing custom computations on intermediate tensors.

Static Public Member Functions

static IRuntimeCreateRaw (const CreationOptions &options)
static IRuntimePtr Create (const CreationOptions &options)
static void Destroy (IRuntime *runtime)

Protected Member Functions

 IRuntime ()
 IRuntime (const IRuntime::CreationOptions &options)
 ~IRuntime ()

Protected Attributes

std::unique_ptr< RuntimeImplpRuntimeImpl

Detailed Description

Definition at line 67 of file IRuntime.hpp.

Constructor & Destructor Documentation

◆ IRuntime() [1/2]

IRuntime ( )
protected

Definition at line 41 of file Runtime.cpp.

41: pRuntimeImpl( new RuntimeImpl(armnn::IRuntime::CreationOptions())) {}
std::unique_ptr< RuntimeImpl > pRuntimeImpl
Definition IRuntime.hpp:263

References IRuntime(), and pRuntimeImpl.

Referenced by CreateRaw(), Destroy(), and IRuntime().

◆ IRuntime() [2/2]

IRuntime ( const IRuntime::CreationOptions & options)
protected

Definition at line 43 of file Runtime.cpp.

43: pRuntimeImpl(new RuntimeImpl(options)) {}

References pRuntimeImpl.

◆ ~IRuntime()

~IRuntime ( )
protecteddefault

Member Function Documentation

◆ Create()

IRuntimePtr Create ( const CreationOptions & options)
static
Examples
CustomMemoryAllocatorSample.cpp, DynamicSample.cpp, and SimpleSample.cpp.

Definition at line 52 of file Runtime.cpp.

53{
54 return IRuntimePtr(CreateRaw(options), &IRuntime::Destroy);
55}
static IRuntime * CreateRaw(const CreationOptions &options)
Definition Runtime.cpp:47
static void Destroy(IRuntime *runtime)
Definition Runtime.cpp:57
std::unique_ptr< IRuntime, void(*)(IRuntime *runtime)> IRuntimePtr
Definition IRuntime.hpp:39

References CreateRaw(), and Destroy().

◆ CreateRaw()

IRuntime * CreateRaw ( const CreationOptions & options)
static

Definition at line 47 of file Runtime.cpp.

48{
49 return new IRuntime(options);
50}

References IRuntime().

Referenced by Create().

◆ Destroy()

void Destroy ( IRuntime * runtime)
static

Definition at line 57 of file Runtime.cpp.

58{
59 delete runtime;
60}

References IRuntime().

Referenced by Create().

◆ EnqueueWorkload()

Status EnqueueWorkload ( NetworkId networkId,
const InputTensors & inputTensors,
const OutputTensors & outputTensors,
std::vector< ImportedInputId > preImportedInputIds = {},
std::vector< ImportedOutputId > preImportedOutputIds = {} )

Evaluates a network using input in inputTensors and outputs filled into outputTensors.

Definition at line 104 of file Runtime.cpp.

109{
110 return pRuntimeImpl->EnqueueWorkload(networkId, inputTensors, outputTensors,
111 preImportedInputIds, preImportedOutputIds);
112}

References pRuntimeImpl.

◆ GetDeviceSpec()

const IDeviceSpec & GetDeviceSpec ( ) const

Definition at line 119 of file Runtime.cpp.

120{
121 return pRuntimeImpl->GetDeviceSpec();
122}

References pRuntimeImpl.

◆ GetInputTensorInfo()

armnn::TensorInfo GetInputTensorInfo ( NetworkId networkId,
LayerBindingId layerId ) const

Definition at line 82 of file Runtime.cpp.

83{
84 return pRuntimeImpl->GetInputTensorInfo(networkId, layerId);
85}

References pRuntimeImpl.

◆ GetOutputTensorInfo()

armnn::TensorInfo GetOutputTensorInfo ( NetworkId networkId,
LayerBindingId layerId ) const

Definition at line 87 of file Runtime.cpp.

88{
89 return pRuntimeImpl->GetOutputTensorInfo(networkId, layerId);
90}

References pRuntimeImpl.

◆ GetProfiler()

const std::shared_ptr< IProfiler > GetProfiler ( NetworkId networkId) const

Gets the profiler corresponding to the given network id.

Parameters
networkIdThe id of the network for which to get the profile.
Returns
A pointer to the requested profiler, or nullptr if not found.

Definition at line 124 of file Runtime.cpp.

125{
126 return pRuntimeImpl->GetProfiler(networkId);
127}

References pRuntimeImpl.

◆ ImportInputs()

std::vector< ImportedInputId > ImportInputs ( NetworkId networkId,
const InputTensors & inputTensors,
MemorySource forceImportMemorySource = MemorySource::Undefined )

ImportInputs separates the importing and mapping of InputTensors from network execution.

Allowing for a set of InputTensors to be imported and mapped once, but used in execution many times. This function is not thread safe and must not be used while other threads are calling Execute(). No exceptions are thrown for failed imports. It is the caller's responsibility to check whether tensors have been successfully imported by comparing returned ids with those passed in the InputTensors. Whether a tensor can be imported or not is backend specific.

Definition at line 92 of file Runtime.cpp.

94{
95 return pRuntimeImpl->ImportInputs(networkId, inputTensors, forceImportMemorySource);
96}

References pRuntimeImpl.

◆ ImportOutputs()

std::vector< ImportedOutputId > ImportOutputs ( NetworkId networkId,
const OutputTensors & outputTensors,
MemorySource forceImportMemorySource = MemorySource::Undefined )

ImportOutputs separates the importing and mapping of OutputTensors from network execution.

Allowing for a set of OutputTensors to be imported and mapped once, but used in execution many times. This function is not thread safe and must not be used while other threads are calling Execute(). No exceptions are thrown for failed imports. It is the caller's responsibility to check whether tensors have been successfully imported by comparing returned ids with those passed in the OutputTensors. Whether a tensor can be imported or not is backend specific.

Definition at line 98 of file Runtime.cpp.

100{
101 return pRuntimeImpl->ImportOutputs(networkId, outputTensors, forceImportMemorySource);
102}

References pRuntimeImpl.

◆ LoadNetwork() [1/3]

Status LoadNetwork ( NetworkId & networkIdOut,
IOptimizedNetworkPtr network )

Loads a complete network into the IRuntime.

Parameters
[out]networkIdOut- Unique identifier for the network is returned in this reference.
[in]network- Complete network to load into the IRuntime. The runtime takes ownership of the network once passed in.
Returns
armnn::Status

Definition at line 62 of file Runtime.cpp.

63{
64 return pRuntimeImpl->LoadNetwork(networkIdOut, std::move(network));
65}

References pRuntimeImpl.

◆ LoadNetwork() [2/3]

Status LoadNetwork ( NetworkId & networkIdOut,
IOptimizedNetworkPtr network,
std::string & errorMessage )

Load a complete network into the IRuntime.

Parameters
[out]networkIdOutUnique identifier for the network is returned in this reference.
[in]networkComplete network to load into the IRuntime.
[out]errorMessageError message if there were any errors. The runtime takes ownership of the network once passed in.
Returns
armnn::Status

Definition at line 67 of file Runtime.cpp.

70{
71 return pRuntimeImpl->LoadNetwork(networkIdOut, std::move(network), errorMessage);
72}

References pRuntimeImpl.

◆ LoadNetwork() [3/3]

Status LoadNetwork ( NetworkId & networkIdOut,
IOptimizedNetworkPtr network,
std::string & errorMessage,
const INetworkProperties & networkProperties )

Definition at line 74 of file Runtime.cpp.

78{
79 return pRuntimeImpl->LoadNetwork(networkIdOut, std::move(network), errorMessage, networkProperties);
80}

References pRuntimeImpl.

◆ RegisterDebugCallback()

void RegisterDebugCallback ( NetworkId networkId,
const DebugCallbackFunction & func )

Registers a callback function to debug layers performing custom computations on intermediate tensors.

Parameters
networkIdThe id of the network to register the callback.
funccallback function to pass to the debug layer.

Definition at line 129 of file Runtime.cpp.

130{
131 return pRuntimeImpl->RegisterDebugCallback(networkId, func);
132}

References pRuntimeImpl.

◆ UnloadNetwork()

Status UnloadNetwork ( NetworkId networkId)

Unloads a network from the IRuntime.

At the moment this only removes the network from the m_Impl->m_Network. This might need more work in the future to be AndroidNN compliant.

Parameters
[in]networkId- Unique identifier for the network to be unloaded. Generated in LoadNetwork().
Returns
armnn::Status

Definition at line 114 of file Runtime.cpp.

115{
116 return pRuntimeImpl->UnloadNetwork(networkId);
117}

References pRuntimeImpl.

Member Data Documentation

◆ pRuntimeImpl


The documentation for this class was generated from the following files: