Arm Virtual Hardware  Version 2.1.0
AVH FVP Models
 
Loading...
Searching...
No Matches

Video file streaming interface. More...

Content

 Video Driver API Functions
 Video Driver API functions.
 
 Video Driver API Defines
 Video Driver API Definitions.
 

Data Structures

struct  VideoDrv_Status_t
 Video Status. More...
 

Description

Video file streaming interface.

Video streaming use case is implemented for AVH FVPs based on the general-purpose Virtual Streaming Interface (VSI).

The use of generic Video Driver APIs simplifies re-targeting of the application code between virtual and physical devices.

The video driver implementation for AVH FVPs relies on the VSI peripheral API with an interface to the python scripts. For video use case the python scripts access local video or image files as streaming input/output. Alternatively, PC's camera can be used as video input and PC's display as video output.

The operation of the video driver follows the principles used in the drivers for real hardware: application should configure the driver with such parameters as resolution, frame rate, color scheme. Additionally it specifies the local input/output file name or defaults to webcam input. Then the application controls the video stream / frame extraction from the peripheral.

The concept is shown on the figure below for input and optional output streams:

Additional details are provided in Operation details.

Video VSI Implementation Overview

Python requirements

In addition to the requirements listed in Python environment setup, the VSI Video use case also requires installed OpenCV Python package. Run the following command to install it with pip:

pip install opencv-python

Operation details

The table below lists the files that implement the video over VSI peripheral:

Item Description
./interface/video/include/video_drv.h Video Driver API header file. Used by implementations on AVH FVPs and real HW boards.
./interface/video/source/video_drv.c Video driver implementation for AVH FVPs.
./interface/video/python/arm_vsi4.py Video via VSI Python script for video input channel 0.
./interface/video/python/arm_vsi5.py Video via VSI Python script for video output channel 0.
./interface/video/python/arm_vsi6.py Video via VSI Python script for video input channel 1.
./interface/video/python/arm_vsi7.py Video via VSI Python script for video output channel 1.
./interface/video/python/vsi_video.py VSI video client module.
./interface/video/python/vsi_video_server.py VSI video server module.

The execution flow for video input on channel 0 is explained in the diagram below:

Sequence diagram for the Video over VSI implementation
  1. Upon start/reset of the FVP model the VSI peripheral calls init() function in the available VSI Python API files. In this example, for video input on channel 0 it is arm_vsi4.py. The init() function then starts video server on the host.
  2. User application on the device calls VideoDrv_Initialize that configures the available channels on the driver into default state.
  3. Then VideoDrv_Configure is used to specify required parameters for the input stream such as resolution, color scheme, and frame rate. The driver propagates this configuration to the video server, and also sets the DMA and the Timer for the VSI peripheral according to that configuration.
  4. With VideoDrv_SetBuf a DMA buffer is assigned to be used for incoming data.
  5. VideoDrv_SetFile specifies the input file. If the file is not found then an error will be returned.
  6. After that application initiates the streaming with VideoDrv_StreamStart. This opens selected file, and also configures VSI timer and enables VSI IRQ.
  7. With step 6 completed, a loop of frame reading is started:
    • A frame is read from the file, and is prepared according to the stream configuration (with crop/resize and color change operations done in the vsi_video_server.py).
    • The processed frame data is copied to the DMA buffer.
    • VideoDrv_GetStatus checks if the frame buffer is full, meaning that a complete frame was transferred to the FVP memory. An approach with asynchronous video driver event VIDEO_DRV_EVENT_FRAME can be used as well.
    • When video buffer is filled, we can extract the frame from it using VideoDrv_GetFrameBuf.
    • Obtained frame can be now passed to the processing algorithm (for example for object detection).
    • Release the frame with VideoDrv_ReleaseFrame to enable reception of the next frame.
    • Repeat the steps above until end of stream is reported in the status.
  8. When the end of the stream is reached, we can close the stream with VideoDrv_StreamStop and stop driver operation with VideoDrv_Uninitialize.

Data Structure Documentation

◆ VideoDrv_Status_t

struct VideoDrv_Status_t

Video Status.

Structure with information about the Video channel status. The data fields encode active and buffer state flags.

Returned by:

Data Fields
uint32_t active: 1 Video stream active.
uint32_t buf_empty: 1 Video buffer empty.
uint32_t buf_full: 1 Video buffer full.
uint32_t overflow: 1 Video buffer overflow (cleared on GetStatus)
uint32_t underflow: 1 Video buffer underflow (cleared on GetStatus)
uint32_t eos: 1 Video end of stream (cleared on GetStatus)
uint32_t reserved: 26