Video Capture


You are viewing an older version of this document (Version 5). The latest is Version 12.

Last updated: 6 months ago • 12 versions | Visibility: Public

Description

The video capture allows developers to access camera streams of the Lynx-R1 : RGB, Tracking and Handtracking.

Frames format:

RGB


Tracking


Handtracking


Get started

1. Start the camera you want to stream with the max expected FPS.

LynxCaptureAPI.StartCapture(LynxCaptureAPI.ESensorType.RGB, fps); // Open RGB cameras stream
LynxCaptureAPI.StartCapture(LynxCaptureAPI.ESensorType.TRACKING, fps); // Open Tracking cameras stream
LynxCaptureAPI.StartCapture(LynxCaptureAPI.ESensorType.HANDTRACKING, fps); // Open Handtracking cameras stream


2. Attach your callback each time a frame is received

LynxCaptureAPI.onRGBFrames += OnCallback; // Attach function called at each frame from the RGB stream
LynxCaptureAPI.onTrackingFrames+= OnCallback; // Attach function called at each frame from the Tracking stream
LynxCaptureAPI.onHandtrackingFrames+= OnCallback; // Attach function called at each frame from the Handtracking stream

public void OnCallback(LynxFrameInfo frameInfo)
{
    // Process the frame herre
}


3. Stop the camera

LynxCaptureAPI.StopCamera(LynxCaptureAPI.ESensorType.RGB); // Stop RGB cameras
LynxCaptureAPI.StopCamera(LynxCaptureAPI.ESensorType.TRACKING); // Stop Tracking cameras
LynxCaptureAPI.StopCamera(LynxCaptureAPI.ESensorType.HANDTRACKING); // Stop Handtracking cameras
LynxCaptureAPI.StopAllCameras(); // Stop all cameras stream


LynxFrameInfo

To facilitate image processing, the LynxOpenCV API provides some useful functions:

LynxOpenCV.YUV2RGB(IntPtr YUVBuffer, uint width, uint height, byte[] outRGBBuffer); // Convert YUV buffer to RGB buffer
LynxOpenCV.YUV2RGBA(IntPtr YUVBuffer, uint width, uint height, byte[] outRGBBuffer); // Convert YUV buffer to RGBA buffer
LynxOpenCV.Compose_ARVR_2_RGBA_From_YUV_AR(IntPtr inYUVBufferAR, uint width, uint height, byte[] inOutRGBABufferVR); // Blend AR YUV and VR RGBA together based on VR alpha
LynxOpenCV.Compose_ARVR_2_RGBA_from_RGBA_AR(byte[] inRGBABufferAR, uint width, uint height, byte[] inOutRGBABufferVR); // Blend AR RGBA and VR RGBA together based on VR alpha
LynxOpenCV.ResizeFrame(uint width, uint height, uint depth, uint new_width, uint new_height, byte[] inBuffer, byte[] outBuffer); // Resize buffer from byte array
LynxOpenCV.ResizeFrame(uint width, uint height, uint depth, uint new_width, uint new_height, IntPtr inBuffer, byte[] outBuffer); // Resize buffer from IntPtr
LynxOpenCV.FlipRGBAFrame(uint width, uint height, bool verticalFlip, bool HorizontalFlip, byte[] inOutBuffer); // Flip a RGBA buffer horizontally and/or vertically
LynxOpenCV.FlipRGBFrame(uint width, uint height, bool verticalFlip, bool HorizontalFlip, byte[] inOutBuffer); // Flip a RGBA buffer horizontally and/or vertically


Samples

The package contains samples to help you go through the Video Capture.

Limitations

This package has the current limitations:


Status

The current package is a preview version.

This feature is planned to be integrated as a full part of the SDK.

Due to the limitations, the current version is provided as a complete library. Thereafter, the video capture will be provided directly as a service on the Lynx-R1 to ease development build and reduce its size.