During real-time communications, you can pre- and post-process received audio and video data to achieve the desired playback effect.
Agora provides the raw data function for those who want to process the audio or video data according to the actual needs. This function enables you to pre-process the captured video frame or audio signal before sending it to the encoder, or to post-process the decoded video frame or audio signal.
Agora Unity SDK provides the VideoRawDataManager
class for capturing and processing the raw video data.
Before using the raw data function, ensure that you have implemented the basic real-time communication functions in your project. See Start a Video Call or Start Interactive Video Streaming for details.
Follow these steps to implement the raw video data functions in your project:
Choose either of the following methods to register a video observer:
EnableVideoObserver
to register a video observer. This method applies to scenarios where the SDK captures and renders the video.RegisterVideoRawDataObserver
to register a video observer. This method applies to scenarios where the SDK does not handle the video capturing and rendering, and you can handle it according to your needs.After you successfully register the observer, call the following methods to listen for callbacks according to your needs:
SetOnCaptureVideoFrameCallback
to listen for the OnCaptureVideoFrameHandler
callback, which returns a video frame captured by the local camera.SetOnRenderVideoFrameCallback
to listen for the OnRenderVideoFrameHandler
callback, which returns a video frame sent by the remote user.Process the captured raw video data according to your needs.
Choose either of the following methods to unregister the video observer:
DisableVideoObserver
to unregister the video observer.UnRegisterVideoRawDataObserver
to unregister a video observer.The following diagram shows how to implement the raw data functions in your project:
See the following sample code to implement the raw video data functions in your project:
void Start()
{
// Initializes the IRtcEngine object.
mRtcEngine = IRtcEngine.GetEngine(mVendorKey);
// Gets the VideoRawDataManager object.
videoRawDataManager = VideoRawDataManager.GetInstance(mRtcEngine);
// Enables the video module.
mRtcEngine.EnableVideo();
// Enables the video observer.
mRtcEngine.EnableVideoObserver();
// Listens for the OnCaptureVideoFrameHandler delegate.
videoRawDataManager.SetOnCaptureVideoFrameCallback(OnCaptureVideoFrameHandler);
// Listens for the OnRenderVideoFrameHandler delegate.
videoRawDataManager.SetOnRenderVideoFrameCallback(OnRenderVideoFrameHandler);
}
// Gets a video frame sent by the remote user.
void OnRenderVideoFrameHandler(uint uid, VideoFrame videoFrame)
{
Debug.Log("OnRenderVideoFrameHandler");
}
// Gets a video frame captured by the local camera.
void OnCaptureVideoFrameHandler(VideoFrame videoFrame)
{
Debug.Log("OnCaptureVideoFrameHandler");
}
public enum VIDEO_FRAME_TYPE {
/** 0: YUV420. */
FRAME_TYPE_YUV420 = 0,
/** 1: RGBA. */
FRAME_TYPE_RGBA = 1,
};
public struct VideoFrame {
// Supports FRAME_TYPE_RGBA only.
public VIDEO_FRAME_TYPE type;
// The width of the video frame.
public int width;
// The height of the video frame.
public int height;
// The line span of the Y buffer within the video data.
public int yStride;
// The buffer of the RGBA data.
public byte[] buffer;
// Sets the rotation of the video frame before rendering the video. Supports 0, 90, 180, 270 degrees clockwise.
public int rotation;
// The timestamp of the external audio frame.
public long renderTimeMs;
// Reserved for future use.
public int avsync_type;
};
EnableVideoObserver
or DisableVideoObserver
, SDK calls RegisterVideoRawDataObserver
or UnRegisterVideoRawDataObserver
automatically, so you don't need to call them again.