During real-time communications, you can pre- and post-process the video data and modify it for desired playback effects.
The Native SDK uses the IVideoFrameObserver class to provide raw video data functions. You can pre-process the data before sending it to the encoder and modify the captured video frames. You can also post-process the data after sending it to the decoder and modify the received video frames.
Agora provides the following open-source sample projects on GitHub that implement the raw video data function:
You can download them to try out this function and view the source code.
Follow these steps to implement the raw video data function in your project:
registerVideoFrameObserver to register the video frame observer, and implement the IVideoFrameObserver class.onCaptureVideoFrame, onPreEncodeVideoFrame, or onRenderVideoFrame callbacks for each video frame..mm file..mm file:
#import <AgoraRtcKit/IAgoraMediaEngine.h> #import <AgoraRtcKit/IAgoraRtcEngine.h>
The following diagram shows how to implement the raw data function in your project:
In addition to the API call sequence diagram, you can refer to the following code samples to implement the raw video data function in your project.
Initialize AgoraRtcEngineKit, and enable the video module.
// Swift
// Initialize AgoraRtcEngineKit
let config = AgoraRtcEngineConfig()
agoraKit = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self)
// Enable the vide module
agoraKit.enableVideo()
Register the video frame observer.
// Swift
let videoType:ObserverVideoType = ObserverVideoType(rawValue: ObserverVideoType.captureVideo.rawValue | ObserverVideoType.renderVideo.rawValue | ObserverVideoType.preEncodeVideo.rawValue)
agoraMediaDataPlugin?.registerVideoRawDataObserver(videoType)
agoraMediaDataPlugin?.videoDelegate = self;
In the .mm file, call the C++ APIs to implement registerVideoRawDataObserver.
- (void)registerVideoRawDataObserver:(ObserverVideoType)observerType {
// Get the C++ handle of the Native SDK
agora::rtc::IRtcEngine* rtc_engine = (agora::rtc::IRtcEngine*)self.agoraKit.getNativeHandle;
// Create IMediaEngine
agora::util::AutoPtr<agora::media::IMediaEngine> mediaEngine;
mediaEngine.queryInterface(rtc_engine, agora::AGORA_IID_MEDIA_ENGINE);
// Call registerVideoFrameObserver to register the video frame observer
mediaEngine->registerVideoFrameObserver(&s_videoFrameObserver);
s_videoFrameObserver.mediaDataPlugin = self;
}
Join a channel.
// Swift
let result = agoraKit.joinChannel(byToken: KeyCenter.Token, channelId: channelName, info: nil, uid: 0) {[unowned self] (channel, uid, elapsed) -> Void in
self.isJoined = true
LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info)
}
Get the raw video data, and process the data.
// Swift
// Get the video frame captured by the local camera.
func mediaDataPlugin(_ mediaDataPlugin: AgoraMediaDataPlugin, didCapturedVideoRawData videoRawData: AgoraVideoRawData) -> AgoraVideoRawData {
return videoRawData
}
// Get the video frame sent by the remote user.
func mediaDataPlugin(_ mediaDataPlugin: AgoraMediaDataPlugin, willRenderVideoRawData videoRawData: AgoraVideoRawData, ofUid uid: uint) -> AgoraVideoRawData {
return videoRawData
}
In the .mm file, call the C++ APIs to implement the callbacks that get the raw data.
// Get the video frame captured by the local camera through the onCaptureVideoFrame callback
virtual bool onCaptureVideoFrame(VideoFrame& videoFrame) override
{
if (!mediaDataPlugin && ((mediaDataPlugin.observerVideoType >> 0) == 0)) return true;
@autoreleasepool {
AgoraVideoRawData *newData = nil;
if ([mediaDataPlugin.videoDelegate respondsToSelector:@selector(mediaDataPlugin:didCapturedVideoRawData:)]) {
AgoraVideoRawData *data = getVideoRawDataWithVideoFrame(videoFrame);
newData = [mediaDataPlugin.videoDelegate mediaDataPlugin:mediaDataPlugin didCapturedVideoRawData:data];
modifiedVideoFrameWithNewVideoRawData(videoFrame, newData);
}
}
return true;
}
// Get the video frame sent by the remote user through the onRenderVideoFrame callback
virtual bool onRenderVideoFrame(unsigned int uid, VideoFrame& videoFrame) override
{
if (!mediaDataPlugin && ((mediaDataPlugin.observerVideoType >> 1) == 0)) return true;
@autoreleasepool {
AgoraVideoRawData *newData = nil;
if ([mediaDataPlugin.videoDelegate respondsToSelector:@selector(mediaDataPlugin:willRenderVideoRawData:ofUid:)]) {
AgoraVideoRawData *data = getVideoRawDataWithVideoFrame(videoFrame);
newData = [mediaDataPlugin.videoDelegate mediaDataPlugin:mediaDataPlugin willRenderVideoRawData:data ofUid:uid];
modifiedVideoFrameWithNewVideoRawData(videoFrame, newData);
}
}
return true;
}
Unregister the video frame observer.
// Swift
agoraMediaDataPlugin?.deregisterVideoRawDataObserver(ObserverVideoType(rawValue: 0))
Call the C++ APIs in the .mm file to implement deregisterVideoRawDataObserver.
- (void)deregisterVideoRawDataObserver:(ObserverVideoType)observerType {
// Get the C++ handle of the Native SDK
agora::rtc::IRtcEngine* rtc_engine = (agora::rtc::IRtcEngine*)self.agoraKit.getNativeHandle;
// Create IMediaEngine
agora::util::AutoPtr<agora::media::IMediaEngine> mediaEngine;
mediaEngine.queryInterface(rtc_engine, agora::AGORA_IID_MEDIA_ENGINE);
// Call registerVideoFrameObserver to unregister the video frame observer
mediaEngine->registerVideoFrameObserver(NULL);
s_videoFrameObserver.mediaDataPlugin = nil;
}
getNativeHandleregisterVideoFrameObserveronCaptureVideoFrameonRenderVideoFrameonPreEncodeVideoFrame.mm file. See AgoraMediaDataPlugin.mm for reference.getNativeHandle to get the C++ handle each time before calling registerVideoFrameObserver.Refer to Raw Audio Data if you want to implement the raw audio data function in your project.