This page introduces how to use the Cloud Gateway to send media streams to the client and receive media streams from the client.
Before you begin, make sure you have downloaded the latest Cloud Gateway. See Integrate the Java SDK.
Complete the following preparations before sending and receiving media streams.
You need to call AgoraService
and initialize
to create and initialize an AgoraService
object. The AgoraService
object persists as long as the server app keeps running.
The SDK supports user IDs in both integer format and string format. This page only shows user IDs in int format (the character set can only be digitals). To learn more about user IDs in string format, see Using User IDs in String Format.
// Import SDK, AgoraService, and AgoraServiceConfig classes for initialization
import io.agora.rtc.SDK;
import io.agora.rtc.AgoraService;
import io.agora.rtc.AgoraServiceConfig;
// Creates an AgoraService object
SDK.load(); // ensure JNI library load
AgoraService service = new AgoraService();
// Initializes the AgoraServiceConfig object
AgoraServiceConfig config = new AgoraServiceConfig();
// Enables the audio processing module
config.setEnableAudioProcessor(1);
// Disables the audio device module (Normally we do not directly connect audio capture or playback devices to a server)
config.setEnableAudioDevice(0);
// Enables video
config.setEnableVideo(1);
// Sets Agora App ID
config.setAppId(appid);
// Initializes the SDK
service.initialize(config);
After initializing the SDK, you can refer to the following steps to connect to the Agora RTC channel.
Call agoraRtcConnCreate
to create an AgoraRtcConn
object to connect to the Agora RTC channel:
AgoraRtcConn conn = service.agoraRtcConnCreate(null);
Call registerObserver
to listen to connection events:
conn.registerObserver(new ConnObserver());
Call connect
to connect to an Agora RTC channel:
conn.connect(token, "test_channel", "1");
Refer to the following steps to send media streams to the client:
You can use the IMediaNodeFactory
object to create various types of media stream senders:
AgoraAudioPcmDataSender
:Sends audio data in PCM format.AgoraVideoFrameSender
:Sends video data in YUV format.AgoraAudioEncodedFrameSender
:Sends encoded audio data.AgoraVideoEncodedImageSender
:Sends encoded video data.Create an IMediaNodeFactory
object:
AgoraMediaNodeFactory factory = service.createMediaNodeFactory();
Per your own requirements, create an AgoraAudioPcmDataSender
object, an AgoraVideoFrameSender
object, an AgoraAudioEncodedFrameSender
object, or an AgoraVideoEncodedImageSender
object for sending audio in PCM format, video in YUV format, encoded audio, and encoded video:
// Creates a sender for PCM audio
AgoraAudioPcmDataSender audioFrameSender = factory.createAudioPcmDataSender();
// Creates a sender for YUV video
AgoraVideoFrameSender videoFrameSender = factory.createVideoFrameSender();
// Creates a sender for encoded audio
AgoraAudioEncodedFrameSender audioFrameSender = factory.createAudioEncodedFrameSender();
// Creates a sender for encoded video
AgoraVideoEncodedImageSender imageSender = factory.createVideoEncodedImageSender();
Create an AgoraLocalAudioTrack
object and an AgoraLocalVideoTrack
object, which respectively correspond to the local audio track and local video track to be published to the Agora RTC channel:
// Creates a custom audio track that uses a PCM audio stream sender
customAudioTrack = service.createCustomAudioTrackPcm(audioFrameSender);
// Creates a custom audio track that uses an encoded audio stream sender
customAudioTrack = service.createCustomAudioTrackEncoded(audioFrameSender,0);
// Creates a custom video track that uses a YUV video stream sender
customVideoTrack = service.createCustomVideoTrackFrame(videoFrameSender);
// Creates a custom video track that uses encoded video stream sender
customVideoTrack = service.createCustomVideoTrackEncoded(videoFrameSender, option);
Use publish
methods in the AgoraLocalUser
object to publish local audio and video tracks created in the previous step to Agora RTC channel:
// Enables and publishes audio and video track
customAudioTrack.setEnabled(1);
customVideoTrack.setEnabled(1);
conn.getLocalUser().publishAudio(customAudioTrack);
conn.getLocalUser().publishVideo(customVideoTrack);
Start the sending thread, which calls the send
methods of the audio/video sender:
pcmSender = new PcmSender(audioFile,audioFrameSender,numOfChannels,sampleRate);
h264Sender = new H264Sender(videoFile,1000/fps,0,0,videoFrameSender);
pcmSender.start();
h264Sender.start();
Taking PCM data as an example, the following code shows the implementation of PcmSender
:
// audio thread
// send audio data every 10 ms;
class PcmSender extends FileSender {
private AgoraAudioPcmDataSender audioFrameSender;
private static final int INTERVAL = 10; //ms
private int channels;
private int samplerate;
private int bufferSize = 0;
private byte[] buffer;
public PcmSender(String filepath, AgoraAudioPcmDataSender sender,int channels,int samplerate){
super(filepath, INTERVAL);
audioFrameSender = sender;
this.channels = channels;
this.samplerate = samplerate;
this.bufferSize = channels * samplerate * 2 * INTERVAL /1000;
this.buffer = new byte[this.bufferSize];
}
// sendOneFrame calls the send method of audioFrameSender to send PCM data
@Override
public void sendOneFrame(byte[] data) {
if(data == null) return;
audioFrameSender.send(data,(int)System.currentTimeMillis(),sampleRate/(1000/INTERVAL),2,channels,samplerate);
}
@Override
public byte[] readOneFrame(FileInputStream fos) {
if(fos != null ){
try {
int size = fos.read(buffer,0,bufferSize);
if( size <= 0){
reset();
return null;
}
} catch (IOException e) {
e.printStackTrace();
}
}
return buffer;
}
}
Taking H.264 video data as an example, the following code shows the implementation of H264Sender
:
class H264Sender extends FileSender {
private AgoraVideoEncodedImageSender imageSender;
private H264Reader h264Reader;
private int lastFrameType = 0;
private int height;
private int width;
private int fps;
public H264Sender(String path,int interval, int height,int width, AgoraVideoEncodedImageSender videoEncodedImageSender){
super(path,interval,false);
this.imageSender = videoEncodedImageSender;
this.h264Reader = new H264Reader(path);
this.height = height;
this.width = width;
this.fps = 1000/interval;
}
// sendOneH264Frame calls the send method of imageSender to send H.264 data.
@Override
public void sendOneFrame(byte[] data) {
if(data == null) return;
EncodedVideoFrameInfo info = new EncodedVideoFrameInfo();
long currentTime = System.currentTimeMillis();
info.setFrameType(lastFrameType);
info.setWidth(width);
info.setHeight(height);
info.setCodecType(Constants.VIDEO_CODEC_H264);
info.setCaptureTimeMs(currentTime);
info.setRenderTimeMs(currentTime);
info.setFramesPerSecond(fps);
info.setRotation(0);
imageSender.send(data,data.length,info);
}
@Override
public byte[] readOneFrame(FileInputStream fos) {
int retry = 0;
H264Reader.H264Frame frame = h264Reader.readNextFrame();
while ( frame == null && retry < 4){
h264Reader.reset();
frame = h264Reader.readNextFrame();
retry ++;
}
if( frame != null ) {
lastFrameType = frame.frameType;
return frame.data;
} else {
return null;
}
}
@Override
public void release() {
super.release();
h264Reader.close();
}
}
You can set the format of the encoded video data by using the
info
parameter of thesend
method.
After finishing media sending tasks, you can refer to the following steps to disconnect from the channel and release resources.
Call unpublish
methods to stop publishing audio and video:
if(conn != null) {
conn.getLocalUser().unpublishAudio(customAudioTrack);
conn.getLocalUser().unpublishVideo(customVideoTrack);
}
Call unregisterObserver
to unregister the connection observer:
conn.unregisterObserver();
Call disconnect
to disconnect from Agora RTC channel:
int ret = conn.disconnect();
Release resources from create objects:
conn.destroy();
Refer to the following steps to receive media stream from the client.
ILocalUserObserver
object to register video and audio framesInstantiate the video and audio frame observer object, and register the observer by using the member methods of the ILocalUserObserver
class. After registration is successful, the SDK triggers callbacks when audio or video frames are available. You can get audio and video frames from the callbacks.
The SampleLocalUserObserver
class in the sample code not only includes observer objects from the IVideoEncodedFrameObserver
class, the IAudioFrameObserverBase
class, and the IVideoFrameObserver2
class, and inherits the ILocalUserObserver
class as well. The member methods of the ILocalUserObserver
class can be used to register video and audio frame observer objects.
// The SampleLocalUserObserver class inherits the ILocalUserObserver class
localUserObserver = new SampleLocalUserObserver(conn.getLocalUser());
conn.getLocalUser().registerObserver(localUserObserver);
// The PcmFrameObserver class inherits the IAudioFrameObserver class
int ret = conn.getLocalUser().setPlaybackAudioFrameBeforeMixingParameters(numOfChannels, sampleRate);
if (ret > 0) {
System.out.printf("setPlaybackAudioFrameBeforeMixingParameters fail ret=%d\n", ret);
return;
}
// The H264FrameReceiver class inherits the IVideoEncodedImageReceiver class
h264FrameReceiver = new H264FrameReceiver(videoFile);
conn.getLocalUser().registerVideoEncodedFrameObserver(new AgoraVideoEncodedFrameObserver(h264FrameReceiver));
setAudioFrameObserver
and registerVideoEncodedFrameObserver
are member methods of the SampleLocalUserObserver
class and can be used to instantiate the IAudioFrameObserver
class and the AgoraVideoEncodedFrameObserver
class.
localUserObserver.setAudioFrameObserver(pcmFrameObserver);
conn.getLocalUser().registerVideoEncodedFrameObserver(new AgoraVideoEncodedFrameObserver(h264FrameReceiver));
ILocalUserObserver
class to receive media streamsThe following code shows how to receive encoded video, video in YUV format, and audio in PCM format.
// Receives encoded video by using the onEncodedVideoFrame callback and save the video data in a file
class H264FrameReceiver extends FileWriter implements IVideoEncodedFrameObserver {
public H264FrameReceiver(String path) {
super(path);
}
@Override
public int onEncodedVideoFrame(AgoraVideoEncodedFrameObserver agora_video_encoded_frame_observer, int uid, byte[] image_buffer, long length, EncodedVideoFrameInfo video_encoded_frame_info) {
System.out.println("onEncodedVideoFrame success " + video_encoded_frame_info.getFrameType());
writeData(image_buffer, (int) length);
return 1;
}
}
// Receives YUV video by using the onFrame callback and save the video data in a file
class YuvFrameObserver extends FileWriter implements IVideoFrameObserver2 {
public YuvFrameObserver(String path) {
super(path);
}
@Override
public void onFrame(AgoraVideoFrameObserver2 agora_video_frame_observer2, String channel_id, String remote_uid, VideoFrame frame) {
System.out.println("onFrame success ");
writeData(frame.getYBuffer(), frame.getYBuffer().remaining());
writeData(frame.getUBuffer(), frame.getUBuffer().remaining());
writeData(frame.getVBuffer(), frame.getVBuffer().remaining());
return ;
}
}
// Receives PCM audio by using the onPlaybackAudioFrameBeforeMixing callback and save the audio data in a file
public static class PcmFrameObserver extends FileWriter implements IAudioFrameObserver {
public PcmFrameObserver(String outputFilePath) {
super(outputFilePath);
}
@Override
public int onRecordAudioFrame(AgoraLocalUser agora_local_user, String channel_id, AudioFrame frame) {
System.out.println("onRecordAudioFrame success");
return 1;
}
@Override
public int onPlaybackAudioFrame(AgoraLocalUser agora_local_user, String channel_id, AudioFrame frame) {
System.out.println("onPlaybackAudioFrame success");
return 1;
}
@Override
public int onMixedAudioFrame(AgoraLocalUser agora_local_user, String channel_id, AudioFrame frame) {
System.out.println("onMixedAudioFrame success");
return 1;
}
@Override
public int onPlaybackAudioFrameBeforeMixing(AgoraLocalUser agora_local_user, String channel_id, String uid, AudioFrame audioFrame) {
// Write PCM samples
int writeBytes = audioFrame.getSamplesPerChannel() * audioFrame.getChannels() * 2;
writeData(audioFrame.getBuffer(), writeBytes);
return 1;
}
}
When the media receiving task is complete, refer to the following steps to disconnect from the Agora RTC channel.
Call unset methods to release the video and audio observer.
// Release video and audio observer
localUserObserver.unsetAudioFrameObserver();
localUserObserver.unsetVideoFrameObserver();
Call disconnect
to disconnect from the Agora RTC channel.
// Disconnect from the Agora channel
int ret = conn.disconnect();
if (ret != 0) {
System.out.printf("conn.disconnect fail ret=%d\n", ret);
}
Release the created objects.
// Releases the created objects.
conn.destroy();
When your server app stops running, refer to the following sample code to release the AgoraService
object.
// Destroy Agora Service
service.destroy();
Here you can find a link to the API reference and developer considerations.
Refer to Cloud Gateway Java API Reference to find detailed API and parameter descriptions.
LIVE_BROADCASTING
.