During real-time communications, you can capture the raw audio data and process the captured raw audio data according to your needs.
Agora Unity SDK provides the AudioRawDataManager
class for capturing the raw audio data.
Before using the raw audio data function, ensure that you have implemented the basic real-time communication functions in your project. See Start a Call and Start Live Interactive Streaming for details.
Follow these steps to implement the raw audio data functions in your project:
RegisterAudioRawDataObserver
to register an audio observer.SetOnRecordAudioFrameCallback
to listen for the OnRecordAudioFrameHandler
callback, which returns an audio frame captured by the recording device.SetOnPlaybackAudioFrameCallback
to listen for the OnPlaybackAudioFrameHandler
callback, which returns a mixed audio frame of all remote users.SetOnMixedAudioFrameCallback
to listen for the OnMixedAudioFrameHandler
callback, which returns a mixed audio frame of local user, remote users and their music files.SetOnPlaybackAudioFrameBeforeMixingCallback
to listen for the OnPlaybackAudioFrameBeforeMixingHandler
callback, which returns an audio frame of a specified remote user.AudioSource
object of Unity, see the workflow as follows: buffer
data. Insert the buffer
data to the rear of the queue.setData
method of the AudioClip
object to take the buffer
data out and store in the AudioClip
object.AudioSource
object to play back the raw audio data in the AudioClip
object.UnRegisterAudioRawDataObserver
to remove the audio observer.The following diagram shows how to implement the raw audio data functions in your project:
See the following sample code to implement the raw audio data functions in your project:
void Start()
{
// Initializes IRtcEngine.
mRtcEngine = IRtcEngine.GetEngine(mVendorKey);
// Gets the AudioRawDataManager object.
AudioRawDataManager = AudioRawDataManager.GetInstance(mRtcEngine);
// Registers the audio observer.
mRtcEngine.RegisterAudioRawDataObserver();
// Listens for the OnRecordAudioFrameHandler delegate.
AudioRawDataManager.SetOnRecordAudioFrameCallback(OnRecordAudioFrameHandler);
// Listens for the OnPlaybackAudioFrameHandler delegate.
AudioRawDataManager.SetOnPlaybackAudioFrameCallback(OnPlaybackAudioFrameHandler);
// Listens for the OnMixedAudioFrameHandler delegate.
AudioRawDataManager.SetOnMixedAudioFrameCallback(OnMixedAudioFrameHandler);
// Listens for the OnPlaybackAudioFrameBeforeMixingHandler delegate.
AudioRawDataManager.SetOnPlaybackAudioFrameBeforeMixingCallback(OnPlaybackAudioFrameBeforeMixingHandler);
}
// Gets an audio frame captured by the recording device.
void OnRecordAudioFrameHandler(AudioFrame audioFrame);
{
Debug.Log("OnRecordAudioFrameHandler");
}
// Gets a mixed audio frame of all remote users.
void OnPlaybackAudioFrameHandler(AudioFrame audioFrame);
{
Debug.Log("OnPlaybackAudioFrameHandler");
}
// Gets a mixed audio frame of local user, remote users and their music files.
void OnMixedAudioFrameHandler(AudioFrame audioFrame);
{
Debug.Log("OnMixedAudioFrameHandler");
}
// Gets an audio frame of a specified remote user.
void OnPlaybackAudioFrameBeforeMixingHandler(uint uid, AudioFrame audioFrame);
{
Debug.Log("OnPlaybackAudioFrameBeforeMixingHandler");
}
public enum AUDIO_FRAME_TYPE
{
// 0: PCM16
FRAME_TYPE_PCM16 = 0,
};
public struct AudioFrame
{
// The type of the audio frame. See #AUDIO_FRAME_TYPE.
public AUDIO_FRAME_TYPE type;
// The number of samples per channel in the audio frame.
public int samples;
// The number of bytes per audio sample, which is usually 16-bit (2-byte).
public int bytesPerSample;
// The number of audio channels.
// - 1: Mono
// - 2: Stereo (the data is interleaved)
public int channels;
// The sample rate.
public int samplesPerSec;
// The data buffer of the audio frame. When the audio frame uses a stereo channel, the data buffer is interleaved.
// The size of the data buffer is as follows: buffer = samples × channels × bytesPerSample.
public byte[] buffer;
// The timestamp of the external audio frame. You can use this parameter for the following purposes:
// - Restore the order of the captured audio frame.
// - Synchronize audio and video frames in video-related scenarios, including where external video sources are used.
public long renderTimeMs;
// Reserved for future use.
public int avsync_type;
};
Use the following methods to modify the audio sample rate in the above callbacks:
RegisterAudioRawDataObserver
before joining a channel, and call UnRegisterAudioRawDataObserver
after leaving the channel.