1# Using AudioRenderer for Audio Playback
2
3The AudioRenderer is used to play Pulse Code Modulation (PCM) audio data. Unlike the [AVPlayer](../media/using-avplayer-for-playback.md), the AudioRenderer can perform data preprocessing before audio input. Therefore, the AudioRenderer is more suitable if you have extensive audio development experience and want to implement more flexible playback features.
4
5## Development Guidelines
6
7The full rendering process involves creating an **AudioRenderer** instance, configuring audio rendering parameters, starting and stopping rendering, and releasing the instance. In this topic, you will learn how to use the AudioRenderer to render audio data. Before the development, you are advised to read [AudioRenderer](../../reference/apis-audio-kit/js-apis-audio.md#audiorenderer8) for the API reference.
8
9The figure below shows the state changes of the AudioRenderer. After an **AudioRenderer** instance is created, different APIs can be called to switch the AudioRenderer to different states and trigger the required behavior. If an API is called when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior. Therefore, you are advised to check the AudioRenderer state before triggering state transition.
10
11To prevent the UI thread from being blocked, most **AudioRenderer** calls are asynchronous. Each API provides the callback and promise functions. The following examples use the callback functions.
12
13**Figure 1** AudioRenderer state transition
14
15![AudioRenderer state transition](figures/audiorenderer-status-change.png)
16
17During application development, you are advised to use [on('stateChange')](../../reference/apis-audio-kit/js-apis-audio.md#onstatechange-8) to subscribe to state changes of the AudioRenderer. This is because some operations can be performed only when the AudioRenderer is in a given state. If the application performs an operation when the AudioRenderer is not in the given state, the system may throw an exception or generate other undefined behavior.
18
19- **prepared**: The AudioRenderer enters this state by calling [createAudioRenderer()](../../reference/apis-audio-kit/js-apis-audio.md#audiocreateaudiorenderer8).
20
21- **running**: The AudioRenderer enters this state by calling [start()](../../reference/apis-audio-kit/js-apis-audio.md#start8) when it is in the **prepared**, **paused**, or **stopped** state.
22
23- **paused**: The AudioRenderer enters this state by calling [pause()](../../reference/apis-audio-kit/js-apis-audio.md#pause8) when it is in the **running** state. When the audio playback is paused, it can call [start()](../../reference/apis-audio-kit/js-apis-audio.md#start8) to resume the playback.
24
25- **stopped**: The AudioRenderer enters this state by calling [stop()](../../reference/apis-audio-kit/js-apis-audio.md#stop8) when it is in the **paused** or **running** state.
26
27- **released**: The AudioRenderer enters this state by calling [release()](../../reference/apis-audio-kit/js-apis-audio.md#release8) when it is in the **prepared**, **paused**, or **stopped** state. In this state, the AudioRenderer releases all occupied hardware and software resources and will not transit to any other state.
28
29### How to Develop
30
311. Set audio rendering parameters and create an **AudioRenderer** instance. For details about the parameters, see [AudioRendererOptions](../../reference/apis-audio-kit/js-apis-audio.md#audiorendereroptions8).
32
33    ```ts
34    import { audio } from '@kit.AudioKit';
35
36    let audioStreamInfo: audio.AudioStreamInfo = {
37      samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate.
38      channels: audio.AudioChannel.CHANNEL_2, // Channel.
39      sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format.
40      encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format.
41    };
42
43    let audioRendererInfo: audio.AudioRendererInfo = {
44      usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
45      rendererFlags: 0
46    };
47
48    let audioRendererOptions: audio.AudioRendererOptions = {
49      streamInfo: audioStreamInfo,
50      rendererInfo: audioRendererInfo
51    };
52
53    audio.createAudioRenderer(audioRendererOptions, (err, data) => {
54      if (err) {
55        console.error(`Invoke createAudioRenderer failed, code is ${err.code}, message is ${err.message}`);
56        return;
57      } else {
58        console.info('Invoke createAudioRenderer succeeded.');
59        let audioRenderer = data;
60      }
61    });
62    ```
63
642. Call **on('writeData')** to subscribe to the audio data write callback.
65
66    ```ts
67    import { BusinessError } from '@kit.BasicServicesKit';
68    import { fileIo } from '@kit.CoreFileKit';
69
70    let bufferSize: number = 0;
71    class Options {
72      offset?: number;
73      length?: number;
74    }
75
76    let path = getContext().cacheDir;
77    // Ensure that the resource exists in the path.
78    let filePath = path + '/StarWars10s-2C-48000-4SW.wav';
79    let file: fileIo.File = fileIo.openSync(filePath, fileIo.OpenMode.READ_ONLY);
80
81    let writeDataCallback = (buffer: ArrayBuffer) => {
82
83      let options: Options = {
84        offset: bufferSize,
85        length: buffer.byteLength
86      }
87      fileIo.readSync(file.fd, buffer, options);
88      bufferSize += buffer.byteLength;
89    }
90
91    audioRenderer.on('writeData', writeDataCallback);
92    ```
93
943. Call **start()** to switch the AudioRenderer to the **running** state and start rendering.
95
96    ```ts
97    import { BusinessError } from '@kit.BasicServicesKit';
98
99    audioRenderer.start((err: BusinessError) => {
100      if (err) {
101        console.error(`Renderer start failed, code is ${err.code}, message is ${err.message}`);
102      } else {
103        console.info('Renderer start success.');
104      }
105    });
106    ```
107
1084. Call **stop()** to stop rendering.
109
110    ```ts
111    import { BusinessError } from '@kit.BasicServicesKit';
112
113    audioRenderer.stop((err: BusinessError) => {
114      if (err) {
115        console.error(`Renderer stop failed, code is ${err.code}, message is ${err.message}`);
116      } else {
117        console.info('Renderer stopped.');
118      }
119    });
120    ```
121
1225. Call **release()** to release the instance.
123
124    ```ts
125    import { BusinessError } from '@kit.BasicServicesKit';
126    
127    audioRenderer.release((err: BusinessError) => {
128      if (err) {
129        console.error(`Renderer release failed, code is ${err.code}, message is ${err.message}`);
130      } else {
131        console.info('Renderer released.');
132      } 
133    });
134    ```
135
136### Selecting the Correct Stream Usage
137
138When developing a media player, it is important to correctly set the stream usage type according to the intended use case. This will ensure that the player behaves as expected in different scenarios.
139
140The recommended use cases are described in [StreamUsage](../../reference/apis-audio-kit/js-apis-audio.md#streamusage). For example, **STREAM_USAGE_MUSIC** is recommended for music scenarios, **STREAM_USAGE_MOVIE** is recommended for movie or video scenarios, and **STREAM_USAGE_GAME** is recommended for gaming scenarios.
141
142An incorrect configuration of **StreamUsage** may cause unexpected behavior. Example scenarios are as follows:
143
144- When **STREAM_USAGE_MUSIC** is incorrectly used in a game scenario, the game cannot be played simultaneously with music applications. However, games usually can coexist with music playback.
145- When **STREAM_USAGE_MUSIC** is incorrectly used in a navigation scenario, any playing music is interrupted when the navigation application provides audio guidance. However, it is generally expected that the music keeps playing at a lower volume while the navigation is active.
146
147### Sample Code
148
149Refer to the sample code below to render an audio file using AudioRenderer.
150
151```ts
152import { audio } from '@kit.AudioKit';
153import { BusinessError } from '@kit.BasicServicesKit';
154import { fileIo } from '@kit.CoreFileKit';
155
156const TAG = 'AudioRendererDemo';
157
158class Options {
159  offset?: number;
160  length?: number;
161}
162
163let context = getContext(this);
164let bufferSize: number = 0;
165let renderModel: audio.AudioRenderer | undefined = undefined;
166let audioStreamInfo: audio.AudioStreamInfo = {
167  samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, // Sampling rate.
168  channels: audio.AudioChannel.CHANNEL_2, // Channel.
169  sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, // Sampling format.
170  encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW // Encoding format.
171}
172let audioRendererInfo: audio.AudioRendererInfo = {
173  usage: audio.StreamUsage.STREAM_USAGE_MUSIC, // Audio stream usage type.
174  rendererFlags: 0 // AudioRenderer flag.
175}
176let audioRendererOptions: audio.AudioRendererOptions = {
177  streamInfo: audioStreamInfo,
178  rendererInfo: audioRendererInfo
179}
180let path = getContext().cacheDir;
181// Ensure that the resource exists in the path.
182let filePath = path + '/StarWars10s-2C-48000-4SW.wav';
183let file: fileIo.File = fileIo.openSync(filePath, fileIo.OpenMode.READ_ONLY);
184
185let writeDataCallback = (buffer: ArrayBuffer) => {
186  let options: Options = {
187    offset: bufferSize,
188    length: buffer.byteLength
189  }
190  fileIo.readSync(file.fd, buffer, options);
191   bufferSize += buffer.byteLength;
192}
193
194// Create an AudioRenderer instance, and set the events to listen for.
195function init() {
196  audio.createAudioRenderer(audioRendererOptions, (err, renderer) => { // Create an AudioRenderer instance.
197    if (!err) {
198      console.info(`${TAG}: creating AudioRenderer success`);
199      renderModel = renderer;
200      if (renderModel !== undefined) {
201        (renderModel as audio.AudioRenderer).on('writeData', writeDataCallback);
202      }
203    } else {
204      console.info(`${TAG}: creating AudioRenderer failed, error: ${err.message}`);
205    }
206  });
207}
208
209// Start audio rendering.
210function start() {
211  if (renderModel !== undefined) {
212    let stateGroup = [audio.AudioState.STATE_PREPARED, audio.AudioState.STATE_PAUSED, audio.AudioState.STATE_STOPPED];
213    if (stateGroup.indexOf((renderModel as audio.AudioRenderer).state.valueOf()) === -1) { // Rendering can be started only when the AudioRenderer is in the prepared, paused, or stopped state.
214      console.error(TAG + 'start failed');
215      return;
216    }
217    // Start rendering.
218    (renderModel as audio.AudioRenderer).start((err: BusinessError) => {
219      if (err) {
220        console.error('Renderer start failed.');
221      } else {
222        console.info('Renderer start success.');
223      }
224    });
225  }
226}
227
228// Pause the rendering.
229function pause() {
230  if (renderModel !== undefined) {
231    // Rendering can be paused only when the AudioRenderer is in the running state.
232    if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING) {
233      console.info('Renderer is not running');
234      return;
235    }
236    // Pause the rendering.
237    (renderModel as audio.AudioRenderer).pause((err: BusinessError) => {
238      if (err) {
239        console.error('Renderer pause failed.');
240      } else {
241        console.info('Renderer pause success.');
242      }
243    });
244  }
245}
246
247// Stop rendering.
248async function stop() {
249  if (renderModel !== undefined) {
250    // Rendering can be stopped only when the AudioRenderer is in the running or paused state.
251    if ((renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_RUNNING && (renderModel as audio.AudioRenderer).state.valueOf() !== audio.AudioState.STATE_PAUSED) {
252      console.info('Renderer is not running or paused.');
253      return;
254    }
255    // Stop rendering.
256    (renderModel as audio.AudioRenderer).stop((err: BusinessError) => {
257      if (err) {
258        console.error('Renderer stop failed.');
259      } else {
260        fileIo.close(file);
261        console.info('Renderer stop success.');
262      }
263    });
264  }
265}
266
267// Release the instance.
268async function release() {
269  if (renderModel !== undefined) {
270    // The AudioRenderer can be released only when it is not in the released state.
271    if (renderModel.state.valueOf() === audio.AudioState.STATE_RELEASED) {
272      console.info('Renderer already released');
273      return;
274    }
275    // Release the resources.
276    (renderModel as audio.AudioRenderer).release((err: BusinessError) => {
277      if (err) {
278        console.error('Renderer release failed.');
279      } else {
280        console.info('Renderer release success.');
281      }
282    });
283  }
284}
285```
286
287When audio streams with the same or higher priority need to use the output device, the current audio playback will be interrupted. The application can respond to and handle the interruption event. For details, see [Processing Audio Interruption Events](audio-playback-concurrency.md).
288