1e41f4b71Sopenharmony_ci# Introduction to AVCodec Kit 2e41f4b71Sopenharmony_ciAudio and Video Codec (AVCodec) Kit provides capabilities such as audio and video codec, media file muxing and demuxing, and media data input. 3e41f4b71Sopenharmony_ci 4e41f4b71Sopenharmony_ci## Capability Scope 5e41f4b71Sopenharmony_ci- Media data input: Media applications can pass in the FD of a file or the URL of a stream for subsequent processing such as media information parsing. 6e41f4b71Sopenharmony_ci- Media foundation: provides common basic types for media data processing, including [AVBuffer](../../reference/apis-avcodec-kit/native__avbuffer_8h.md) and [AVFormat](../../reference/apis-avcodec-kit/native__avformat_8h.md). 7e41f4b71Sopenharmony_ci- Audio encoding: Audio applications (such as audio calling and audio recording applications) can send uncompressed audio data to the audio encoder for encoding. The applications can set parameters such as the encoding format, bit rate, and sampling rate to obtain compressed audio files in desired formats. 8e41f4b71Sopenharmony_ci- Video encoding: Video applications (such as video calling and video recording applications) can send uncompressed video data to the video encoder for encoding. The applications can set parameters such as the encoding format, bit rate, and frame rate to obtain compressed video files in desired formats. 9e41f4b71Sopenharmony_ci- Audio decoding: Audio applications (such as audio calling application and audio player) can send audio streams to the audio decoder for decoding. The decoded data can be sent to audio devices for playback. 10e41f4b71Sopenharmony_ci- Video decoding: Video applications (such as video calling application and video player) can send video streams to the video decoder for decoding. The decoded image data can be sent to display devices for display. 11e41f4b71Sopenharmony_ci- Media file demuxing: Media applications (such as audio and video players) can parse media files stored locally or received from the Internet to obtain audio and video streams, presentation time, encoding formats, and basic file attributes. 12e41f4b71Sopenharmony_ci- Media file muxing: Media applications (such as audio and video recording application) can mux stream data encoded by the encoder into media files (in MP4 or M4A format), and write the audio and video streams, presentation time, encoding format, and basic file attributes into the specified file in a certain format. 13e41f4b71Sopenharmony_ci 14e41f4b71Sopenharmony_ci## Highlights 15e41f4b71Sopenharmony_ci- Zero copy of internal data: During video decoding, the AVCodec provides an AVBuffer through a callback function. The application writes the sample data to be decoded to the AVBuffer. In this way, data in the AVCodec is directly sent to the decoder, rather than being copied from the memory. 16e41f4b71Sopenharmony_ci 17e41f4b71Sopenharmony_ci- Hardware acceleration for video codecs: H.264, H.265, and H.265 10-bit hardware codecs are supported. 18e41f4b71Sopenharmony_ci 19e41f4b71Sopenharmony_ci## Basic Concepts 20e41f4b71Sopenharmony_ci- Media file: file that carries media data such as audio, video, and subtitles. Examples are .mp4 and .m4a files. 21e41f4b71Sopenharmony_ci- Streaming media: media transmission mode that supports simultaneous download and playback. The supported download protocols include HTTP/HTTPS and HLS. 22e41f4b71Sopenharmony_ci- Audio and video encoding: process of converting uncompressed audio and video data into another format, such as H.264 and AAC. 23e41f4b71Sopenharmony_ci- Audio and video decoding: process of converting a data format into an uncompressed original sequence of audio or video data, such as YUV and PCM. 24e41f4b71Sopenharmony_ci- Media file muxing: process of writing media data (such as audio, video, and subtitles) and description information to a file in a given format, for example, .mp4. 25e41f4b71Sopenharmony_ci- Media file demuxing: process of reading media data (such as audio, video, and subtitles) from a file and parsing the description information. 26e41f4b71Sopenharmony_ci- sample: a group of data with the same timing attributes. 27e41f4b71Sopenharmony_ci 28e41f4b71Sopenharmony_ci In the case of audio and video, a sample typically means compressed data that has the same decoding timestamp. 29e41f4b71Sopenharmony_ci 30e41f4b71Sopenharmony_ci In the case of subtitles, it generally includes the content that is meant to be displayed at certain time points. 31e41f4b71Sopenharmony_ci 32e41f4b71Sopenharmony_ci At the end of all tracks, the sample is considered to be empty. 33e41f4b71Sopenharmony_ci 34e41f4b71Sopenharmony_ci## Usage 35e41f4b71Sopenharmony_ci- Video codec 36e41f4b71Sopenharmony_ci 37e41f4b71Sopenharmony_ci The input of video encoding and the output of video decoding are in surface mode. 38e41f4b71Sopenharmony_ci 39e41f4b71Sopenharmony_ci During encoding and decoding, an application is notified of the data processing status through a callback function. For example, during encoding, the application is notified once a frame is encoded and an AVBuffer is output. During decoding, the application is notified once a frame of stream arrives at the decoder. When the decoding is complete, the application is also notified and can perform subsequent processing on the data. 40e41f4b71Sopenharmony_ci 41e41f4b71Sopenharmony_ci The following figure shows the video encoding and decoding logic. 42e41f4b71Sopenharmony_ci 43e41f4b71Sopenharmony_ci  44e41f4b71Sopenharmony_ci 45e41f4b71Sopenharmony_ci For details about the development guide, see [Video Decoding in Surface Output](video-decoding.md#surface-output) and [Video Encoding in Surface Input](video-encoding.md#surface-input). 46e41f4b71Sopenharmony_ci 47e41f4b71Sopenharmony_ci- Audio codec 48e41f4b71Sopenharmony_ci 49e41f4b71Sopenharmony_ci The input of audio encoding and the output of audio decoding are in PCM format. 50e41f4b71Sopenharmony_ci 51e41f4b71Sopenharmony_ci During encoding and decoding, an application is notified of the data processing status through a callback function. For example, during encoding, the application is notified once a frame is encoded and an AVBuffer is output. During decoding, the application is notified once a frame of stream arrives at the decoder. When the decoding is complete, the application is also notified and can perform subsequent processing on the data. 52e41f4b71Sopenharmony_ci 53e41f4b71Sopenharmony_ci The following figure shows the audio encoding and decoding logic. 54e41f4b71Sopenharmony_ci 55e41f4b71Sopenharmony_ci  56e41f4b71Sopenharmony_ci 57e41f4b71Sopenharmony_ci For details about the development guide, see [Audio Decoding](audio-decoding.md) and [Audio Encoding](audio-encoding.md). 58e41f4b71Sopenharmony_ci 59e41f4b71Sopenharmony_ci 60e41f4b71Sopenharmony_ci- File muxing and demuxing 61e41f4b71Sopenharmony_ci 62e41f4b71Sopenharmony_ci During file muxing, an application sends an AVBuffer to the corresponding codec interface for muxing. The AVBuffer can be that output in the preceding encoding process or an AVBuffer created by the application. The AVBuffer must carry valid stream data and related time description. 63e41f4b71Sopenharmony_ci During file demuxing, the application obtains the AVBuffer that carries stream data from the corresponding codec interface. The AVBuffer can be sent to the decoder for decoding. 64e41f4b71Sopenharmony_ci 65e41f4b71Sopenharmony_ci The following figure shows the file muxing and demuxing logic. 66e41f4b71Sopenharmony_ci 67e41f4b71Sopenharmony_ci  68e41f4b71Sopenharmony_ci 69e41f4b71Sopenharmony_ci For details about the development guide, see [Audio and Video Demuxing](audio-video-demuxer.md) and [Audio and Video Muxing](audio-video-muxer.md). 70