1e41f4b71Sopenharmony_ci# Using MindSpore Lite for Image Classification (C/C++)
2e41f4b71Sopenharmony_ci
3e41f4b71Sopenharmony_ci## When to Use
4e41f4b71Sopenharmony_ci
5e41f4b71Sopenharmony_ciYou can use [MindSpore](../../reference/apis-mindspore-lite-kit/_mind_spore.md) to quickly deploy AI algorithms into your application to perform AI model inference for image classification.
6e41f4b71Sopenharmony_ci
7e41f4b71Sopenharmony_ciImage classification can be used to recognize objects in images and is widely used in medical image analysis, auto driving, e-commerce, and facial recognition.
8e41f4b71Sopenharmony_ci
9e41f4b71Sopenharmony_ci## Basic Concepts
10e41f4b71Sopenharmony_ci
11e41f4b71Sopenharmony_ci- N-API: a set of native APIs used to build ArkTS components. N-APIs can be used to encapsulate C/C++ libraries into ArkTS modules.
12e41f4b71Sopenharmony_ci
13e41f4b71Sopenharmony_ci## Development Process
14e41f4b71Sopenharmony_ci
15e41f4b71Sopenharmony_ci1. Select an image classification model.
16e41f4b71Sopenharmony_ci2. Use the MindSpore Lite inference model on the device to classify the selected images.
17e41f4b71Sopenharmony_ci
18e41f4b71Sopenharmony_ci## Environment Preparation
19e41f4b71Sopenharmony_ci
20e41f4b71Sopenharmony_ciInstall DevEco Studio 4.1 or later, and update the SDK to API version 11 or later.
21e41f4b71Sopenharmony_ci
22e41f4b71Sopenharmony_ci## How to Develop
23e41f4b71Sopenharmony_ci
24e41f4b71Sopenharmony_ciThe following uses inference on an image in the album as an example to describe how to use MindSpore Lite to implement image classification.
25e41f4b71Sopenharmony_ci
26e41f4b71Sopenharmony_ci### Selecting a Model
27e41f4b71Sopenharmony_ci
28e41f4b71Sopenharmony_ciThis sample application uses [mobilenetv2.ms](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/1.5/mobilenetv2.ms) as the image classification model. The model file is available in the **entry/src/main/resources/rawfile** project directory.
29e41f4b71Sopenharmony_ci
30e41f4b71Sopenharmony_ciIf you have other pre-trained models for image classification, convert the original model into the .ms format by referring to [Using MindSpore Lite for Model Conversion](mindspore-lite-converter-guidelines.md).
31e41f4b71Sopenharmony_ci
32e41f4b71Sopenharmony_ci### Writing Code
33e41f4b71Sopenharmony_ci
34e41f4b71Sopenharmony_ci#### Image Input and Preprocessing
35e41f4b71Sopenharmony_ci
36e41f4b71Sopenharmony_ci1. Call [@ohos.file.picker](../../reference/apis-core-file-kit/js-apis-file-picker.md) to pick up the desired image in the album.
37e41f4b71Sopenharmony_ci
38e41f4b71Sopenharmony_ci   ```ts
39e41f4b71Sopenharmony_ci   import { photoAccessHelper } from '@kit.MediaLibraryKit';
40e41f4b71Sopenharmony_ci   import { BusinessError } from '@kit.BasicServicesKit';
41e41f4b71Sopenharmony_ci   
42e41f4b71Sopenharmony_ci   let uris: Array<string> = [];
43e41f4b71Sopenharmony_ci   
44e41f4b71Sopenharmony_ci   // Create an image picker instance.
45e41f4b71Sopenharmony_ci   let photoSelectOptions = new photoAccessHelper.PhotoSelectOptions();
46e41f4b71Sopenharmony_ci   
47e41f4b71Sopenharmony_ci   // Set the media file type to IMAGE and set the maximum number of media files that can be selected.
48e41f4b71Sopenharmony_ci   photoSelectOptions.MIMEType = photoAccessHelper.PhotoViewMIMETypes.IMAGE_TYPE;
49e41f4b71Sopenharmony_ci   photoSelectOptions.maxSelectNumber = 1;
50e41f4b71Sopenharmony_ci   
51e41f4b71Sopenharmony_ci   // Create an album picker instance and call select() to open the album page for file selection. After file selection is done, the result set is returned through photoSelectResult.
52e41f4b71Sopenharmony_ci   let photoPicker = new photoAccessHelper.PhotoViewPicker();
53e41f4b71Sopenharmony_ci   photoPicker.select(photoSelectOptions, async (
54e41f4b71Sopenharmony_ci     err: BusinessError, photoSelectResult: photoAccessHelper.PhotoSelectResult) => {
55e41f4b71Sopenharmony_ci     if (err) {
56e41f4b71Sopenharmony_ci       console.error('MS_LITE_ERR: PhotoViewPicker.select failed with err: ' + JSON.stringify(err));
57e41f4b71Sopenharmony_ci       return;
58e41f4b71Sopenharmony_ci     }
59e41f4b71Sopenharmony_ci     console.info('MS_LITE_LOG: PhotoViewPicker.select successfully, ' +
60e41f4b71Sopenharmony_ci       'photoSelectResult uri: ' + JSON.stringify(photoSelectResult));
61e41f4b71Sopenharmony_ci     uris = photoSelectResult.photoUris;
62e41f4b71Sopenharmony_ci     console.info('MS_LITE_LOG: uri: ' + uris);
63e41f4b71Sopenharmony_ci   })
64e41f4b71Sopenharmony_ci   ```
65e41f4b71Sopenharmony_ci
66e41f4b71Sopenharmony_ci2. Based on the input image size, call [@ohos.multimedia.image](../../reference/apis-image-kit/js-apis-image.md) and [@ohos.file.fs](../../reference/apis-core-file-kit/js-apis-file-fs.md) to perform operations such as cropping the image, obtain the image buffer, and standardizing the image.
67e41f4b71Sopenharmony_ci
68e41f4b71Sopenharmony_ci   ```ts
69e41f4b71Sopenharmony_ci   import { image } from '@kit.ImageKit';
70e41f4b71Sopenharmony_ci   import { fileIo } from '@kit.CoreFileKit';
71e41f4b71Sopenharmony_ci   
72e41f4b71Sopenharmony_ci   let modelInputHeight: number = 224;
73e41f4b71Sopenharmony_ci   let modelInputWidth: number = 224;
74e41f4b71Sopenharmony_ci   
75e41f4b71Sopenharmony_ci   // Based on the specified URI, call fileIo.openSync to open the file to obtain the FD.
76e41f4b71Sopenharmony_ci   let file = fileIo.openSync(this.uris[0], fileIo.OpenMode.READ_ONLY);
77e41f4b71Sopenharmony_ci   console.info('MS_LITE_LOG: file fd: ' + file.fd);
78e41f4b71Sopenharmony_ci   
79e41f4b71Sopenharmony_ci   // Based on the FD, call fileIo.readSync to read the data in the file.
80e41f4b71Sopenharmony_ci   let inputBuffer = new ArrayBuffer(4096000);
81e41f4b71Sopenharmony_ci   let readLen = fileIo.readSync(file.fd, inputBuffer);
82e41f4b71Sopenharmony_ci   console.info('MS_LITE_LOG: readSync data to file succeed and inputBuffer size is:' + readLen);
83e41f4b71Sopenharmony_ci   
84e41f4b71Sopenharmony_ci   // Perform image preprocessing through PixelMap.
85e41f4b71Sopenharmony_ci   let imageSource = image.createImageSource(file.fd);
86e41f4b71Sopenharmony_ci   imageSource.createPixelMap().then((pixelMap) => {
87e41f4b71Sopenharmony_ci     pixelMap.getImageInfo().then((info) => {
88e41f4b71Sopenharmony_ci       console.info('MS_LITE_LOG: info.width = ' + info.size.width);
89e41f4b71Sopenharmony_ci       console.info('MS_LITE_LOG: info.height = ' + info.size.height);
90e41f4b71Sopenharmony_ci       // Crop the image based on the input image size and obtain the image buffer readBuffer.
91e41f4b71Sopenharmony_ci       pixelMap.scale(256.0 / info.size.width, 256.0 / info.size.height).then(() => {
92e41f4b71Sopenharmony_ci         pixelMap.crop(
93e41f4b71Sopenharmony_ci           { x: 16, y: 16, size: { height: modelInputHeight, width: modelInputWidth } }
94e41f4b71Sopenharmony_ci         ).then(async () => {
95e41f4b71Sopenharmony_ci           let info = await pixelMap.getImageInfo();
96e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: crop info.width = ' + info.size.width);
97e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: crop info.height = ' + info.size.height);
98e41f4b71Sopenharmony_ci           // Set the size of readBuffer.
99e41f4b71Sopenharmony_ci           let readBuffer = new ArrayBuffer(modelInputHeight * modelInputWidth * 4);
100e41f4b71Sopenharmony_ci           await pixelMap.readPixelsToBuffer(readBuffer);
101e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: Succeeded in reading image pixel data, buffer: ' +
102e41f4b71Sopenharmony_ci           readBuffer.byteLength);
103e41f4b71Sopenharmony_ci           // Convert readBuffer to the float32 format, and standardize the image.
104e41f4b71Sopenharmony_ci           const imageArr = new Uint8Array(
105e41f4b71Sopenharmony_ci             readBuffer.slice(0, modelInputHeight * modelInputWidth * 4));
106e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: imageArr length: ' + imageArr.length);
107e41f4b71Sopenharmony_ci           let means = [0.485, 0.456, 0.406];
108e41f4b71Sopenharmony_ci           let stds = [0.229, 0.224, 0.225];
109e41f4b71Sopenharmony_ci           let float32View = new Float32Array(modelInputHeight * modelInputWidth * 3);
110e41f4b71Sopenharmony_ci           let index = 0;
111e41f4b71Sopenharmony_ci           for (let i = 0; i < imageArr.length; i++) {
112e41f4b71Sopenharmony_ci             if ((i + 1) % 4 == 0) {
113e41f4b71Sopenharmony_ci               float32View[index] = (imageArr[i - 3] / 255.0 - means[0]) / stds[0]; // B
114e41f4b71Sopenharmony_ci               float32View[index+1] = (imageArr[i - 2] / 255.0 - means[1]) / stds[1]; // G
115e41f4b71Sopenharmony_ci               float32View[index+2] = (imageArr[i - 1] / 255.0 - means[2]) / stds[2]; // R
116e41f4b71Sopenharmony_ci               index += 3;
117e41f4b71Sopenharmony_ci             }
118e41f4b71Sopenharmony_ci           }
119e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: float32View length: ' + float32View.length);
120e41f4b71Sopenharmony_ci           let printStr = 'float32View data:';
121e41f4b71Sopenharmony_ci           for (let i = 0; i < 20; i++) {
122e41f4b71Sopenharmony_ci             printStr += ' ' + float32View[i];
123e41f4b71Sopenharmony_ci           }
124e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: float32View data: ' + printStr);
125e41f4b71Sopenharmony_ci         })
126e41f4b71Sopenharmony_ci       })
127e41f4b71Sopenharmony_ci     });
128e41f4b71Sopenharmony_ci   });
129e41f4b71Sopenharmony_ci   ```
130e41f4b71Sopenharmony_ci
131e41f4b71Sopenharmony_ci#### Writing Inference Code
132e41f4b71Sopenharmony_ci
133e41f4b71Sopenharmony_ciCall [MindSpore](../../reference/apis-mindspore-lite-kit/_mind_spore.md) to implement inference on the device. The operation process is as follows:
134e41f4b71Sopenharmony_ci
135e41f4b71Sopenharmony_ci1. Include the corresponding header file.
136e41f4b71Sopenharmony_ci
137e41f4b71Sopenharmony_ci   ```c++
138e41f4b71Sopenharmony_ci   #include <iostream>
139e41f4b71Sopenharmony_ci   #include <sstream>
140e41f4b71Sopenharmony_ci   #include <stdlib.h>
141e41f4b71Sopenharmony_ci   #include <hilog/log.h>
142e41f4b71Sopenharmony_ci   #include <rawfile/raw_file_manager.h>
143e41f4b71Sopenharmony_ci   #include <mindspore/types.h>
144e41f4b71Sopenharmony_ci   #include <mindspore/model.h>
145e41f4b71Sopenharmony_ci   #include <mindspore/context.h>
146e41f4b71Sopenharmony_ci   #include <mindspore/status.h>
147e41f4b71Sopenharmony_ci   #include <mindspore/tensor.h>
148e41f4b71Sopenharmony_ci   #include "napi/native_api.h"
149e41f4b71Sopenharmony_ci   ```
150e41f4b71Sopenharmony_ci
151e41f4b71Sopenharmony_ci2. Read the model file.
152e41f4b71Sopenharmony_ci
153e41f4b71Sopenharmony_ci   ```c++
154e41f4b71Sopenharmony_ci   #define LOGI(...) ((void)OH_LOG_Print(LOG_APP, LOG_INFO, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
155e41f4b71Sopenharmony_ci   #define LOGD(...) ((void)OH_LOG_Print(LOG_APP, LOG_DEBUG, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
156e41f4b71Sopenharmony_ci   #define LOGW(...) ((void)OH_LOG_Print(LOG_APP, LOG_WARN, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
157e41f4b71Sopenharmony_ci   #define LOGE(...) ((void)OH_LOG_Print(LOG_APP, LOG_ERROR, LOG_DOMAIN, "[MSLiteNapi]", __VA_ARGS__))
158e41f4b71Sopenharmony_ci   
159e41f4b71Sopenharmony_ci   void *ReadModelFile(NativeResourceManager *nativeResourceManager, const std::string &modelName, size_t *modelSize) {
160e41f4b71Sopenharmony_ci       auto rawFile = OH_ResourceManager_OpenRawFile(nativeResourceManager, modelName.c_str());
161e41f4b71Sopenharmony_ci       if (rawFile == nullptr) {
162e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: Open model file failed");
163e41f4b71Sopenharmony_ci           return nullptr;
164e41f4b71Sopenharmony_ci       }
165e41f4b71Sopenharmony_ci       long fileSize = OH_ResourceManager_GetRawFileSize(rawFile);
166e41f4b71Sopenharmony_ci       void *modelBuffer = malloc(fileSize);
167e41f4b71Sopenharmony_ci       if (modelBuffer == nullptr) {
168e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: OH_ResourceManager_ReadRawFile failed");
169e41f4b71Sopenharmony_ci       }
170e41f4b71Sopenharmony_ci       int ret = OH_ResourceManager_ReadRawFile(rawFile, modelBuffer, fileSize);
171e41f4b71Sopenharmony_ci       if (ret == 0) {
172e41f4b71Sopenharmony_ci           LOGI("MS_LITE_LOG: OH_ResourceManager_ReadRawFile failed");
173e41f4b71Sopenharmony_ci           OH_ResourceManager_CloseRawFile(rawFile);
174e41f4b71Sopenharmony_ci           return nullptr;
175e41f4b71Sopenharmony_ci       }
176e41f4b71Sopenharmony_ci       OH_ResourceManager_CloseRawFile(rawFile);
177e41f4b71Sopenharmony_ci       *modelSize = fileSize;
178e41f4b71Sopenharmony_ci       return modelBuffer;
179e41f4b71Sopenharmony_ci   }
180e41f4b71Sopenharmony_ci   ```
181e41f4b71Sopenharmony_ci   
182e41f4b71Sopenharmony_ci3. Create a context, set parameters such as the number of threads and device type, and load the model.
183e41f4b71Sopenharmony_ci
184e41f4b71Sopenharmony_ci   ```c++
185e41f4b71Sopenharmony_ci   void DestroyModelBuffer(void **buffer) {
186e41f4b71Sopenharmony_ci       if (buffer == nullptr) {
187e41f4b71Sopenharmony_ci           return;
188e41f4b71Sopenharmony_ci       }
189e41f4b71Sopenharmony_ci       free(*buffer);
190e41f4b71Sopenharmony_ci       *buffer = nullptr;
191e41f4b71Sopenharmony_ci   }
192e41f4b71Sopenharmony_ci   
193e41f4b71Sopenharmony_ci   OH_AI_ModelHandle CreateMSLiteModel(void *modelBuffer, size_t modelSize) {
194e41f4b71Sopenharmony_ci       // Set executing context for model.
195e41f4b71Sopenharmony_ci       auto context = OH_AI_ContextCreate();
196e41f4b71Sopenharmony_ci       if (context == nullptr) {
197e41f4b71Sopenharmony_ci           DestroyModelBuffer(&modelBuffer);
198e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: Create MSLite context failed.\n");
199e41f4b71Sopenharmony_ci           return nullptr;
200e41f4b71Sopenharmony_ci       }
201e41f4b71Sopenharmony_ci       auto cpu_device_info = OH_AI_DeviceInfoCreate(OH_AI_DEVICETYPE_CPU);
202e41f4b71Sopenharmony_ci   
203e41f4b71Sopenharmony_ci       OH_AI_DeviceInfoSetEnableFP16(cpu_device_info, true);
204e41f4b71Sopenharmony_ci       OH_AI_ContextAddDeviceInfo(context, cpu_device_info);
205e41f4b71Sopenharmony_ci   
206e41f4b71Sopenharmony_ci       // Create model
207e41f4b71Sopenharmony_ci       auto model = OH_AI_ModelCreate();
208e41f4b71Sopenharmony_ci       if (model == nullptr) {
209e41f4b71Sopenharmony_ci           DestroyModelBuffer(&modelBuffer);
210e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: Allocate MSLite Model failed.\n");
211e41f4b71Sopenharmony_ci           return nullptr;
212e41f4b71Sopenharmony_ci       }
213e41f4b71Sopenharmony_ci   
214e41f4b71Sopenharmony_ci       // Build model object
215e41f4b71Sopenharmony_ci       auto build_ret = OH_AI_ModelBuild(model, modelBuffer, modelSize, OH_AI_MODELTYPE_MINDIR, context);
216e41f4b71Sopenharmony_ci       DestroyModelBuffer(&modelBuffer);
217e41f4b71Sopenharmony_ci       if (build_ret != OH_AI_STATUS_SUCCESS) {
218e41f4b71Sopenharmony_ci           OH_AI_ModelDestroy(&model);
219e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: Build MSLite model failed.\n");
220e41f4b71Sopenharmony_ci           return nullptr;
221e41f4b71Sopenharmony_ci       }
222e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Build MSLite model success.\n");
223e41f4b71Sopenharmony_ci       return model;
224e41f4b71Sopenharmony_ci   }
225e41f4b71Sopenharmony_ci   ```
226e41f4b71Sopenharmony_ci
227e41f4b71Sopenharmony_ci4. Set the model input data and perform model inference.
228e41f4b71Sopenharmony_ci
229e41f4b71Sopenharmony_ci   ```c++
230e41f4b71Sopenharmony_ci   constexpr int K_NUM_PRINT_OF_OUT_DATA = 20;
231e41f4b71Sopenharmony_ci   
232e41f4b71Sopenharmony_ci   // Set the model input data.
233e41f4b71Sopenharmony_ci   int FillInputTensor(OH_AI_TensorHandle input, std::vector<float> input_data) {
234e41f4b71Sopenharmony_ci       if (OH_AI_TensorGetDataType(input) == OH_AI_DATATYPE_NUMBERTYPE_FLOAT32) {
235e41f4b71Sopenharmony_ci           float *data = (float *)OH_AI_TensorGetMutableData(input);
236e41f4b71Sopenharmony_ci           for (size_t i = 0; i < OH_AI_TensorGetElementNum(input); i++) {
237e41f4b71Sopenharmony_ci               data[i] = input_data[i];
238e41f4b71Sopenharmony_ci           }
239e41f4b71Sopenharmony_ci           return OH_AI_STATUS_SUCCESS;
240e41f4b71Sopenharmony_ci       } else {
241e41f4b71Sopenharmony_ci           return OH_AI_STATUS_LITE_ERROR;
242e41f4b71Sopenharmony_ci       }
243e41f4b71Sopenharmony_ci   }
244e41f4b71Sopenharmony_ci   
245e41f4b71Sopenharmony_ci   // Execute model inference.
246e41f4b71Sopenharmony_ci   int RunMSLiteModel(OH_AI_ModelHandle model, std::vector<float> input_data) {
247e41f4b71Sopenharmony_ci       // Set input data for model.
248e41f4b71Sopenharmony_ci       auto inputs = OH_AI_ModelGetInputs(model);
249e41f4b71Sopenharmony_ci   
250e41f4b71Sopenharmony_ci       auto ret = FillInputTensor(inputs.handle_list[0], input_data);
251e41f4b71Sopenharmony_ci       if (ret != OH_AI_STATUS_SUCCESS) {
252e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: RunMSLiteModel set input error.\n");
253e41f4b71Sopenharmony_ci           return OH_AI_STATUS_LITE_ERROR;
254e41f4b71Sopenharmony_ci       }
255e41f4b71Sopenharmony_ci       // Get model output.
256e41f4b71Sopenharmony_ci       auto outputs = OH_AI_ModelGetOutputs(model);
257e41f4b71Sopenharmony_ci       // Predict model.
258e41f4b71Sopenharmony_ci       auto predict_ret = OH_AI_ModelPredict(model, inputs, &outputs, nullptr, nullptr);
259e41f4b71Sopenharmony_ci       if (predict_ret != OH_AI_STATUS_SUCCESS) {
260e41f4b71Sopenharmony_ci           OH_AI_ModelDestroy(&model);
261e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: MSLite Predict error.\n");
262e41f4b71Sopenharmony_ci           return OH_AI_STATUS_LITE_ERROR;
263e41f4b71Sopenharmony_ci       }
264e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Run MSLite model Predict success.\n");
265e41f4b71Sopenharmony_ci       // Print output tensor data.
266e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Get model outputs:\n");
267e41f4b71Sopenharmony_ci       for (size_t i = 0; i < outputs.handle_num; i++) {
268e41f4b71Sopenharmony_ci           auto tensor = outputs.handle_list[i];
269e41f4b71Sopenharmony_ci           LOGI("MS_LITE_LOG: - Tensor %{public}d name is: %{public}s.\n", static_cast<int>(i),
270e41f4b71Sopenharmony_ci                OH_AI_TensorGetName(tensor));
271e41f4b71Sopenharmony_ci           LOGI("MS_LITE_LOG: - Tensor %{public}d size is: %{public}d.\n", static_cast<int>(i),
272e41f4b71Sopenharmony_ci                (int)OH_AI_TensorGetDataSize(tensor));
273e41f4b71Sopenharmony_ci           LOGI("MS_LITE_LOG: - Tensor data is:\n");
274e41f4b71Sopenharmony_ci           auto out_data = reinterpret_cast<const float *>(OH_AI_TensorGetData(tensor));
275e41f4b71Sopenharmony_ci           std::stringstream outStr;
276e41f4b71Sopenharmony_ci           for (int i = 0; (i < OH_AI_TensorGetElementNum(tensor)) && (i <= K_NUM_PRINT_OF_OUT_DATA); i++) {
277e41f4b71Sopenharmony_ci               outStr << out_data[i] << " ";
278e41f4b71Sopenharmony_ci           }
279e41f4b71Sopenharmony_ci           LOGI("MS_LITE_LOG: %{public}s", outStr.str().c_str());
280e41f4b71Sopenharmony_ci       }
281e41f4b71Sopenharmony_ci       return OH_AI_STATUS_SUCCESS;
282e41f4b71Sopenharmony_ci   }
283e41f4b71Sopenharmony_ci   ```
284e41f4b71Sopenharmony_ci
285e41f4b71Sopenharmony_ci5. Implement a complete model inference process.
286e41f4b71Sopenharmony_ci
287e41f4b71Sopenharmony_ci   ```c++
288e41f4b71Sopenharmony_ci   static napi_value RunDemo(napi_env env, napi_callback_info info) {
289e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Enter runDemo()");
290e41f4b71Sopenharmony_ci       napi_value error_ret;
291e41f4b71Sopenharmony_ci       napi_create_int32(env, -1, &error_ret);
292e41f4b71Sopenharmony_ci       // Process the input data.
293e41f4b71Sopenharmony_ci       size_t argc = 2;
294e41f4b71Sopenharmony_ci       napi_value argv[2] = {nullptr};
295e41f4b71Sopenharmony_ci       napi_get_cb_info(env, info, &argc, argv, nullptr, nullptr);
296e41f4b71Sopenharmony_ci       bool isArray = false;
297e41f4b71Sopenharmony_ci       napi_is_array(env, argv[0], &isArray);
298e41f4b71Sopenharmony_ci       uint32_t length = 0;
299e41f4b71Sopenharmony_ci       // Obtain the length of the array.
300e41f4b71Sopenharmony_ci       napi_get_array_length(env, argv[0], &length);
301e41f4b71Sopenharmony_ci   	LOGI("MS_LITE_LOG: argv array length = %{public}d", length);
302e41f4b71Sopenharmony_ci       std::vector<float> input_data;
303e41f4b71Sopenharmony_ci       double param = 0;
304e41f4b71Sopenharmony_ci       for (int i = 0; i < length; i++) {
305e41f4b71Sopenharmony_ci           napi_value value;
306e41f4b71Sopenharmony_ci           napi_get_element(env, argv[0], i, &value);
307e41f4b71Sopenharmony_ci           napi_get_value_double(env, value, &param);
308e41f4b71Sopenharmony_ci           input_data.push_back(static_cast<float>(param));
309e41f4b71Sopenharmony_ci       }
310e41f4b71Sopenharmony_ci       std::stringstream outstr;
311e41f4b71Sopenharmony_ci       for (int i = 0; i < K_NUM_PRINT_OF_OUT_DATA; i++) {
312e41f4b71Sopenharmony_ci           outstr << input_data[i] << " ";
313e41f4b71Sopenharmony_ci       }
314e41f4b71Sopenharmony_ci   	LOGI("MS_LITE_LOG: input_data = %{public}s", outstr.str().c_str());
315e41f4b71Sopenharmony_ci       // Read model file
316e41f4b71Sopenharmony_ci       const std::string modelName = "mobilenetv2.ms";
317e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Run model: %{public}s", modelName.c_str());
318e41f4b71Sopenharmony_ci       size_t modelSize;
319e41f4b71Sopenharmony_ci       auto resourcesManager = OH_ResourceManager_InitNativeResourceManager(env, argv[1]);
320e41f4b71Sopenharmony_ci       auto modelBuffer = ReadModelFile(resourcesManager, modelName, &modelSize);
321e41f4b71Sopenharmony_ci       if (modelBuffer == nullptr) {
322e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: Read model failed");
323e41f4b71Sopenharmony_ci           return error_ret;
324e41f4b71Sopenharmony_ci       }
325e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Read model file success");
326e41f4b71Sopenharmony_ci       auto model = CreateMSLiteModel(modelBuffer, modelSize);
327e41f4b71Sopenharmony_ci       if (model == nullptr) {
328e41f4b71Sopenharmony_ci           OH_AI_ModelDestroy(&model);
329e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: MSLiteFwk Build model failed.\n");
330e41f4b71Sopenharmony_ci           return error_ret;
331e41f4b71Sopenharmony_ci       }
332e41f4b71Sopenharmony_ci       int ret = RunMSLiteModel(model, input_data);
333e41f4b71Sopenharmony_ci       if (ret != OH_AI_STATUS_SUCCESS) {
334e41f4b71Sopenharmony_ci           OH_AI_ModelDestroy(&model);
335e41f4b71Sopenharmony_ci           LOGE("MS_LITE_ERR: RunMSLiteModel failed.\n");
336e41f4b71Sopenharmony_ci           return error_ret;
337e41f4b71Sopenharmony_ci       }
338e41f4b71Sopenharmony_ci       napi_value out_data;
339e41f4b71Sopenharmony_ci       napi_create_array(env, &out_data);
340e41f4b71Sopenharmony_ci       auto outputs = OH_AI_ModelGetOutputs(model);
341e41f4b71Sopenharmony_ci       OH_AI_TensorHandle output_0 = outputs.handle_list[0];
342e41f4b71Sopenharmony_ci       float *output0Data = reinterpret_cast<float *>(OH_AI_TensorGetMutableData(output_0));
343e41f4b71Sopenharmony_ci       for (size_t i = 0; i < OH_AI_TensorGetElementNum(output_0); i++) {
344e41f4b71Sopenharmony_ci           napi_value element;
345e41f4b71Sopenharmony_ci           napi_create_double(env, static_cast<double>(output0Data[i]), &element);
346e41f4b71Sopenharmony_ci           napi_set_element(env, out_data, i, element);
347e41f4b71Sopenharmony_ci       }
348e41f4b71Sopenharmony_ci       OH_AI_ModelDestroy(&model);
349e41f4b71Sopenharmony_ci       LOGI("MS_LITE_LOG: Exit runDemo()");
350e41f4b71Sopenharmony_ci       return out_data;
351e41f4b71Sopenharmony_ci   }
352e41f4b71Sopenharmony_ci   ```
353e41f4b71Sopenharmony_ci
354e41f4b71Sopenharmony_ci6. Write the **CMake** script to link the MindSpore Lite dynamic library.
355e41f4b71Sopenharmony_ci
356e41f4b71Sopenharmony_ci   ```c++
357e41f4b71Sopenharmony_ci   # the minimum version of CMake.
358e41f4b71Sopenharmony_ci   cmake_minimum_required(VERSION 3.4.1)
359e41f4b71Sopenharmony_ci   project(MindSporeLiteCDemo)
360e41f4b71Sopenharmony_ci   
361e41f4b71Sopenharmony_ci   set(NATIVERENDER_ROOT_PATH ${CMAKE_CURRENT_SOURCE_DIR})
362e41f4b71Sopenharmony_ci   
363e41f4b71Sopenharmony_ci   if(DEFINED PACKAGE_FIND_FILE)
364e41f4b71Sopenharmony_ci       include(${PACKAGE_FIND_FILE})
365e41f4b71Sopenharmony_ci   endif()
366e41f4b71Sopenharmony_ci   
367e41f4b71Sopenharmony_ci   include_directories(${NATIVERENDER_ROOT_PATH}
368e41f4b71Sopenharmony_ci                       ${NATIVERENDER_ROOT_PATH}/include)
369e41f4b71Sopenharmony_ci   
370e41f4b71Sopenharmony_ci   add_library(entry SHARED mslite_napi.cpp)
371e41f4b71Sopenharmony_ci   target_link_libraries(entry PUBLIC mindspore_lite_ndk)
372e41f4b71Sopenharmony_ci   target_link_libraries(entry PUBLIC hilog_ndk.z)
373e41f4b71Sopenharmony_ci   target_link_libraries(entry PUBLIC rawfile.z)
374e41f4b71Sopenharmony_ci   target_link_libraries(entry PUBLIC ace_napi.z)
375e41f4b71Sopenharmony_ci   ```
376e41f4b71Sopenharmony_ci
377e41f4b71Sopenharmony_ci#### Use N-APIs to encapsulate the C++ dynamic library into an ArkTS module.
378e41f4b71Sopenharmony_ci
379e41f4b71Sopenharmony_ci1. In **entry/src/main/cpp/types/libentry/Index.d.ts**, define the ArkTS API **runDemo ()**. The content is as follows:
380e41f4b71Sopenharmony_ci
381e41f4b71Sopenharmony_ci   ```ts
382e41f4b71Sopenharmony_ci   export const runDemo: (a: number[], b:Object) => Array<number>;
383e41f4b71Sopenharmony_ci   ```
384e41f4b71Sopenharmony_ci
385e41f4b71Sopenharmony_ci2. In the **oh-package.json5** file, associate the API with the .so file to form a complete ArkTS module.
386e41f4b71Sopenharmony_ci
387e41f4b71Sopenharmony_ci   ```json
388e41f4b71Sopenharmony_ci   {
389e41f4b71Sopenharmony_ci     "name": "libentry.so",
390e41f4b71Sopenharmony_ci     "types": "./Index.d.ts",
391e41f4b71Sopenharmony_ci     "version": "1.0.0",
392e41f4b71Sopenharmony_ci     "description": "MindSpore Lite inference module"
393e41f4b71Sopenharmony_ci   }
394e41f4b71Sopenharmony_ci   ```
395e41f4b71Sopenharmony_ci
396e41f4b71Sopenharmony_ci#### Invoke the encapsulated ArkTS module for inference and output the result.
397e41f4b71Sopenharmony_ci
398e41f4b71Sopenharmony_ciIn **entry/src/main/ets/pages/Index.ets**, call the encapsulated ArkTS module to process the inference result.
399e41f4b71Sopenharmony_ci
400e41f4b71Sopenharmony_ci```ts
401e41f4b71Sopenharmony_ciimport msliteNapi from 'libentry.so'
402e41f4b71Sopenharmony_ciimport { resourceManager } from '@kit.LocalizationKit';
403e41f4b71Sopenharmony_ci
404e41f4b71Sopenharmony_cilet resMgr: resourceManager.ResourceManager = getContext().getApplicationContext().resourceManager;
405e41f4b71Sopenharmony_cilet max: number = 0;
406e41f4b71Sopenharmony_cilet maxIndex: number = 0;
407e41f4b71Sopenharmony_cilet maxArray: Array<number> = [];
408e41f4b71Sopenharmony_cilet maxIndexArray: Array<number> = [];
409e41f4b71Sopenharmony_ci
410e41f4b71Sopenharmony_ci// Call the runDemo function of C++. The buffer data of the input image is stored in float32View after preprocessing. For details, see Image Input and Preprocessing.
411e41f4b71Sopenharmony_ciconsole.info('MS_LITE_LOG: *** Start MSLite Demo ***');
412e41f4b71Sopenharmony_cilet output: Array<number> = msliteNapi.runDemo(Array.from(float32View), resMgr);
413e41f4b71Sopenharmony_ci// Obtain the maximum number of categories.
414e41f4b71Sopenharmony_cithis.max = 0;
415e41f4b71Sopenharmony_cithis.maxIndex = 0;
416e41f4b71Sopenharmony_cithis.maxArray = [];
417e41f4b71Sopenharmony_cithis.maxIndexArray = [];
418e41f4b71Sopenharmony_cilet newArray = output.filter(value => value !== max);
419e41f4b71Sopenharmony_cifor (let n = 0; n < 5; n++) {
420e41f4b71Sopenharmony_ci  max = output[0];
421e41f4b71Sopenharmony_ci  maxIndex = 0;
422e41f4b71Sopenharmony_ci  for (let m = 0; m < newArray.length; m++) {
423e41f4b71Sopenharmony_ci    if (newArray[m] > max) {
424e41f4b71Sopenharmony_ci      max = newArray[m];
425e41f4b71Sopenharmony_ci      maxIndex = m;
426e41f4b71Sopenharmony_ci    }
427e41f4b71Sopenharmony_ci  }
428e41f4b71Sopenharmony_ci  maxArray.push(Math.round(this.max * 10000));
429e41f4b71Sopenharmony_ci  maxIndexArray.push(this.maxIndex);
430e41f4b71Sopenharmony_ci  // Call the array filter function.
431e41f4b71Sopenharmony_ci  newArray = newArray.filter(value => value !== max);
432e41f4b71Sopenharmony_ci}
433e41f4b71Sopenharmony_ciconsole.info('MS_LITE_LOG: max:' + this.maxArray);
434e41f4b71Sopenharmony_ciconsole.info('MS_LITE_LOG: maxIndex:' + this.maxIndexArray);
435e41f4b71Sopenharmony_ciconsole.info('MS_LITE_LOG: *** Finished MSLite Demo ***');
436e41f4b71Sopenharmony_ci```
437e41f4b71Sopenharmony_ci
438e41f4b71Sopenharmony_ci### Debugging and Verification
439e41f4b71Sopenharmony_ci
440e41f4b71Sopenharmony_ci1. On DevEco Studio, connect to the device, click **Run entry**, and build your own HAP. 
441e41f4b71Sopenharmony_ci
442e41f4b71Sopenharmony_ci   ```shell
443e41f4b71Sopenharmony_ci   Launching com.samples.mindsporelitecdemo
444e41f4b71Sopenharmony_ci   $ hdc shell aa force-stop com.samples.mindsporelitecdemo
445e41f4b71Sopenharmony_ci   $ hdc shell mkdir data/local/tmp/xxx
446e41f4b71Sopenharmony_ci   $ hdc file send C:\Users\xxx\MindSporeLiteCDemo\entry\build\default\outputs\default\entry-default-signed.hap "data/local/tmp/xxx"
447e41f4b71Sopenharmony_ci   $ hdc shell bm install -p data/local/tmp/xxx
448e41f4b71Sopenharmony_ci   $ hdc shell rm -rf data/local/tmp/xxx
449e41f4b71Sopenharmony_ci   $ hdc shell aa start -a EntryAbility -b com.samples.mindsporelitecdemo
450e41f4b71Sopenharmony_ci   ```
451e41f4b71Sopenharmony_ci
452e41f4b71Sopenharmony_ci2. Touch the **photo** button on the device screen, select an image, and touch **OK**. The classification result of the selected image is displayed on the device screen. In the log printing result, filter images by the keyword **MS_LITE**. The following information is displayed:
453e41f4b71Sopenharmony_ci
454e41f4b71Sopenharmony_ci   ```verilog
455e41f4b71Sopenharmony_ci   08-05 17:15:52.001   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: PhotoViewPicker.select successfully, photoSelectResult uri: {"photoUris":["file://media/Photo/13/IMG_1501955351_012/plant.jpg"]}
456e41f4b71Sopenharmony_ci   ...
457e41f4b71Sopenharmony_ci   08-05 17:15:52.627   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: crop info.width = 224
458e41f4b71Sopenharmony_ci   08-05 17:15:52.627   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: crop info.height = 224
459e41f4b71Sopenharmony_ci   08-05 17:15:52.628   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: Succeeded in reading image pixel data, buffer: 200704
460e41f4b71Sopenharmony_ci   08-05 17:15:52.971   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: float32View data: float32View data: 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143 1.4722440242767334 1.2385478019714355 1.308123230934143
461e41f4b71Sopenharmony_ci   08-05 17:15:52.971   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: *** Start MSLite Demo ***
462e41f4b71Sopenharmony_ci   08-05 17:15:53.454   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: Build MSLite model success.
463e41f4b71Sopenharmony_ci   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: Run MSLite model Predict success.
464e41f4b71Sopenharmony_ci   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: Get model outputs:
465e41f4b71Sopenharmony_ci   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: - Tensor 0 name is: Default/head-MobileNetV2Head/Sigmoid-op466.
466e41f4b71Sopenharmony_ci   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: - Tensor data is:
467e41f4b71Sopenharmony_ci   08-05 17:15:53.753   4684-4684    A00000/[MSLiteNapi]            pid-4684              I     MS_LITE_LOG: 3.43385e-06 1.40285e-05 9.11969e-07 4.91007e-05 9.50266e-07 3.94537e-07 0.0434676 3.97196e-05 0.00054832 0.000246202 1.576e-05 3.6494e-06 1.23553e-05 0.196977 5.3028e-05 3.29346e-05 4.90475e-07 1.66109e-06 7.03273e-06 8.83677e-07 3.1365e-06
468e41f4b71Sopenharmony_ci   08-05 17:15:53.781   4684-4684    A03d00/JSAPP                   pid-4684              W     MS_LITE_WARN: output length =  500 ;value =  0.0000034338463592575863,0.000014028532859811094,9.119685273617506e-7,0.000049100715841632336,9.502661555416125e-7,3.945370394831116e-7,0.04346757382154465,0.00003971960904891603,0.0005483203567564487,0.00024620210751891136,0.000015759984307806008,0.0000036493988773145247,0.00001235533181898063,0.1969769448041916,0.000053027983085485175,0.000032934600312728435,4.904751449430478e-7,0.0000016610861166554969,0.000007032729172351537,8.836767619868624e-7
469e41f4b71Sopenharmony_ci   08-05 17:15:53.831   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: max:9497,7756,1970,435,46
470e41f4b71Sopenharmony_ci   08-05 17:15:53.831   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: maxIndex:323,46,13,6,349
471e41f4b71Sopenharmony_ci   08-05 17:15:53.831   4684-4684    A03d00/JSAPP                   pid-4684              I     MS_LITE_LOG: *** Finished MSLite Demo ***
472e41f4b71Sopenharmony_ci   ```
473e41f4b71Sopenharmony_ci
474e41f4b71Sopenharmony_ci
475e41f4b71Sopenharmony_ci### Effects
476e41f4b71Sopenharmony_ci
477e41f4b71Sopenharmony_ciTouch the **photo** button on the device screen, select an image, and touch **OK**. The top 4 categories of the image are displayed below the image.
478e41f4b71Sopenharmony_ci
479e41f4b71Sopenharmony_ci<img src="figures/stepc1.png"  width="20%"/>     <img src="figures/step2.png" width="20%"/>     <img src="figures/step3.png" width="20%"/>     <img src="figures/stepc4.png" width="20%"/>
480e41f4b71Sopenharmony_ci
481e41f4b71Sopenharmony_ci
482