1e41f4b71Sopenharmony_ci# Using MindSpore Lite for Image Classification (ArkTS)
2e41f4b71Sopenharmony_ci
3e41f4b71Sopenharmony_ci## When to Use
4e41f4b71Sopenharmony_ci
5e41f4b71Sopenharmony_ciYou can use [@ohos.ai.mindSporeLite](../../reference/apis-mindspore-lite-kit/js-apis-mindSporeLite.md) to quickly deploy AI algorithms into your application to perform AI model inference for image classification.
6e41f4b71Sopenharmony_ci
7e41f4b71Sopenharmony_ciImage classification can be used to recognize objects in images and is widely used in medical image analysis, auto driving, e-commerce, and facial recognition.
8e41f4b71Sopenharmony_ci
9e41f4b71Sopenharmony_ci## Basic Concepts
10e41f4b71Sopenharmony_ci
11e41f4b71Sopenharmony_ciBefore getting started, you need to understand the following basic concepts:
12e41f4b71Sopenharmony_ci
13e41f4b71Sopenharmony_ci**Tensor**: a special data structure that is similar to an array or matrix. It is the basic data structure used in MindSpore Lite network operations.
14e41f4b71Sopenharmony_ci
15e41f4b71Sopenharmony_ci**Float16 inference mode**: an inference mode in half-precision format, where a number is represented with 16 bits.
16e41f4b71Sopenharmony_ci
17e41f4b71Sopenharmony_ci## **Available APIs**
18e41f4b71Sopenharmony_ci
19e41f4b71Sopenharmony_ciAPIs involved in MindSpore Lite model inference are categorized into context APIs, model APIs, and tensor APIs. For details about APIs, see [@ohos.ai.mindSporeLite](../../reference/apis-mindspore-lite-kit/js-apis-mindSporeLite.md).
20e41f4b71Sopenharmony_ci
21e41f4b71Sopenharmony_ci| API                                                      | Description            |
22e41f4b71Sopenharmony_ci| ------------------------------------------------------------ | ---------------- |
23e41f4b71Sopenharmony_ci| loadModelFromFile(model: string, context?: Context): Promise<Model> | Loads a model from a file.|
24e41f4b71Sopenharmony_ci| getInputs(): MSTensor[]                                      | Obtains the model input.|
25e41f4b71Sopenharmony_ci| predict(inputs: MSTensor[]): Promise<MSTensor[]>       | Performs model inference.      |
26e41f4b71Sopenharmony_ci| getData(): ArrayBuffer                                       | Obtains tensor data.|
27e41f4b71Sopenharmony_ci| setData(inputArray: ArrayBuffer): void                       | Sets tensor data.|
28e41f4b71Sopenharmony_ci
29e41f4b71Sopenharmony_ci## Development Process
30e41f4b71Sopenharmony_ci
31e41f4b71Sopenharmony_ci1. Select an image classification model.
32e41f4b71Sopenharmony_ci2. Use the MindSpore Lite inference model on the device to classify the selected images.
33e41f4b71Sopenharmony_ci
34e41f4b71Sopenharmony_ci## Environment Setup
35e41f4b71Sopenharmony_ci
36e41f4b71Sopenharmony_ciInstall DevEco Studio 4.1 or later, and update the SDK to API version 11 or later.
37e41f4b71Sopenharmony_ci
38e41f4b71Sopenharmony_ci## How to Develop
39e41f4b71Sopenharmony_ci
40e41f4b71Sopenharmony_ciThe following uses inference on an image in the album as an example to describe how to use MindSpore Lite to implement image classification.
41e41f4b71Sopenharmony_ci
42e41f4b71Sopenharmony_ci### Selecting a Model
43e41f4b71Sopenharmony_ci
44e41f4b71Sopenharmony_ciThis sample application uses [mobilenetv2.ms](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/1.5/mobilenetv2.ms) as the image classification model. The model file is available in the **entry/src/main/resources/rawfile** project directory.
45e41f4b71Sopenharmony_ci
46e41f4b71Sopenharmony_ciIf you have other pre-trained models for image classification, convert the original model into the .ms format by referring to [Using MindSpore Lite for Model Conversion](mindspore-lite-converter-guidelines.md).
47e41f4b71Sopenharmony_ci
48e41f4b71Sopenharmony_ci### Writing Code
49e41f4b71Sopenharmony_ci
50e41f4b71Sopenharmony_ci#### Image Input and Preprocessing
51e41f4b71Sopenharmony_ci
52e41f4b71Sopenharmony_ci1. Call [@ohos.file.picker](../../reference/apis-core-file-kit/js-apis-file-picker.md) to pick up the desired image in the album.
53e41f4b71Sopenharmony_ci
54e41f4b71Sopenharmony_ci   ```ts
55e41f4b71Sopenharmony_ci   import { photoAccessHelper } from '@kit.MediaLibraryKit';
56e41f4b71Sopenharmony_ci   import { BusinessError } from '@kit.BasicServicesKit';
57e41f4b71Sopenharmony_ci   
58e41f4b71Sopenharmony_ci   let uris: Array<string> = [];
59e41f4b71Sopenharmony_ci   
60e41f4b71Sopenharmony_ci   // Create an image picker instance.
61e41f4b71Sopenharmony_ci   let photoSelectOptions = new photoAccessHelper.PhotoSelectOptions();
62e41f4b71Sopenharmony_ci   
63e41f4b71Sopenharmony_ci   // Set the media file type to IMAGE and set the maximum number of media files that can be selected.
64e41f4b71Sopenharmony_ci   photoSelectOptions.MIMEType = photoAccessHelper.PhotoViewMIMETypes.IMAGE_TYPE;
65e41f4b71Sopenharmony_ci   photoSelectOptions.maxSelectNumber = 1;
66e41f4b71Sopenharmony_ci   
67e41f4b71Sopenharmony_ci   // Create an album picker instance and call select() to open the album page for file selection. After file selection is done, the result set is returned through photoSelectResult.
68e41f4b71Sopenharmony_ci   let photoPicker = new photoAccessHelper.PhotoViewPicker();
69e41f4b71Sopenharmony_ci   photoPicker.select(photoSelectOptions, async (
70e41f4b71Sopenharmony_ci     err: BusinessError, photoSelectResult: photoAccessHelper.PhotoSelectResult) => {
71e41f4b71Sopenharmony_ci     if (err) {
72e41f4b71Sopenharmony_ci       console.error('MS_LITE_ERR: PhotoViewPicker.select failed with err: ' + JSON.stringify(err));
73e41f4b71Sopenharmony_ci       return;
74e41f4b71Sopenharmony_ci     }
75e41f4b71Sopenharmony_ci     console.info('MS_LITE_LOG: PhotoViewPicker.select successfully, ' +
76e41f4b71Sopenharmony_ci       'photoSelectResult uri: ' + JSON.stringify(photoSelectResult));
77e41f4b71Sopenharmony_ci     uris = photoSelectResult.photoUris;
78e41f4b71Sopenharmony_ci     console.info('MS_LITE_LOG: uri: ' + uris);
79e41f4b71Sopenharmony_ci   })
80e41f4b71Sopenharmony_ci   ```
81e41f4b71Sopenharmony_ci
82e41f4b71Sopenharmony_ci2. Based on the input image size, call [@ohos.multimedia.image](../../reference/apis-image-kit/js-apis-image.md) and [@ohos.file.fs](../../reference/apis-core-file-kit/js-apis-file-fs.md) to perform operations such as cropping the image, obtaining the image buffer, and standardizing the image.
83e41f4b71Sopenharmony_ci
84e41f4b71Sopenharmony_ci   ```ts
85e41f4b71Sopenharmony_ci   import { image } from '@kit.ImageKit';
86e41f4b71Sopenharmony_ci   import { fileIo } from '@kit.CoreFileKit';
87e41f4b71Sopenharmony_ci   
88e41f4b71Sopenharmony_ci   let modelInputHeight: number = 224;
89e41f4b71Sopenharmony_ci   let modelInputWidth: number = 224;
90e41f4b71Sopenharmony_ci   
91e41f4b71Sopenharmony_ci   // Based on the specified URI, call fileIo.openSync to open the file to obtain the FD.
92e41f4b71Sopenharmony_ci   let file = fileIo.openSync(this.uris[0], fileIo.OpenMode.READ_ONLY);
93e41f4b71Sopenharmony_ci   console.info('MS_LITE_LOG: file fd: ' + file.fd);
94e41f4b71Sopenharmony_ci   
95e41f4b71Sopenharmony_ci   // Based on the FD, call fileIo.readSync to read the data in the file.
96e41f4b71Sopenharmony_ci   let inputBuffer = new ArrayBuffer(4096000);
97e41f4b71Sopenharmony_ci   let readLen = fileIo.readSync(file.fd, inputBuffer);
98e41f4b71Sopenharmony_ci   console.info('MS_LITE_LOG: readSync data to file succeed and inputBuffer size is:' + readLen);
99e41f4b71Sopenharmony_ci   
100e41f4b71Sopenharmony_ci   // Perform image preprocessing through PixelMap.
101e41f4b71Sopenharmony_ci   let imageSource = image.createImageSource(file.fd);
102e41f4b71Sopenharmony_ci   imageSource.createPixelMap().then((pixelMap) => {
103e41f4b71Sopenharmony_ci     pixelMap.getImageInfo().then((info) => {
104e41f4b71Sopenharmony_ci       console.info('MS_LITE_LOG: info.width = ' + info.size.width);
105e41f4b71Sopenharmony_ci       console.info('MS_LITE_LOG: info.height = ' + info.size.height);
106e41f4b71Sopenharmony_ci       // Crop the image based on the input image size and obtain the image buffer readBuffer.
107e41f4b71Sopenharmony_ci       pixelMap.scale(256.0 / info.size.width, 256.0 / info.size.height).then(() => {
108e41f4b71Sopenharmony_ci         pixelMap.crop(
109e41f4b71Sopenharmony_ci           { x: 16, y: 16, size: { height: modelInputHeight, width: modelInputWidth } }
110e41f4b71Sopenharmony_ci         ).then(async () => {
111e41f4b71Sopenharmony_ci           let info = await pixelMap.getImageInfo();
112e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: crop info.width = ' + info.size.width);
113e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: crop info.height = ' + info.size.height);
114e41f4b71Sopenharmony_ci           // Set the size of readBuffer.
115e41f4b71Sopenharmony_ci           let readBuffer = new ArrayBuffer(modelInputHeight * modelInputWidth * 4);
116e41f4b71Sopenharmony_ci           await pixelMap.readPixelsToBuffer(readBuffer);
117e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: Succeeded in reading image pixel data, buffer: ' +
118e41f4b71Sopenharmony_ci           readBuffer.byteLength);
119e41f4b71Sopenharmony_ci           // Convert readBuffer to the float32 format, and standardize the image.
120e41f4b71Sopenharmony_ci           const imageArr = new Uint8Array(
121e41f4b71Sopenharmony_ci             readBuffer.slice(0, modelInputHeight * modelInputWidth * 4));
122e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: imageArr length: ' + imageArr.length);
123e41f4b71Sopenharmony_ci           let means = [0.485, 0.456, 0.406];
124e41f4b71Sopenharmony_ci           let stds = [0.229, 0.224, 0.225];
125e41f4b71Sopenharmony_ci           let float32View = new Float32Array(modelInputHeight * modelInputWidth * 3);
126e41f4b71Sopenharmony_ci           let index = 0;
127e41f4b71Sopenharmony_ci           for (let i = 0; i < imageArr.length; i++) {
128e41f4b71Sopenharmony_ci             if ((i + 1) % 4 == 0) {
129e41f4b71Sopenharmony_ci               float32View[index] = (imageArr[i - 3] / 255.0 - means[0]) / stds[0]; // B
130e41f4b71Sopenharmony_ci               float32View[index+1] = (imageArr[i - 2] / 255.0 - means[1]) / stds[1]; // G
131e41f4b71Sopenharmony_ci               float32View[index+2] = (imageArr[i - 1] / 255.0 - means[2]) / stds[2]; // R
132e41f4b71Sopenharmony_ci               index += 3;
133e41f4b71Sopenharmony_ci             }
134e41f4b71Sopenharmony_ci           }
135e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: float32View length: ' + float32View.length);
136e41f4b71Sopenharmony_ci           let printStr = 'float32View data:';
137e41f4b71Sopenharmony_ci           for (let i = 0; i < 20; i++) {
138e41f4b71Sopenharmony_ci             printStr += ' ' + float32View[i];
139e41f4b71Sopenharmony_ci           }
140e41f4b71Sopenharmony_ci           console.info('MS_LITE_LOG: float32View data: ' + printStr);
141e41f4b71Sopenharmony_ci         })
142e41f4b71Sopenharmony_ci       })
143e41f4b71Sopenharmony_ci     });
144e41f4b71Sopenharmony_ci   });
145e41f4b71Sopenharmony_ci   ```
146e41f4b71Sopenharmony_ci
147e41f4b71Sopenharmony_ci#### Writing Inference Code
148e41f4b71Sopenharmony_ci
149e41f4b71Sopenharmony_ci1. If the capability set defined by the project does not contain MindSpore Lite, create the **syscap.json** file in the **entry/src/main** directory of the DevEco Studio project. The file content is as follows:
150e41f4b71Sopenharmony_ci
151e41f4b71Sopenharmony_ci   ```json
152e41f4b71Sopenharmony_ci   {
153e41f4b71Sopenharmony_ci     "devices": {
154e41f4b71Sopenharmony_ci       "general": [
155e41f4b71Sopenharmony_ci         // The value must be the same as the value of deviceTypes in the module.json5 file.
156e41f4b71Sopenharmony_ci         "default"
157e41f4b71Sopenharmony_ci       ]
158e41f4b71Sopenharmony_ci     },
159e41f4b71Sopenharmony_ci     "development": {
160e41f4b71Sopenharmony_ci       "addedSysCaps": [
161e41f4b71Sopenharmony_ci         "SystemCapability.AI.MindSporeLite"
162e41f4b71Sopenharmony_ci       ]
163e41f4b71Sopenharmony_ci     }
164e41f4b71Sopenharmony_ci   }
165e41f4b71Sopenharmony_ci   ```
166e41f4b71Sopenharmony_ci
167e41f4b71Sopenharmony_ci2. Call [@ohos.ai.mindSporeLite](../../reference/apis-mindspore-lite-kit/js-apis-mindSporeLite.md) to implement inference on the device. The operation process is as follows:
168e41f4b71Sopenharmony_ci
169e41f4b71Sopenharmony_ci   1. Create a context, and set parameters such as the number of runtime threads and device type.
170e41f4b71Sopenharmony_ci   2. Load the model. In this example, the model is loaded from the memory.
171e41f4b71Sopenharmony_ci   3. Load data. Before executing a model, you need to obtain the model input and then fill data in the input tensors.
172e41f4b71Sopenharmony_ci   4. Perform model inference through the **predict** API.
173e41f4b71Sopenharmony_ci
174e41f4b71Sopenharmony_ci   ```ts
175e41f4b71Sopenharmony_ci   // model.ets
176e41f4b71Sopenharmony_ci   import { mindSporeLite } from '@kit.MindSporeLiteKit'
177e41f4b71Sopenharmony_ci   
178e41f4b71Sopenharmony_ci   export default async function modelPredict(
179e41f4b71Sopenharmony_ci     modelBuffer: ArrayBuffer, inputsBuffer: ArrayBuffer[]): Promise<mindSporeLite.MSTensor[]> {
180e41f4b71Sopenharmony_ci   
181e41f4b71Sopenharmony_ci     // 1. Create a context, and set parameters such as the number of runtime threads and device type.
182e41f4b71Sopenharmony_ci     let context: mindSporeLite.Context = {};
183e41f4b71Sopenharmony_ci     context.target = ['cpu'];
184e41f4b71Sopenharmony_ci     context.cpu = {}
185e41f4b71Sopenharmony_ci     context.cpu.threadNum = 2;
186e41f4b71Sopenharmony_ci     context.cpu.threadAffinityMode = 1;
187e41f4b71Sopenharmony_ci     context.cpu.precisionMode = 'enforce_fp32';
188e41f4b71Sopenharmony_ci   
189e41f4b71Sopenharmony_ci     // 2. Load the model from the memory.
190e41f4b71Sopenharmony_ci     let msLiteModel: mindSporeLite.Model = await mindSporeLite.loadModelFromBuffer(modelBuffer, context);
191e41f4b71Sopenharmony_ci   
192e41f4b71Sopenharmony_ci     // 3. Set the input data.
193e41f4b71Sopenharmony_ci     let modelInputs: mindSporeLite.MSTensor[] = msLiteModel.getInputs();
194e41f4b71Sopenharmony_ci     for (let i = 0; i < inputsBuffer.length; i++) {
195e41f4b71Sopenharmony_ci       let inputBuffer = inputsBuffer[i];
196e41f4b71Sopenharmony_ci       if (inputBuffer != null) {
197e41f4b71Sopenharmony_ci         modelInputs[i].setData(inputBuffer as ArrayBuffer);
198e41f4b71Sopenharmony_ci       }
199e41f4b71Sopenharmony_ci     }
200e41f4b71Sopenharmony_ci   
201e41f4b71Sopenharmony_ci     // 4. Perform inference.
202e41f4b71Sopenharmony_ci     console.info('=========MS_LITE_LOG: MS_LITE predict start=====');
203e41f4b71Sopenharmony_ci     let modelOutputs: mindSporeLite.MSTensor[] = await msLiteModel.predict(modelInputs);
204e41f4b71Sopenharmony_ci     return modelOutputs;
205e41f4b71Sopenharmony_ci   }
206e41f4b71Sopenharmony_ci   ```
207e41f4b71Sopenharmony_ci
208e41f4b71Sopenharmony_ci#### Executing Inference
209e41f4b71Sopenharmony_ci
210e41f4b71Sopenharmony_ciLoad the model file and call the inference function to perform inference on the selected image, and process the inference result.
211e41f4b71Sopenharmony_ci
212e41f4b71Sopenharmony_ci```ts
213e41f4b71Sopenharmony_ciimport modelPredict from './model';
214e41f4b71Sopenharmony_ciimport { resourceManager } from '@kit.LocalizationKit'
215e41f4b71Sopenharmony_ci
216e41f4b71Sopenharmony_cilet modelName: string = 'mobilenetv2.ms';
217e41f4b71Sopenharmony_cilet max: number = 0;
218e41f4b71Sopenharmony_cilet maxIndex: number = 0;
219e41f4b71Sopenharmony_cilet maxArray: Array<number> = [];
220e41f4b71Sopenharmony_cilet maxIndexArray: Array<number> = [];
221e41f4b71Sopenharmony_ci
222e41f4b71Sopenharmony_ci// The buffer data of the input image is stored in float32View after preprocessing. For details, see Image Input and Preprocessing.
223e41f4b71Sopenharmony_cilet inputs: ArrayBuffer[] = [float32View.buffer];
224e41f4b71Sopenharmony_cilet resMgr: resourceManager.ResourceManager = getContext().getApplicationContext().resourceManager;
225e41f4b71Sopenharmony_ciresMgr.getRawFileContent(modelName).then(modelBuffer => {
226e41f4b71Sopenharmony_ci  // predict
227e41f4b71Sopenharmony_ci  modelPredict(modelBuffer.buffer.slice(0), inputs).then(outputs => {
228e41f4b71Sopenharmony_ci    console.info('=========MS_LITE_LOG: MS_LITE predict success=====');
229e41f4b71Sopenharmony_ci    // Print the result.
230e41f4b71Sopenharmony_ci    for (let i = 0; i < outputs.length; i++) {
231e41f4b71Sopenharmony_ci      let out = new Float32Array(outputs[i].getData());
232e41f4b71Sopenharmony_ci      let printStr = outputs[i].name + ':';
233e41f4b71Sopenharmony_ci      for (let j = 0; j < out.length; j++) {
234e41f4b71Sopenharmony_ci        printStr += out[j].toString() + ',';
235e41f4b71Sopenharmony_ci      }
236e41f4b71Sopenharmony_ci      console.info('MS_LITE_LOG: ' + printStr);
237e41f4b71Sopenharmony_ci      // Obtain the maximum number of categories.
238e41f4b71Sopenharmony_ci      this.max = 0;
239e41f4b71Sopenharmony_ci      this.maxIndex = 0;
240e41f4b71Sopenharmony_ci      this.maxArray = [];
241e41f4b71Sopenharmony_ci      this.maxIndexArray = [];
242e41f4b71Sopenharmony_ci      let newArray = out.filter(value => value !== max)
243e41f4b71Sopenharmony_ci      for (let n = 0; n < 5; n++) {
244e41f4b71Sopenharmony_ci        max = out[0];
245e41f4b71Sopenharmony_ci        maxIndex = 0;
246e41f4b71Sopenharmony_ci        for (let m = 0; m < newArray.length; m++) {
247e41f4b71Sopenharmony_ci          if (newArray[m] > max) {
248e41f4b71Sopenharmony_ci            max = newArray[m];
249e41f4b71Sopenharmony_ci            maxIndex = m;
250e41f4b71Sopenharmony_ci          }
251e41f4b71Sopenharmony_ci        }
252e41f4b71Sopenharmony_ci        maxArray.push(Math.round(max * 10000))
253e41f4b71Sopenharmony_ci        maxIndexArray.push(maxIndex)
254e41f4b71Sopenharmony_ci        // Call the array filter function.
255e41f4b71Sopenharmony_ci        newArray = newArray.filter(value => value !== max)
256e41f4b71Sopenharmony_ci      }
257e41f4b71Sopenharmony_ci      console.info('MS_LITE_LOG: max:' + maxArray);
258e41f4b71Sopenharmony_ci      console.info('MS_LITE_LOG: maxIndex:' + maxIndexArray);
259e41f4b71Sopenharmony_ci    }
260e41f4b71Sopenharmony_ci    console.info('=========MS_LITE_LOG END=========');
261e41f4b71Sopenharmony_ci  })
262e41f4b71Sopenharmony_ci})
263e41f4b71Sopenharmony_ci```
264e41f4b71Sopenharmony_ci
265e41f4b71Sopenharmony_ci### Debugging and Verification
266e41f4b71Sopenharmony_ci
267e41f4b71Sopenharmony_ci1. On DevEco Studio, connect to the device, click **Run entry**, and build your own HAP. 
268e41f4b71Sopenharmony_ci
269e41f4b71Sopenharmony_ci   ```shell
270e41f4b71Sopenharmony_ci   Launching com.samples.mindsporelitearktsdemo
271e41f4b71Sopenharmony_ci   $ hdc shell aa force-stop com.samples.mindsporelitearktsdemo
272e41f4b71Sopenharmony_ci   $ hdc shell mkdir data/local/tmp/xxx
273e41f4b71Sopenharmony_ci   $ hdc file send C:\Users\xxx\MindSporeLiteArkTSDemo\entry\build\default\outputs\default\entry-default-signed.hap "data/local/tmp/xxx" 
274e41f4b71Sopenharmony_ci   $ hdc shell bm install -p data/local/tmp/xxx
275e41f4b71Sopenharmony_ci   $ hdc shell rm -rf data/local/tmp/xxx
276e41f4b71Sopenharmony_ci   $ hdc shell aa start -a EntryAbility -b com.samples.mindsporelitearktsdemo
277e41f4b71Sopenharmony_ci   ```
278e41f4b71Sopenharmony_ci
279e41f4b71Sopenharmony_ci2. Touch the **photo** button on the device screen, select an image, and touch **OK**. The classification result of the selected image is displayed on the device screen. In the log printing result, filter images by the keyword **MS_LITE**. The following information is displayed:
280e41f4b71Sopenharmony_ci
281e41f4b71Sopenharmony_ci   ```verilog
282e41f4b71Sopenharmony_ci   08-06 03:24:33.743   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: PhotoViewPicker.select successfully, photoSelectResult uri: {"photoUris":["file://media/Photo/13/IMG_1501955351_012/plant.jpg"]}
283e41f4b71Sopenharmony_ci   08-06 03:24:33.795   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: readSync data to file succeed and inputBuffer size is:32824
284e41f4b71Sopenharmony_ci   08-06 03:24:34.147   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: crop info.width = 224
285e41f4b71Sopenharmony_ci   08-06 03:24:34.147   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: crop info.height = 224
286e41f4b71Sopenharmony_ci   08-06 03:24:34.160   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: Succeeded in reading image pixel data, buffer: 200704
287e41f4b71Sopenharmony_ci   08-06 03:24:34.970   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     =========MS_LITE_LOG: MS_LITE predict start=====
288e41f4b71Sopenharmony_ci   08-06 03:24:35.432   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     =========MS_LITE_LOG: MS_LITE predict success=====
289e41f4b71Sopenharmony_ci   08-06 03:24:35.447   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: Default/head-MobileNetV2Head/Sigmoid-op466:0.0000034338463592575863,0.000014028532859811094,9.119685273617506e-7,0.000049100715841632336,9.502661555416125e-7,3.945370394831116e-7,0.04346757382154465,0.00003971960904891603...
290e41f4b71Sopenharmony_ci   08-06 03:24:35.499   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: max:9497,7756,1970,435,46
291e41f4b71Sopenharmony_ci   08-06 03:24:35.499   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     MS_LITE_LOG: maxIndex:323,46,13,6,349
292e41f4b71Sopenharmony_ci   08-06 03:24:35.499   22547-22547  A03d00/JSAPP                   com.sampl...liteark+  I     =========MS_LITE_LOG END=========
293e41f4b71Sopenharmony_ci   ```
294e41f4b71Sopenharmony_ci
295e41f4b71Sopenharmony_ci### Effects
296e41f4b71Sopenharmony_ci
297e41f4b71Sopenharmony_ciTouch the **photo** button on the device screen, select an image, and touch **OK**. The top 4 categories of the image are displayed below the image.
298e41f4b71Sopenharmony_ci
299e41f4b71Sopenharmony_ci<img src="figures/step1.png" width="20%"/>     <img src="figures/step2.png" width="20%"/>     <img src="figures/step3.png" width="20%"/>     <img src="figures/step4.png" width="20%"/>
300e41f4b71Sopenharmony_ci
301e41f4b71Sopenharmony_ci
302