1# Using the MindSpore Lite Engine for On-Device Training (C/C++)
2
3## When to Use
4
5MindSpore Lite is an AI engine that implements AI model inference for different hardware devices. It has been used in a wide range of fields, such as image classification, target recognition, facial recognition, and character recognition. In addition, MindSpore Lite supports deployment of model training on devices, making it possible to adapt to user behavior in actual service scenarios.
6
7This topic describes the general development process for using MindSpore Lite for model training on devices.
8
9
10## Available APIs
11The following table list some APIs for using MindSpore Lite for model training.
12
13| API       | Description       |
14| ------------------ | ----------------- |
15|OH_AI_ContextHandle OH_AI_ContextCreate()|Creates a context object.|
16|OH_AI_DeviceInfoHandle OH_AI_DeviceInfoCreate(OH_AI_DeviceType device_type)|Creates a runtime device information object.|
17|void OH_AI_ContextDestroy(OH_AI_ContextHandle *context)|Destroys a context object.|
18|void OH_AI_ContextAddDeviceInfo(OH_AI_ContextHandle context, OH_AI_DeviceInfoHandle device_info)|Adds a runtime device information object.|
19|OH_AI_TrainCfgHandle OH_AI_TrainCfgCreate()|Creates the pointer to a training configuration object.|
20|void OH_AI_TrainCfgDestroy(OH_AI_TrainCfgHandle *train_cfg)|Destroys the pointer to a training configuration object.|
21|OH_AI_ModelHandle OH_AI_ModelCreate()|Creates a model object.|
22|OH_AI_Status OH_AI_TrainModelBuildFromFile(OH_AI_ModelHandle model, const char *model_path, OH_AI_ModelType model_type, const OH_AI_ContextHandle model_context, const OH_AI_TrainCfgHandle train_cfg)|Loads and builds a MindSpore training model from a model file.|
23|OH_AI_Status OH_AI_RunStep(OH_AI_ModelHandle model, const OH_AI_KernelCallBack before, const OH_AI_KernelCallBack after)|Runs a single-step training model.|
24|OH_AI_Status OH_AI_ModelSetTrainMode(OH_AI_ModelHandle model, bool train)|Sets the training mode.|
25|OH_AI_Status OH_AI_ExportModel(OH_AI_ModelHandle model, OH_AI_ModelType model_type, const char *model_file, OH_AI_QuantizationType quantization_type, bool export_inference_only, char **output_tensor_name, size_t num)|Exports a trained MS model.|
26|void OH_AI_ModelDestroy(OH_AI_ModelHandle *model)|Destroys a model object.|
27
28
29## How to Develop
30The following figure shows the development process for MindSpore Lite model training.
31
32**Figure 1** Development process for MindSpore Lite model training
33![how-to-use-train](figures/train_sequence_unify_api.png)
34
35Before moving to the development process, you need to reference related header files and compile functions to generate random input. The sample code is as follows:
36
37```c
38#include <stdlib.h>
39#include <stdio.h>
40#include <string.h>
41#include "mindspore/model.h"
42
43int GenerateInputDataWithRandom(OH_AI_TensorHandleArray inputs) {
44  for (size_t i = 0; i < inputs.handle_num; ++i) {
45    float *input_data = (float *)OH_AI_TensorGetMutableData(inputs.handle_list[i]);
46    if (input_data == NULL) {
47      printf("OH_AI_TensorGetMutableData failed.\n");
48      return  OH_AI_STATUS_LITE_ERROR;
49    }
50    int64_t num = OH_AI_TensorGetElementNum(inputs.handle_list[i]);
51    const int divisor = 10;
52    for (size_t j = 0; j < num; j++) {
53      input_data[j] = (float)(rand() % divisor) / divisor;  // 0--0.9f
54    }
55  }
56  return OH_AI_STATUS_SUCCESS;
57}
58```
59
60The development process consists of the following main steps:
61
621. Prepare the required model.
63
64    The prepared model is in `.ms` format. This topic uses [lenet_train.ms](https://gitee.com/openharmony-sig/compatibility/blob/master/test_suite/resource/master/standard%20system/acts/resource/ai/mindspore/lenet_train/lenet_train.ms) as an example. To use a custom model, perform the following steps:
65
66    - Use Python to create a network model based on the MindSpore architecture and export the model as a `.mindir` file. For details, see [Quick Start](https://www.mindspore.cn/tutorials/en/r2.1/beginner/quick_start.html).
67    - Convert the `.mindir` model file into an `.ms` file. For details about the conversion procedure, see [Converting MindSpore Lite Models](https://www.mindspore.cn/lite/docs/en/r2.1/use/converter_train.html). The `.ms` file can be imported to the device to implement training based on the MindSpore device framework.
68
692. Create a context and set parameters such as the device type and training configuration.
70
71    ```c
72    // Create and init context, add CPU device info
73    OH_AI_ContextHandle context = OH_AI_ContextCreate();
74    if (context == NULL) {
75        printf("OH_AI_ContextCreate failed.\n");
76        return OH_AI_STATUS_LITE_ERROR;
77    }
78
79    OH_AI_DeviceInfoHandle cpu_device_info = OH_AI_DeviceInfoCreate(OH_AI_DEVICETYPE_CPU);
80    if (cpu_device_info == NULL) {
81        printf("OH_AI_DeviceInfoCreate failed.\n");
82        OH_AI_ContextDestroy(&context);
83        return OH_AI_STATUS_LITE_ERROR;
84    }
85    OH_AI_ContextAddDeviceInfo(context, cpu_device_info);
86
87    // Create trainCfg
88    OH_AI_TrainCfgHandle trainCfg = OH_AI_TrainCfgCreate();
89    if (trainCfg == NULL) {
90        printf("OH_AI_TrainCfgCreate failed.\n");
91        OH_AI_ContextDestroy(&context);
92        return OH_AI_STATUS_LITE_ERROR;
93    }
94    ```
95
963. Create, load, and build the model.
97
98    Call **OH_AI_TrainModelBuildFromFile** to load and build the model.
99
100    ```c
101    // Create model
102    OH_AI_ModelHandle model = OH_AI_ModelCreate();
103    if (model == NULL) {
104        printf("OH_AI_ModelCreate failed.\n");
105        OH_AI_TrainCfgDestroy(&trainCfg);
106        OH_AI_ContextDestroy(&context);
107        return OH_AI_STATUS_LITE_ERROR;
108    }
109
110    // Build model
111    int ret = OH_AI_TrainModelBuildFromFile(model, model_file, OH_AI_MODELTYPE_MINDIR, context, trainCfg);
112    if (ret != OH_AI_STATUS_SUCCESS) {
113        printf("OH_AI_TrainModelBuildFromFile failed, ret: %d.\n", ret);
114        OH_AI_ModelDestroy(&model);
115        return ret;
116    }
117    ```
118
1194. Input data.
120
121    Before executing model training, you need to populate data to the input tensor. In this example, random data is used to populate the model.
122
123    ```c
124    // Get Inputs
125    OH_AI_TensorHandleArray inputs = OH_AI_ModelGetInputs(model);
126    if (inputs.handle_list == NULL) {
127        printf("OH_AI_ModelGetInputs failed, ret: %d.\n", ret);
128        OH_AI_ModelDestroy(&model);
129        return ret;
130    }
131
132    // Generate random data as input data.
133    ret = GenerateInputDataWithRandom(inputs);
134    if (ret != OH_AI_STATUS_SUCCESS) {
135        printf("GenerateInputDataWithRandom failed, ret: %d.\n", ret);
136        OH_AI_ModelDestroy(&model);
137        return ret;
138    }
139    ```
140
1415. Execute model training.
142
143    Use **OH_AI_ModelSetTrainMode** to set the training mode and use **OH_AI_RunStep** to run model training.
144
145    ```c
146    // Set Traim Mode
147    ret = OH_AI_ModelSetTrainMode(model, true);
148    if (ret != OH_AI_STATUS_SUCCESS) {
149        printf("OH_AI_ModelSetTrainMode failed, ret: %d.\n", ret);
150        OH_AI_ModelDestroy(&model);
151        return ret;
152    }
153
154    // Model Train Step
155    ret = OH_AI_RunStep(model, NULL, NULL);
156    if (ret != OH_AI_STATUS_SUCCESS) {
157        printf("OH_AI_RunStep failed, ret: %d.\n", ret);
158        OH_AI_ModelDestroy(&model);
159        return ret;
160    }
161    printf("Train Step Success.\n");
162    ```
163
1646. Export the trained model.
165
166    Use **OH_AI_ExportModel** to export the trained model.
167
168    ```c
169    // Export Train Model
170    ret = OH_AI_ExportModel(model, OH_AI_MODELTYPE_MINDIR, export_train_model, OH_AI_NO_QUANT, false, NULL, 0);
171    if (ret != OH_AI_STATUS_SUCCESS) {
172        printf("OH_AI_ExportModel train failed, ret: %d.\n", ret);
173        OH_AI_ModelDestroy(&model);
174        return ret;
175    }
176    printf("Export Train Model Success.\n");
177
178    // Export Inference Model
179    ret = OH_AI_ExportModel(model, OH_AI_MODELTYPE_MINDIR, export_infer_model, OH_AI_NO_QUANT, true, NULL, 0);
180    if (ret != OH_AI_STATUS_SUCCESS) {
181        printf("OH_AI_ExportModel inference failed, ret: %d.\n", ret);
182        OH_AI_ModelDestroy(&model);
183        return ret;
184    }
185    printf("Export Inference Model Success.\n");
186    ```
187
1887. Destroy the model.
189
190    If the MindSpore Lite inference framework is no longer needed, you need to destroy the created model.
191
192    ```c
193    // Delete model.
194    OH_AI_ModelDestroy(&model);
195    ```
196
197
198## Verification
199
2001. Write **CMakeLists.txt**.
201    ```c
202    cmake_minimum_required(VERSION 3.14)
203    project(TrainDemo)
204
205    add_executable(train_demo main.c)
206
207    target_link_libraries(
208            train_demo
209            mindspore_lite_ndk
210    )
211    ```
212
213   - To use ohos-sdk for cross compilation, you need to set the native toolchain path for the CMake tool as follows: `-DCMAKE_TOOLCHAIN_FILE="/xxx/native/build/cmake/ohos.toolchain.camke"`.
214
215   - Start cross compilation. When running the compilation command, set **OHOS_NDK** to the native toolchain path.
216      ```shell
217        mkdir -p build
218
219        cd ./build || exit
220        OHOS_NDK=""
221        cmake -G "Unix Makefiles" \
222              -S ../ \
223              -DCMAKE_TOOLCHAIN_FILE="$OHOS_NDK/build/cmake/ohos.toolchain.cmake" \
224              -DOHOS_ARCH=arm64-v8a \
225              -DCMAKE_BUILD_TYPE=Release
226
227        make
228      ```
229
2302. Run the executable program for compilation.
231
232    - Use hdc to connect to the device and put **train_demo** and **lenet_train.ms** to the same directory on the device.
233    - Use hdc shell to access the device, go to the directory where **train_demo** is located, and run the following command:
234
235    ```shell
236    ./train_demo ./lenet_train.ms export_train_model export_infer_model
237    ```
238
239    The operation is successful if the output is similar to the following:
240
241    ```shell
242    Train Step Success.
243    Export Train Model Success.
244    Export Inference Model Success.
245    Tensor name: Default/network-WithLossCell/_backbone-LeNet5/fc3-Dense/BiasAdd-op121, tensor size is 80, elements num: 20.
246    output data is:
247    0.000265 0.000231 0.000254 0.000269 0.000238 0.000228
248    ```
249
250    In the directory where **train_demo** is located, you can view the exported model files **export_train_model.ms** and **export_infer_model.ms**.
251
252
253## Sample
254
255```c
256#include <stdlib.h>
257#include <stdio.h>
258#include <string.h>
259#include "mindspore/model.h"
260
261int GenerateInputDataWithRandom(OH_AI_TensorHandleArray inputs) {
262  for (size_t i = 0; i < inputs.handle_num; ++i) {
263    float *input_data = (float *)OH_AI_TensorGetMutableData(inputs.handle_list[i]);
264    if (input_data == NULL) {
265      printf("OH_AI_TensorGetMutableData failed.\n");
266      return  OH_AI_STATUS_LITE_ERROR;
267    }
268    int64_t num = OH_AI_TensorGetElementNum(inputs.handle_list[i]);
269    const int divisor = 10;
270    for (size_t j = 0; j < num; j++) {
271      input_data[j] = (float)(rand() % divisor) / divisor;  // 0--0.9f
272    }
273  }
274  return OH_AI_STATUS_SUCCESS;
275}
276
277int ModelPredict(char* model_file) {
278  // Create and init context, add CPU device info
279  OH_AI_ContextHandle context = OH_AI_ContextCreate();
280  if (context == NULL) {
281    printf("OH_AI_ContextCreate failed.\n");
282    return OH_AI_STATUS_LITE_ERROR;
283  }
284
285  OH_AI_DeviceInfoHandle cpu_device_info = OH_AI_DeviceInfoCreate(OH_AI_DEVICETYPE_CPU);
286  if (cpu_device_info == NULL) {
287    printf("OH_AI_DeviceInfoCreate failed.\n");
288    OH_AI_ContextDestroy(&context);
289    return OH_AI_STATUS_LITE_ERROR;
290  }
291  OH_AI_ContextAddDeviceInfo(context, cpu_device_info);
292
293  // Create model
294  OH_AI_ModelHandle model = OH_AI_ModelCreate();
295  if (model == NULL) {
296    printf("OH_AI_ModelCreate failed.\n");
297    OH_AI_ContextDestroy(&context);
298    return OH_AI_STATUS_LITE_ERROR;
299  }
300
301  // Build model
302  int ret = OH_AI_ModelBuildFromFile(model, model_file, OH_AI_MODELTYPE_MINDIR, context);
303  if (ret != OH_AI_STATUS_SUCCESS) {
304    printf("OH_AI_ModelBuildFromFile failed, ret: %d.\n", ret);
305    OH_AI_ModelDestroy(&model);
306    return ret;
307  }
308
309  // Get Inputs
310  OH_AI_TensorHandleArray inputs = OH_AI_ModelGetInputs(model);
311  if (inputs.handle_list == NULL) {
312    printf("OH_AI_ModelGetInputs failed, ret: %d.\n", ret);
313    OH_AI_ModelDestroy(&model);
314    return ret;
315  }
316
317  // Generate random data as input data.
318  ret = GenerateInputDataWithRandom(inputs);
319  if (ret != OH_AI_STATUS_SUCCESS) {
320    printf("GenerateInputDataWithRandom failed, ret: %d.\n", ret);
321    OH_AI_ModelDestroy(&model);
322    return ret;
323  }
324
325  // Model Predict
326  OH_AI_TensorHandleArray outputs;
327  ret = OH_AI_ModelPredict(model, inputs, &outputs, NULL, NULL);
328  if (ret != OH_AI_STATUS_SUCCESS) {
329    printf("MSModelPredict failed, ret: %d.\n", ret);
330    OH_AI_ModelDestroy(&model);
331    return ret;
332  }
333
334  // Print Output Tensor Data.
335  for (size_t i = 0; i < outputs.handle_num; ++i) {
336    OH_AI_TensorHandle tensor = outputs.handle_list[i];
337    int64_t element_num = OH_AI_TensorGetElementNum(tensor);
338    printf("Tensor name: %s, tensor size is %ld ,elements num: %ld.\n", OH_AI_TensorGetName(tensor),
339           OH_AI_TensorGetDataSize(tensor), element_num);
340    const float *data = (const float *)OH_AI_TensorGetData(tensor);
341    printf("output data is:\n");
342    const int max_print_num = 50;
343    for (int j = 0; j < element_num && j <= max_print_num; ++j) {
344      printf("%f ", data[j]);
345    }
346    printf("\n");
347  }
348
349  OH_AI_ModelDestroy(&model);
350  return OH_AI_STATUS_SUCCESS;
351}
352
353int TrainDemo(int argc, const char **argv) {
354  if (argc < 4) {
355    printf("Model file must be provided.\n");
356    printf("Export Train Model path must be provided.\n");
357    printf("Export Inference Model path must be provided.\n");
358    return OH_AI_STATUS_LITE_ERROR;
359  }
360  const char *model_file = argv[1];
361  const char *export_train_model = argv[2];
362  const char *export_infer_model = argv[3];
363
364  // Create and init context, add CPU device info
365  OH_AI_ContextHandle context = OH_AI_ContextCreate();
366  if (context == NULL) {
367    printf("OH_AI_ContextCreate failed.\n");
368    return OH_AI_STATUS_LITE_ERROR;
369  }
370
371  OH_AI_DeviceInfoHandle cpu_device_info = OH_AI_DeviceInfoCreate(OH_AI_DEVICETYPE_CPU);
372  if (cpu_device_info == NULL) {
373    printf("OH_AI_DeviceInfoCreate failed.\n");
374    OH_AI_ContextDestroy(&context);
375    return OH_AI_STATUS_LITE_ERROR;
376  }
377  OH_AI_ContextAddDeviceInfo(context, cpu_device_info);
378
379  // Create trainCfg
380  OH_AI_TrainCfgHandle trainCfg = OH_AI_TrainCfgCreate();
381  if (trainCfg == NULL) {
382    printf("OH_AI_TrainCfgCreate failed.\n");
383    OH_AI_ContextDestroy(&context);
384    return OH_AI_STATUS_LITE_ERROR;
385  }
386
387  // Create model
388  OH_AI_ModelHandle model = OH_AI_ModelCreate();
389  if (model == NULL) {
390    printf("OH_AI_ModelCreate failed.\n");
391    OH_AI_TrainCfgDestroy(&trainCfg);
392    OH_AI_ContextDestroy(&context);
393    return OH_AI_STATUS_LITE_ERROR;
394  }
395
396  // Build model
397  int ret = OH_AI_TrainModelBuildFromFile(model, model_file, OH_AI_MODELTYPE_MINDIR, context, trainCfg);
398  if (ret != OH_AI_STATUS_SUCCESS) {
399    printf("OH_AI_TrainModelBuildFromFile failed, ret: %d.\n", ret);
400    OH_AI_ModelDestroy(&model);
401    return ret;
402  }
403
404  // Get Inputs
405  OH_AI_TensorHandleArray inputs = OH_AI_ModelGetInputs(model);
406  if (inputs.handle_list == NULL) {
407    printf("OH_AI_ModelGetInputs failed, ret: %d.\n", ret);
408    OH_AI_ModelDestroy(&model);
409    return ret;
410  }
411
412  // Generate random data as input data.
413  ret = GenerateInputDataWithRandom(inputs);
414  if (ret != OH_AI_STATUS_SUCCESS) {
415    printf("GenerateInputDataWithRandom failed, ret: %d.\n", ret);
416    OH_AI_ModelDestroy(&model);
417    return ret;
418  }
419
420  // Set Traim Mode
421  ret = OH_AI_ModelSetTrainMode(model, true);
422  if (ret != OH_AI_STATUS_SUCCESS) {
423    printf("OH_AI_ModelSetTrainMode failed, ret: %d.\n", ret);
424    OH_AI_ModelDestroy(&model);
425    return ret;
426  }
427
428  // Model Train Step
429  ret = OH_AI_RunStep(model, NULL, NULL);
430  if (ret != OH_AI_STATUS_SUCCESS) {
431    printf("OH_AI_RunStep failed, ret: %d.\n", ret);
432    OH_AI_ModelDestroy(&model);
433    return ret;
434  }
435  printf("Train Step Success.\n");
436
437  // Export Train Model
438  ret = OH_AI_ExportModel(model, OH_AI_MODELTYPE_MINDIR, export_train_model, OH_AI_NO_QUANT, false, NULL, 0);
439  if (ret != OH_AI_STATUS_SUCCESS) {
440    printf("OH_AI_ExportModel train failed, ret: %d.\n", ret);
441    OH_AI_ModelDestroy(&model);
442    return ret;
443  }
444  printf("Export Train Model Success.\n");
445
446  // Export Inference Model
447  ret = OH_AI_ExportModel(model, OH_AI_MODELTYPE_MINDIR, export_infer_model, OH_AI_NO_QUANT, true, NULL, 0);
448  if (ret != OH_AI_STATUS_SUCCESS) {
449    printf("OH_AI_ExportModel inference failed, ret: %d.\n", ret);
450    OH_AI_ModelDestroy(&model);
451    return ret;
452  }
453  printf("Export Inference Model Success.\n");
454
455  // Delete model.
456  OH_AI_ModelDestroy(&model);
457
458  // Use The Exported Model to predict
459  ret = ModelPredict(strcat(export_infer_model, ".ms"));
460  if (ret != OH_AI_STATUS_SUCCESS) {
461    printf("Exported Model to predict failed, ret: %d.\n", ret);
462    return ret;
463  }
464  return OH_AI_STATUS_SUCCESS;
465}
466
467int main(int argc, const char **argv) { return TrainDemo(argc, argv); }
468
469```
470