.. _building_first_pipeline_with_palette: ================== Build with Palette ================== This step-by-step guide walks you through building the ResNet-50 pipeline using :ref:`Palette CLI`. .. note:: This article is mainly for the `standalone mode <../target_architectures.html#sima-devkit-as-a-standalone-device>`_, but it is also useful to go through for the `PCIe mode <../target_architectures.html#sima-mlsoc-as-pcie-card>`_, as the core principles of pipeline development remain the same. For PCIe mode, you will need to build additional software on the host side, using the PCIe driver and :ref:`APIs ` to interact with the board. Once you completed this guide, you can refer to this :ref:`page ` to learn how to adapt this pipeline to run on a PCIe system. This guide is compatible with both :ref:`MLSoc_Devkit` and :ref:`Modalix ` platforms. .. dropdown:: Step 1. Install and Setup Palette :animate: fade-in :color: secondary :open: .. note:: .. button-link:: https://docs.sima.ai/pkg_downloads/SDK1.7.0/sdk/1.7.0_Palette_SDK_master_B219.zip :color: primary :shadow: Download Palette You will need a modern Ubuntu 22.04+ or Windows 11 Pro machine to run Palette. For more information on system requirements and installation procedure, refer to :ref:`Palette installation`. Make sure your DevKit runs v1.7 firmware or above, click :ref:`here ` to find out how to update the DevKit. .. include:: ../blocks/palette_volume_mapping.rst .. dropdown:: Step 2. Create a New Project :animate: fade-in :color: secondary :open: First, download the tutorial package which contains an optimized ResNet50 model and input source image inside the Palette SDK container by following the instructions below. To learn how to optimize a standard ResNet50 ONNX model, see :ref:`this guide `. **Download And Prepare Project For Modalix** .. code-block:: console sima-user@docker-image-id:~$ cd /home/docker/sima-cli/ && mkdir tutorial && cd tutorial sima-user@docker-image-id:~$ sima-cli install -v 1.7.0 samples/palette-tutorial **Download And Prepare Project For MLSoC** .. code-block:: console sima-user@docker-image-id:~$ cd /home/docker/sima-cli/ && mkdir tutorial && cd tutorial sima-user@docker-image-id:~$ sima-cli install -v 1.7.0 samples/palette-tutorial -t mlsoc .. note:: Please note this procedure only supports BitMap image format, if you have a dataset that is in JPEG or other format, please make sure you convert them to BitMap format first before proceeding. The installation procedure above runs ``mpk project create --model-path resnet_50.tar.gz --input-resource golden_retriever_207_720p.rgb`` to prepare the project skeleton under the ``resnet_50_simaaisrc`` folder. For more information regarding ``mpk project create`` command, refer to this `link <../../palette/mpk_tools.html#mpk-project-create>`_. .. dropdown:: Step 3. Understand the Project Structure :animate: fade-in :color: secondary :open: The skeleton created in the previous step includes an ``application.json`` pipeline definition file and a number of plugin files. Understanding the structure of this file is crucial, as we need to modify it to meet our requirements. The ``application.json`` file defines the GStreamer inferencing pipeline and provides a gst-launch command. When the app is packaged and deployed, the SiMa MLA interprets this file to start the appropriate components accordingly. .. code:: console user@palette-container-id:/home/docker/sima-cli/1.7/pipeline/resnet_50_simaaisrc$ tree -L 2 . ├── application.json ├── core │   ├── allocator │   ├── buffer-pool │   ├── caps │   ├── CMakeLists.txt │   ├── dispatcher-lite │   ├── metadata │   ├── simamm │   └── utils ├── dependencies │   └── gst_app └── plugins ├── processcvu ├── processmla └── simaaisrc 14 directories, 2 files .. dropdown:: Step 4. About the Data Input :animate: fade-in :color: secondary :open: To faciliate with file operations, SiMa provides the :ref:`simaaisrc plugin `. This plugin operates similarly to the standard filesrc and multifilesrc plugins but optimized for SiMa MLA. It serves as a source element that reads data from files and feeds it into a GStreamer pipeline. This functionality is particularly useful for testing and debugging, as it allows developers to simulate various input scenarios by sourcing data from files, thereby facilitating isolated testing of individual components within a pipeline. This plugin only supports RGB format binary files, so if you want to pass a JPEG image to this plugin you must convert it to RGB format first. For your convenience we provided an example code for image conversion. .. dropdown:: View Sample Code for Resizing and Reformattng Images :animate: fade-in :color: info .. include:: ../blocks/prepare_images.rst .. dropdown:: Step 5: Modify the Preprocessing Plugin :animate: fade-in :color: secondary :open: The ``preproc`` plugin is a preprocessing plugin that prepares the input data for the ResNet-50 model. It performs the following operations: - Resizes the input image from 1280x720 input resolution to 224x224 output resolution. - Converts the image from RGB to BGR format. - Normalizes the pixel values to a set mean and standard deviation However, the default configuration for normalization of the input are only with unit statistics (i.e., mean=0, std=1). For ResNet-50, we need to use the ImageNet statistics, which are: - Mean: [0.485, 0.456, 0.406] - Std: [0.229, 0.224, 0.225] To modify the preprocessing plugin, we need to edit the ``plugins/processcvu/cfg/0_preproc.json`` file and change the normalization parameters and input type. .. code:: javascript { ... "channel_mean": [ 0.485, 0.456, 0.406 ], "channel_stddev": [ 0.229, 0.224, 0.225 ], ... } .. dropdown:: Step 6. Add a Custom Postprocessing Plugin :animate: fade-in :color: secondary :open: By default, when the project skeleton was created, the pipeline terminates at ``fakesink`` which means the output goes no where. To check the inference output we need to write a post processing plugin and place into the pipeline. Let take a shortcut and bring an example plugin from the SDK ``/usr/local/simaai/app_zoo/GStreamer/GenericAggregatorDemo/plugins/simple_overlay`` into the project. We also need to copy the template headers from the SDK into the project. The template header files allows users to create custom plguins and add their own configuration json files that can be read by the plugin. .. code:: console user@palette-container-id:/home/docker/sima-cli/tutorial/resnet_50_simaaisrc$ cp -R /usr/local/simaai/app_zoo/Gstreamer/GenericAggregatorDemo/plugins/simple_overlay/ plugins user@palette-container-id:/home/docker/sima-cli/tutorial/resnet_50_simaaisrc$ cp -R /usr/local/simaai/plugin_zoo/gst-simaai-plugins-base/gst/templates/aggregator plugins/simple_overlay/ user@palette-container-id:/home/docker/sima-cli/tutorial/resnet_50_simaaisrc$ sed -i 's|\.\./\.\./\.\./\.\./core|\.\./\.\./core|g' plugins/simple_overlay/CMakeLists.txt Then, let's edit the add ``simple_overlay`` to the hidden file ``.project/pluginsInfo.json`` so it will be included to the package compilation. .. code:: console user@palette-container-id:/home/docker/sima-cli/tutorial/resnet_50_simaaisrc$ jq '.pluginsInfo += [{"gid":"simple_overlay","path":"plugins/simple_overlay"}]' .project/pluginsInfo.json > tmp.json && mv tmp.json .project/pluginsInfo.json We also need to modify the plugin configuration file ``plugins/simple_overlay/cfg/overlay.json`` to include the proper caps negotiation, buffers modifications, use below configuration for the overlay plugin. .. code:: javascript { "version": 0.1, "node_name": "overlay", "memory": { "cpu": 0, "next_cpu": 0 }, "system": { "out_buf_queue": 1, "debug": 0, "dump_data": 0 }, "buffers": { "input": [ { "name": "simaai_detesdequant_1", "size": 4000 } ], "output": { "size": 211527 } }, "caps": { "sink_pads": [ { "media_type": "application/vnd.simaai.tensor", "params": [ { "name": "format", "type": "string", "values": "DETESSDEQUANT", "json_field": null } ] } ], "src_pads": [ { "media_type": "image/png", "params": [ { "name": "width", "type": "int", "values": "1 - 4096", "json_field": null }, { "name": "height", "type": "int", "values": "1 - 4096", "json_field": null } ] } ] } } .. dropdown:: Step 7. Modify the ``gst`` String :animate: fade-in :color: secondary :open: The ``gst`` string in the ``application.json`` file defines the gStreamer launch command. We need to modify this string to include the new post processing plugin. Original string: .. code:: console "gst": "simaaisrc location=/data/simaai/applications/resnet_50_simaaisrc/etc/golden_retriever_207_720p.rgb node-name=decoder delay=1000 mem-target=0 index=0 loop=false ! 'video/x-raw, format=(string)RGB, width=(int)1280, height=(int)720' ! simaaiprocesscvu name=simaaiprocesspreproc_1 ! simaaiprocessmla name=simaaiprocessmla_1 ! simaaiprocesscvu name=simaaiprocessdetess_dequant_1 ! fakesink" New string: .. code:: console "gst": "simaaisrc location=/data/simaai/applications/resnet_50_simaaisrc/etc/golden_retriever_207_720p.rgb node-name=decoder delay=1000 mem-target=0 index=0 loop=false ! 'video/x-raw, format=(string)RGB, width=(int)1280, height=(int)720' ! simaaiprocesscvu name=simaaiprocesspreproc_1 ! simaaiprocessmla name=simaaiprocessmla_1 ! simaaiprocesscvu name=simaaiprocessdetess_dequant_1 ! 'application/vnd.simaai.tensor, format=(string)DETESSDEQUANT' ! simple_overlay config=/data/simaai/applications/resnet_50_simaaisrc/etc/overlay.json" The changes in the new strings are: - ``width`` and ``height`` in the ``simaaisrc`` plugin is changed to 1280 and 720 respectively. - Added ``config`` parameter to the ``simple_overlay`` plugins. .. dropdown:: Step 8: Add Custom Plugin And Configuration To application.json :animate: fade-in :color: secondary :open: In order for the compilation process to include the configuration file for the custom plugin, it must also be declared explicitly in the ``application.json`` file. Add ``simple_overlay`` block in ``plugins`` section of the ``application.json`` as below. .. code:: javascript { "name": "simple_overlay_1", "pluginName": "simple_overlay", "gstreamerPluginName": "simple_overlay", "pluginGid": "simple_overlay", "pluginId": "simple_overlay_1", "sequence": 5, "resources": { "configs": [ "cfg/overlay.json" ] } } .. dropdown:: Step 9. Understand Resnet50 Output :animate: fade-in :color: secondary :open: ``ResNet-50`` is typically trained on ImageNet (1000 classes), and its output is a 1D tensor (vector) of size 1000, representing the classification probabilities for each class. **Output Shape** .. code:: console (1, 1000) # (Batch Size, Number of Classes) Each value in this vector represents the probability score for a class. For example: .. code:: console [0.001, 0.003, ..., 0.92, ..., 0.0005] # 1000 values - The index with the highest value is the predicted class. - Example: If index 340 = 0.92, then the image is classified as zebra. The plugin ``detesdequant`` is configured to output ``NHWC`` format, which stands for: - N → Batch size (number of images processed together) - H → Height (number of rows in the image) - W → Width (number of columns in the image) - C → Channels (number of color channels, e.g., 3 for RGB or 1 for grayscale) .. dropdown:: Step 10. Add Custom Logic to Process Resnet50 Output :animate: fade-in :color: secondary :open: Once you understand the output of the ``detesdequant`` plugin, you can write a custom plugin to process the output. Use the following code to replace the ``UserContext::run`` function in the ``simple_overlay/payload.cpp``. .. code:: c++ int argmax(float *probabilities, std::size_t size) { if (size == 0) return -1; int max_idx = 0; float max_value = probabilities[0]; for (std::size_t i = 1; i < size; i++) { if (probabilities[i] > max_value) { max_value = probabilities[i]; max_idx = i; } } return max_idx; } void UserContext::run(std::vector &input, std::span output) { // Extract the input probabilities float *probabilities = reinterpret_cast(input[0].getData().data()); std::size_t probabilities_size = input[0].getDataSize(); // Ensure valid input data if (probabilities == nullptr || probabilities_size == 0) { std::cerr << "Error: Invalid input data." << std::endl; return; } // Convert byte size to element count std::size_t num_elements = probabilities_size / sizeof(float); if (num_elements == 0) { std::cerr << "Error: No valid data in input." << std::endl; return; } // Perform the argmax operation int max_idx = argmax(probabilities, num_elements); // Print the class index and its probability std::cout << "Predicted Class Index: " << max_idx << std::endl; std::cout << "Probability (%): " << 100*probabilities[max_idx] << std::endl; } The provided C++ code above processes the output of the ``detesdequant`` plugin and determines the most probable class based on its probability distribution. **argmax Function** - Iterates through an array of floating-point probabilities. - Finds the index of the highest probability value (i.e., the most confident prediction). - Returns the index of the class with the highest probability. **UserContext::run Function** - Extracts the probability tensor from the input. - Validates the input to ensure it is not empty. - Computes the most probable class using argmax(). - Prints the predicted class index and its probability. .. dropdown:: Step 11. Compile and Deploy the Project :animate: fade-in :color: secondary :open: .. include :: ../blocks/palette_create_connect_deploy.rst To connect to a PCIe device, use ``mpk device connect -s 0`` where ``0`` stands for the slot number where the PCIe card is installed. .. dropdown:: Step 12. Check the Runtime Output :animate: fade-in :color: secondary :open: .. note:: Connect to the DevKit console using SSH. For PCIe mode, connect through the virtual ethernet interface as described `here <../setup_pcie_mode.html#virtual-network>`_. When the pipeline is running, you can use the following command to check the running gst process's full command line. .. code:: console davinci:/var/log$ cat /proc/$(pgrep -f gst_app)/cmdline | tr '\0' ' ' If you want to check the process console log, you can tail ``tail -f /tmp/simaai/resnet_50_simaaisrc_Pipeline/resnet_50_simaaisrc_Pipeline.1/gst_app.log`` : .. code:: console davinci:/var/log$ tail -f /tmp/simaai/resnet_50_Pipeline/resnet50_mpk_Pipeline.1/gst_app.log Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Predicted Class Index: 207 Probability (%): 98.815 Here, the predicted class is 207, which corresponds to the golden retriever class in the ImageNet dataset. The probability of this prediction is 98.815%, indicating a high confidence level in the classification. To stop the running pipeline, you can either kill the ``gst_app`` from the DevKit shell, or run the `mpk list <../../palette/mpk_tools.html#mpk-list>`_ to find the running process and then `mpk kill --pid {pid} <../../palette/mpk_tools.html#mpk-kill>`_ to kill from within Palette.