.. _hello_sima_run_demos: Run Demos --------- Before building the first pipeline, try one of the preconfigured demos. SiMa provides three distinct demo experiences: - :ref:`Edgematic` is a state-of-the-art, web-based development platform for Edge AI applications. The developer can quickly explore the demo by dragging and dropping a prebuilt pipeline with just a few clicks. - Run a multi-channel Vision AI demo pipeline on the DevKit using the :ref:`OptiView `. - Run a Large Language Model demo on the Modalix platform using :ref:`sima_cli`. Experience cutting-edge multimodal AI running on SiMa’s high performance power efficient Modalix platform. Follow the instructions below to explore our demos. This also serves as a great way to verify that the developer's environment is set up correctly. .. tabs:: .. tab:: Edgematic Demo .. include:: ../blocks/edgematic_beta_registration_notice.rst Log into Edgematic, create a new project called ``demo``, on the right hand side ``Catalog`` tab, find ``yolo_v5_ethernet`` application under ``SiMa`` and drag it into the Canvas. Then hit the ``play`` button the top right corner of the page. .. raw:: html .. tab:: Multichannel Vision AI Demo The Multichannel Vision AI demo highlights the SiMa platform’s real-time processing of 16 video streams, running multiple models simultaneously such as YOLOv9s, YOLOv7, YOLOv8, and pose estimation. This prebuilt demo uses :ref:`OptiView ` — a tool that streamlines Vision AI ML pipeline development. On the host side, an installable package integrates `MediaMTX `_, an open-source media streaming server, to deliver multiple RTSP channels to the DevKit. Developers who prefer to build the demo from source can download the :ref:`Palette SDK ` and use the :ref:`MPK tools ` to compile and deploy the pipeline. See step 6 for detailed instructions. This demo is compatible with both :ref:`MLSoc_Devkit` and :ref:`Modalix ` platforms. .. note:: To install the demo, ensure the DevKit is connected to the Internet. Once installed, it can run entirely offline. Modalix DevKit is preloaded with this demo, if the developer is unboxing a new Modalix DevKit, open ``http://:8800`` from the browser to try the Vision AI demo and skip the rest of the installation instruction. .. dropdown:: Application Architecture :animate: fade-in :color: secondary :open: .. figure:: ../../media/pipeline-demo-architecture.jpg :width: 800px :align: center **Multi-channel Vision AI Demo Architecture** .. dropdown:: Step 1. Install Pre-built Pipeline On The DevKit :animate: fade-in :color: secondary :open: **Install on Modalix** .. code-block:: console modalix:~$ cd /tmp modalix:~$ sima-cli install -v 1.7.0 samples/multichannel -t modalix **Install on MLSoC** .. code-block:: console davinci:~$ cd /tmp davinci:~$ sima-cli install -v 1.7.0 samples/multichannel -t mlsoc .. dropdown:: Step 2. Install And Run OptiView On The DevKit :animate: fade-in :color: secondary :open: The Modalix DevKit comes with ``OptiView`` preloaded from the factory. Before attempting to install ``OptiView``, type ``optiview`` in the shell to check if it is already installed. .. code-block:: console modalix:~$ sima-cli install optiview modalix:~$ source ~/.bashrc modalix:~$ optiview When prompted to enter password, use the default password ``edgeai``. .. dropdown:: Step 3. Install And Run RTSP Media Source On The Host :animate: fade-in :color: secondary :open: **The following instruction applies to macOS and Ubuntu Linux** .. code-block:: console user@host-machine:~$ mkdir multisrc && cd multisrc && sima-cli install assets/multi-video-sources user@host-machine:~$ cd multi-video-sources-scripts/ user@host-machine:~$ open preview.html user@host-machine:~$ ./mediasrc.sh ../videos-720p16 This script will automatically install FFMPEG and MediaMTX on the host if they are not already installed. The ``open preview.html`` command allows the developer to view the RTSP streams in a browser to ensure they are indeed active. **The following instruction applies to Windows** On Windows platform, open PowerShell and run the ``mediasrc.bat`` file instead of the ``mediasrc.sh`` file. .. code-block:: console user@host-machine:C:\Users\sima\> mkdir multisrc && cd multisrc && sima-cli install assets/multi-video-sources user@host-machine:C:\Users\sima\> cd multi-video-sources-scripts/ user@host-machine:C:\Users\sima\> ii preview.html user@host-machine:C:\Users\sima\> .\mediasrc.bat ..\videos-720p16\ .. important:: For the MLSoC, the pipeline is capable of processing 16 480p30 input streams, run the ``mediasrc`` script with the ``videos-480p30`` assets instead. .. dropdown:: Step 4. Modify The Pipeline To Receive RTSP Streams :animate: fade-in :color: secondary :open: .. note:: By default, the pipeline reads the RTSP stream from the local IP address ``127.0.0.1``. Modify the pipeline configuration to read the stream from the host instead. Since OptiView was launched in step 2, open ``https://devkit-ip:9900`` from the host machine to access the OptiView web interface. Select the multichannel project from the project dropdown list, then click the ``gstEdit`` icon on the left hand side of the dropdown list to edit the pipeline. Copy the pipeline content into a text editor and replace all instances of ``127.0.0.1:8554`` with the host machine’s IP address and port 8554. For example, if the host IP is ``192.168.1.100``, replace ``127.0.0.1:8554`` with ``192.168.1.100:8554``. Paste the modified content back into the OptiView ``gstEdit`` window, then click ``Save``. .. dropdown:: Step 5. Run The Pipeline :animate: fade-in :color: secondary :open: Click the ``Rocket`` icon in the toolbar. If the pipeline starts successfully, the rocket will display a launch animation. A log window will also appear, allowing developers to monitor the pipeline status. Next, click the ``TV`` icon in the bottom-left corner to view the results. Note that this multichannel pipeline may take up to 30 seconds to fully start. .. dropdown:: Step 6. Compile The Pipeline :animate: fade-in :color: secondary :open: Once the developer have verified that the prebuilt pipeline runs on the DevKit and can receive and process multiple RTSP streams, the developer can customize and compile the pipeline by following these steps: Before proceeding, ensure that the :ref:`Palette CLI` SDK is installed on the host machine. The source code of this demo resides on `GitHub `_. **Download Pipeline Source Code & Models** .. code-block:: console sima-user@docker-image-id:~$ sima-cli install gh:sima-ai/pipeline-multichannel At the end of the installation enter the IP address of the RTSP source host when prompted. **Build Pipeline** .. code-block:: console sima-user@docker-image-id:~$ cd pipeline-multichannel sima-user@docker-image-id:~$ mpk create -s . -d . --clean --board-type modalix sima-user@docker-image-id:~$ mpk create -s . -d . --clean --board-type davinci **Deploy Pipeline** .. code-block:: console sima-user@docker-image-id:~$ mpk device connect -t sima@devkit-ip sima-user@docker-image-id:~$ mpk deploy -f project.mpk After the pipeline is deployed, the developer can still use OptiView to control and run the pipeline. .. tab:: VLM Demo This demo, nicknamed ``llima``, integrates live image capture, voice input, and text input to enable seamless multimodal interaction powered by the `LLaVA model `_. It also includes support for Retrieval-Augmented Generation (RAG) functionality. .. note:: This instruction is compatible only with the Modalix DevKit. To install this demo on Modalix Early Access Kit, refer to this :ref:`page `. To install the demo, ensure the DevKit is connected to the Internet. Once installed, it can run entirely offline. Modalix DevKit is preloaded with this demo, if the developer is unboxing a new Modalix DevKit, open ``http://:8800`` from the browser to try the LLiMa demo and skip the rest of the installation instruction. .. include:: ../blocks/install_sima_cli.rst .. dropdown:: Application Architecture :animate: fade-in :color: secondary :open: .. figure:: ../../media/llima-architecture.png :width: 800px :align: center **LLiMa Software Architecture** .. dropdown:: Install The Application On Modalix DevKit :animate: fade-in :color: secondary :open: Before proceeding, make sure the NVMe drive is mounted. The Modalix DevKit comes with a pre-formatted 500GB NVMe drive. If the developer is using a larger or new NVMe drive, refer to this :ref:`guide `. .. code-block:: console modalix:~$ sima-cli nvme remount modalix:~$ cd /media/nvme/ && mkdir -p llima && cd llima modalix:~$ sima-cli install -v 1.7.0 samples/llima This command downloads the necessary models and application components, organizing them under the ``/media/nvme/llima`` directory. Upon successful installation, the developer will see instructions on how to launch the demo. .. code-block:: console modalix:~$ cd simaai-genai-demo && ./run.sh To access the demo, open Chrome on the desktop and navigate to: ``https://:5000``. CLI mode allows users to test the model locally without relying on another machine. However, it only supports text and image interactions. To run the demo in CLI mode, use the following command: .. code-block:: console modalix:~$ cd simaai-genai-demo && ./run.sh -cli .. dropdown:: Install the RAG File Processing Server on Ubuntu :animate: fade-in :color: secondary :open: Optionally, the developer can install a RAG File Processing Server on a Ubuntu machine that the Modalix DevKit can access over the network. This allows users to upload ``.txt``, ``.pdf``, or ``.md`` files to the server, where text embeddings are generated and transferred to the Modalix DevKit for runtime RAG-based search. .. code-block:: console user@ubuntu-host-machine:~$ mkdir ragfps && cd ragfps user@ubuntu-host-machine:~$ sima-cli install assets/llm-demo-assets/rfps Upon successful installation, the CLI will display instructions to run the RAG File Processing Server: .. code-block:: console user@ubuntu-host-machine:~$ cd ragfps && source .venv/bin/activate user@ubuntu-host-machine:~$ python3 main.py To run the demo with the RAG File Processing Server enabled: .. code-block:: console modalix:~$ cd simaai-genai-demo && ./run.sh --ragfps