.. _sample_collections: Sample Collections ================== SiMa provides multiple sample sources to help developers explore, evaluate, and deploy models, pipelines, and tools across different stages of development. Each source serves a distinct purpose and is accessed in a different way. This document explains what each collection is, when to use it, and how to access the full list of available samples. .. tabs:: .. tab:: Model Zoo The **Model Zoo** is a curated collection of **precompiled and quantized models** that are ready to run on SiMa devices. **Purpose** - Evaluate model accuracy and performance - Avoid manual compilation and quantization - Use validated, production-ready model artifacts - Select models optimized for specific hardware targets **Access** The Model Zoo is accessed via the CLI. .. code-block:: console user@host-machine:~$ sima-cli modelzoo list user@host-machine:~$ sima-cli modelzoo get yolov5s user@host-machine:~$ sima-cli modelzoo describe yolov5 .. tab:: App Zoo The **App Zoo** provides a collection of **prebuilt, end-to-end pipelines** that can be **deployed directly to a DevKit and executed**. **Purpose** - Rapid functional validation on hardware - End-to-end pipeline evaluation - Demonstration of complete applications - Deployment of runnable examples with minimal setup App Zoo entries represent complete pipelines, including models, preprocessing, and postprocessing logic. **Access** The App Zoo is accessed via the CLI. .. code-block:: console user@host-machine:~$ sima-cli appzoo list Applications from App Zoo can typically be deployed and executed directly on a DevKit. .. tab:: Hugging Face (LLM / VLM Models) SiMa supports **Large Language Models (LLMs)** and **Vision-Language Models (VLMs)** sourced from `Hugging Face `_, primarily for interactive evaluation and demonstration. **Purpose** - Evaluate LLM and VLM capabilities on SiMa hardware - Run interactive demos and inference workloads - Experiment with generative AI use cases - Validate end-to-end LLM pipelines These examples focus on **runtime evaluation and interaction** rather than precompiled Model Zoo artifacts. **Access** Hugging Face–based LLM/VLM samples are provided through the **LLiMA demo**. To install the LLiMA sample package: .. code-block:: console sima@modalix:~$ sima-cli install samples/llima -t select During installation, the user will be prompted to select one or more models to download. If multiple models are available on the system, the user will be prompted to select a model when launching the LLiMA application. For more information regarding LLiMa example, refer to this `article `_. .. tab:: GitHub Open Source Samples SiMa maintains a collection of **open source samples, tools, and reference implementations** on GitHub under the ``sima-ai`` organization. **Purpose** - Reference applications and developer tools - Automation and integration examples - Community collaboration and customization These repositories provide fully transparent source code and may evolve independently from SDK releases. **Access** Browse available repositories on `GitHub `_. .. tab:: Model SDK Examples **Model SDK examples** are reference examples that are **downloaded locally when the SDK is installed**. **Purpose** - Learn the Model SDK APIs - Understand pipeline construction and configuration - Explore common workflows such as preprocessing, inference, and postprocessing - Modify and experiment with code locally These are source-level examples intended for learning, debugging, and customization. **Access** The examples are installed automatically as part of the SDK installation and are available directly in the SDK directory on your system. No CLI command is required to list them — simply browse the installed SDK files.