.. |br| raw:: html
.. _tx8-bsp-coral: Coral Edge TPU ============== Introduction ------------ .. figure:: images/coral_overview.jpg :scale: 20 % :align: right :figwidth: 40% *Coral helps you bring on-device AI application ideas from prototype to production. They offer a platform of hardware components, software tools, and pre-compiled models for building devices with local AI.* - `coral.ai `_ |br| The Coral Mini PCIe accelerator can be attached to our **QSBASE3** modules, meaning **QSXM** and **QSXP**. - https://coral.ai/products/pcie-accelerator You have two options using it: |br| - Download the *karo-image-ml* fitting your module from our `Download Area `_. - To compile the image yourself use :ref:`nxp-yocto-guide-index` and compile ``karo-image-ml``. Setup ----- 1. Mount the Coral PCIe module to our QSBASE3 baseboard: .. figure:: images/coral_installed.jpg :scale: 20 % :align: right :figwidth: 40% Getting Started [1]_ -------------------- karo-image-ml ~~~~~~~~~~~~~ For the following steps you will need **karo-image-ml** rootfs on your board. You can compile it with :ref:`nxp-yocto-guide-index` or get it from our `Download Area `_. .. note:: **PCIe has to be enabled in devicetree.** * QSXM: .. code-block:: text &pcie0 { status = "okay"; }; * QSXP: .. code-block:: text &pcie { status = "okay"; }; &pcie_phy { status = "okay"; }; PyCoral Library ~~~~~~~~~~~~~~~ PyCoral is a Python library built on top of the TensorFlow Lite library to speed up your development and provide extra functionality for the Edge TPU. It's recommend to start with the PyCoral API, because it simplifies the amount of code you must write to run an inference. But you can build your own projects using TensorFlow Lite directly, in either Python or C++. To install the PyCoral library into our *karo-image-ml* rootfs use the following command: .. prompt:: :prompts: # pip3 install --extra-index-url https://google-coral.github.io/py-repo/ pycoral .. tip:: If you're receiving any SSL/certificate errors make sure your module has set the correct time. |br| .. prompt:: :prompts: # ntpdate -s -u Run a Model on TPU ~~~~~~~~~~~~~~~~~~ Follow these steps to perform image classification with example code and MobileNet v2: 1. Download the example code from GitHub: .. prompt:: :prompts: # mkdir coral && cd coral git clone https://github.com/google-coral/pycoral.git cd pycoral 2. Download the model, labels, and bird photo: .. prompt:: :prompts: # bash examples/install_requirements.sh classify_image.py 3. Run the image classifier with the example bird photo: .. prompt:: :prompts: # python3 examples/classify_image.py \ --model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite \ --labels test_data/inat_bird_labels.txt \ --input test_data/parrot.jpg You should see a result like this: .. code-block:: text W :131] Could not set performance expectation : 4 (Inappropriate ioctl for device) ----INFERENCE TIME---- Note: The first inference on Edge TPU is slow because it includes loading the model into Edge TPU memory. 25.9ms 7.9ms 7.9ms 7.9ms 7.9ms -------RESULTS-------- Ara macao (Scarlet Macaw): 0.75781 .. hint:: The "Could not set performance expectation" warning comes from the linux-imx Kernel apex module. See https://github.com/google-coral/libedgetpu/issues/11 Next Steps ---------- You can continue with the official Coral documentation. https://coral.ai/docs/m2/get-started#next-steps .. rubric:: References .. [1] Partially extracted from https://coral.ai/docs/m2/get-started