2024年1月31日 星期三

Raspberry PI4 + Coral USB Accelerator Various Applications

Purpose:
利用Raspberry PI4 + Coral USB Accelerator來學習各種Edge TPU的應用範例, 進而開發自己的專案程式.
Use Raspberry PI4 + Coral USB Accelerator to learn various Edge TPU application examples and then develop your own project programs.

Architectures:

Fundamental:
開始使用Google Coral USB Accelerator之前,須先在樹莓派上安裝Edge TPU runtime及PyCoral library。Coral官方網站中:「Get started with the USB Accelerator」針對Google Coral USB Accelerator做出一連串的介紹及設定。
介紹模型及安裝
較常使用的TensorFlow Lite的模型為「物件偵測(Object detection)」及「圖片分類(Image classification)」。而在TensorFlow官網上提供object_detection介紹,透過Video Stream與物件偵測模型結合,該模型可以判斷鏡頭前的物品並標示物品名稱。

樹莓派上開啟終端機,輸入以下指令下載Coral Edge TPU套件:
在/home/pi/google-coral/tflite/python/examples/detection/models中找到「ssd_mobilenet_v2_coco_quant_postprocess_edgetpu.tflite」模型,此模型為物件偵測所使用,/home/pi/google-coral/tflite/python/examples/detection/models中找到「coco_labels.txt」文字檔,內容包含89項物品的文字.

Image classification(影像分類)
1. Download the example code from GitHub:
mkdir coral && cd coral
git clone https://github.com/google-coral/pycoral.git
cd pycoral
2. Download the model, labels, and bird photo:
bash examples/install_requirements.sh classify_image.py
3. Run the image classifier with the bird photo (shown in figure 1):
python3 examples/classify_image.py \
--model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite \
--labels test_data/inat_bird_labels.txt \
--input test_data/parrot.jpg

python3 detect_image.py \
  --model all_models/ssd_mobilenet_v2_coco_quant_postprocess_edgetpu.tflite \
  --labels all_models/coco_labels.txt \
  --input all_models/grace_hopper.bmp \
  --output all_models/grace_hopper_processed.bmp


Edge TPU Object Tracker Example
Installation
1. First, be sure you have completed the setup instructions for your Coral device. If it's been a while, repeat to be sure you have the latest software.
Importantly, you should have the latest TensorFlow Lite runtime installed (as per the Python quickstart).
2. Clone this Git repo onto your computer:
mkdir google-coral && cd google-coral
git clone https://github.com/google-coral/example-object-tracker.git
cd example-object-tracker/
3. Download the models:
sh download_models.sh
These models will be downloaded to a new folder models.
Models
For the demos in this repository you can change the model and the labels file by using the flags flags --model and --labels. Be sure to use the models labeled _edgetpu, as those are compiled for the accelerator - otherwise the model will run on the CPU and be much slower.
For detection you need to select one of the SSD detection models and its corresponding labels file:
mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite, coco_labels.txt

Edge TPU simple camera examples
1. First, be sure you have completed the setup instructions for your Coral device. If it's been a while, repeat to be sure you have the latest software.
Importantly, you should have the latest TensorFlow Lite runtime installed (as per the Python quickstart).

2. Clone this Git repo onto your computer:
mkdir google-coral && cd google-coral
git clone https://github.com/google-coral/examples-camera.git --depth 1
3. Download the models:
cd examples-camera
sh download_models.sh
These canned models will be downloaded and extracted to a new folder all_models.

Posenet 姿態識別
安裝posenet
Coral PoseNet
Pose estimation refers to computer vision techniques that detect human figures in images and video, so that one could determine, for example, where someone’s elbow, shoulder or foot show up in an image. PoseNet does not recognize who is in an image, it is simply estimating where key body joints are.

This repo contains a set of PoseNet models that are quantized and optimized for use on Coral's Edge TPU, together with some example code to shows how to run it on a camera stream.

Why PoseNet ?

Reference https://github.com/google-coral/project-posenet
https://hackmd.io/@0p3Xnj8xQ66lEl0EHA_2RQ/H1dgMOx1L
git clone https://github.com/google-coral/project-posenet
sh install_requirements.sh
python3 simple_pose.py

TensorFlow-Lite-Object-Detection-on-Raspberry-Pi
Reference https://github.com/JerryKurata/TFlite-object-detection/tree/main

TensorFlow Lite Python image classification example with Raspberry Pi.
Plug in your Coral USB Accelerator into one of the USB ports on the Raspberry Pi. If you're using a Pi 4, make sure to plug it in to one of the blue USB 3.0 ports.
Speed up TensorFlow Lite Inferencing with Coral USB Accelerator
https://gpiocc.github.io/learn/raspberrypi/ml/2020/06/27/martin-ku-speed-up-tensorflow-lite-inferencing-with-coral-usb-accelerator.html
python3 classify.py \
  --model ~/efficientnet_lite0_edgetpu.tflite \
  --labels ~/ai/labels.txt

Reference wget www.dropbox.com/s/i8mdgys2wav7ooa/coral-live-object-detector-v2.py

YouTube Demo:



沒有留言:

張貼留言