TinyML-CAM/[ipynb]-TinyML-CAM-full-cod...

5230 lines
1.3 MiB
Plaintext
Raw Permalink Normal View History

2022-07-23 03:08:11 +00:00
{
"cells": [
{
"cell_type": "markdown",
"id": "ffa8fbc6",
"metadata": {},
"source": [
"# Esp32-Cam Image Object Recognition in 30 minutes\n",
"\n",
"\n",
"![Esp32-cam image recognition](assets/esp32-cam-image-recognition-cover.png)\n",
"\n",
"\n",
"Have you ever wanted to perform **object recognition** on your cheap Esp32-cam in a matter of minutes?\n",
"\n",
"Do you want it to be easy and fast?\n",
"\n",
"This project is for you!\n",
"\n",
"Learn how to quickly implement your own **object recognition system on the Esp32-cam** by: \n",
"\n",
" 1. collect images from Esp32-cam to create a dataset\n",
" 2. train a Machine Learning classifier on your PC to recognize objects in the images\n",
" 3. deploy that classifier to your Esp32-cam for real-time object recognition"
]
},
{
"cell_type": "markdown",
"id": "8b98bc2b",
"metadata": {},
"source": [
"## Image Recognition that is *Fast*\n",
"\n",
"Image and object recognition is not something entirely new on the Esp32-cam and other microcontrollers, thanks to [TensorFlow for Microcontrollers](https://www.tensorflow.org/lite/microcontrollers) and no-code platforms like [Edge Impulse](https://edgeimpulse.com).\n",
"\n",
"They come with pre-trained Neural networks of varying size and complexity that you can leverage to implement your own image recognition system.\n",
"\n",
"*But...*\n",
"\n",
"**Neural Networks for image recognition are heavyweight: they can take anywhere from 50 Kb to 500 Kb of RAM.**\n",
"\n",
"Since your cheap Esp32-cam usually comes with limited RAM, you will often be forced to opt for a low complexity, low accuracy network.\n",
"\n",
"Even more, with weight it comes **time complexity**: classifying an image on the Esp32-cam usually takes about 500 ms (*source: [Edge Impulse blog](https://www.edgeimpulse.com/blog/add-sight-to-your-esp32)*).\n",
"\n",
"Can we do better?\n",
"\n",
"Can we do *faster*?\n",
"\n",
"Yes, we can!\n",
"\n",
"**Image and object recognition on Esp32-cam can be implemented in 30 minutes, with minimal code configuration, thanks to the Eloquent Arduino ecosystem of libraries: once deployed, it takes 1 kb of RAM and runs at 60 FPS.**\n",
"\n",
"Follow the next steps to get up and running!"
]
},
{
"cell_type": "markdown",
"id": "d21cdacb",
"metadata": {},
"source": [
"## Hardware Requirements\n",
"\n",
"To follow this project the only requirement is an Esp32 camera.\n",
"\n",
"You can find many models on the market:\n",
"\n",
" - [from Ai Thinker](http://www.ai-thinker.com/pro_view-24.html) (the most widely used)\n",
" - [from Espressif](https://www.espressif.com/en/products/devkits/esp-eye/overview)\n",
" - [from M5Stack](https://shop.m5stack.com/products/esp32-camera)\n",
"\n",
"I can't recommend enough the cameras from M5Stack because they come with 4 Mb external PSRAM, but any from the above list should work.\n",
"\n",
"<img src=\"assets/Esp32 cam devices.png\" alt=\"Esp32 cam devices\" />"
]
},
{
"cell_type": "markdown",
"id": "e985c552",
"metadata": {},
"source": [
"## Software requirements\n",
"\n",
"To capture the images from the Esp32-cam with ease, I suggest you to install the **[Eloquent Arduino library version 2.1.2](https://github.com/eloquentarduino/EloquentArduino)**. It is available on the Arduino IDE Library Manager.\n",
"\n",
"![Eloquent Arduino library](assets/eloquent-arduino-2.x.y.png)\n",
"\n",
"To collect the images on your PC and train the Machine Learning model, you have to install the **[everywhereml Python package](https://github.com/eloquentarduino/everywhereml)**.\n",
"\n",
"Create a new Python project and run\n",
"\n",
"```bash\n",
"pip install everywhereml==0.0.5\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "32a97c62",
"metadata": {},
"source": [
"## Step 1 of 5: Load the CameraWebServer sketch\n",
"\n",
"First step to create a Machine Learning model is to collect data.\n",
"\n",
"Since the Esp32-cam quality is pretty low, I recommend you to:\n",
"\n",
" 1. fix the camera in position with tape and don't let it move\n",
" 2. use artificial illumination if possible (image quality degrades in low light conditions)\n",
" \n",
"<img src=\"assets/esp32-cam image recognition setup.jpeg\" alt=\"esp32-cam image recognition setup\" />\n",
"<p class=\"caption\">Something as simple as a plain background will work best</p>\n",
" \n",
"To keep acquisition speed fast, we will capture at QQVGA resolution (160 x 120). If your project requires you to capture at higher resolutions, change the sketch accordingly.\n",
"\n",
"<x-alerts.info>Image recognition often happens at even lower resolutions anyway, so if you're not using the large version of the image for other purposes, QQVGA is the best choice</x-alerts.info>\n",
" \n",
"Once your setup is ready, load the CameraWebServer sketch below on your board.\n",
"\n",
"Once loaded, the Esp32-cam will connect your WiFi network and start an HTTP video streaming server you can access from any web broswer.\n",
"\n",
"```cpp\n",
"#include \"eloquent.h\"\n",
"#include \"eloquent/networking/wifi.h\"\n",
"#include \"eloquent/vision/camera/esp32/webserver.h\"\n",
"\n",
"// replace 'm5wide' with your own model\n",
"// possible values are 'aithinker', 'eye', 'm5stack', 'm5wide', 'wrover'\n",
"#include \"eloquent/vision/camera/m5wide.h\"\n",
"\n",
"\n",
"void setup() {\n",
" Serial.begin(115200);\n",
"\n",
" // configure camera\n",
" camera.jpeg();\n",
" camera.qqvga();\n",
"\n",
" // replace with your WiFi credentials\n",
" while (!wifi.connectTo(\"Abc\", \"12345678\"))\n",
" Serial.println(\"Cannot connect to WiFi\");\n",
"\n",
" while (!camera.begin())\n",
" Serial.println(\"Cannot connect to camera\");\n",
"\n",
" webServer.start();\n",
" Serial.print(\"Camera web server started at http://\");\n",
" Serial.println(WiFi.localIP());\n",
"}\n",
"\n",
"void loop() {\n",
" // do nothing\n",
"}\n",
"```\n",
"\n",
"Now connect your PC to the same WiFi network of your Esp32-cam and open the browser at the address that you read on the Serial Monitor. You should be able to see the live stream from the board."
]
},
{
"cell_type": "markdown",
"id": "27b9e424",
"metadata": {},
"source": [
"## Step 2 of 5: Collect images from Esp32-cam over HTTP\n",
"\n",
"Now that the Esp32-cam video stream is available over the WiFi network, we can run a program that collects the frames over HTTP.\n",
"\n",
"We will make use of the `MjpegCollector` class, that needs the URL of the Esp32-cam web server (the one you can read on the Serial Monitor)."
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "a8fcde70",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"ImageDataset[Boards](num_images=1978, num_labels=4, labels=['empty', 'pi', 'portenta', 'wio'])\n"
]
}
],
"source": [
"\"\"\"\n",
"Collect images from Esp32-cam web server\n",
"\"\"\"\n",
"from everywhereml.data import ImageDataset\n",
"from everywhereml.data.collect import MjpegCollector\n",
"\n",
"base_folder = 'dataset_folder'\n",
"\n",
"try:\n",
" # if our dataset folder already exists, load it\n",
" image_dataset = ImageDataset.from_nested_folders(\n",
" name='Boards', \n",
" base_folder=base_folder\n",
" )\n",
"except FileNotFoundError:\n",
" # if the dataset folder does not exists, collect the samples\n",
" # from the Esp32-cam web server\n",
" # duration is how long (in seconds) the program will collect \n",
" # the images for each class\n",
" mjpeg_collector = MjpegCollector(address='http://192.168.105.76')\n",
" image_dataset = mjpeg_collector.collect_many_classes(\n",
" dataset_name='Boards', \n",
" base_folder=base_folder,\n",
" duration=40\n",
" )\n",
" \n",
"print(image_dataset)"
]
},
{
"cell_type": "markdown",
"id": "907794dc",
"metadata": {},
"source": [
"The above snippet will start an interactive data collection procedure: it will ask for a class name and collect the frames for the given amount of time, until you decide to exit.\n",
"\n",
"**Put the objects in front of the camera, enter the object name in the input field and press \\[Enter\\]. The frame collection will start immediately.**\n",
"\n",
"\n",
"<img src=\"assets/esp32-cam image recognition data collection.jpeg\" alt=\"esp32-cam image recognition data collection\" />\n",
"<p class=\"caption\">Put the objects in front of the camera before starting the collection process</p>\n",
"\n",
"**Move the object a little in front of the camera to capture slight variations and make the model more robust**.\n",
"\n",
"Once you're done collecting frames, you can get a preview of them and check the quality of your work."
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "36dd7f29",
"metadata": {
"scrolled": true
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABBAAAAJCCAYAAAB58zQ2AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOz9XaxlS3IeBn6Ra+9TVfd2t5rs7nu72U1KMtAvgp8sQ3o1YAxNGQZowIZgGfBYtiUOMOKMLckzIuZFmhk9+NUGDBl8IEhhAFF6GdgzpCQbFGw+CZDhB/8BHBOCBVGQTFIku/tWnbPXyoyYh8jIjMy11j7nVJ3aZ9e9+RVW7b+11177rP1lRHwZEUkigoGBgYGBgYGBgYGBgYGBgYFzCM99AgMDAwMDAwMDAwMDAwMDA9ePISAMDAwMDAwMDAwMDAwMDAzciyEgDAwMDAwMDAwMDAwMDAwM3IshIAwMDAwMDAwMDAwMDAwMDNyLISAMDAwMDAwMDAwMDAwMDAzciyEgDAwMDAwMDAwMDAwMDAwM3Iv3IiAQ0U8Q0a8R0a8T0c+8j88YGBh4PAY3BwauE4ObAwPXicHNgYHrxODm84FE5GkPSDQB+P8B+N8B+A0Afw/AnxCR//lJP2hgYOBRGNwcGLhODG4ODFwnBjcHBq4Tg5vPi/eRgfBHAPy6iPx9EZkB/CKAn3wPnzMwMPA4DG4ODFwnBjcHBq4Tg5sDA9eJwc1nxOE9HPPbAP6he/wbAP5ovxMR/RSAnwKAF8fwhz/9+iuQvgIBoP/l+/mVdbKEQPJruo/4l8oj6V/zRxDaOK59luT7YueM480NqHymAEjNm8R9DgmDhOtx2v8g7UN9b/sVVmctAggLvv96wd2cCAMDD8ejuXlzDH/406+/zL/5unnOtPwRd48Aoea19vdM9R0dLyCAEHUccccWgCjzQQQhEI7Hm3JYEoDEM0jyidqm3CxHlL3Pac+4ntv6uwsEzIIfvI6DmwOPxVvZzW9+/RUKJ+F/vvvcyUfSH3E2jT0zq+1F5QbqEyKOu+Xpbi+RzM2A481N836Am/eVzxcBQRCEN+2hP9f+s8Xv1+0gImAZ3Bx4K7yV3fzm11/C20wIrX6j9bfqf80EEe9lnuPm+lVG/XlLx30pNNPRIYQJx+NRbWbeVX3W+j6xkUQAIGVu1l0qdwGQ+e/mO/fft+Nufr8w8IPh0w48Hm8Zb36kPi05H0/MO5zK+3obZPuR51j70q5N1Ndo9SbP4LK7SIk3AbhPq/GkGUTJvi1BHHc3fFe553laB8IiAmbsxpvvQ0B4EETkZwH8LAB8+8f+gPy7f/7/hpQiWBJSjBAwmBMoMaYEixQgAAJRCRzsHwWCsBSnxTb7e9tjdvtABBJnwD3Hws37E0UAASLAYTrgW9/6NkII0MwZMw5ADh/AnCDCeh8HMN3kY3E+dt1S3oQZLIIUI5gZzKwDceLmYjMzlmXB6XSHv/NL/++LXq+BLw48N3/kx35M/vd/7s/rb5gFMSXljQDEgnCKhZsAQFSTmgTqtBuYGSkm5SsIBIZwylzEintI+pnMnLmrQYXtTwQwC1JKePHyBb71zW+BiEBEAE2IOOrnSwKIQSSZmwmMAMGUPyufW0rl+CIJIqnwUYTB5VwYcOciECwpYU4Rd3e3+NVf/uXLXrCBLwxau/lj8if/3J9D4gTmhMQRCWp7KBHCcii/bcDsptpQYnVlKD+XUgInF9SDM8+V28ZB40uICcT1cUqp8hYCJkLihMQJL1+9wiff/CYQjJsBSNlJIxUTCufMbuJYbHxKepxio5G3zM3CURawJBBOEKRi95kZ8zLj7vYOv/o3BzcH3g9abv6o/Kk/+x+Ak/42o+eHEJhD+X3nyNuOAnAEJIJAEOjvX7gVGMBUfvMW5KhfCYjtb9x03FE7nvcTxsuXr/DNb30TgYKOBSBETOV0CAAoc00YLJTtZrbviZvjkwgI1VbGGMvYAAgYdxAs1e6mhCVGnE4n/J1f/puXulQDXzC03Pz98u/+h/9XMKfGp1M+MQJ7W9YhApIIgZQgKSWkHKeFQJiYESQ1tsv8WggQOx/XeOH9UH0f48WLF/jWt76FkO2mIIDlkElpQoLo9xD1aTkcymezcBMTMzNS8WcFMcY8Zljs+wZ+QjylhBgjTqc7/Mr/d5ub70NA+EcAftQ9/k5+bh+ccJhfYxL9QpFnCC9gSRAhgI4ABEJZv6FQZxdFEKCOEEK+oKJBDomASBDye0UECCh/YAEwB0ISskMhMSCc9wGQRB0cogA5EOQFQSjkWU5BSBFVdxKEoAfQC5MAWcqrXAZwddqiLGCOOjAzIyY32BIQKYFDdepimpF4RuITgPhEl2vgC4RHczMI46PlToW9lHBIEQDnwByQ6VAGv9YR0nmSSfxMY4JQVEeDCCRHAFMVAU0Rzv/SQcU4kgBhBrHN5NTBnUA4IOD48gY4EiioIwQwJnmDKYt3dn72bxICiYkXgiSs4w1QOcoElgBmICUTHwnMhAUMzoEPIztvKUGYu9mkgYEH4dHcFAZwJwgimEQwsQBIgCg/53gCAAQBKBAmmpwd5CIgQATCEaDqPJAcEHCotstlNAiAOAE85Z+6SLlvGwsjBYEgQF5OkBsCQoAEQmDgINVuEhgIxtMstvNS5ioT1KZb0FLsd3aSqt3UwGUJgmRCIwScEiQmvV1NJw0M3IvHc1OAmACRoL9ySsX2AIwgcw26QaBQZyaFAPBUxD1oSF6OHSRkf5U0SMgfKDnDIU4CTFX8o1TtJrKAoZ8WML08AAfSzyQASJhwl8+FQBRAFMCkYr4wAI5l1rPYzczJGBIiEpgYQgLOfjwLq30XUTtu40ZiSOJOvBwYeDAezU2ShJv4WQmqo03c5oCbIaD8ey42srw5QKag01/CyutsNwWEOE0Qmgq3cQhu0ksgMTUBpjhhHqjP0RQwvToAN1lwJwIJMKVF98vCu35qAgmrT4uljjJiUrs+jkhIJBBS35UQq7gBwQwBu64GSRISA5H3k4Leh4Dw9wB8l4j+IPRC/hsA/s1zbxASnOgOibPzAIFQyH444WizmmIuhT6vGQYEkqmkMwfhnJWpf2S9uFwEBUH+2+uUJ26Ii5iggkGb+inTAZoQQHh18xIvoQFKoEkvQTgWR4cIAFUnSCSBZdHfChM4kP528rkmSUikAQlTAGPK++TBNDFCCmWwpoUQYgDFPjV8YOBBeDQ3WYBZGMw685BAZfwDsjhnQ5SlRGcQB5DUdLAgBwgOUK4ESADyQ93fOJ0JeIgWxFuYUUuZfIxORHh5POAYCCGQDrYIEAuAbGzmGnwkAInyoA7JckAWBEQg4Q6cnTxmhhzyzE0eo6YlIrC+L0GQ4gxKCxDvcmQ3MPAovIXdZCzHO7UXrM4CbKZS1EYBUD6QBhz6GgA5ZL6qMQoSoA6JcpgCQ8WILIyX4F0DiUOyLMCcwVACe2T/JqdsE+HjwxEvshNGAoAmcLAMPnNv1FYr/1Q4L7MyIUAmcjMpmZMMMJFmEwXoGMWMCTOmXALBwojxDkgLAp/yxMPAwKPwaG4CDOHbnEFngTvqhuCLjFp/TszaKSeDT1sGECYBDuo3qpggxc6JKDdtQkxEj2BcEtEswZQSAMKrmyMOhGI3BQGRbvJYQQACQFOWPTJHw6LcFgEToJp5Fun5gEkCiFXMO1imYRZAEhYIWfYekJgRObkshYGBR+HxdhOEEybNeGdA6KCCV47hpiz6TYVzPtaiItSBtQx2ojwBFkL2kFU8E2TziqwZkNpDSJ00U2Z7GwgwCUCEj6cDXiIg5KzahID5cKjfggCQgCQVvjGzDiWSfVqbtBZBIkYUVtvKjISpyVDActLyJAt+I0MimxK6+bd8cgFBRCIR/TSAvw1gAvBzIvI/nXsPAZgYQE6voByMSB4cLa2iXEYLOHLADtF0RSrPrY7uzy/PJLL+aKYACfWPGJyjopkAKQ+4jI9fvcJhCpimA0IIYNFaTX/skgadTyhQUHUrpOw8CZj0ex6YEUjAlHWiAB2QSYOdNE3goKnVMSUsacGSZixxGYPtwKPxdtwUUGJ
"text/plain": [
"<Figure size 1440x720 with 40 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"\"\"\"\n",
"Display a preview of the captured images\n",
"\"\"\"\n",
"image_dataset.preview(\n",
" samples_per_class=10, \n",
" rows_per_class=2, \n",
" figsize=(20, 10)\n",
")"
]
},
{
"cell_type": "markdown",
"id": "682e5531",
"metadata": {},
"source": [
"If you find that some images are bad or totally wrong, take some time to delete them.\n",
"\n",
"If you feel that you may need to capture more images, do so.\n",
"\n",
"**Take all the time it takes to collect an high quality dataset, because in Machine Learning \"garbage in, garbage out\"!**"
]
},
{
"cell_type": "markdown",
"id": "6a305ea3",
"metadata": {},
"source": [
"## Step 3 of 5: Create an Object Detection pipeline\n",
"\n",
"Having our very own dataset of images, we need a way to transform each image into something a Machine Learning model can classify.\n",
"\n",
"With Neural Networks, you usually feed the raw image as input and the network learns by itself how to extract meaningful features from it.\n",
"\n",
"With traditional Machine Learning it's different: we have to extract the features by ourself.\n",
"\n",
"But don't worry, you don't have to do this on your own.\n",
"\n",
"The `everywhereml` package has all the tools you need.\n",
"\n",
"First of all, our feature extractor will work with grayscale images, so let's convert the dataset from RGB to Gray."
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "54670106",
"metadata": {},
"outputs": [],
"source": [
"\"\"\"\n",
"Image classification with HOG works on grayscale images at the moment\n",
"So convert images to grayscale in the range 0-255\n",
"\"\"\"\n",
"image_dataset = image_dataset.gray().uint8()"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "2a5f156b",
"metadata": {
"scrolled": true
},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABBAAAAJCCAYAAAB58zQ2AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOz9W4hke3YeiH877nnPyqqs+6mq0+fWffr0Od3qVgshGwzD3yMbgwZmELZhsI2HfpF8ebJkMHjwk189/AcP/SBaerHsB2Mb1GANDUaDbGPZxkjTct9O97lW1al7Vl7iHnsesr5ffnvF2hGRWVWRkd3rgyAjduzYsSP3/n5rrW+t9ftleZ4jEAgEAoFAIBAIBAKBQGASKqd9AoFAIBAIBAKBQCAQCAQWHyEgBAKBQCAQCAQCgUAgEJiKEBACgUAgEAgEAoFAIBAITEUICIFAIBAIBAKBQCAQCASmIgSEQCAQCAQCgUAgEAgEAlMRAkIgEAgEAoFAIBAIBAKBqXgpAkKWZb+cZdn3syz7UZZlv/kyviMQCBwfwc1AYDER3AwEFhPBzUBgMRHcPD1keZ6/2ANmWRXADwD8/wB8AuCPAPyVPM//9IV+USAQOBaCm4HAYiK4GQgsJoKbgcBiIrh5ungZFQhfB/CjPM9/nOd5D8DvAviVl/A9gUDgeAhuBgKLieBmILCYCG4GAouJ4OYpovYSjnkNwMfy+hMAv2B3yrLsGwC+AQD1ev2r29vb6T1WRZRVR0yrmvDen2Vb2Wv+zbIMzWYzPZ92jDzPx17P8t3Tzms0GuHg4AC9Xi8b+1GBQDmem5vESbg5Ky+99yZ9Ns9zVCoVNBqN9F6WZVP5fFw+TjoHft9oNML+/n5wM3BcnIib58+fHzvQ89pPb5/jvh6NRsiyDKPRCJVKBa1Wi+c/Zhft54/LvVn3GY1G6HQ66Pf7wc3AcXBibtJPBGa7X+2+s7x30te0m/Rpy75rFh92Vs56+/LRbreDm4Hj4rm5OYs/WLatDMe1Yd6+yk0v3pzETe9971ymjTPT7ObLEBBmQp7n3wTwTQC4ceNG/vf+3t9DnucYDofodDrp+bN90w9xjlPYPhwOMRqNCgMT9xkOh6hUKoVj9nq9wv7ed/KzS0tLuHnzJiqVSsEwWPAzQNFh4nG84/MxGAzSPtym+/V6PXQ6HXznO9953ksQCLhQbt68eTP/jd/4jXT/dbvdMV7xXs/zvMAL3rsaXOv9LN+Xtlmu8K8OePrgPktLS3jllVcSNz1+8vjkV6VSKZyPdw583ev1Ejc5hujv5v+m3W7j93//91/CVQkEitx85ZVX8r/9t/92wXboPdzv9/kZ/fyYLSL4mtxRu2S50e/3x7iifweDASqVCnq9HlZXV3Hr1i1Uq9VSblq+UXwos5vK/cFggMFgUOCjjjHD4RDdbhcHBwf4wz/8wxd5OQKBBMvNv/N3/s7YPaxcVf7QpgBI7xP8HKE+JXmgr/v9/hgXrd3k51ZWVgp2s1KpFM7F2nqen27judpt/A7+VvUDuO9gMECv18P+/j7+/b//9y/1+gR+dmG5+Xf/7t9NXCRf9N61sab1CSuVSsEG8f4GUOA8X5fx3otVlZs3b95MvCz5XWNCgdo9HTvUn+ZfPld7qz5yp9OZaDdfhoDwKYBX5PX1Z9smot/vFxwCAIUfw+fWAfFUFn2fz/mXQQMHs3q9XriYtVqtcFF4LqPRCK1WC/V6vSAg2BttOBymC24DKm7X77cBlwZk3o1YJqQEAjPgRNxUEc0OdAobGCgHlL98z97H+nkOmHRoeO/zfMgDDnyNRgP1ej2ND9xO0Enj4K/fx/0tH/Wz3I/c1+PxuzQACgSOiRNzU+2m5aYG5HrvAhizPcope8+rgwQAtVptLGBXh4hiQbVaxdraGhqNRuH7y7jP71GHSfmpQgJBAUFFRk+MtIJJIDAjjs1NBvJlDrzei9bmTAoYdB8elz6rcmMwGKBarRY4obzg55rNZoGbui9wZMNV1LOwtpy81+DKBik2yNEkQSBwDJyIm0wGWQHB2g7ey2r/yBUVtumnWu4Q6peWxXTqK49GIywvL0+NN/l7rBhZxlOeg/oK9ljcRrva6/VKufkyBIQ/AvBGlmWv4vBC/mUAf3XSB1TJsQMenZEyx9wqphyMbECgQT3/wQwm+M/2/vEcnFkiXa1WCxfCihl20NTfQqiDpudQqVRQrVbH/jdZliUl1zpLgcAx8FzcVBGhbCDzXnv3apZlqNVqY/sCR5lQzVx4pc/KvWazmcYCHehtsKTcHI1GBSfLZkj0e3SM0HPTLE9wM/AcODY36fzofaiBBFDkwDTxj84SX9tqPXWwyjIpand5jFarlQQFK+LpeVLA53ij9tgT7XiMer2e/h+0lfwLHHK13+8HNwMnxbG5CRSTVpqR5zbdR+9LT8j2hG3aLnKEQTt9VC9Qsd/barVQq9XGBASFHkPPUW2zrTIkP2nj9bPK7X6/n4SWQOAEOBE3tUrHCuaWVxrH2ftc7RjvdcarChUr7Hd6dhM49Gmr1WqynWW2S8U9m6gDUIh3VdzwxHj93Cw284ULCHmeD7Is+3UA/xZAFcBv5Xn+3SmfKaiy1smxQcC0CgQN8FWVIWwQwW08D3WK9J/PTIonHugxCL3Z1KGy2UqrWOk+er6DwQDdbjccocCJ8Dzc9Mq6LDynx8s0em0GOuhp0KMc0fFAHRWWe1GtLQMDCz0HFRL02DoWMbDRsjVmYBmcqMAX3AwcFyfhJjBetQb4GUH73H7GCnrWYVHbSA5ZB8QKDXl+2PrD6iB1hLxj63dr9RIdOhUy+Tna6Hq9nnirPB0OhylImZRJCQTKcFJuqs3UAN5W8wDjHFU+eEkr6zsCSOKBivxMSqlvSg72ej0sLS0lbsrvLTzXpJyKijxncs4GWty/0Wgk28jfwQwwM5zBzcBJcFKfVkVve7+X+bmWoyqGKzdsrKm8U/vkxa79fj8dlz4t7SaPp+eqSTb9bWo7tfJJwWOTm3oM7k//toybL2UOhDzPvw3g28f8zMT3PQXWe08dj2nqru5j/+pgyAGYFQieeMDP6ndbhUuzKfYcykq6NfNrhYdA4Lh4GdychEnc9DhEJ4u8s46JCoIqzHHfWq3mqrUq0nlBlydWlp2jPRcO5Dp/SSBwXJyEmyeFvbftfW6FOHJNgwDLTX2uf+moTHKE9BxUZPCcPK0k9Bw+/Tw5aSscA4Hj4KTcPOn95vFRBXYrxOlzfU/Fdu986vV6oQLBs5sqWngChrXF+vDETA1QbGVjIHBczNNuAn6rrXLT3u9qdxqNRsGuebYLOLSP5KaKe544ManKkAIHeey11ut3W6HeEx4UpzaJ4otCmZJb5vzrYErFFjgqB+NfPlRdbTQaKUApq0DQ71DxgBdNFSg7WU0ZVAWmKhQInAWUBSm23QAo9mvrwGeDCZvpYJbD9osp9Lieo2MFBZ6vZly8Y/K8wxEKnDXYHk2FViMASDZH5z5g5Y4GL/oe+dNsNqdmOWknaWt1QkfPgZkkvPOYVnwPuxmYF9RenQReJZ0mr5R33nwKaifVbtoEWrPZnElAYMmz8tJ+h3KyLEjhMa1PG3YzcFZgg3hgnJsEK3cAvz3C2jbyqExAUI7Y5DQ5annv8bLMbvI7ZhXez7yAAIwHKaoGle2vwQEwPqM6n3NfTjijAoIFj6eig+0Z16oEDU7sBBw2sNLKg8ikBM4C7PwiQDGz7/GzLFC3QYbdj+KeOkLKIRUQGEzY9xVWiNSqIoXlddmgHAi8DNgs5KyYlOEkNzUYmXVuBCuOq92c5AjxLwUDcpX72gDFq2zygh8rIoTdDMwL3j1JeL6nble76Qne+h3KFS8bSQ7ofD/AITc5PwkFQ3vOtmrWE9w5PtgViuwcJgr1acNuBuYNtVsW07ip97IKB9antce3lUNqJ1WIIJ9sUsweT+f6Uw7pvrZaT2NUa+89X3ZaxfvCCAg2g6EoUzIV2n/iDbb8J9hJCr3yaH1o+0Kr1UKz2Sw4VvY32BJp7T/RZbaoRvF8VYG1ypGe32g0SstzBQLzgFduXCbO0XH
"text/plain": [
"<Figure size 1440x720 with 40 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"\"\"\"\n",
"Preview grayscale images\n",
"\"\"\"\n",
"image_dataset.preview(\n",
" samples_per_class=10, \n",
" rows_per_class=2, \n",
" figsize=(20, 10), \n",
" cmap='gray'\n",
")"
]
},
{
"cell_type": "markdown",
"id": "32a5f281",
"metadata": {},
"source": [
"Now it's time to actually convert the images to feature vectors.\n",
"\n",
"There exist many feature extractor for images: in this project we will make use of [Histogram. of Oriented Gradients](https://en.wikipedia.org/wiki/Histogram_of_oriented_gradients).\n",
"\n",
"It is lightweight and pretty fast, so it's a good fit for embedded environments like the Esp32-cam.\n",
"\n",
"To speed the processing up, we will rescale our source image to a lower resolution (40 x 30).\n",
"\n",
"If you later find your classifier achieves low accuracy, you may want to tweak this resolution and see how it impacts both accuracy and execution time."
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "eac31b84",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"HOG: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1978/1978 [00:03<00:00, 593.51it/s]\n"
]
},
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>hog0</th>\n",
" <th>hog1</th>\n",
" <th>hog2</th>\n",
" <th>hog3</th>\n",
" <th>hog4</th>\n",
" <th>hog5</th>\n",
" <th>hog6</th>\n",
" <th>hog7</th>\n",
" <th>hog8</th>\n",
" <th>hog9</th>\n",
" <th>...</th>\n",
" <th>hog126</th>\n",
" <th>hog127</th>\n",
" <th>hog128</th>\n",
" <th>hog129</th>\n",
" <th>hog130</th>\n",
" <th>hog131</th>\n",
" <th>hog132</th>\n",
" <th>hog133</th>\n",
" <th>hog134</th>\n",
" <th>target</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>count</th>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>...</td>\n",
" <td>1978.0</td>\n",
" <td>1978.0</td>\n",
" <td>1978.0</td>\n",
" <td>1978.0</td>\n",
" <td>1978.0</td>\n",
" <td>1978.0</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" <td>1978.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>mean</th>\n",
" <td>0.000961</td>\n",
" <td>0.004506</td>\n",
" <td>0.052086</td>\n",
" <td>0.041644</td>\n",
" <td>0.031181</td>\n",
" <td>0.020685</td>\n",
" <td>0.016865</td>\n",
" <td>0.109291</td>\n",
" <td>0.118905</td>\n",
" <td>0.004492</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.000036</td>\n",
" <td>0.647984</td>\n",
" <td>0.138959</td>\n",
" <td>1.498483</td>\n",
" </tr>\n",
" <tr>\n",
" <th>std</th>\n",
" <td>0.003348</td>\n",
" <td>0.007184</td>\n",
" <td>0.031365</td>\n",
" <td>0.025717</td>\n",
" <td>0.028305</td>\n",
" <td>0.037702</td>\n",
" <td>0.028823</td>\n",
" <td>0.058904</td>\n",
" <td>0.037382</td>\n",
" <td>0.018272</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.000612</td>\n",
" <td>0.187835</td>\n",
" <td>0.052473</td>\n",
" <td>1.116958</td>\n",
" </tr>\n",
" <tr>\n",
" <th>min</th>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.013274</td>\n",
" <td>0.023559</td>\n",
" <td>0.000000</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.000000</td>\n",
" <td>0.280464</td>\n",
" <td>0.023285</td>\n",
" <td>0.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>25%</th>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.027400</td>\n",
" <td>0.022143</td>\n",
" <td>0.014863</td>\n",
" <td>0.006805</td>\n",
" <td>0.003621</td>\n",
" <td>0.077048</td>\n",
" <td>0.092610</td>\n",
" <td>0.000000</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.000000</td>\n",
" <td>0.471291</td>\n",
" <td>0.108158</td>\n",
" <td>1.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>50%</th>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.052826</td>\n",
" <td>0.038823</td>\n",
" <td>0.025297</td>\n",
" <td>0.013129</td>\n",
" <td>0.008883</td>\n",
" <td>0.100355</td>\n",
" <td>0.119885</td>\n",
" <td>0.000000</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.000000</td>\n",
" <td>0.722219</td>\n",
" <td>0.140522</td>\n",
" <td>1.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>75%</th>\n",
" <td>0.000000</td>\n",
" <td>0.007689</td>\n",
" <td>0.076274</td>\n",
" <td>0.058344</td>\n",
" <td>0.037194</td>\n",
" <td>0.021973</td>\n",
" <td>0.016421</td>\n",
" <td>0.122752</td>\n",
" <td>0.143970</td>\n",
" <td>0.000000</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.000000</td>\n",
" <td>0.774513</td>\n",
" <td>0.167540</td>\n",
" <td>2.000000</td>\n",
" </tr>\n",
" <tr>\n",
" <th>max</th>\n",
" <td>0.034322</td>\n",
" <td>0.038407</td>\n",
" <td>0.141044</td>\n",
" <td>0.157112</td>\n",
" <td>0.277901</td>\n",
" <td>0.395260</td>\n",
" <td>0.199861</td>\n",
" <td>0.426787</td>\n",
" <td>0.271960</td>\n",
" <td>0.155243</td>\n",
" <td>...</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.0</td>\n",
" <td>0.015333</td>\n",
" <td>1.000000</td>\n",
" <td>0.436249</td>\n",
" <td>3.000000</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>8 rows × 136 columns</p>\n",
"</div>"
],
"text/plain": [
" hog0 hog1 hog2 hog3 hog4 \\\n",
"count 1978.000000 1978.000000 1978.000000 1978.000000 1978.000000 \n",
"mean 0.000961 0.004506 0.052086 0.041644 0.031181 \n",
"std 0.003348 0.007184 0.031365 0.025717 0.028305 \n",
"min 0.000000 0.000000 0.000000 0.000000 0.000000 \n",
"25% 0.000000 0.000000 0.027400 0.022143 0.014863 \n",
"50% 0.000000 0.000000 0.052826 0.038823 0.025297 \n",
"75% 0.000000 0.007689 0.076274 0.058344 0.037194 \n",
"max 0.034322 0.038407 0.141044 0.157112 0.277901 \n",
"\n",
" hog5 hog6 hog7 hog8 hog9 ... \\\n",
"count 1978.000000 1978.000000 1978.000000 1978.000000 1978.000000 ... \n",
"mean 0.020685 0.016865 0.109291 0.118905 0.004492 ... \n",
"std 0.037702 0.028823 0.058904 0.037382 0.018272 ... \n",
"min 0.000000 0.000000 0.013274 0.023559 0.000000 ... \n",
"25% 0.006805 0.003621 0.077048 0.092610 0.000000 ... \n",
"50% 0.013129 0.008883 0.100355 0.119885 0.000000 ... \n",
"75% 0.021973 0.016421 0.122752 0.143970 0.000000 ... \n",
"max 0.395260 0.199861 0.426787 0.271960 0.155243 ... \n",
"\n",
" hog126 hog127 hog128 hog129 hog130 hog131 hog132 \\\n",
"count 1978.0 1978.0 1978.0 1978.0 1978.0 1978.0 1978.000000 \n",
"mean 0.0 0.0 0.0 0.0 0.0 0.0 0.000036 \n",
"std 0.0 0.0 0.0 0.0 0.0 0.0 0.000612 \n",
"min 0.0 0.0 0.0 0.0 0.0 0.0 0.000000 \n",
"25% 0.0 0.0 0.0 0.0 0.0 0.0 0.000000 \n",
"50% 0.0 0.0 0.0 0.0 0.0 0.0 0.000000 \n",
"75% 0.0 0.0 0.0 0.0 0.0 0.0 0.000000 \n",
"max 0.0 0.0 0.0 0.0 0.0 0.0 0.015333 \n",
"\n",
" hog133 hog134 target \n",
"count 1978.000000 1978.000000 1978.000000 \n",
"mean 0.647984 0.138959 1.498483 \n",
"std 0.187835 0.052473 1.116958 \n",
"min 0.280464 0.023285 0.000000 \n",
"25% 0.471291 0.108158 1.000000 \n",
"50% 0.722219 0.140522 1.000000 \n",
"75% 0.774513 0.167540 2.000000 \n",
"max 1.000000 0.436249 3.000000 \n",
"\n",
"[8 rows x 136 columns]"
]
},
"execution_count": 18,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"\"\"\"\n",
"Create an object detection pipeline with HOG feature extractor\n",
"\"\"\"\n",
"from everywhereml.preprocessing.image.object_detection import HogPipeline\n",
"from everywhereml.preprocessing.image.transform import Resize\n",
"\n",
"pipeline = HogPipeline(\n",
" transforms=[\n",
" Resize(width=40, height=30)\n",
" ]\n",
")\n",
"\n",
"# Convert images to feature vectors\n",
"feature_dataset = pipeline.fit_transform(image_dataset)\n",
"feature_dataset.describe()"
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "c3fc5a19",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"ImagePipeline: HogPipeline\n",
"---------\n",
" - Resize(from=((160, 120)), to=(40, 30), pixformat=gray)\n",
" > HOG(block_size=8, bins=9, cell_size=3)\n"
]
}
],
"source": [
"\"\"\"\n",
"Print pipeline description\n",
"\"\"\"\n",
"print(pipeline)"
]
},
{
"cell_type": "markdown",
"id": "9b7537ff",
"metadata": {},
"source": [
"The output of the above code is a dataset made of feature vectors, instead of images. These feature vectors are now suitable for Machine Learning models.\n",
"\n",
"To get a *visual* idea of how informative the extracted features are, we can plot a *pairplot* of them.\n",
"\n",
"A [pairplot](https://seaborn.pydata.org/generated/seaborn.pairplot.html) compares each feature against the others in a grid format. By highlighting each class with a different color, you can quickly get if the features are able to \"isolate\" a class (if you can do this by eye, a Machine Learning classifier will be able too!)."
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "13b73980",
"metadata": {
"scrolled": true
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Users/simone/PycharmProjects/ebooks/Getting started with TinyML on Arduino/venv/lib/python3.8/site-packages/sklearn/feature_selection/_univariate_selection.py:112: UserWarning: Features [ 27 28 30 36 37 38 39 40 81 82 83 84 85 86 87 126 127 128\n",
" 129 130 131 132] are constant.\n",
" warnings.warn(\"Features %s are constant.\" % constant_features_idx, UserWarning)\n",
"/Users/simone/PycharmProjects/ebooks/Getting started with TinyML on Arduino/venv/lib/python3.8/site-packages/sklearn/feature_selection/_univariate_selection.py:113: RuntimeWarning: invalid value encountered in true_divide\n",
" f = msb / msw\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAABeMAAAWHCAYAAADHhG3lAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOyddZRUR9qHn9tu093j7o67QwgOcQ9JNu72ZeMbt93sbnRjG9m4uxsBkhAILoOPu7u0Tdv3Rw0jzGAJMAPc55w+h66+fbt6qL636lfv+3slv9+PjIyMjIyMjIyMjIyMjIyMjIyMjIyMjMyhQ9HfHZCRkZGRkZGRkZGRkZGRkZGRkZGRkZE52pHFeBkZGRkZGRkZGRkZGRkZGRkZGRkZGZlDjCzGy8jIyMjIyMjIyMjIyMjIyMjIyMjIyBxiZDFeRkZGRkZGRkZGRkZGRkZGRkZGRkZG5hAji/EyMjIyMjIyMjIyMjIyMjIyMjIyMjIyhxhZjJeRkZGRkZGRkZGRkZGRkZGRkZGRkZE5xBw1Yvy8efP8gPyQH/3xOGDk8So/+ulxwMhjVX700+OAkceq/OinxwEjj1X50U+PA0Yeq/KjHx8HhDxW5Uc/Pg4IeazKj358yAwgjhoxvq6urr+7ICOz38jjVeZIQR6rMkcK8liVOVKQx6rMkYI8VmWOFOSxKnOkII9VGRkZOIrEeBkZGRkZGRkZGRkZGRkZGRkZGRkZGZmBSr+I8ZIkzZMkKVuSpDxJku7s4/U4SZJ+liRpoyRJmyVJWtAf/ZSRkZGRkZGRkZGRkZGRkZGRkZGRkZE5GKgO9wdKkqQEngdmA2XAWkmSvvL7/du7HXYP8JHf7/+vJEmDgO+AhMPdVxmZY5mSBjvLcmpZU1jPlNRQpqQEE2U19He3ZGRkgNzqVhbvqCG7qoU5gyOYkBREkFHb392SOQZwub1sKGni680VmLQqFgyNYHiMFUmS+rtrMjLHLDurWli0rZqiehvzh0QwLjEIi17T393qhdfnZ1NpI99ursTj83PisChGxFrRqORkbRkZmT9OYZ2NpTuqySprYkZGOJNSggkL0PV3t44KWh1u1hY18N2WSqIDDcwbEkFmpLm/uyUjc8Rz2MV4YByQ5/f7CwAkSfoAOAXoLsb7gV2/cAtQcVh7KCNzjNNga+e2j7NYXdgAwFdZlZw8IopHTxuKUdsflw0ZGZldlDTYufC1NVQ2OwH4YlMFd8xL5+rjkmVBVOaQs7qwgQtfW9P5/I0VRXx09QRGxAb2Y69kZI5dCmrbOO+V1TTY2gH4bEM5fz91COdPiO/nnvVmY2kj5760Co9P1JF7e1Ux710+nonJIf3cMxkZmSOVqmYHV729jpzqNkCsW6+YmsjtczNQyxt9f5pF26u55eOszudv/F7EJ1dPJDU8oB97JSNz5NMfqlo0UNrteRkwfrdjHgAWSZJ0A2AEZvV1IkmSrgSuBIiLizvoHZWROZgcSeM1v7aNtUUNnDIiivhgAxISy3JqKaqzMTja0t/dkznEHElj9VhkR2VLpxC/i2eX5nHS8ChiAkX2SqvTTVGdDYVCIiHEiFFzdG6iyWP18NLu8fLybwU927w+luyo6Tcx/kgZ6/JYlTlUbKtoIdSk5YIJcfj9UN3i4tmlucwaFE64+cAjQw/lWP1yYzlTUkM6smlgU0kTb68sZkJSsLyZLHPAyNfVo4+yRjs1LS6CjRriQ4z79Z6c6rZOIX4Xr68o4tyxcSSHmQ5FNw+YI2msujxeCmttOD1eQoxanvopu8frzQ43W8qbZTFeRuZPMjBXLLAQeMPv9z8hSdJE4G1Jkob4/X5f94P8fv/LwMsAY8aM8fdDP2X6CbfXxws/5zEhKZjxScH93Z394kgar16fn7/OTuOHrVV8uakChQSnjYzG6/Pt+80yRzxH0lg9FvH5ev+XeLz+zvaSejv3f7WVn7NrATh5RBR3zssgyqo/rP08HMhj9fDjcnt7t3n6595QXG/j/i+38UuOGOunjIjijgE61uWxKnOo0KuVjE8K4oWf8/H4/CSGGLlqWjI+/x8bZodyrGZGmimqt/OfJbkAHJcWyqj4QPx+kLV4mQNFvq4eXSzPreP69zfQZHdj1Ch57KzhzB0cgVKx94uDt495sdfv/8PXwEPBkTJWG2wuXvy1gP/9VoDPD4OizFw0OZG/f7ujx3F9/c1lZGQOjP4Q48uB2G7PYzraunMZMA/A7/evlCRJB4QANYelhzIDnjd/L+KzjeW8+XsRy++cgWGARsIdKVS3ONlQ3Eizw01auImkED3vrW5lW0ULAD4/fLqhnDmDIxjWz32VkTnWSY8IwGpQ02R3d7ZdOiWB6I6o+O+2VvJzdi3zhkQwOMqMy+NjY0kTAToVDbZ2vD4/MYF6NCplf30FmSMUjUrJ5VOTWFu0vrNNIcH8IRGsL24gt7oNi0HNyNhAIiyH3qv1uy1VnUI8wJebKpicHMLZY2P38i4ZmaMLhSQ2aW+YmYrH60OnVrKlvJlzxg2830G7x8eY+EBGxQUiSeK5AlDsQ2yTkZE5uilrtHcK8QC2di83fbCJb2+c0isCu7DOxvaKZrw+P5mRZhKC9USadVS2dGWNnjEqhrggudbZgbKptImXl3VlQG6vaGFCYiAPnDSIels7KqWCRpuLwVGyZ7yMzJ+lPxTMtUCqJEmJCBH+XOC83Y4pAWYCb0iSlAnogFpkZAC/3887q4q5dHIiX22q4Kft1ZwyIrq/u3XEUt5g56aPNrG2qBEQkUnPLRzZKcR3J7uqhbmDIw53F2VkZLqRFGri3cvH88GaErZVtHDG6BhmZYahVEh4vT4Wbatieloofj88sSin8313zEtHpQCH20+jrZ3LpyYRHTjwIohlBjZTUkL434VjeOP3QkxaFZdNSaKy2cF1721kVxDamPhAnjl3BFGBh24h7O4Y67uzLLdWFuNljin8wMbSps55m0KCuxZk0mhrH3DBKgatisd+zMbWLjJswgK0/N+s1H7ulYyMTH9T0+LqEWQCwgavstnZQ4zPrmrh/P+tpq5N1MgI0Kr45xlDuX1+OsX1Nn7JruOk4VHMGxyBVi0HnRwo2VU97X50agWxQUYe/nZHZzT88BgLV05N7o/uycgcVRz2ihZ+v98DXA/8COwAPvL7/dskSXpIkqSTOw67BbhCkqQs4H3gYr9/AOUZyfQrhXU22lweUsNMjIizsmSHnDDxZ9hY2tQpxAP4/fDItzv4Sx+Fv1LCZG84GZmBwOAoCw+dMoQPr5rI+ePjCTcLUV2pVDA5JYSR8YH8uJtQ+fTiXBrtHl5ZVkCYWct3Wyr7o+syRzhGrYpZg8J545JxPH/+aCKtOh75ZgfdZ2nriht73FcOBWqlgimpvYs+jksIOqSfKyMz0Gi0u3sEUPj88M6qYgbawsnj9fFrdk2nEA9Q0+qiercaKDIyMscewUYNRk1P8VypkAgN0PZoW7y9ulOIB2h1efhuSxUv/lKASqEgM8LECcMi5GCTP0hiSM8girmDI3hnVXEPW5qssmY2lzcd5p7JyBx99Et5ab/f/53f70/z+/3Jfr//7x1t9/n9/q86/r3d7/dP9vv9w/1+/wi/37+oP/opMzD5Pb+eIdEWJEkiM8LMuqKG/u7SEU2Dvb1XW3WLk2ExFqK62QzMHxLB6DjrYeyZjMw+aLdBYxHYj81rgCRJqJW9b+OnjohG3UfKv8vjQ6mQaHV5KG1wsCynFnu753B09djCVgeNxeA+ugUmlVKBUiHRbHdT1dL7u1Y2Ow+5p+ipI6PJiOgqzjYmPpDUcBOrCuqpb3Md0s+W+QP4vNBYAi3yRuDBxO7qfR2vbHbS6hw41/cGWzuFtTaKG+y9Xtu9IHkP2mrF9dQj/55ljlL8fmguE4+jlMI6Gz/vrGFdcQMtjt7rToD4ECOPnTUcTce8VqmQePiUwSSH9izAml9r6/XeymYHoQFaXl1eiE6joqzBcfC/xFGOy+1lW3kzQUYN09NDO9vjgw1UNPW+Rrc43L3ajjoczWKd6Wrt757IHKUMrNxFGZn9YE1hA6kdEdqRVh3NDjc
"text/plain": [
"<Figure size 1518.75x1440 with 72 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"\"\"\"\n",
"Plot pairplot of features.\n",
"Feel free to open the image in a new window to see it at full scale.\n",
"In the next line:\n",
" - n is the number of points to plot (the greater the value, the longer it takes)\n",
" - k is the number of features (values greater than 10 become messy)\n",
"\"\"\"\n",
"feature_dataset.plot.features_pairplot(n=200, k=8)"
]
},
{
"cell_type": "markdown",
"id": "7d876a8b",
"metadata": {},
"source": [
"In this case, we can clearly see that while the *wio* class (red) and *empty* class (blue) are well clustered, the *pi* (orange) and *portenta* (green) are always mixed to some degree.\n",
"\n",
"This tells us that the classifier will mis-label them sometimes.\n",
"\n",
"Another kind of visualization is UMAP.\n",
"\n",
"[UMAP (Uniform Manifold Approximation and Projection)](https://umap-learn.readthedocs.io/en/latest/) is a dimensionality reduction algorithm.\n",
"\n",
"It takes a feature vector of length N and \"compresses\" it to, in our case, length 2, while trying to preserve the topology structure of the original vector.\n",
"\n",
"By collapsing the feature vectors to (x, y) pairs, we can plot them on a scatter plot."
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "713c6d7b",
"metadata": {
"scrolled": true
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"OMP: Info #270: omp_set_nested routine deprecated, please use omp_set_max_active_levels instead.\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYcAAAEWCAYAAACNJFuYAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA7LElEQVR4nO3deXhU1fnA8e97sycEQtiCgKyyCAgCoigiiCjFHVxQW6UutAjWarVW6U+ttWpdirai1Fr3BVfqjqKIoCAIAoJhkz2sYSeQde77+2MmGDJZJslMZjJ5P8+Th8y5557zDpB5c8899xxRVYwxxpiSnHAHYIwxJvJYcjDGGOPHkoMxxhg/lhyMMcb4seRgjDHGjyUHY4wxfiw5GBMi4vW8iOwVkQXhjqc0ERkjIl+HOw4TmSw5mDpBRFREOpUqu1dEXvF9P9hXZ1qpOr185bNKlYuIrBORzDL6miUieSKSIyK7RORdEWlZjbAHAsOA1qrav4x+xoiIx9dPji+ecdXox5igs+Rgokk2MEBEmpQouwZYXUbdQUBzoIOInFTG8Qmq2gDoDKQBk6oRT1tgg6oeqqDOPFVt4OtrFPCwiJxYjb4qJCKxwW7TRDdLDiaaFAD/A0YDiEgMcDnwahl1rwHeAz72fV8mVd0DvAP0KOu4iBwjIu+LyB4R+UlEbvCVXwc8izdZ5YjIXyoLXlUXAyuAbiXav0BEfhSRfb4rmpLH/iQia0XkoIhkisjFJY6NEZFvRGSSiOwG7hWRJr5YD/iGuTqWqC++ujt9x5eJSJnv2dQP9tuEiTYv4f0tfzJwDrAc2FqygogkA5fgTSJJwL9F5FZVLSjdmIg0xfsb/eJy+pvq6+MYoCswQ0TWqup/RcQDXK+qAwMJ3HcF0xlY6HvdGXgduAiYBdwCfCAix/tiXQucDmwHLgVeEZFOqrrN1+TJvvhaAHHA80Ae0BJoD3wKrPfVPRvv1VRnYL/vvewLJG4TnezKwUQVVZ0LpItIF+BqvMmitJFAPvAZ8BHeD85zS9X5p4jsA5YC24BbSzciIm2A04A7VDVPVZfgvVq4ugohn+K7KjgILABeBtb4jl0OfKSqM1S1EHgUbzI71fde31LVrarqquobvvNK3tvYqqr/UtUivFdVo4C7VfWQqi4HXixRtxBIxZsURFVXlEgyph6y5GDqCg/eD/GS4vB+qJX2MjABGAJMK+P4NcCbqlqkqnl4h41KDy39TlXTVLWVql6lqtlltHMMsEdVD5Yo2wi0qvztHPGtr59UIAPoDjxQov2NxRVV1QU2F7cvIleLyBJfctmHd+iraYm2N5f4vhnekYKSZSXbngk8ifeKa6eIPCMiDavwPkyUseRg6opNQLtSZe0p8QFXwsvAjcDHqnq45AERaQ2cCfxSRLaLyHa8Q0wjfENIVbEV71VKaomyY4EtVWwHAFXdgTdRnV+i/bYlYhegDbBFRNoC/8GbBJuoahre4S0p2WSJ77OBIt/5JWMt2f8/VbUvcDze4aXbq/M+THSw5GDqijeAP4tIaxFxROQsvB+ib5euqKrrgTOAiWW08yu8s5e6AL19X52BLOCKqgSkqpuBucCDIpIoIicA1wGvVKWdYr5ZVhcDP/qK3gTOFZGhIhIH/AHvcNhcIAXvh3+279xfU85Nc1+sHuBdvDemk0XkeEpcLYnISSJysq+fQ3jvTbjVeR8mOlhyMHXFfXg/FL8G9gIPA1f5xs79qOrXqrq1jEPXAE+p6vaSX8AUKpi1VIEr8F7RbMU7hHWPqn5ehfOLZzPl4J2plA3c5HsPq4BfAv8CduFNhueraoGqZgKPAfOAHUBP4JtK+poANMB7A/sFvDeoizXEeyWyF+/V2G7gkSq8DxNlxDb7McYYU5pdORhjjPFjycEYY4wfSw7GGGP8WHIwxhjjJyqWz2jatKm2a9cu3GEYY0ydsmjRol2q2qysY1GRHNq1a8fChQvDHYYxxtQpIlLWQ6SADSsZY4wpgyUHY4wxfiw5GGOM8RMV9xyMMaa2FBYWkpWVRV5eXrhDCVhiYiKtW7cmLq70wsbls+Rgjigs8vCfj+eRkhDPr4adhONI5ScZU89kZWWRmppKu3bt8C6UG9lUld27d5OVlUX79u0DPs+SgwHgsr++yE9b9xx5/cT/vqFX+wye/2OVFio1Jurl5eXVmcQAICI0adKE7OyytiQpn91zMLz2xaKjEkOxpeu388WiVWGIyJjIVlcSQ7HqxGvJwfDo27PLPfbHZz+uxUiMMZHCkoOpkC3obkxgtm/fzujRo+nYsSN9+/ZlxIgRrF69mh49yt2DKaLZPQeDI+CWkwXsnrQxlVNVLr74Yq655hqmTp0KwNKlS9mxY0eYI6s+u3IwTLl5VLnH/n7DubUYiTF105dffklcXBy//e1vj5T16tWLNm1+3rJ7w4YNnH766fTp04c+ffowd+5cALZt28agQYPo3bs3PXr0YM6cOXg8HsaMGUOPHj3o2bMnkyZNAmDt2rUMHz6cvn37cvrpp7Ny5UoA3nrrLXr06EGvXr0YNGhQUN6TXTkY+nU5lgsHHM978zKPKj+zdweGntg5TFGZ2pC5L4sf9m2kTUpTTm3auc7daI0Uy5cvp2/fvhXWad68OTNmzCAxMZE1a9ZwxRVXsHDhQl577TXOOeccJk6ciMfj4fDhwyxZsoQtW7awfLl3F9x9+/YBMHbsWKZMmcJxxx3H/PnzufHGG5k5cyb33Xcfn376Ka1atTpSt6YsORgA7rn6HO4YPZT5KzZSUOThlOPbkpqUEO6wTIjkFhUwes7jbMvfd1T55W0G8Ifu54cnqChXWFjIhAkTWLJkCTExMaxevRqAk046iWuvvZbCwkIuuugievfuTYcOHVi3bh033XQT5557LmeffTY5OTnMnTuXSy+99Eib+fn5AJx22mmMGTOGyy67jJEjRwYlXhtWMkfszTnM/JUbeefrH/hq6dpwh2NC6O6lb/glBoA3Ns/j3Y3zaz+gOq579+4sWrSowjqTJk2iRYsWLF26lIULF1JQUADAoEGDmD17Nq1atWLMmDG89NJLNG7cmKVLlzJ48GCmTJnC9ddfj+u6pKWlsWTJkiNfK1asAGDKlCncf//9bN68mb59+7J79+4avydLDgaA12d+z7kT/8vUWUtZsHIzd7/4KX3GTWL1pp3hDs2EwFfZK8o99tCK99h8qOYfLvXJmWeeSX5+Ps8888yRsh9++IHNmzcfeb1//35atmyJ4zi8/PLLeDweADZu3EiLFi244YYbuP766/n+++/ZtWsXrusyatQo7r//fr7//nsaNmxI+/bteeuttwDvTfClS5cC3nsRJ598Mvfddx/NmjU7qt/qsmElQ/b+HB5566syj41+8FW+fHQcjVISazkqEyrbc/dVWufSOY+RJHEc1kJaJqTxRL8xtE1tHvrg6igRYdq0afz+97/n73//O4mJibRr147HH3/8SJ0bb7yRUaNG8dJLLzF8+HBSUlIAmDVrFo888ghxcXE0aNCAl156iS1btvDrX/8a13UBePDBBwF49dVXGTduHPfffz+FhYWMHj2aXr16cfvtt7NmzRpUlaFDh9KrV6+avyfVuj+TvV+/fmqb/VTfS58t5PFpc8o9nhjn8M0Tv7OblVHiurmTWXZgS5XP65nWhv+eMi4EEdUtK1asoFu3buEOo8rKiltEFqlqv7Lq27CSYeaSNRUezyt0ueD/nqulaEwoFbmeaiUGgGX7NnP7opeDHJGJVJYcTEBPQW/ZfYADh+rOEsWmbM+v/bJG53+VvYK3Ns4LUjQmkllyMFx4yvEB1Vu6dmuIIzGh9sq68ocPA/XYig9Ze3B7EKIxkcySgyEhPj6geht27g1xJCaUilwPuVpY43ZclDHznmZn3oEgRGUilc1WMmRuDmz9l8Q4++9Sl72/OXiTNvLdQkbNeoQujVqxLmcn8TExDGnRnXHHnU3D+OSg9WPCJ6xXDiLynIjsFJHlJcruFZEtIrLE9zUinDHWB22bpwVUb2DPwHeRMpHn6TWfBbW9fDz8sH8TOZ489hQc4p3NCzhr5v18unV
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"\"\"\"\n",
"Plot UMAP of features\n",
"If features are discriminative, we should see well defined clusters of points\n",
"\"\"\"\n",
"feature_dataset.plot.umap()"
]
},
{
"cell_type": "markdown",
"id": "017bb7b6",
"metadata": {},
"source": [
"If we see well defined cluster of points (as in the above image), it means that our features do a great job at describing each class.\n",
"\n",
"<x-alerts.info>If you feel like the pairplot and the UMAP \"disagree\", it is true only to some extent. UMAP applies heavy lifting to data to preserve the cluster isolation; most Machine Learning models won't do so. You should give more importance to the pairplot in the context of TinyML</x-alerts.info>"
]
},
{
"cell_type": "markdown",
"id": "c4c9ea72",
"metadata": {},
"source": [
"## Step 4 of 5: Train a Machine Learning classifier\n",
"\n",
"From the above graphics we can say that our features are pretty good at characterizing our data, so it is time to train a classifier.\n",
"\n",
"There are many available, but one of the most effective is [Random Forest](https://en.wikipedia.org/wiki/Random_forest). You can tweak its configuration as you prefer, but the values below should work fine in most cases."
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "500dde6f",
"metadata": {
"scrolled": true
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Score on test set: 1.00\n"
]
},
{
"data": {
"text/plain": [
"RandomForestClassifier(base_estimator=DecisionTreeClassifier(), bootstrap=True, ccp_alpha=0.0, class_name=RandomForestClassifier, class_weight=None, criterion=gini, estimator_params=('criterion', 'max_depth', 'min_samples_split', 'min_samples_leaf', 'min_weight_fraction_leaf', 'max_features', 'max_leaf_nodes', 'min_impurity_decrease', 'random_state', 'ccp_alpha'), max_depth=40, max_features=auto, max_leaf_nodes=None, max_samples=None, min_impurity_decrease=0.0, min_samples_leaf=1, min_samples_split=2, min_weight_fraction_leaf=0.0, n_estimators=10, n_jobs=None, num_outputs=4, oob_score=False, package_name=everywhereml.sklearn.ensemble, random_state=None, template_folder=everywhereml/sklearn/ensemble, verbose=0, warm_start=False)"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"\"\"\"\n",
"Create and fit RandomForest classifier\n",
"\"\"\"\n",
"from everywhereml.sklearn.ensemble import RandomForestClassifier\n",
"\n",
"clf = RandomForestClassifier(n_estimators=10, max_depth=40)\n",
"\n",
"# fit on train split and get accuracy on the test split\n",
"train, test = feature_dataset.split(test_size=0.3)\n",
"clf.fit(train)\n",
"\n",
"print('Score on test set: %.2f' % clf.score(test))\n",
"\n",
"# now fit on the whole dataset\n",
"clf.fit(feature_dataset)"
]
},
{
"cell_type": "markdown",
"id": "64446bf5",
"metadata": {},
"source": [
"Depending on your dataset, you can expect your accuracy to range from 0.7 to 1.\n",
"\n",
"If it is lower (or too low for your use case), you can:\n",
"\n",
" 1. improve your dataset (collect more images, fix your setup)\n",
" 2. tweak the `resize` parameter of the HogPipeline to an higher resolution\n",
" 3. tweak the `RandomForestClassifier` parameters ([see documentation](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html))\n",
" \n",
"If you're satisfied, it's time to port the whole system to your Esp32-cam."
]
},
{
"cell_type": "markdown",
"id": "6d1ec4bb",
"metadata": {},
"source": [
"## Step 5 of 5: Port to Esp32\n",
"\n",
"Last step is to convert the `HogPipeline` and `RandomForestClassifier` to C++ code that can run on your Esp32-cam.\n",
"\n",
"This process is very straightforward, since you only need a line of code.\n",
"\n",
"Create a new project in the Arduino IDE to hold all the following files."
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "22b5edba",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"#ifndef UUID5853197456\n",
"#define UUID5853197456\n",
"\n",
"\n",
" #ifndef UUID5853199664\n",
"#define UUID5853199664\n",
"\n",
"/**\n",
" * HOG(block_size=8, bins=9, cell_size=3)\n",
" */\n",
"class HOG {\n",
" public:\n",
"\n",
" /**\n",
" * Transform input image\n",
" */\n",
" template<typename T, typename U>\n",
" bool transform(T *input, U *output) {\n",
" \n",
" uint16_t f = 0;\n",
" uint16_t block = 0;\n",
" float hog[135] = {0};\n",
"\n",
" // compute gradients\n",
" for (uint16_t blockY = 0; blockY < 3; blockY++) {\n",
" const uint16_t blockOffsetY = blockY * 320;\n",
"\n",
" for (uint16_t blockX = 0; blockX < 5; blockX++) {\n",
" const uint16_t blockOffsetX = blockX * 8;\n",
" float hist[9] = {0};\n",
"\n",
" for (uint16_t _y = 1; _y < 7; _y += 1) {\n",
" const uint16_t rowOffset = blockOffsetY + _y * 40 + blockOffsetX;\n",
" const uint16_t rowOffsetBefore = rowOffset - 40;\n",
" const uint16_t rowOffsetAfter = rowOffset + 40;\n",
"\n",
" for (uint16_t _x = 1; _x < 7; _x += 1) {\n",
" const uint16_t offset = rowOffset + _x;\n",
" const uint16_t offsetBefore = rowOffsetBefore + _x;\n",
" const uint16_t offsetAfter = rowOffsetAfter + _x;\n",
" const float gy = input[offsetAfter] - input[offsetBefore];\n",
" const float gx = input[offset + 1] - input[offset - 1];\n",
" const float g = sqrt(gy * gy + gx * gx);\n",
" uint8_t angle = abs(this->arctan(gy, gx) * 180 / 3.141592653589793f / 20);\n",
"\n",
" if (angle >= 8) angle = 8;\n",
" hist[angle] += g;\n",
" }\n",
" }\n",
"\n",
" for (uint16_t i = 0; i < 9; i++)\n",
" hog[f++] = hist[i];\n",
"\n",
" block += 1;\n",
"\n",
" // end of cell, normalize\n",
" if ((block % 3) == 0) {\n",
" const uint16_t offset = (block - 3) * 9;\n",
" float maxGradient = 0.0001;\n",
"\n",
" for (uint16_t i = 0; i < 27; i++) {\n",
" const float h = hog[offset + i];\n",
"\n",
" if (h > maxGradient)\n",
" maxGradient = h;\n",
" }\n",
"\n",
" for (uint16_t i = 0; i < 27; i++) {\n",
" hog[offset + i] /= maxGradient;\n",
" }\n",
"\n",
" maxGradient = 0.0001;\n",
" }\n",
" }\n",
" }\n",
"\n",
"\n",
" // copy over\n",
" for (uint16_t i = 0; i < 135; i++)\n",
" output[i] = hog[i];\n",
"\n",
"\n",
" return true;\n",
" }\n",
"\n",
" protected:\n",
" \n",
"\n",
" /**\n",
" * optional atan2 approximation for faster calculation\n",
" */\n",
" float arctan(float y, float x) {\n",
" \n",
" float r = 0;\n",
"\n",
" if (abs(y) < 0.00000001)\n",
" return 0;\n",
" else if (abs(x) < 0.00000001)\n",
" return 3.14159274 * (y > 0 ? 1 : -1);\n",
" else {\n",
" float a = min(abs(x), abs(y)) / max(abs(x), abs(y));\n",
" float s = a * a;\n",
" r = ((-0.0464964749 * s + 0.15931422) * s - 0.327622764) * s * a + a;\n",
"\n",
" if (abs(y) > abs(x))\n",
" r = 1.57079637 - r;\n",
" }\n",
"\n",
" if (x < 0)\n",
" r = 3.14159274 - r;\n",
" if (y < 0)\n",
" r = -r;\n",
"\n",
" return r;\n",
" \n",
" }\n",
"\n",
"\n",
"};\n",
"\n",
"\n",
"\n",
"#endif\n",
"\n",
"\n",
"/**\n",
" * ImagePipeline: HogPipeline\n",
" * ---------\n",
" * - Resize(from=((160, 120)), to=(40, 30), pixformat=gray)\n",
" * > HOG(block_size=8, bins=9, cell_size=3)\n",
" */\n",
"class HogPipeline {\n",
" public:\n",
" static const size_t NUM_INPUTS = 1200;\n",
" static const size_t NUM_OUTPUTS = 135;\n",
" static const size_t WORKING_SIZE = 135;\n",
" float features[135];\n",
"\n",
" /**\n",
" * Extract features from input image\n",
" */\n",
" template<typename T>\n",
" bool transform(T *input) {\n",
" time_t start = micros();\n",
" ok = true;\n",
"\n",
" preprocess(input);\n",
"\n",
" \n",
" \n",
" ok = ok && hog.transform(input, features);\n",
" \n",
" \n",
"\n",
" latency = micros() - start;\n",
"\n",
" return ok;\n",
" }\n",
"\n",
" /**\n",
" * Debug output feature vector\n",
" */\n",
" template<typename PrinterInterface>\n",
" void debugTo(PrinterInterface &printer, uint8_t precision=5) {\n",
" printer.print(features[0], precision);\n",
"\n",
" for (uint16_t i = 1; i < 135; i++) {\n",
" printer.print(\", \");\n",
" printer.print(features[i], precision);\n",
" }\n",
"\n",
" printer.print('\\n');\n",
" }\n",
"\n",
" /**\n",
" * Get latency in micros\n",
" */\n",
"uint32_t latencyInMicros() {\n",
" return latency;\n",
"}\n",
"\n",
"/**\n",
" * Get latency in millis\n",
" */\n",
"uint16_t latencyInMillis() {\n",
" return latency / 1000;\n",
"}\n",
"\n",
" protected:\n",
" bool ok;\n",
" time_t latency;\n",
" \n",
" HOG hog;\n",
" \n",
"\n",
" template<typename T>\n",
" void preprocess(T *input) {\n",
" \n",
" \n",
" // grayscale rescaling\n",
" const float dy = 4.0f;\n",
" const float dx = 4.0f;\n",
"\n",
" for (uint16_t y = 0; y < 30; y++) {\n",
" const size_t sourceOffset = round(y * dy) * 160;\n",
" const size_t destOffset = y * 40;\n",
"\n",
" for (uint16_t x = 0; x < 40; x++)\n",
" input[destOffset + x] = input[sourceOffset + ((uint16_t) (x * dx))];\n",
" }\n",
"\n",
" \n",
" }\n",
"};\n",
"\n",
"\n",
"static HogPipeline hog;\n",
"\n",
"\n",
"#endif\n"
]
}
],
"source": [
"\"\"\"\n",
"Export pipeline to C++\n",
"Replace the path to your actual sketch path\n",
"\"\"\"\n",
"print(pipeline.to_arduino_file(\n",
" filename='path-to-sketch/HogPipeline.h',\n",
" instance_name='hog'\n",
"))"
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "993ecf96",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"#ifndef UUID5828661360\n",
"#define UUID5828661360\n",
"\n",
"/**\n",
" * RandomForestClassifier(base_estimator=DecisionTreeClassifier(), bootstrap=True, ccp_alpha=0.0, class_name=RandomForestClassifier, class_weight=None, criterion=gini, estimator_params=('criterion', 'max_depth', 'min_samples_split', 'min_samples_leaf', 'min_weight_fraction_leaf', 'max_features', 'max_leaf_nodes', 'min_impurity_decrease', 'random_state', 'ccp_alpha'), max_depth=40, max_features=auto, max_leaf_nodes=None, max_samples=None, min_impurity_decrease=0.0, min_samples_leaf=1, min_samples_split=2, min_weight_fraction_leaf=0.0, n_estimators=10, n_jobs=None, num_outputs=4, oob_score=False, package_name=everywhereml.sklearn.ensemble, random_state=None, template_folder=everywhereml/sklearn/ensemble, verbose=0, warm_start=False)\n",
" */\n",
"class RandomForestClassifier {\n",
" public:\n",
"\n",
" /**\n",
" * Predict class from features\n",
" */\n",
" int predict(float *x) {\n",
" int predictedValue = 0;\n",
" size_t startedAt = micros();\n",
"\n",
" \n",
" uint16_t votes[4] = { 0 };\n",
" uint8_t classIdx = 0;\n",
" float classScore = 0;\n",
"\n",
" \n",
" tree0(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree1(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree2(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree3(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree4(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree5(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree6(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree7(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree8(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
" tree9(x, &classIdx, &classScore);\n",
" votes[classIdx] += classScore;\n",
" \n",
"\n",
" // return argmax of votes\n",
"uint8_t maxClassIdx = 0;\n",
"float maxVote = votes[0];\n",
"\n",
"for (uint8_t i = 1; i < 4; i++) {\n",
" if (votes[i] > maxVote) {\n",
" maxClassIdx = i;\n",
" maxVote = votes[i];\n",
" }\n",
"}\n",
"\n",
"predictedValue = maxClassIdx;\n",
"\n",
"\n",
" latency = micros() - startedAt;\n",
"\n",
" return (lastPrediction = predictedValue);\n",
" }\n",
"\n",
"\n",
" \n",
"\n",
"/**\n",
" * Predict class label\n",
" */\n",
"String predictLabel(float *x) {\n",
" return getLabelOf(predict(x));\n",
"}\n",
"\n",
"/**\n",
" * Get label of last prediction\n",
" */\n",
"String getLabel() {\n",
" return getLabelOf(lastPrediction);\n",
"}\n",
"\n",
"/**\n",
" * Get label of given class\n",
" */\n",
"String getLabelOf(int8_t idx) {\n",
" switch (idx) {\n",
" case -1:\n",
" return \"ERROR\";\n",
" \n",
" case 0:\n",
" return \"empty\";\n",
" \n",
" case 1:\n",
" return \"pi\";\n",
" \n",
" case 2:\n",
" return \"portenta\";\n",
" \n",
" case 3:\n",
" return \"wio\";\n",
" \n",
" default:\n",
" return \"UNKNOWN\";\n",
" }\n",
"}\n",
"\n",
"\n",
" /**\n",
" * Get latency in micros\n",
" */\n",
"uint32_t latencyInMicros() {\n",
" return latency;\n",
"}\n",
"\n",
"/**\n",
" * Get latency in millis\n",
" */\n",
"uint16_t latencyInMillis() {\n",
" return latency / 1000;\n",
"}\n",
"\n",
" protected:\n",
" float latency = 0;\n",
" int lastPrediction = 0;\n",
"\n",
" \n",
"\n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #0\n",
" */\n",
" void tree0(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[18] <= 0.0369415208697319) {\n",
" \n",
" \n",
" if (x[78] <= 0.1300949603319168) {\n",
" \n",
" \n",
" if (x[90] <= 0.02731012087315321) {\n",
" \n",
" \n",
" if (x[63] <= 0.8730830252170563) {\n",
" \n",
" \n",
" if (x[113] <= 0.26372766494750977) {\n",
" \n",
" \n",
" if (x[98] <= 0.022605867125093937) {\n",
" \n",
" \n",
" if (x[34] <= 0.2811869978904724) {\n",
" \n",
" \n",
" if (x[55] <= 0.07885900512337685) {\n",
" \n",
" \n",
" if (x[92] <= 0.3012963682413101) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[134] <= 0.28155651688575745) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[78] <= 0.02770218253135681) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[116] <= 0.00469584483653307) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[106] <= 0.11038459464907646) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 523.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[114] <= 0.0355784073472023) {\n",
" \n",
" \n",
" if (x[99] <= 0.0847080098465085) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 523.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[63] <= 0.5556537806987762) {\n",
" \n",
" \n",
" if (x[112] <= 0.3595939725637436) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[48] <= 0.05262908060103655) {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[112] <= 0.38147011026740074) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 523.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #1\n",
" */\n",
" void tree1(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[66] <= 0.03470461815595627) {\n",
" \n",
" \n",
" if (x[92] <= 0.4885159134864807) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[54] <= 0.2060672491788864) {\n",
" \n",
" \n",
" if (x[92] <= 0.600633054971695) {\n",
" \n",
" \n",
" if (x[79] <= 0.23793259263038635) {\n",
" \n",
" \n",
" if (x[56] <= 0.20621951669454575) {\n",
" \n",
" \n",
" if (x[59] <= 0.07552803680300713) {\n",
" \n",
" \n",
" if (x[79] <= 0.157677061855793) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[113] <= 0.07983606681227684) {\n",
" \n",
" \n",
" if (x[53] <= 0.543332826346159) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[63] <= 0.2800687700510025) {\n",
" \n",
" \n",
" if (x[93] <= 0.05621062591671944) {\n",
" \n",
" \n",
" if (x[42] <= 0.08418368548154831) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[123] <= 0.09437189996242523) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.005484993336722255) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[2] <= 0.08067275956273079) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[80] <= 0.3647053986787796) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[20] <= 0.5538146793842316) {\n",
" \n",
" \n",
" if (x[49] <= 0.05982496030628681) {\n",
" \n",
" \n",
" if (x[56] <= 0.2696080878376961) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[70] <= 0.11429519951343536) {\n",
" \n",
" \n",
" if (x[114] <= 0.16653749346733093) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[123] <= 0.032165540382266045) {\n",
" \n",
" \n",
" if (x[103] <= 0.4637199342250824) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[2] <= 0.04892366752028465) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[6] <= 0.018453402444720268) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[116] <= 0.09328529983758926) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[59] <= 0.09137589856982231) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[61] <= 0.19741814583539963) {\n",
" \n",
" \n",
" if (x[79] <= 0.38941070437431335) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[65] <= 0.36613909900188446) {\n",
" \n",
" \n",
" if (x[67] <= 0.7522072196006775) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 518.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 466.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 488.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #2\n",
" */\n",
" void tree2(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[133] <= 0.45472584664821625) {\n",
" \n",
" \n",
" if (x[57] <= 0.30642061680555344) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 459.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[94] <= 0.013848140370100737) {\n",
" \n",
" \n",
" if (x[80] <= 0.9212180376052856) {\n",
" \n",
" \n",
" if (x[116] <= 0.09623592719435692) {\n",
" \n",
" \n",
" if (x[55] <= 0.05950098857283592) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 448.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[103] <= 0.10040346160531044) {\n",
" \n",
" \n",
" if (x[122] <= 0.02093493938446045) {\n",
" \n",
" \n",
" if (x[67] <= 0.5244172215461731) {\n",
" \n",
" \n",
" if (x[23] <= 0.3314560502767563) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 459.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[45] <= 0.210725799202919) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 448.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[116] <= 0.09688782319426537) {\n",
" \n",
" \n",
" if (x[99] <= 0.15070531517267227) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[104] <= 0.004858243744820356) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[107] <= 0.043371833860874176) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[90] <= 0.02459509950131178) {\n",
" \n",
" \n",
" if (x[133] <= 0.8430708050727844) {\n",
" \n",
" \n",
" if (x[35] <= 0.0420481413602829) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[109] <= 0.18476517498493195) {\n",
" \n",
" \n",
" if (x[56] <= 0.2247144728899002) {\n",
" \n",
" \n",
" if (x[116] <= 0.0034325255546718836) {\n",
" \n",
" \n",
" if (x[79] <= 0.15541844815015793) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.000795243657194078) {\n",
" \n",
" \n",
" if (x[33] <= 0.5699660181999207) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[88] <= 0.20550961047410965) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[74] <= 0.0708680022507906) {\n",
" \n",
" \n",
" if (x[116] <= 0.11245472729206085) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[55] <= 0.056950220838189125) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 511.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 560.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #3\n",
" */\n",
" void tree3(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[80] <= 0.9138846397399902) {\n",
" \n",
" \n",
" if (x[102] <= 0.10483590140938759) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 487.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[97] <= 0.01703598815947771) {\n",
" \n",
" \n",
" if (x[75] <= 0.023341485299170017) {\n",
" \n",
" \n",
" if (x[70] <= 0.1384853944182396) {\n",
" \n",
" \n",
" if (x[101] <= 0.2657548785209656) {\n",
" \n",
" \n",
" if (x[78] <= 0.00982948113232851) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[58] <= 0.9875586926937103) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[22] <= 0.18773172795772552) {\n",
" \n",
" \n",
" if (x[95] <= 0.019351176917552948) {\n",
" \n",
" \n",
" if (x[11] <= 0.01873837038874626) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[110] <= 0.0485067144036293) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[54] <= 0.033164238557219505) {\n",
" \n",
" \n",
" if (x[80] <= 0.3718045800924301) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[124] <= 0.9936124086380005) {\n",
" \n",
" \n",
" if (x[73] <= 0.0703774020075798) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[47] <= 0.12305829674005508) {\n",
" \n",
" \n",
" if (x[90] <= 0.008263786789029837) {\n",
" \n",
" \n",
" if (x[98] <= 0.009134478168562055) {\n",
" \n",
" \n",
" if (x[96] <= 0.03772225510329008) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[64] <= 0.18797582015395164) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 516.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #4\n",
" */\n",
" void tree4(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[65] <= 0.4568137377500534) {\n",
" \n",
" \n",
" if (x[111] <= 0.04640892706811428) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 483.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[92] <= 0.5991251170635223) {\n",
" \n",
" \n",
" if (x[116] <= 0.08080089464783669) {\n",
" \n",
" \n",
" if (x[46] <= 0.1407422199845314) {\n",
" \n",
" \n",
" if (x[107] <= 0.10579070821404457) {\n",
" \n",
" \n",
" if (x[79] <= 0.19204548746347427) {\n",
" \n",
" \n",
" if (x[97] <= 0.03240184811875224) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[54] <= 0.015389987733215094) {\n",
" \n",
" \n",
" if (x[90] <= 0.02292781602591276) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[55] <= 0.10951834544539452) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[90] <= 0.007291872752830386) {\n",
" \n",
" \n",
" if (x[114] <= 0.05278712324798107) {\n",
" \n",
" \n",
" if (x[5] <= 0.019473688676953316) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[61] <= 0.06436269357800484) {\n",
" \n",
" \n",
" if (x[91] <= 0.004494044464081526) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[55] <= 0.08292442187666893) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[67] <= 0.795905202627182) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[80] <= 0.5234813690185547) {\n",
" \n",
" \n",
" if (x[21] <= 0.049780791625380516) {\n",
" \n",
" \n",
" if (x[46] <= 0.15785471722483635) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[116] <= 0.06707945093512535) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[117] <= 0.07842132449150085) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 528.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 489.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #5\n",
" */\n",
" void tree5(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[101] <= 0.052058856934309006) {\n",
" \n",
" \n",
" if (x[108] <= 0.5930112600326538) {\n",
" \n",
" \n",
" if (x[80] <= 0.48694971203804016) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 478.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[78] <= 0.12165448069572449) {\n",
" \n",
" \n",
" if (x[64] <= 0.13614977896213531) {\n",
" \n",
" \n",
" if (x[116] <= 0.0720963291823864) {\n",
" \n",
" \n",
" if (x[125] <= 0.006470485590398312) {\n",
" \n",
" \n",
" if (x[102] <= 0.5091452896595001) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[133] <= 0.8621841073036194) {\n",
" \n",
" \n",
" if (x[107] <= 0.10208133608102798) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[65] <= 0.16045310348272324) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[108] <= 0.20820708572864532) {\n",
" \n",
" \n",
" if (x[10] <= 0.01630954770371318) {\n",
" \n",
" \n",
" if (x[47] <= 0.1604420617222786) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.01953808404505253) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[55] <= 0.06958340853452682) {\n",
" \n",
" \n",
" if (x[111] <= 0.8284404277801514) {\n",
" \n",
" \n",
" if (x[95] <= 0.3986862450838089) {\n",
" \n",
" \n",
" if (x[94] <= 0.01923585683107376) {\n",
" \n",
" \n",
" if (x[102] <= 0.967304915189743) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[54] <= 0.4388955533504486) {\n",
" \n",
" \n",
" if (x[75] <= 0.01729480642825365) {\n",
" \n",
" \n",
" if (x[116] <= 0.13019520416855812) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[67] <= 0.38635003566741943) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[68] <= 0.3347695767879486) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[44] <= 0.9923345446586609) {\n",
" \n",
" \n",
" if (x[103] <= 0.1337764412164688) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[96] <= 0.0030480041168630123) {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 486.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[103] <= 0.19916509091854095) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 506.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 508.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #6\n",
" */\n",
" void tree6(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[102] <= 0.11687179654836655) {\n",
" \n",
" \n",
" if (x[64] <= 0.17842888087034225) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 471.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[61] <= 0.037651170045137405) {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 456.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[70] <= 0.06084801256656647) {\n",
" \n",
" \n",
" if (x[116] <= 0.07822248339653015) {\n",
" \n",
" \n",
" if (x[49] <= 0.38092613220214844) {\n",
" \n",
" \n",
" if (x[120] <= 0.013017720077186823) {\n",
" \n",
" \n",
" if (x[46] <= 0.15880535542964935) {\n",
" \n",
" \n",
" if (x[103] <= 0.6758613288402557) {\n",
" \n",
" \n",
" if (x[97] <= 0.028888202272355556) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[59] <= 0.06674952432513237) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[101] <= 0.16203821450471878) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[117] <= 0.29609745740890503) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[63] <= 0.011694431537762284) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.008444164413958788) {\n",
" \n",
" \n",
" if (x[55] <= 0.08139839395880699) {\n",
" \n",
" \n",
" if (x[46] <= 0.06833056174218655) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[71] <= 0.01715219486504793) {\n",
" \n",
" \n",
" if (x[71] <= 0.007186424918472767) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[54] <= 0.023847054690122604) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[102] <= 0.18921969085931778) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 555.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #7\n",
" */\n",
" void tree7(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[114] <= 0.03222952038049698) {\n",
" \n",
" \n",
" if (x[64] <= 0.17848213016986847) {\n",
" \n",
" \n",
" if (x[103] <= 0.20130645111203194) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 504.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.011821059044450521) {\n",
" \n",
" \n",
" if (x[80] <= 0.35860520601272583) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[67] <= 0.3444618433713913) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[101] <= 0.4114457815885544) {\n",
" \n",
" \n",
" if (x[43] <= 0.07745068892836571) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[59] <= 0.19998691976070404) {\n",
" \n",
" \n",
" if (x[42] <= 0.09131728485226631) {\n",
" \n",
" \n",
" if (x[7] <= 0.07340359315276146) {\n",
" \n",
" \n",
" if (x[104] <= 0.05308664217591286) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[56] <= 0.41088466346263885) {\n",
" \n",
" \n",
" if (x[47] <= 0.14968156814575195) {\n",
" \n",
" \n",
" if (x[7] <= 0.2603675201535225) {\n",
" \n",
" \n",
" if (x[99] <= 0.004318844527006149) {\n",
" \n",
" \n",
" if (x[41] <= 0.019454671069979668) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[21] <= 0.1030389778316021) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[111] <= 0.551390528678894) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 479.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[125] <= 0.11612779647111893) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 481.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 514.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #8\n",
" */\n",
" void tree8(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[101] <= 0.05232098326086998) {\n",
" \n",
" \n",
" if (x[108] <= 0.5930112600326538) {\n",
" \n",
" \n",
" if (x[42] <= 0.05685785599052906) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 493.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[49] <= 0.008799195755273104) {\n",
" \n",
" \n",
" if (x[67] <= 0.7265111804008484) {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 484.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[78] <= 0.018843566067516804) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[92] <= 0.5997764766216278) {\n",
" \n",
" \n",
" if (x[80] <= 0.37235601246356964) {\n",
" \n",
" \n",
" if (x[57] <= 0.014879327500239015) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 493.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[107] <= 0.004320872016251087) {\n",
" \n",
" \n",
" if (x[114] <= 0.0745684988796711) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.022634425200521946) {\n",
" \n",
" \n",
" if (x[68] <= 0.10072548314929008) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[49] <= 0.24995308369398117) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[43] <= 0.25109250098466873) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[111] <= 0.30603039264678955) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[94] <= 0.30125169456005096) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[90] <= 0.011206488590687513) {\n",
" \n",
" \n",
" if (x[88] <= 0.07520439848303795) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[45] <= 0.09221789240837097) {\n",
" \n",
" \n",
" if (x[60] <= 0.40736718475818634) {\n",
" \n",
" \n",
" if (x[61] <= 0.2641521543264389) {\n",
" \n",
" \n",
" if (x[80] <= 0.5720258057117462) {\n",
" \n",
" \n",
" if (x[47] <= 0.21448105573654175) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[65] <= 0.2080097794532776) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 496.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
" \n",
" /**\n",
" * Random forest's tree #9\n",
" */\n",
" void tree9(float *x, uint8_t *classIdx, float *classScore) {\n",
" \n",
" if (x[112] <= 0.0743429847061634) {\n",
" \n",
" \n",
" *classIdx = 3;\n",
" *classScore = 467.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[110] <= 0.28466828167438507) {\n",
" \n",
" \n",
" if (x[116] <= 0.08032038062810898) {\n",
" \n",
" \n",
" if (x[93] <= 0.11881325766444206) {\n",
" \n",
" \n",
" if (x[92] <= 0.9340429902076721) {\n",
" \n",
" \n",
" if (x[110] <= 0.00503910006955266) {\n",
" \n",
" \n",
" if (x[15] <= 0.025954012759029865) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[98] <= 0.017721661366522312) {\n",
" \n",
" \n",
" if (x[56] <= 0.1771697998046875) {\n",
" \n",
" \n",
" if (x[72] <= 0.01185616571456194) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[58] <= 0.15649032592773438) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[58] <= 0.34245556592941284) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[56] <= 0.2746366709470749) {\n",
" \n",
" \n",
" if (x[114] <= 0.2599753737449646) {\n",
" \n",
" \n",
" if (x[23] <= 0.7012089490890503) {\n",
" \n",
" \n",
" if (x[94] <= 0.012615592684596777) {\n",
" \n",
" \n",
" if (x[59] <= 0.19462940841913223) {\n",
" \n",
" \n",
" if (x[43] <= 0.1349761039018631) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[64] <= 0.36634698510169983) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[103] <= 0.06138048693537712) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[124] <= 0.9182375073432922) {\n",
" \n",
" \n",
" *classIdx = 2;\n",
" *classScore = 513.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" if (x[21] <= 0.03671480901539326) {\n",
" \n",
" \n",
" *classIdx = 1;\n",
" *classScore = 505.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
" else {\n",
" \n",
" \n",
" *classIdx = 0;\n",
" *classScore = 493.0;\n",
" return;\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" \n",
" }\n",
"\n",
" }\n",
" \n",
" \n",
"\n",
"\n",
"};\n",
"\n",
"\n",
"\n",
"static RandomForestClassifier classifier;\n",
"\n",
"\n",
"#endif\n"
]
}
],
"source": [
"\"\"\"\n",
"Export classifier to C++\n",
"Replace the path to your actual sketch path\n",
"\n",
"The class_map parameters convert numeric classes to human-readable strings\n",
"\"\"\"\n",
"print(clf.to_arduino_file(\n",
" filename='path-to-sketch/HogClassifier.h',\n",
" instance_name='classifier', \n",
" class_map=feature_dataset.class_map\n",
"))"
]
},
{
"cell_type": "markdown",
"id": "7ac71919",
"metadata": {},
"source": [
"And this is the main code to put in the `.ino` file.\n",
"\n",
"\n",
"```cpp\n",
"#include \"eloquent.h\"\n",
"#include \"eloquent/print.h\"\n",
"#include \"eloquent/tinyml/voting/quorum.h\"\n",
"\n",
"// replace 'm5wide' with your own model\n",
"// possible values are 'aithinker', 'eye', 'm5stack', 'm5wide', 'wrover'\n",
"#include \"eloquent/vision/camera/m5wide.h\"\n",
"\n",
"#include \"HogPipeline.h\"\n",
"#include \"HogClassifier.h\"\n",
"\n",
"Eloquent::TinyML::Voting::Quorum<7> quorum;\n",
"\n",
"\n",
"void setup() {\n",
" Serial.begin(115200);\n",
" delay(3000);\n",
" Serial.println(\"Begin\");\n",
"\n",
" camera.qqvga();\n",
" camera.grayscale();\n",
"\n",
" while (!camera.begin())\n",
" Serial.println(\"Cannot init camera\");\n",
"}\n",
"\n",
"void loop() {\n",
" if (!camera.capture()) {\n",
" Serial.println(camera.getErrorMessage());\n",
" delay(1000);\n",
" return;\n",
" }\n",
" \n",
" // apply HOG pipeline to camera frame\n",
" hog.transform(camera.buffer);\n",
"\n",
" // get a stable prediction\n",
" // this is optional, but will improve the stability of predictions\n",
" uint8_t prediction = classifier.predict(hog.features);\n",
" int8_t stablePrediction = quorum.vote(prediction);\n",
"\n",
" if (quorum.isStable()) {\n",
" eloquent::print::printf(\n",
" Serial, \n",
" \"Stable prediction: %s \\t(DSP: %d ms, Classifier: %d us)\\n\", \n",
" classifier.getLabelOf(stablePrediction),\n",
" hog.latencyInMillis(),\n",
" classifier.latencyInMicros()\n",
" );\n",
" }\n",
"\n",
" camera.free();\n",
"}\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "007f9205",
"metadata": {},
"source": [
"Hit upload and put your objects in front of the camera: you will see the predicted label."
]
},
{
"cell_type": "markdown",
"id": "2ec9b6b6",
"metadata": {},
"source": [
"## Demo video\n",
"\n",
"If you follow the above steps, you will end with the following result.\n",
"\n",
"You can see that the *portenta* and *pi* are mis-labelled quite often: this is expected result as we saw from the features pairplot."
]
},
{
"cell_type": "code",
"execution_count": 25,
"id": "38207261",
"metadata": {
"scrolled": true
},
"outputs": [
{
"data": {
"text/html": [
"<video src=\"assets/esp32 image object classification live demo.mp4\" controls width=\"728\" >\n",
" Your browser does not support the <code>video</code> element.\n",
" </video>"
],
"text/plain": [
"<IPython.core.display.Video object>"
]
},
"execution_count": 25,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"\"\"\"\n",
"Play demo video\n",
"\"\"\"\n",
"from IPython.display import Video\n",
"\n",
"Video(\"assets/esp32 image object classification live demo.mp4\", width=728)"
]
},
{
"cell_type": "markdown",
"id": "4405e8d7",
"metadata": {},
"source": [
"Processing time is 12 milliseconds, while classification time is < 20 microseconds (1 / 1000 th of DSP!). If you do the math, this translates to **~80 FPS**, which is greater than your Esp32-cam frame rate.\n",
"\n",
"<x-alerts.info>In the next release of <code>everywhereml</code>, thanks to some approximated math, DSP will lower to 6 milliseconds (a.k.a. <b>160 FPS!</b>)</x-alerts.info>"
]
},
{
"cell_type": "markdown",
"id": "18d15f63",
"metadata": {},
"source": [
"## Conclusion\n",
"\n",
"When it comes to **image recognition on Esp32-cam**, you have two options:\n",
"\n",
" 1. If you're looking for the best accuracy possibile, you should stick to Neural Networks: they achieve state-of-the-art results. Platforms like Edge Impulse will speed up your development time.\n",
" 2. If you're goal is to implement something that works *good* and **really fast**, you now have an alternative option to choose thanks to the Eloquent Arduino libraries."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1c7fe7b2",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "getting-started-with-tinyml",
"language": "python",
"name": "getting-started-with-tinyml"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}