Taghowto

HowTo: Load Tensorflow Lite Tinyml model from internet on Arduino

If you have an internet-connected board, you can now load Tensorflow Lite Tinyml models on-demand directly from the internet! This way you can repurpose your board for different applications without flashing new firmware. Let's see how in this tutorial.

Tensorflow Tinyml from internet


A few days ago I showed you how to load Tensorflow Lite Tinyml models from an SD card in Arduino. This time I'll show you how to download models from internet.

Why?

If your board has internet connectivity (either Ethernet or Wifi), you may want to load different models as per user needs, or maybe you host your own models and want to keep them updated so they improve the end user experience without requiring firmware update.

Whatever your use-case, it really is very easy to download a model from the internet, much similar to how we did for the SD card. It is a 3 step process:

  1. connect to internet (either Ethernet or WiFi)
  2. download model from URL
  3. initialize Tensorflow from the downloaded model

The whole sketch is quite short and mostly contains boilerplate code (connect to wifi, make HTTP request, run Tensorflow tinyml inference). I will make use of the EloquentTinyML library because it makes using Tf painless.

The sketch should work on many different boards without any (significant) modification: I tested it on an ESP32, but you could use the new Arduino RP2040 Connect for example.

As always, we'll load the sine model from an HTTP server (in a next post I will show the HTTPS version).

#include <SPI.h>
#include <WiFi.h>
// include WifiNINA instead of WiFi for Arduino boards
// #include <WiFiNINA.h>
#include <HttpClient.h>
#include <EloquentTinyML.h>
#include <eloquent_tinyml/tensorflow.h>

#define N_INPUTS 1
#define N_OUTPUTS 1
#define TENSOR_ARENA_SIZE 2*1024

char SSID[] = "NetworkSSID";
char PASS[] = "Password";

// this is a server I owe that doesn't require HTTPS, you can replace with whatever server you have at hand
// that supports HTTP
const char server[] = "152.228.173.213";
const char path[] = "/sine.bin";

WiFiClient client;
HttpClient http(client);

uint8_t *model;
Eloquent::TinyML::TensorFlow::TensorFlow<N_INPUTS, N_OUTPUTS, TENSOR_ARENA_SIZE> tf;

void setup() {
    Serial.begin(115200);
    delay(2000);

    wifi_connect();
    download_model();

    // init Tf from loaded model
    if (!tf.begin(model)) {
        Serial.println("Cannot inialize model");
        Serial.println(ml.errorMessage());
        delay(60000);
    }
    else {
        Serial.println("Model loaded, starting inference");
    }
}

void loop() {
    // pick up a random x and predict its sine
    float x = 3.14 * random(100) / 100;
    float y = sin(x);
    float input[1] = { x };
    float predicted = tf.predict(input);

    Serial.print("sin(");
    Serial.print(x);
    Serial.print(") = ");
    Serial.print(y);
    Serial.print("\t predicted: ");
    Serial.println(predicted);
    delay(1000);
}

/**
 * Connect to wifi
 */
void wifi_connect() {
    int status = WL_IDLE_STATUS;

    while (status != WL_CONNECTED) {
        Serial.print("Attempting to connect to SSID: ");
        Serial.println(SSID);
        status = WiFi.begin(SSID, PASS);

        delay(1000);
    }

    Serial.println("Connected to wifi");
}

/**
 * Download model from URL
 */
void download_model() {
    http.get(server, path);
    http.responseStatusCode();
    http.skipResponseHeaders();

    int modelSize = http.contentLength();

    Serial.print("Model size is: ");
    Serial.println(modelSize);
    Serial.println();

    model = (uint8_t*) malloc(modelSize);

    http.read(model, modelSize);
}

Check the full project code on Github and remember to star!

HowTo: Load Tensorflow Lite model from SD card in Arduino

In this short post we'll take a look at how lo load Tensorflow Lite models exported as a C header file from the filesystem, be it an SD card or the built-in SPIFFS filesystem on ESP32 devices.

Load Tensorflow model from SD card

Continue reading