Training model with tensorflow

Making my own models and importing them on a microcontroller

Learning TinyML journey:

Now it's time to run generated models that are been trained on datasets (from the internet, myself included). I struggled a lot with this phase. A lot of incompatibilities were present in the project. eg: I could not run TensorFlow on my microcontroller if I had not reverted my version to an older version of TensorFlow Lite.

Also, the training section of the documentation did not work properly on google collab. I think the main reason this happened is that it's in alpha and the code (TensorFlow Github) is fixing bugs on a daily basis. The other day, the training process would partially complete, the other day it would not work, and after some time, the training completed without errors. Keep in mind that I did not change the code. I had been stuck on this step for 3 days, so I needed other alternatives that implement TinyML.

Edge impulse alternative

Another more stable alternative is edge impulse machine learning. This had A lot of advantages.

  • Clean/understandable code
  • Optimised/quantised models
  • Datasets are made with your voice
  • Clean UI
  • Content management system for training models.
  • Good UX
  • A lot of options on how to export the model
  • Connection with a phone is also a possibility.

Edge impulse was the technology where I was looking for. This made the whole training models and machine learning a lot easier. I followed the steps in this documentation. on how to set up and export your module to an Arduino sketch.

Documentation of the process:

I connected my device to edge impulse using the terminal.

Connection of the BLE sense

When this was done, I could collect some data examples with labels that represent classes: In this case, I made a model using Yes, no, noise classes.

Data collection

Then I could configure the samples and tell it what it can train, how to train. On which aspects should focus, etc

Configuration

When this was done, the fun part can begin:

  • Training the model (neural network)
  • Live classification testing
  • Model testing
  • Deploying the model to microcontrollers

Training

Live classification

Model testing

Deployment

I needed to export my model as an Arduino library. When I imported this model, I experienced some minor incompatible issues that were caused by versions of the BLE sense board firmware that was installed with the Arduino IDE in the board's manager. I needed to revert this to version 1.4. I uploaded the sketch that was present opened up my serial monitor. and it gave me finally some positive output!

Example

Arduino sketch

When I configured it with my impulse and loaded an example into it, It would give me a confidence rate. AKA: tasks successful.

I changed the code up, and now I can make a servo move with my IC2 configuration with Arduino (Arduino BLE sense acts as a brain that sent commands to the body: the normal Arduino board) The command that I gave it:

when you hear yes the servo will move continuously when I say no the servo will stop spinning. This is run locally, which is an impressive accomplishment.

An output example, where you can play around with:

Output example

This was my journey on how to machine learn models and display them locally on your Arduino microcontroller

My personal opinion:

I struggled a lot with this step, the journey was NOT flawless and came with a lot of setbacks. Luckily I kept my head up and searched for a solution! For this week I will make the first version of my robot that can respond to left, right, stop, go commands!

All my prototypes will be linked together and I will finally see some progress that I have been working on!