Real-time Using Arduino, Firebase and Angular.js

Real-time Using Arduino, Firebase and Angular.js

Interact With the Web in Real-time Using Arduino, Firebase and Angular.js

This simple project is meant to be a hybrid introduction to connecting and manipulating data over the internet with Arduino, Node, Angular and Firebase.

UPDATE: Check out @Moycs777 terrific web app based on this example hosted on Firebase.

COMMENTS: Please do contribute with your comments. I’d like to know how you like (or don’t like) it.

The Internet of Things is nothing new. You may have been using it all along. In the broadest sense, your laptop and smartphones are IoT objects. What’s actually new is the “T” part. We have been using computers and smartphones so often that we hardly recognize them as “things.” However, “things” are more synonymous to everyday objects such as clothes, furnitures, fridges, clocks, books, lamps, skateboards, bicycles, and etc. IoT is when a coffee machine that brews you a cup of java when the weather gets too cold, a pair of shoes that lights up 10 minutes before your train arrives, or a door knob that alerts your phone when your parents try to trespass into your room.

To be able to connect to the internet, make sense of its data and interact with the users, these things need tiny computers aka micro controllers to make them conscious.

What we are building

We are going to connect an Arduino board to the internet and change the RGB color property on a web page in real-time by rotating a potentiometer.

What we need:

  • Arduino Uno
  • A-B USB cable
  • Potentiometer (aka pot)
  • RGB LED
  • 330 Ohms resistors x 3
  • 10 KOhms resistor x 1
  • Male-to-male jumper wires x 9

Everything is included in Sparkfun’s inventor’s kit, which is quite neat to get your hands on. No wifi or ethernet shield needed, since we’re actually persisting the data to the online database and communicate with the web app via the REST API provided by Firebase.

If you have not installed Node, head to Node.js home and follow instructions to install Node and npm.

All the codes can be downloaded from my repo or cloned using git:

$ git clone https://github.com/jochasinga/firepot

Once downloaded, cd firepot to getinto the directory, and you should see two subdirectories—pot and app. cd into each one and install the dependencies:

$ cd pot && npm install

All dependencies are indicated in package.json, and npm automatically install them according to the information. They will be collected in a new subdirectory node_modules and can be required by any code within the project scope.

Connecting the circuit

Connect your circuit according to the diagram. A typical potentiometer has three leads. Connect one end to +5V power via 10 KOhms pull-up resistor and the other far end to ground (0V). The pot provides variable resistance between the voltage, and the middle lead should be connected to Arduino’s analog input pin (A0) to feed the voltage signal into the Arduino.

RGB LED is essentially three LEDs in one, each with different color of red, green and blue, together producing 16,777,216 possible colors. We are going to use the pot to traverse this color range from pure red (#FF0000) to blue (#0000FF) to green (#00FF00) and back to red again. The longest lead of the LED is called a common, and should be connected to ground. The rest are connected to Arduino’s PWMoutputs (those with ~ preceding the number). The code connects red, green and blue leads to ~9, ~10 and ~11 respectively.

Connect the Arduino to your laptop via the USB cable, and you’re good to go.

Signing up with Firebase

Firebase is a JSON-style database service for real-time applications (You’ll need a free signup to use). Firebase implements a clever way of manipulating JSON data by adopting RESTful APIs. In this project, we will CREATE, READ and UPDATE a data chunk that looks like this:

"colors" : {
  "r" : 255,
  "g" : 0,
  "b" : 0
}

Firebase has an easy way of getting to your data via REST api, i.e. you can get your JSON data at https://burning-limbo-6666.firebaseio.com/colors.json, where https://burning-limbo-6666.firebaseio.com is the domain address of your app which Firebase generates for you after creating a new app, and /colors is the parent node of your data. Firebase has a data dashboard at the very URL, so you can just paste the address into the browser and hit it after you’ve updated the data there in the next section to see your data changed by the pot in real-time.

pot.js

Johnny-five is a javaScript library that wraps around Arduino’s C/C++ language and interface with the board via Firmata firmware, created by the awesome Rick Waldron, and is synonymous to the Nodebot Movement. In order to make it work, you must open the Arduino IDE to flash a standard Firmata code onto the board. On your IDE, go to File > Examples > Firmata > StandardFirmata and upload the code (Don’t forget to select the right board and serial port in the Tools menu). Once the upload have finished, you can close the IDE. Now, let’s have a look at our pot.js code.

var Firebase = require("firebase");
var five = require("johnny-five");

// Create a new reference of Firebase db
var firebaseRef = new Firebase(
  // fictional URL, replace it with your own from Firebase
  "https://burning-limbo-6666.firebaseio.com/colors"
);

five.Board().on("ready", function() {
  var maxValue = 511;
  var colRange = 6;
  var offset   = maxValue / colRange;

  // Create a new pot instance
  var pot = new five.Sensor({
    pin: "A0",
    freq: 250
  });

  // Create a new led array based on pin number
  var ledArray = new five.Led.Array([9, 10, 11]);

  // Listen on data change
  pot.on("data", function() {

    var self = this.value;
    // Print pot value 
    console.log(self);

    // Map dynamic color brightness to pot value
    // RED - MAGENTA - BLUE
    var redDec   = Math.round(five.Fn.map(self, offset, offset*2, 255, 0));
    var blueInc  = Math.round(five.Fn.map(self, 0, offset, 0, 255));
    // BLUE - CYAN - GREEN
    var blueDec  = Math.round(five.Fn.map(self, offset*3, offset*4, 255, 0));
    var greenInc = Math.round(five.Fn.map(self, offset*2, offset*3, 0, 255));
    // GREEN - YELLOW - RED
    var greenDec = Math.round(five.Fn.map(self, offset*5, offset*6, 255, 0));
    var redInc   = Math.round(five.Fn.map(self, offset*4, offset*5, 0, 255));

    // Adjusting color brightness conditionally based on 
    // the location of the pot output value.
    switch (true) {
      case (self > 0 && self <= offset):
        console.log("1st loop");
        ledArray[0].brightness(255);
        ledArray[2].brightness(blueInc);
        ledArray[1].brightness(0);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": 255, "b": blueInc, "g": 0});
        break;
      case (self > offset && self <= offset*2):
        console.log("2nd loop");
        ledArray[0].brightness(redDec);
        ledArray[2].brightness(255);
        ledArray[1].brightness(0);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": redDec, "b": 255, "g": 0});
	break;
      case (self > offset*2 && self <= offset*3):
        console.log("3rd loop");
        ledArray[0].brightness(0);
        ledArray[2].brightness(255);
        ledArray[1].brightness(greenInc);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": 0, "b": 255, "g": greenInc});
	break;
      case (self > offset*3 && self <= offset*4):
        console.log("4th loop");
        ledArray[0].brightness(0);
        ledArray[2].brightness(blueDec);
        ledArray[1].brightness(255);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": 0, "b": blueDec, "g": 255});
        break;
      case (self > offset*4 && self <= offset*5):
        console.log("5th loop");
        ledArray[0].brightness(redInc);
        ledArray[2].brightness(0);
        ledArray[1].brightness(255);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": redInc, "b": 0, "g": 255});
	break;
      case (self > offset*5 && self <= offset*6):
        console.log("6th loop");
        ledArray[0].brightness(255);
        ledArray[2].brightness(0);
        ledArray[1].brightness(greenDec);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": 255, "b": 0, "g": greenDec});
	break;
      default:
	console.log("Out of range");
        ledArray[0].brightness(255);
	ledArray[2].brightness(0);
	ledArray[1].brightness(0);
	// update firebase colors' child node r, g, b
	firebaseRef.set({"r": 255, "b": 0, "g": 0});
    }
  });
});

First, (1–3) we require our installed dependencies for the app, firebase and johnny-five, then we (5–8)create a new firebase reference firebaseRef to where your data is stored. After that, (10) we create a new johnny-five instance of the Arduino board, and hook it to a callback function which will execute the rest of the code once it’s ready. (11–13) I assign the max value we expect from the pot to a variable and divide it by the RGB color subrange number to obtain a standard offset “distance” used as a step variable to calculate where the output value from the pot is on the RGB strip i.e. offset is MAGENTA, offset * 2 is BLUE, offset * 3 is CYAN and so on. You can see how I divided the RGB color strip into 6 subranges, as graphically shown below.

Normally on 5V power, a pot will convert analog signal (0–5V) to digital and give out a range of integer from 0–1023. In my case, my little pot maxes at half of that, so my maxValue lies somewhere around 511(Check this value by logging the output with *console.log()).*Then (16–19),create a new instance of the pot sensor, set its analog pin to A0 and frequency to 250. (22) Assign each LED’s pin to an array variable. Now, (25++) set our pot instance to listen to the “data” event, and within the callback function is the rest of the code that (27–40)calculates and maps the pot’s output range (0-maxValue) to a range of 0–255 (LED’s brightness)using our obtained step variable offset. (44–1o2) I use switch-case loop to conditionally adjusts each LED’s color brightness with Led.brightness method and saving these values to Firebase with Firebase.set method according to where the pot value is.

After that, run pot.js with node commandline

$ node pot.js

Your LED should light up and your terminal should be loop-printing the value from the pot (self) and which loop (or subrange) your pot value is currently in. Try spinning the pot to see the printed data changed as the LED’s color gradually shift. Then, browse to your Firebase dashboard using your app’s URL (i.e. https://burning-limbo-6666.firebaseio.com/colors). You should see your data change as you rotate your pot. We have successfully CREATEd and UPDATEd data on a database like you would have done with web forms, sliders or buttons. That concludes the hardware side.

app.js

Now we are going to work on our client, a simple web app. The structure of the app directory is as follow:

app
├── app.js
├── node_modules
|   ├── express
|   ├── firebase
|   └── socket.io
├── package.json
└── public
    ├── index.html    
    └── index.js

If you have not installed dependencies, you will probably not see node_modules subdirectory in there. Do so now using npm install.

$ cd app && npm install

Take note of the public directory. app.js is a server code which serve static contents from public directory, in this case, index.html and index.js. Let’s hop into app.js:

var Firebase = require("firebase");
var express = require("express");

// Create HTTP Server
var app = express();
var server = require("http").createServer(app);
// Attach Socket.io server 
var io = require("socket.io")(server);
// Indicate port 3000 as host
var port = process.env.PORT || 3000;

// Create a new firebase reference
var firebaseRef = new Firebase(
  "https://burning-limbo-6666.firebaseio.com/colors"
);

// Make the server listens on port 3000
server.listen(port, function() {
  console.log("Server listening on port %d", port);
});

// Routing to static files
app.use(express.static(__dirname + "/public"));

// Socket server listens on connection event
io.on("connection", function(socket) {
  console.log("Connected and ready!");
  
  // firebase reference listens on value change, 
  // and return the data snapshot as an object
  firebaseRef.on("value", function(snapshot) {
    var colorChange = snapshot.val();
    
    // Print the data object's values
    console.log("snapshot R: " + colorChange.r);
    console.log("snapshot B: " + colorChange.b);
    console.log("snapshot G: " + colorChange.g);
  });
});


What this code does is (5–18) creating a Node web server to listen to requests on port 3000 and (21)serve the front-end contents from inside public directory. Then (24–25) the server waits for a socket connection (i.e. GET request), print out “Connect and ready!” message, and (29++) start tracking data from Firebase and printing out changes. Firebase is not necessary here, since we will be using Angularfire library in public/index.js to sync data from Firebase directly there, but it’s intentionally included in here to exhibit a basic firebase method of detecting data changes and retrieving the snapshot of them. The most important part here is serving the public/index.html page and run script in public/index.js.

index.html

Our web page will be displaying R : G : B values dynamically from Firebase and changing the background color of the div according to how you rotate your pot. We’re going to use AngularFire, Firebase client library supporting Angular.js.

<!DOCTYPE html>
<!--Directive-->
<html ng-app="app">
  <head>
  </head>
  <!--Controller-->
  <body ng-controller="Ctrl">
    <div class="header">
      <!--Style-->
      <h1 ng-style="{'background-color': 'rgb(' + data.r + ',' + data.g + ',' + data.b +')'}">                      
        Afro Circus! {{ data.r }} : {{ data.g }} : {{ data.b }}                                                     
      </h1>
    </div>
    <!--Scripts-->
    <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.3.0-beta.19/angular.min.js"></script>
    <script src="https://cdn.firebase.com/libs/angularfire/0.8.2/angularfire.min.js"></script>
    <script src="/index.js"></script>
  </body>
</html>

https://gist.github.com/jochasinga/c720677640e026381366

This html view(V) binds its part to a data model(M) that syncs data from your Firebase storage, and as data change, only that part is re-rendered. Angular operates on what’s called a “directive.” Directives add new functionalities to HTML elements instead of manipulating the DOM as in JQuery. (3) ng-app directivestarts the application anddefines the scope of the binding, (7) ng-controller defines the application controller© and scope that that particular controller method has effect on, and (10) ng-style allows dynamic styling of document (like you would have done with JQuery’s .css or .addClass). To display data from the model, double-brackets ({{ }}) is used to contain the variable, which is a common way to do it in other web frameworks’ template language. Never mind the data object for now, you’ll see it in public/index.js. Ensure that you have included the scripts before .

index.js

This is the engine room of our front end. In this file, we attach the firebase module to the app, define the controller method and sync the data from firebase to a local model object used by the html binding.

// Register firebase module                                                                                         
var app = angular.module("app", ["firebase"]);

// Set up controller function                                                                                       
app.controller("Ctrl", function($scope, $firebase) {
    var firebaseRef = new Firebase(
      // Replace this fictional URL with your own
      "https://burning-limbo-6666.firebaseio.com/colors"                                                          
    );
    // create an AngularFire ref to the data                                                                        
    var sync = $firebase(firebaseRef);

    // pull the data into a local model                                                                             
    var syncObject = sync.$asObject();

    // sync the object with three-way data binding                                                                  
    syncObject.$bindTo($scope, "data");
});

(2) Register firebase service to Angular app. After that, (5) you’ll have $firebase variable available for injecting into the controller. (6–9) Setting up the first part of the controller method is familiar—we create a firebase reference to our data. Now, (11) use the reference as a parameter to $firebase() to create an Angularfire reference to the data. (14)We translate the data into a JavaScript object, and (17) bind it to the variable “data” that will be used in index.html template.

Whew! That was some work, right? Now comes the exciting part. Go back to pot directory and run your pot code again with:

$ node pot.js

Once you’ve got your LED turned on and the pot value start printing to the console, from another terminal window, run your app.js inside app directory:

$ node app.js

The console should start by printing “Server listening on port 3000" then gushing out RGB values from Firebase. Go to your web browser and browser to http://localhost:3000, hopefully, you’ll get like the video below.

If you like this article, please recommend and retweet. Feel free to shoot me at [email protected]. I’m up for talking, exchanging ideas, collaborations or consults. Comments are welcomed.

30s ad

Arduino Fun: Make a high-tech remote controlled car

DIY with Arduino - 5 Simple Projects to Get You Started

Learn PCB Design By Designing an Arduino Nano in Altium

Beyond Arduino, Part 2: Analog Input Output

Arduino for Beginners

DIY Sleep Apnea Screening with Arduino Pro Mini

DIY Sleep Apnea Screening with Arduino Pro Mini

I recently asked my doctor about testing for sleep apnea. There were a few reasons I suspected I might have sleep apnea: I’d sometimes wake up with shortness of breath, morning tiredness, occasional snoring, and had a family history of it. He...

I recently asked my doctor about testing for sleep apnea. There were a few reasons I suspected I might have sleep apnea: I’d sometimes wake up with shortness of breath, morning tiredness, occasional snoring, and had a family history of it. He suggested a home sleep apnea test (HSAT), which basically involves wearing a bunch of sensors at night to record breathing, blood oxygen levels, electrical activity, and other things. Then a specialist analyzes the data and makes a diagnosis. Unfortunately, I have a high deductible health insurance plan and the test is expensive, around $500. There was also a month long wait. Since I had some time, I decided to try taking some measurements on my own and then revisit whether I wanted to take a $500 test. My goal was not to try and diagnose sleep apnea myself, but just to use some simple rough measurements to decide whether the more expensive and accurate test was worth taking.

Disclaimer: if you think you have sleep apnea, talk to your doctor.

Quick Intro to Sleep Apnea

There are two major types of sleep apnea: central and obstructive. In central you stop making the effort to breath (diaphragm stops moving) and in obstructive you keep trying to breath, but the airflow is blocked. You can also have a mixed combination of the two. In all cases, your blood oxygen level will dip down when the airflow stops, which is not great for your health.

Taking Measurements

I used 3 sensors to measure breathing effort, airflow, and blood oxygen. They were, respectively, my cell phone, thermistors, and a pulse oximeter. A real HSAT has more sensors than this, but again my goal was to do a cheap, rough screening that was just accurate enough to make an informed decision on taking an HSAT.

For breathing effort I used the accelerometer in my phone with some medical tape to attach it to my stomach. My phone is fairly light and I was able to sleep comfortably on both my back and side. Sleeping on your stomach would probably not work as well, especially if your mattress is firm, since the phone would move less in that position.

Initially, I thought I would need to integrate the acceleration twice to get position. But the small errors accumulate and average velocity ends up being non-zero, so the position is constantly moving off in one direction. Simply plotting acceleration worked well to track the motion of my stomach. I was also able to use the gyroscope sensor to track whether I was sleeping on my back or left/right side.

To measure airflow, I soldered 3 precision thermistors together and taped them below the nose, one under each nostril and one just over the mouth, with medical tape (I actually initially used the band part of a band-aid, which worked well, but looked super dorky). Thermistors are resistors whose resistance changes with temperature. The idea is that when you breathe in and out the thermistors are alternately hit by warm and cool air. To detect the change in resistance I used a simple voltage divider circuit and measured the voltage with an Arduino Pro Mini. This worked Ok, but the measurement from the Arduino was noisy and I had to run it through a low-pass filter before plotting. If I were to redo this part, I’d try to find a more sensitive device to measure voltage. My multi-meter was more accurate, but it couldn’t communicate with my phone.

10k precision NTC thermistors, 5 for $3.96 on ebay


Arduino Pro Mini and voltage divider circuit, about $7 in total

I connected the Arduino to my phone with a USB cable and FTDI USB to TTL serial adapter board. I wrapped the whole Arduino circuit in a small bag to keep the jumper wires from pulling out. Super hacky, but it worked.

To measure blood oxygen, I used a cool little Bluetooth pulse oximeter. I tried out no fewer than 4 different pulse oximeters for this project and settled on this one because 1. It was Bluetooth capable so it could talk to my phone and 2. it was comfortable to wear on my finger for long time periods. The pulse oximeter measures blood oxygen (SpO2), pulse rate, and perfusion index.

Bluetooth pulse oximeter, $17.99 on ebay

An App to Bring it all Together

I created a simple Android app which records data from all of the sensors and writes it to a file. The pulse oximeter came with its own app, Oxycare, that displayed the readings, but lacked any feature to record data. Even if it did record data, I’d still prefer to use one single app to record all of the sensor readings since the timing of events needs to be accurate. I used JADX to decompile the apk into java code and located the classes relevant to the BLE connection and data parsing.

After recording a night’s worth of data, I used a python script to generate plots. Data was recorded every 200 ms and I plotted 200 seconds at a time, which meant I had to look at around 150 plots. In the end, I could see good breathing effort, airflow, and blood oxygen levels, so I opted not to take the $500 test.

I’ve placed the code for the Android app, Arduino, and python plotting script on github.

Android of Things Arduino

Android of Things Arduino

The Android of Things platform has some big advantages compared to Arduino or Raspbian. We Android developers can use the same tools we use for building applications for our projects, including all our favorite libraries, Google Play Services, and of course Kotlin.

The Android of Things platform has some big advantages compared to Arduino or Raspbian. We Android developers can use the same tools we use for building applications for our projects, including all our favorite libraries, Google Play Services, and of course Kotlin.

On the other hand, there’s a big disadvantage. Compared to Arduino, the Raspberry Pi can’t handle the low level custom protocols of some sensors. And compared to Raspbian, many of the available HATs are currently not supported.

How can we get the best of both worlds together? By using the right tool for the job. Let’s use the Arduino controller to read and write to our cheap and widely available sensors and use the Raspberry Pi to implement all the complex logic the same way we build Android applications.

Then, we will use the UART (universal asynchronous receiver/transmitter) of both platforms to exchange data between them. This is also known as the Serial on Arduino.

Wiring

Connecting the two devices is very simple:

  • Connect the GPIO pin 14 of the Raspberry to the RX pin of the Arduino (aka GPIO zero)
  • Connect the GPIO pin 15 of the Raspberry needs to the TX pin of the Arduino (aka GPIO 1)
  • Connect a Ground pin from the Raspberry to the Arduino.

However there’s still one thing to do:

Take a look at the Configuring the UART mode in the official documentation:

☞ Arduino with Ardublockly: Block based Arduino Programming

Since the UART Bluetooth mode is enabled by default, you can’t start using the UART to talk with the Arduino out-of-the-box. You need to go through that configuration step!

Code

I’ll like to give all the credit of this idea to Marcos Placona, who wrote an article about the same topic months ago and helped me get started:

☞ Arduino Bootcamp : Learning Through Projects

The library he started didn’t fit my needs so I went with a simpler, one class solution that you can copy and modify for your projects:

import android.util.Log
import com.google.android.things.pio.PeripheralManagerService
import com.google.android.things.pio.UartDevice

class Arduino(uartDevice: String = "UART0"): AutoCloseable {
    private val TAG = "Arduino"
    private val uart: UartDevice by lazy {
        PeripheralManagerService().openUartDevice(uartDevice).apply {
            setBaudrate(115200)
            setDataSize(8)
            setParity(UartDevice.PARITY_NONE)
            setStopBits(1)
        }
    }

    fun read(): String {
        val maxCount = 8
        val buffer = ByteArray(maxCount)
        var output = ""
        do {
            val count = uart.read(buffer, buffer.size)
            output += buffer.toReadableString()
            if(count == 0) break
            Log.d(TAG, "Read ${buffer.toReadableString()} $count bytes from peripheral")
        } while (true)
        return output
    }

    private fun ByteArray.toReadableString() = filter { it > 0.toByte() }
            .joinToString(separator = "") { it.toChar().toString() }

    fun write(value: String) {
        val count = uart.write(value.toByteArray(), value.length)
        Log.d(TAG, "Wrote $value $count bytes to peripheral")
    }

    override fun close() {
        uart.close()
    }
}

Arduino.kt

This class provides three simple methods that cover the following:

  • Connect the GPIO pin 14 of the Raspberry to the RX pin of the Arduino (aka GPIO zero)
  • Connect the GPIO pin 15 of the Raspberry needs to the TX pin of the Arduino (aka GPIO 1)
  • Connect a Ground pin from the Raspberry to the Arduino.

On the Arduino project, you need to handle the received commands and write the responses afterwards. This is an example of how to do it:

Arduino.c

if (Serial.available() > 0) {
  char command = (char) Serial.read();
  switch (command) {
     case 'H':
       Serial.write("Hello World");
       break;
  }
  Serial.flush();
}

You can write an “H” to the UART, and read back “Hello World” from it:

Example.kt

val arduino = Arduino()
arduino.write("H")
Thread.sleep(100) // let Arduino reply
val hello = arduino.read() // hello = "Hello World"

Now you can happily communicate your Android of Things applications with Arduino and use all the arsenal of sweet sweet cheap Arduino sensors on your Android of Things projects.

Get Started with Machine Learning on Arduino

Get Started with Machine Learning on Arduino

Get Started with Machine Learning on Arduino .Arduino is on a mission to make Machine Learning simple enough for anyone to use. We’ve been working with the TensorFlow Lite team over the past few months and are excited to show you what we’ve been up to together:

Arduino is on a mission to make Machine Learning simple enough for anyone to use. We’ve been working with the TensorFlow Lite team over the past few months and are excited to show you what we’ve been up to together: bringing TensorFlow Lite Micro to the Arduino Nano 33 BLE Sense. In this article, we’ll show you how to install and run several new TensorFlow Lite Micro examples that are now available in the Arduino Library Manager.

The first tutorial below shows you how to install a neural network on your Arduino board to recognize simple voice commands.

Running the pre-trained micro_speech inference example.

Next, we’ll introduce a more in-depth tutorial you can use to train your own custom gesture recognition model for Arduino using TensorFlow in Colab. This material is based on a practical workshop held by Sandeep Mistry and Don Coleman, an updated version of which is now online.

If you have previous experience with Arduino, you may be able to get these tutorials working within a couple of hours. If you’re entirely new to microcontrollers, it may take a bit longer.

We’re excited to share some of the first examples and tutorials, and to see what you will build from here. Let’s get started!

Note: the following projects are based on TensorFlow Lite for Microcontrollers which is currently experimental within the TensorFlow repo. This is still a new and emerging field!

Microcontrollers and TinyML

Microcontrollers, such as those used on Arduino boards, are low-cost, single chip, self-contained computer systems. They’re the invisible computers embedded inside billions of everyday gadgets like wearables, drones, 3D printers, toys, rice cookers, smart plugs, e-scooters, washing machines. The trend to connect these devices is part of what is referred to as the Internet of Things.

Arduino is an open-source platform and community focused on making microcontroller application development accessible to everyone. The board we’re using here has an Arm Cortex-M4 microcontroller running at 64 MHz with 1MB Flash memory and 256 KB of RAM. This is tiny in comparison to Cloud, PC, or Mobile but reasonable by microcontroller standards.

Arduino Nano 33 BLE Sense board is smaller than a stick of gum

There are practical reasons you might want to squeeze ML on microcontrollers, including:

  • Function — wanting a smart device to act quickly and locally (independent of the Internet).
  • Cost — accomplishing this with simple, lower cost hardware.
  • Privacy — not wanting to share all sensor data externally.
  • Efficiency — smaller device form-factor, energy-harvesting or longer battery life.

There’s a final goal which we’re building towards that is very important:

  • Machine learning can make microcontrollers accessible to developers who don’t have a background in embedded development

On the machine learning side, there are techniques you can use to fit neural network models into memory constrained devices like microcontrollers. One of the key steps is the quantization of the weights from floating point to 8-bit integers. This also has the effect of making inference quicker to calculate and more applicable to lower clock-rate devices.

TinyML is an emerging field and there is still work to do — but what’s exciting is there’s a vast unexplored application space out there. Billions of microcontrollers combined with all sorts of sensors in all sorts of places which can lead to some seriously creative and valuable Tiny ML applications in the future.

What you need to get started
  • An Arduino Nano 33 BLE Sense board
  • A Micro USB cable to connect the Arduino board to your desktop machine
  • To program your board, you can use the Arduino Web Editor or install the Arduino IDE. We’ll give you more details on how to set these up in the following section

The Arduino Nano 33 BLE Sense has a variety of onboard sensors meaning potential for some cool Tiny ML applications:

  • Voice — digital microphone
  • Motion — 9-axis IMU (accelerometer, gyroscope, magnetometer)
  • Environmental — temperature, humidity and pressure
  • Light — brightness, color and object proximity

Unlike classic Arduino Uno, the board combines a microcontroller with onboard sensors which means you can address many use cases without additional hardware or wiring. The board is also small enough to be used in end applications like wearables. As the name suggests, it has Bluetooth LE connectivity so you can send data (or inference results) to a laptop, mobile app or other BLE boards and peripherals.

Tip: Sensors on a USB stick

Connecting the BLE Sense board over USB is an easy way to capture data and add multiple sensors to single board computers without the need for additional wiring or hardware — a nice addition to a Raspberry Pi, for example.

TensorFlow Lite for Microcontrollers examples

The inference examples for TensorFlow Lite for Microcontrollers are now packaged and available through the Arduino Library manager making it possible to include and run them on Arduino in a few clicks. In this section we’ll show you how to run them. The examples are:

  • micro_speech — speech recognition using the onboard microphone
  • magic_wand — gesture recognition using the onboard IMU
  • person_detection — person detection using an external ArduCam camera

For more background on the examples you can take a look at the source in the TensorFlow repository. The models in these examples were previously trained. The tutorials below show you how to deploy and run them on an Arduino. In the next section, we’ll discuss training.

How to run the examples using Arduino Create web editor

Once you connect your Arduino Nano 33 BLE Sense to your desktop machine with a USB cable you will be able to compile and run the following TensorFlow examples on the board by using the Arduino Create web editor:


Compiling an example from the Arduino_TensorFlowLite library

Focus on the speech recognition example: micro_speech

One of the first steps with an Arduino board is getting the LED to flash. Here, we’ll do it with a twist by using TensorFlow Lite Micro to recognise voice keywords. It has a simple vocabulary of “yes” and “no”. Remember this model is running locally on a microcontroller with only 256KB of RAM, so don’t expect commercial ‘voice assistant’ level accuracy — it has no Internet connection and on the order of 2000x less local RAM available.

Note the board can be battery powered as well. As the Arduino can be connected to motors, actuators and more, this offers the potential for voice controlled projects.

How to run the examples using the Arduino IDE

Alternatively, you can use try the same inference examples using Arduino IDE application.

First, follow the instructions in the next section Setting up the Arduino IDE.

In the Arduino IDE, you will see the examples available via the File > Examples > Arduino_TensorFlowLite menu in the ArduinoIDE.

Select an example and the sketch will open. To compile, upload, and run the examples on the board, and click the arrow icon:

For advanced users who prefer a command line, there is also the arduino-cli.

Training a TensorFlow Lite Micro model for Arduino

Gesture classification on Arduino BLE 33 Nano Sense, output as Emojis

Next, we will use ML to enable the Arduino board to recognise gestures. We’ll capture motion data from the Arduino Nano 33 BLE Sense board, import it into TensorFlow to train a model, and deploy the resulting classifier onto the board.

The idea for this tutorial was based on Charlie Gerard’s awesome Play Street Fighter with body movements using Arduino and Tensorflow.js. In Charlie’s example, the board is streaming all sensor data from the Arduino to another machine which performs the gesture classification in Tensorflow.js. We take this further and “TinyML-ify” it by performing gesture classification on the Arduino board itself. This is made easier in our case as the Arduino Nano 33 BLE Sense board we’re using has a more powerful Arm Cortex-M4 processor, and an on-board IMU.

We’ve adapted the tutorial below, so no additional hardware is needed — the sampling starts on detecting movement of the board. The original version of the tutorial adds a breadboard and a hardware button to press to trigger sampling. If you want to get into a little hardware, you can follow that version instead.

Setting up the Arduino IDE

Following the steps below sets up the Arduino IDE application used to both upload inference models to your board and download training data from it in the next section. There are a few more steps involved than using Arduino Create web editor because we will need to download and install the specific board and libraries in the Arduino IDE.

  • Download and install the Arduino IDE fromhttps://arduino.cc/downloads
  • Open the Arduino application you just installed
  • In the Arduino IDE menu select Tools > Board > Boards Manager…
  • Search for “Nano BLE” and press install on the board
  • It will take several minutes to install
  • When it’s done close the Boards Manager window

  • Now go to the Library Manager Tools > Manage Libraries…
  • Search for and install the Arduino_TensorFlowLite library
  • Next search for and install the Arduino_LSM9DS1 library:

  • Finally, plug the micro USB cable into the board and your computer
  • Choose the board Tools > Board > Arduino Nano 33 BLE
  • Choose the port Tools > Port > COM5 (Arduino Nano 33 BLE)
  • Note that the actual port name may be different on your computer

There are more detailed Getting Started and Troubleshooting guides on the Arduino site if you need help.

Streaming sensor data from the Arduino board

First, we need to capture some training data. You can capture sensor data logs from the Arduino board over the same USB cable you use to program the board with your laptop or PC.

Arduino boards run small applications (also called sketches) which are compiled from .ino format Arduino source code, and programmed onto the board using the Arduino IDE or Arduino Create.

We’ll be using a pre-made sketch IMU_Capture.ino which does the following:

  • Monitor the board’s accelerometer and gyroscope
  • Trigger a sample window on detecting significant linear acceleration of the board
  • Sample for one second at 119Hz, outputting CSV format data over USB
  • Loop back and monitor for the next gesture

The sensors we choose to read from the board, the sample rate, the trigger threshold, and whether we stream data output as CSV, JSON, binary or some other format are all customizable in the sketch running on the Arduino. There is also scope to perform signal preprocessing and filtering on the device before the data is output to the log — this we can cover in another blog. For now, you can just upload the sketch and get to sampling.

To program the board with this sketch in the Arduino IDE:

  • Download IMU_Capture.ino and open it in the Arduino IDE
  • Compile and upload it to the board with Sketch > Upload
Visualizing live sensor data log from the Arduino board

With that done, we can now visualize the data coming off the board. We’re not capturing data yet — this is just to give you a feel for how the sensor data capture is triggered and how long a sample window is. This will help when it comes to collecting training samples.

  • In the Arduino IDE, open the Serial Plotter Tools > Serial Plotter
  • If you get an error that the board is not available, reselect the port:
  • Tools > Port > portname (Arduino Nano 33 BLE)
  • Pick up the board and practice your punch and flex gestures
  • You’ll see it only sample for a one second window, then wait for the next gesture
  • You should see a live graph of the sensor data capture (see GIF below)

When you’re done be sure to close the Serial Plotter window — this is important as the next step won’t work otherwise.

Capturing gesture training data

To capture data as a CSV log to upload to TensorFlow, you can use Arduino IDE > Tools > Serial Monitor to view the data and export it to your desktop machine:

  • Reset the board by pressing the small white button on the top
  • Pick up the board in one hand (picking it up later will trigger sampling)
  • In the Arduino IDE, open the Serial Monitor Tools > Serial Monitor
  • If you get an error that the board is not available, reselect the port:
  • Tools > Port > portname (Arduino Nano 33 BLE)
  • Make a punch gesture with the board in your hand (Be careful whilst doing this!)
  • Make the outward punch quickly enough to trigger the capture
  • Return to a neutral position slowly so as not to trigger the capture again
  • Repeat the gesture capture step 10 or more times to gather more data
  • Copy and paste the data from the Serial Console to new text file called punch.csv
  • Clear the console window output and repeat all the steps above, this time with a flex gesture in a file called flex.csv
  • Make the inward flex fast enough to trigger capture returning slowly each time

Note the first line of your two csv files should contain the fields aX,aY,aZ,gX,gY,gZ

Linux tip: if you prefer you can redirect the sensor log output from the Arduino straight to a .csv file on the command line. With the Serial Plotter / Serial Monitor windows closed use:

$ cat /dev/cu.usbmodem[nnnnn] > sensorlog.csv
Training in TensorFlow

We’re going to use Google Colab to train our machine learning model using the data we collected from the Arduino board in the previous section. Colab provides a Jupyter notebook that allows us to run our TensorFlow training in a web browser.

The Colab will step you through the following:

  • Setup Python Environment
  • Upload the punch.csv and flex.csv data
  • Parse and prepare the data
  • Build & Train the Model
  • Convert the Trained Model to TensorFlow Lite
  • Encode the Model in an Arduino Header File

The final step of the colab is generates the model.h file to download and include in our Arduino IDE gesture classifier project in the next section:

Let’s open the notebook in Colab and run through the steps in the cells — arduino_tinyml_workshop.ipynb

Classifying IMU Data

Next we will use model.h file we just trained and downloaded from Colab in the previous section in our Arduino IDE project:

  1. Open IMU_Classifier.ino in the Arduino IDE.
  2. Create a new tab in the IDE. When asked name it model.h

3. Open the model.h tab and paste in the version you downloaded from Colab

4. Upload the sketch: Sketch > Upload

5. Open the Serial Monitor: Tools > Serial Monitor

6. Perform some gestures

7. The confidence of each gesture will be printed to the Serial Monitor (0 = low confidence, 1 = high confidence)

For added fun, the Emoji_Button.ino example shows how to create a USB keyboard that prints an emoji character in Linux and macOS. Try combining the Emoji_Button.ino example with the IMU_Classifier.ino sketch to create a gesture controlled emoji keyboard 👊.

Conclusion

It’s an exciting time with a lot to learn and explore in Tiny ML. We hope this blog has given you some idea of the potential and a starting point to start applying it in your own projects. Be sure to let us know what you build and share it with the Arduino community.

For a comprehensive background on TinyML and the example applications in this article, we recommend Pete Warden and Daniel Situnayake’s new O’Reilly book “TinyML: Machine Learning with TensorFlow on Arduino and Ultra-Low Power Microcontrollers”