According to Statistica, 25.6 billion units of microcontrollers were shipped in 2019. There are over 250 billion microcontrollers in the world and this number is projected to grow over the coming years. As a result of this, deep learning on embedded devices is one of the fastest-growing fields. This area is popularly known as tiny machine learning (TinyML). That said, embedded devices pose a couple of challenges, key among them being their low processing power and limited memory. Machine learning models must therefore be able to work on just a few kilobytes of memory. They must also be able to perform inferencing with the low processing power available in embedded systems. In this piece, we’ll look at TensorFlow Lite Micro (TF Micro) whose aim is to run deep learning models on embedded systems. TF Micro is an open-source ML inference framework that has been fronted by researchers from Google and Harvard University. It addresses the resource constraints faced with running deep learning models on embedded systems.

Applications of TinyML

TensorFlow Lite Micro: Embedded Machine Learning on TinyML Systems

The authors of the above TinyML article start by highlighting some of the applications of TinyML technology. Key among them being:

  • Wakeword detection — waking up a device with a certain phrase, e.g., ‘Ok, Google.’
  • Predictive maintenance — resulting from analysis and modeling of signals from microphones, sensors, and accelerometers, just to mention a few.
  • Acoustic-anomaly detection.
  • Visual object detection.
  • Human-activity recognition.

#machine-learning #tensorflow

TensorFlow Lite Micro
4.00 GEEK