If you need to deploy a machine learning model to a mobile device, it becomes challenging, as there’s limited space and processing power on the device. There’s no doubt that machine learning models suffer from such heavy model sizes and high latency when targeting mobile devices.

However, there are techniques to reduce size or increase performance so that they do fit and work on mobile (see the links below for more on these techniques). It should be noted that, despite these challenges, ML models are currently being deployed to mobile devices.

In this article, we’re going to discuss how to implement a housing price prediction machine learning model for mobile using TensorFlow Lite. We’ll learn how to train a TensorFlow Lite neural network for regression that provides a continuous value prediction, specifically in the context of housing prices.

TensorFlow Lite is an open source deep learning framework for mobile device inference. It is essentially a set of tools to help us run TensorFlow models on mobile, embedded, and IoT devices. TensorFlow Lite enables on-device machine learning inference with low latency and a small binary size.

There are two main components of TensorFlow Lite:

  • TensorFlow Lite interpreter: The interpreter runs optimized Lite models on many different hardware types, including mobile phones, embedded devices, and microcontrollers.
  • TensorFlow Lite converter : The converter basically converts TensorFlow models into an efficient form to be used by the interpreter. This can introduce optimizations to improve binary size as well as performance.

#machine-learning #tensorflow #tensorflow-lite #heartbeat #mobile-app-development

TensorFlow Lite Model for On-Device Housing Price Predictions
1.65 GEEK