My Journey in Converting PyTorch to TensorFlow Lite. Sometimes an MLOps gotta do what an MLOps gotta do
I recently had to convert a deep learning model (a MobileNetV2 variant) from PyTorch to TensorFlow Lite. It was a long, complicated journey, involved jumping through a lot of hoops to make it work. I found myself collecting pieces of information from Stackoverflow posts and GitHub issues. My goal is to share my experience in an attempt to help someone else who is lost like I was.
DISCLAIMER: This is not a guide_ on how to properly do this conversion. I only wish to share my experience. I might have done it wrong (especially because I have no experience with Tensorflow). If you notice something that I could have done better/differently — please comment and I’ll update the post accordingly._
Convert a deep learning model (a MobileNetV2 variant) from Pytorch to TensorFlow Lite. The conversion process should be:
Pytorch →ONNX → Tensorflow → TFLite
In order to test the converted models, a set of roughly 1,000 input tensors was generated, and the PyTorch model’s output was calculated for each. That set was later used to test each of the converted models, by comparing their yielded outputs against the original outputs, via a mean error metric, over the entire set. The mean error reflects how different are the converted model outputs compared to the original PyTorch model outputs, over the same input.
I decided to treat a model with a mean error smaller than 1e-6 as a successfully converted model.
It might also be important to note that I added the batch dimension in the tensor, even though it was 1. I had no reason doing so other than a hunch that comes from my previous experience converting PyTorch to DLC models.
In this article, take a look at accelerated automatic differentiation with Jax and see how it stacks up against Autograd, TensorFlow, and PyTorch.
Pytorch and Tensorflow are by far two of the most popular frameworks for Deep Learning. It’s always a lot of work to learn and be comfortable with a new framework, so a lot of people face the dilemma of which one to choose out of the two
Understanding the Problem Statement: MNIST. In this post, you'll see Image Classification Model in PyTorch and TensorFlow
PyTorch vs Tensorflow: Key Differences You Need To Know. This comparison blog on TensorFlow vs PyTorch provides you with a crisp knowledge about the top deep learning frameworks.
In this video I explain how Einstein Summation (einsum) works and why it is amazing, at the end of the video you too will realize that einsum is all you need. It works similar in all libraries with the following function call: torch.einsum, tf.einsum, numpy.einsum (pytorch, tensorflow, numpy).