Mckenzie  Osiki

Mckenzie Osiki


Use pre-trained Huggingface models in TensorFlow Serving

Scale thousand of community NLP model to production

HuggingFace simplifies NLP to the point that with a few lines of code you have a complete pipeline capable to perform tasks from sentiment analysis to text generation. Being a Hub for pre-trained models and with its open-source framework Transformers, a lot of the hard work that we used to do is simplified. This allows us to write applications capable of solving complex NLP tasks, but with the downside that we don’t know what is happening behind the curtains. Despite this amazing simplification that **HuggingFace **and **Transformers **perform we might want some abstraction from all that code and just simply use one of many available pre-trained models. In this post, we will learn how we can use one of many pre-trained models with TensorFlow Serving, a popular service to put machine learning models into production.

I will use this Distilbert pre-trained model for Sentiment Analysis which will predict if a given text is positive or negative. Unfortunately, Transformers doesn’t have a direct export functionality to TensorFlow Serve so we will have to do a bit of working around to achieve our goals. First, we need to install Tensorflow, Transformers and NumPy libraries.

#nlp #bert #tensorflow-serving #hugging-face #sentiment-analysis

Use pre-trained Huggingface models in TensorFlow Serving