This article explains how to manage multiple models and multiple versions of the same model in TensorFlow Serving using configuration files along with a brief understanding of batching.

Image for post

You have TensorFlow deep learning models with different architectures or have trained your models with different hyperparameters and would like to test them locally or in production. The easiest way is to serve the models using a Model Server Config file.

A Model Server Configuration file is a protocol buffer file(protobuf), which is a language-neutral, platform-neutral extensible yet simple and faster way to serialize the structure data.

#deep-learning #python #tensorflow-serving #tensorflow

How to Serve Different Model Versions using TensorFlow Serving
4.55 GEEK