Deep Learning and Machine Learning models trained by many data professionals either end up in an inference.ipynb notebook or an app.pyfile 😅. Those meticulous model architectures capable of creating awe in the real world never see the light of the day. Those models just sit there in the background processing requests via an API gateway doing their job silently and making the system more intelligent.

People using those intelligent systems don’t always credit the Data Professionals who spent hours or weeks or months collecting data, cleaning the collected data, formatting the data to use it correctly, writing the model architecture, training that model architecture and validating it. And if the validation metrics are not very good, again going back to square one and repeating the cycle. Those people don’t see the struggle of a data professional not being able to use a 4 channel image on a pre-trained ResNet to get image features, they don’t appreciate the fact that it took a data professional days and nights to optimize the training parameters to make the model converge to a better accuracy or reach a lower loss value, they don’t get to see the data professional’s toil of choosing the best layer of combinations of best layers to predict the final output, or know the agonizing feeling when the validation scores are poor because the model weights weren’t loaded correctly or a different seed value was used at the start.

#streamlit #deep-learning

Deploy Deep Learning Models Using Streamlit and Heroku
3.10 GEEK