Reuben  Deckow

Reuben Deckow

1618716900

How to Deploy A Semantic Search Engine with Streamlit and Docker on AWS Elastic Beanstalk

Streamlit is an open-source Python library that makes it easy to create applications for machine learning and data science. With Streamlit, you don’t need to learn Flask or any frontend development and you can focus solely on your application.

Our app will help users search for academic articles. Users will type text queries in a search box and retrieve the most relevant publications and their metadata. They will also be able to choose the number of returned results and filter them by the number of paper citations and the publication year.

Behind the scenes, we will vectorise the search query with a sentence-DistilBERT model and pass it onto a pre-built Faiss index for similarity matching. Faiss will measure the L2 distance between the query vector and the indexed paper vectors and return a list of paper IDs that are closest to the query.

Let’s see how the app will look like and then dive into the code.

#docker

What is GEEK

Buddha Community

How to Deploy A Semantic Search Engine with Streamlit and Docker on AWS Elastic Beanstalk

How to Use AWS Elastic Beanstalk to Reduce Risk of Deployment Downtime

You can use AWS Elastic Beanstalk to create and deploy an updated or upgrated application version with blue-green deployment using cloned configs.

In this piece, I’ll be demonstrating how AWS Elastic Beanstalk can simplify deployments by doing all the hard work for you – and with no risk of downtime – by employing a Blue/Green deployment strategy.

Using AWS means combining a large number of tools to complete projects. Personally, I choose to streamline this process by using Elastic Beanstalk, as it enables me and the rest of the dev team to control the AWS resources which power the applications we support and gives us full access to the underlying resources at any time.

#cloud #aws #elastic beanstalk #aws tools #aws elastic beanstalk

Lindsey  Koepp

Lindsey Koepp

1603612560

Deploying Secure and Scalable Streamlit Apps on AWS with Docker Swarm

Background

Streamlit is a popular open-source framework for creating machine learning and visualisation apps in Python. Although it is fun making your own Streamlit apps, deploying a production-grade app can be quite painful. If you are a data scientist who just wants to get the work done but doesn’t necessarily want to go down the DevOps rabbit hole, this tutorial offers a relatively straightforward deployment solution leveraging Docker Swarm and Traefik, with an option of adding user authentication with Keycloak. The first part of the tutorial is meant to be a gentle introduction to Docker. We will deploy a dockerised Streamilit demo app locally. The second part is about deploying the app in the cloud. These are two relatively standalone parts. You can skip the first section if you are already familiar with Docker.

Deploying apps locally with Docker

Step 1: Installing Docker

Simply follow the instruction on this page to download the latest Docker Windows Desktop from Docker (Mac users please see here; for Linux users, this installation script is very handy, which we will use when deploying the apps on AWS instances). Docker is a really powerful tool but I will only cover the basic commands and tools in this post. If you are interested, please check the tutorials online.

Please note the following system requirement:

  • Windows 10 64-bit: Pro, Enterprise, or Education (Build 15063 or later).
  • Hyper-V and Containers Windows features must be enabled.
  • The following hardware prerequisites are required to successfully run Client Hyper-V on Windows 10:
  • 64-bit processor with Second Level Address Translation (SLAT)
  • 4GB system RAM
  • BIOS-level hardware virtualization support must be enabled in the BIOS settings. For more information, see Virtualization.

If you are using Windows 7 or older version of macOS, you can try the Docker Toolbox. It will create a small Linux VM (VirtualBox). This VM hosts Docker Engine for you on your Windows system. If you have a newer OS, then chances are you can use its native virtualization and does not require VirtualBox to run Docker.

Step 2: Building and running the Streamlit GAN demo app

Since this post is about deployment, we won’t be creating Streamlit apps from scratch but using the GAN Face Generator demo app instead. The app calls on TensorFlow to generate photorealistic faces, using Nvidia’s Progressive Growing of GANs and Shaobo Guan’s Transparent Latent-space GAN method for tuning the output face’s characteristics.

First, let’s clone the demo repo.

git clone https://github.com/streamlit/demo-face-gan.git

The demo-face-gan folder contains the data and the trained GAN models (pg_gan and tl_gan), the app script app.py and the requirement.txt. Normally, we would create a virtual environment and install the modules specified in the requirement.txt. But let’s do it in the Docker way! Docker provides the ability to package and run an application in a loosely isolated environment called a container. A Docker container is nothing but an environment virtualized during run-time to allow users to isolate applications from the system underpinning it. To spin off containers, we need Docker images, which are non-changeable files containing libraries, source code, tools and other files needed to run applications. Let’s start by creating a Docker image for the demo app.

We will need to create a file called Dockerfile. You can think of it as a set of instructions or blueprint to build the image. Copy the Dockerfile below to the demo-face-gan folder. Note that I have included comments in the file to explain each part. This is probably one of the simplest Dockerfile. For details of other options available in Dockerfiles, please see the official document.

## Dockerfile to create a Docker image for the demo-face-gan Streamlit app

## Creates a layer from the python:3.7 Docker image
FROM python:3.7

## Copy all the files from the folders the Dockerfile is to the container root folder
COPY . .

## Install the modules specified in the requirements.txt
RUN pip3 install -r requirements.txt

## The port on which a container listens for connections
EXPOSE 8501

## The command that run the app
CMD streamlit run app.py

#2020 oct tutorials # overviews #aws #deployment #docker #scalability #security #streamlit

Iliana  Welch

Iliana Welch

1595249460

Docker Explained: Docker Architecture | Docker Registries

Following the second video about Docker basics, in this video, I explain Docker architecture and explain the different building blocks of the docker engine; docker client, API, Docker Daemon. I also explain what a docker registry is and I finish the video with a demo explaining and illustrating how to use Docker hub

In this video lesson you will learn:

  • What is Docker Host
  • What is Docker Engine
  • Learn about Docker Architecture
  • Learn about Docker client and Docker Daemon
  • Docker Hub and Registries
  • Simple demo to understand using images from registries

#docker #docker hub #docker host #docker engine #docker architecture #api

Reuben  Deckow

Reuben Deckow

1618716900

How to Deploy A Semantic Search Engine with Streamlit and Docker on AWS Elastic Beanstalk

Streamlit is an open-source Python library that makes it easy to create applications for machine learning and data science. With Streamlit, you don’t need to learn Flask or any frontend development and you can focus solely on your application.

Our app will help users search for academic articles. Users will type text queries in a search box and retrieve the most relevant publications and their metadata. They will also be able to choose the number of returned results and filter them by the number of paper citations and the publication year.

Behind the scenes, we will vectorise the search query with a sentence-DistilBERT model and pass it onto a pre-built Faiss index for similarity matching. Faiss will measure the L2 distance between the query vector and the indexed paper vectors and return a list of paper IDs that are closest to the query.

Let’s see how the app will look like and then dive into the code.

#docker

Rory  West

Rory West

1622771760

How to Create an AWS Continuous Deployment Pipeline

Curious how to set up a continuous deployment pipeline with AWS services? Read this blog and get acquainted with AWS CodeBuild, AWS CodePipeline and Elastic Beanstalk.

Creating a continuous deployment pipeline will bring us a step closer to an automated build, test, deploy strategy. In order to create such a pipeline, we need to have access to several tools. Instead of installing these on on-premise servers, we can make use of the AWS cloud offer. Let’s see how this can be accomplished!

1. Introduction

We want to create an automated pipeline in order to ensure that no manual and error prone steps are required for building, testing and deploying the application. When a failure occurs during one of these steps, we will be automatically notified and can take necessary actions in order to resolve the issue.

What will we be doing in this post?

  • Create a sample Spring Boot application.
  • Push the source code to GitHub.
  • Build the application with AWS CodeBuild.
  • Create an AWS Elastic Beanstalk environment.
  • Create an AWS CodePipeline to glue everything together.

You will need an account at GitHub (which is fairly easy to create) and an account at AWS for this. A step-by-step guide how to create an AWS account can be found in a previous post where we explored AWS Elastic Beanstalk.

The sources which we will be using can be found at GitHub.

#aws #continuous deployment #aws codebuild #aws codepipeline #elastic beanstalk