Cayla  Erdman

Cayla Erdman

1603072320

Dockerized 🐳 Flask-Celery-RabbitMQ-Redis Application

This explains how to configure Flask, Celery, RabbitMQ and Redis, together with Docker to build a web service that dynamically uploads the content and loads this content when it is ready to be displayed. We’ll focus mainly on Celery and the services that surround it. Docker is a bit more straightforward.

Project Structure

The finished project structure will be as follows:

├── Dockerfile
├── docker-compose.yml
├── README.md
├── app
│ ├── app.py
│ ├── tasks.py
│ └── templates
│ ├── download.html
│ └── index.html
├── scripts
│ ├── run_celery.sh
│ └── run_web.sh
└── requirements.txt

Creating the Flask application 🌶

First, we create a folder for our app. For this example, our folder is called app. Within this folder, create an app.py file and an empty folder named templates where our HTML templates will be stored.

For our app, we first include some basic Flask libraries and create an instance of the app:

from io import BytesIO

from flask import Flask, request
from flask import render_template, make_response
APP = Flask(__name__)

We define three routes for Flask to implement: a landing page, a secondary page that embeds and image, and a route for the image itself. Our image route crops an image dynamically. For this example, it crops an image using pillow, and some delays are also included so that the time taken to create the image is more apparent.

@APP.route(‘/’)

def index():
   return render_template(‘index.html’)
@APP.route(‘/image_page’)
def image_page():
   job = tasks.get_data_from_strava.delay()
   return render_template(‘home.html’)
@APP.route('/result.png')
def result():
   '''
   Pull our generated .png binary from redis and return it
   '''
   jobid = request.values.get('jobid')
   if jobid:
      job = tasks.get_job(jobid)
      png_output = job.get()
      png_output="../"+png_output
      return png_output
   else:
      return 404

#rabbitmq #celery #flask #docker #redis

What is GEEK

Buddha Community

Dockerized 🐳 Flask-Celery-RabbitMQ-Redis Application
Cayla  Erdman

Cayla Erdman

1603072320

Dockerized 🐳 Flask-Celery-RabbitMQ-Redis Application

This explains how to configure Flask, Celery, RabbitMQ and Redis, together with Docker to build a web service that dynamically uploads the content and loads this content when it is ready to be displayed. We’ll focus mainly on Celery and the services that surround it. Docker is a bit more straightforward.

Project Structure

The finished project structure will be as follows:

├── Dockerfile
├── docker-compose.yml
├── README.md
├── app
│ ├── app.py
│ ├── tasks.py
│ └── templates
│ ├── download.html
│ └── index.html
├── scripts
│ ├── run_celery.sh
│ └── run_web.sh
└── requirements.txt

Creating the Flask application 🌶

First, we create a folder for our app. For this example, our folder is called app. Within this folder, create an app.py file and an empty folder named templates where our HTML templates will be stored.

For our app, we first include some basic Flask libraries and create an instance of the app:

from io import BytesIO

from flask import Flask, request
from flask import render_template, make_response
APP = Flask(__name__)

We define three routes for Flask to implement: a landing page, a secondary page that embeds and image, and a route for the image itself. Our image route crops an image dynamically. For this example, it crops an image using pillow, and some delays are also included so that the time taken to create the image is more apparent.

@APP.route(‘/’)

def index():
   return render_template(‘index.html’)
@APP.route(‘/image_page’)
def image_page():
   job = tasks.get_data_from_strava.delay()
   return render_template(‘home.html’)
@APP.route('/result.png')
def result():
   '''
   Pull our generated .png binary from redis and return it
   '''
   jobid = request.values.get('jobid')
   if jobid:
      job = tasks.get_job(jobid)
      png_output = job.get()
      png_output="../"+png_output
      return png_output
   else:
      return 404

#rabbitmq #celery #flask #docker #redis

Misael  Stark

Misael Stark

1624621920

FastAPI-PostgreSQL-Celery-RabbitMQ-Redis bakcend with Docker containerization

FastAPI - PostgreSQL - Celery - Rabbitmq backend

This source code implements the following architecture:

arch

All the required database endpoints are implemented and tested. These include crud operations for dog and user PostgreSQL relations. The asynchronous tasks are queued via one endpoint, and the upload of files to guane internal test server (external API) is implemented as another endpoint.

This app also executes HTTP requests to another external endpoint located at https://dog.ceo/api/breeds/image/random which returns a message with an URL to a random dog picture. The URL is stored as the field picture in the dog relation.

#docker #postgresql #fastapi #rabbitmq #celery #redis bakcend

Loma  Baumbach

Loma Baumbach

1596679140

Redis Transactions & Long-Running Lua Scripts

Redis offers two mechanisms for handling transactions – MULTI/EXEC based transactions and Lua scripts evaluation. Redis Lua scripting is the recommended approach and is fairly popular in usage.

Our Redis™ customers who have Lua scripts deployed often report this error – “BUSY Redis is busy running a script. You can only call SCRIPT KILL or SHUTDOWN NOSAVE”. In this post, we will explain the Redis transactional property of scripts, what this error is about, and why we must be extra careful about it on Sentinel-managed systems that can failover.

Redis Lua Scripts Diagram - ScaleGrid Blog

Transactional Nature of Redis Lua Scripts

Redis “transactions” aren’t really transactions as understood conventionally – in case of errors, there is no rollback of writes made by the script.

Atomicity” of Redis scripts is guaranteed in the following manner:

  • Once a script begins executing, all other commands/scripts are blocked until the script completes. So, other clients either see the changes made by the script or they don’t. This is because they can only execute either before the script or after the script.
  • However, Redis doesn’t do rollbacks, so on an error within a script, any changes already made by the script will be retained and future commands/scripts will see those partial changes.
  • Since all other clients are blocked while the script executes, it is critical that the script is well-behaved and finishes in time.

The ‘lua-time-limit’ Value

It is highly recommended that the script complete within a time limit. Redis enforces this in a weak manner with the ‘lua-time-limit’ value. This is the maximum allowed time (in ms) that the script is allowed to run. The default value is 5 seconds. This is a really long time for CPU-bound activity (scripts have limited access and can’t run commands that access the disk).

However, the script is not killed when it executes beyond this time. Redis starts accepting client commands again, but responds to them with a BUSY error.

If you must kill the script at this point, there are two options available:

  • SCRIPT KILL command can be used to stop a script that hasn’t yet done any writes.
  • If the script has already performed writes to the server and must still be killed, use the SHUTDOWN NOSAVE to shutdown the server completely.

It is usually better to just wait for the script to complete its operation. The complete information on methods to kill the script execution and related behavior are available in the documentation.

#cloud #database #developer #high availability #howto #redis #scalegrid #lua-time-limit #redis diagram #redis master #redis scripts #redis sentinel #redis servers #redis transactions #sentinel-managed #server failures

Lindsey  Koepp

Lindsey Koepp

1610595064

Asynchronous Tasks with Flask and Celery

If a long-running process is part of your application’s workflow, rather blocking the response, you should handle it in the background, outside the normal request/response flow.

Perhaps your web application requires users to submit a thumbnail (which will probably need to be re-sized) and confirm their email when they register. If your application processed the image and sent a confirmation email directly in the request handler, then the end user would have to wait unnecessarily for them both to finish processing before the page loads or updates. Instead, you’ll want to pass these processes off to a task queue and let a separate worker process deal with it, so you can immediately send a response back to the client. The end user can then do other things on the client-side while the processing takes place. Your application is also free to respond to requests from other users and clients.

To achieve this, we’ll walk you through the process of setting up and configuring Celery and Redis for handling long-running processes in a Flask app. We’ll also use Docker and Docker Compose to tie everything together. Finally, we’ll look at how to test the Celery tasks with unit and integration tests.

Objectives

By the end of this tutorial, you will be able to:

  1. Integrate Celery into a Flask app and create tasks.
  2. Containerize Flask, Celery, and Redis with Docker.
  3. Run processes in the background with a separate worker process.
  4. Save Celery logs to a file.
  5. Set up Flower to monitor and administer Celery jobs and workers.
  6. Test a Celery task with both unit and integration tests.

Background Tasks

Again, to improve user experience, long-running processes should be run outside the normal HTTP request/response flow, in a background process.

Examples:

  1. Sending confirmation emails
  2. Scraping and crawling
  3. Analyzing data
  4. Processing images
  5. Generating reports

As you’re building out an app, try to distinguish tasks that should run during the request/response lifecycle, like CRUD operations, from those that should run in the background.




#flask #python #redis #celery #docker

Iliana  Welch

Iliana Welch

1595249460

Docker Explained: Docker Architecture | Docker Registries

Following the second video about Docker basics, in this video, I explain Docker architecture and explain the different building blocks of the docker engine; docker client, API, Docker Daemon. I also explain what a docker registry is and I finish the video with a demo explaining and illustrating how to use Docker hub

In this video lesson you will learn:

  • What is Docker Host
  • What is Docker Engine
  • Learn about Docker Architecture
  • Learn about Docker client and Docker Daemon
  • Docker Hub and Registries
  • Simple demo to understand using images from registries

#docker #docker hub #docker host #docker engine #docker architecture #api