Assumed background knowledge

This article assumes the reader has familiarity with Python, Flask, Celery, and AWS SQS.

Introduction

The fundamental thing to grasp when building a Flask app that utilizes Celery for asynchronous task management is that there are really three parts to consider, outside of the queue and result backends. These are (1) the Flask instance, which is your web or micro-service frontend, (2) the Celery instance, which feeds tasks to the queue, and (3) the Celery worker, which pulls tasks off the queue and completes the work. The Flask and Celery instances are deployed together and work in tandem at the interface of the application. The Celery worker is deployed separately and works effectively independent from the instances.

At first glance setting up an application for using these three components appears very simple. However, the complication arises when attempting to implement the Flask instance and Celery instance using the Flask application factory pattern, because the approach causes a circular import issue.

The objective of this article of to:

  1. clarify how to initialize these three parts of the Flask+Celery service
  2. explain how to containerize the pieces of these services for deployment to a production environment
  3. point out a some notes on best practices

#flask #task-management #python #celery #sqs

Celery for Task Management with Flask and SQS
10.65 GEEK