In this Python tutorial, you'll learn how to use Asynchronous Programming in Python to get more done in less time, without waiting
Asynchronous programming, or async for short, is a feature of many modern languages that allows a program to juggle multiple operations without waiting or getting hung up on any one of them. It’s a smart way to efficiently handle tasks like network or file I/O, where most of the program’s time is spent waiting for a task to finish.
Consider a web scraping application that opens 100 network connections. You could open one connection, wait for the results, then open the next and wait for the results, and so on. Most of the time the program runs is spent waiting on a network response, not doing actual work.
Async gives you a more efficient method: Open all 100 connections at once, then switch among each active connection as they return results. If one connection isn’t returning results, switch to the next one, and so on, until all connections have returned their data.
Async syntax is now a standard feature in Python, but longtime Pythonistas who are used to doing one thing at a time may have trouble wrapping their heads around it. In this article we’ll explore how asynchronous programming works in Python, and how to put it to use.
Note that if you want to use async in Python, it’s best to use Python 3.7 or Python 3.8 (the latest version as of this writing). We’ll be using Python’s async syntax and helper functions as defined in those versions of the language.
In general, the best times to use async are when you’re trying to do work that has the following traits:
Async lets you set up multiple tasks in parallel and iterate through them efficiently, without blocking the rest of your application.
Some examples of tasks that work well with async:
It’s important to note that asynchronous programming is different from multithreading or multiprocessing. Async operations all run in the same thread, but they yield to one another as needed, making async more efficient than threading or multiprocessing for many kinds of tasks. (More on this below.)
Python recently added two keywords, async and await, for creating async operations. Consider this script:
def get_server_status(server_addr) # A potentially long-running operation ... return server_status def server_ops() results =  results.append(get_server_status('addr1.server') results.append(get_server_status('addr2.server') return results
An async version of the same script—not functional, just enough to give us an idea of how the syntax works—might look like this.
async def get_server_status(server_addr) # A potentially long-running operation ... return server_status async def server_ops() results =  results.append(await get_server_status('addr1.server') results.append(await get_server_status('addr2.server') return results
Functions prefixed with the async keyword become asynchronous functions, also known as coroutines. Coroutines behave differently from regular functions:
So if we can’t call async functions from non-asynchronous functions, and we can’t run async functions directly, how do we use them? Answer: By using the asyncio library, which bridges async and the rest of Python.
Here is an example (again, not functional but illustrative) of how one might write a web scraping application using async and asyncio. This script takes a list of URLs and uses multiple instances of an async function from an external library (read_from_site_async()) to download them and aggregate the results.
import asyncio from web_scraping_library import read_from_site_async async def main(url_list): return await asyncio.gather(*[read_from_site_async(_) for _ in url_list]) urls = ['http://site1.com','http://othersite.com','http://newsite.com'] results = asyncio.run(main(urls)) print (results)
In the above example, we use two common
asyncio.run()is used to launch an
asyncfunction from the non-asynchronous part of our code, and thus kick off all of the progam’s async activities. (This is how we run main().)
asyncio.gather()takes one or more async-decorated functions (in this case, several instances of read_from_site_async() from our hypothetical web-scraping library), runs them all, and waits for all of the results to come in.
The idea here is, we start the read operation for all of the sites at once, then gather the results as they arrive (hence asyncio.gather()). We don’t wait for any one operation to complete before moving onto the next one.
We’ve already mentioned how Python async apps use coroutines as their main ingredient, drawing on the
asyncio library to run them. A few other elements are also key to asynchronous applications in Python:
asyncio library creates and manages event loops, the mechanisms that run coroutines until they complete. Only one event loop should be running at a time in a Python process, if only to make it easier for the programmer keep track of what goes into it.
When you submit a coroutine to an event loop for processing, you can get back a
Task object, which provides a way to control the behavior of the coroutine from outside the event loop. If you need to cancel the running task, for instance, you can do that by calling the task’s
Here is a slightly different version of the site-scraper script that shows the event loop and tasks at work:
import asyncio from web_scraping_library import read_from_site_async tasks =  async def main(url_list): for n in url_list: tasks.append(asyncio.create_task(read_from_site_async(n))) print (tasks) return await asyncio.gather(*tasks) urls = ['http://site1.com','http://othersite.com','http://newsite.com'] loop = asyncio.get_event_loop() results = loop.run_until_complete(main(urls)) print (results)
This script uses the event loop and task objects more explicitly.
How much control you need over the event loop and its tasks will depend on how complex the application is that you’re building. If you just want to submit a set of fixed jobs to run concurrently, as with our web scraper, you won’t need a whole lot of control—just enough to launch jobs and gather the results.
By contrast, if you’re creating a full-blown web framework, you’ll want far more control over the behavior of the coroutines and the event loop. For instance, you may need to shut down the event loop gracefully in the event of an application crash, or run tasks in a threadsafe manner if you’re calling the event loop from another thread.
At this point you may be wondering, why use async instead of threads or multiprocessing, both of which have been long available in Python?
First, there is a key difference between async and threads or multiprocessing, even apart from how those things are implemented in Python. Async is about concurrency, while threads and multiprocessing are about parallelism. Concurrency involves dividing time efficiently among multiple tasks at once—e.g., checking your email while waiting for a register at the grocery store. Parallelism involves multiple agents processing multiple tasks side by side—e.g., having five separate registers open at the grocery store.
Most of the time, async is a good substitute for threading as threading is implemented in Python. This is because Python doesn’t use OS threads but its own cooperative threads, where only one thread is ever running at a time in the interpreter. In comparison to cooperative threads, async provides some key advantages:
Multiprocessing in Python, on the other hand, is best for jobs that are heavily CPU-bound rather than I/O-bound. Async actually works hand-in-hand with multiprocessing, as you can use asyncio.run_in_executor() to delegate CPU-intensive jobs to a process pool from a central process, without blocking that central process.
In the programming world, Data types play an important role. Each Variable is stored in different data types and responsible for various functions. Python had two different objects, and They are mutable and immutable objects.
Magic Methods are the special methods which gives us the ability to access built in syntactical features such as ‘<’, ‘>’, ‘==’, ‘+’ etc.. You must have worked with such methods without knowing them to be as magic methods. Magic methods can be identified with their names which start with __ and ends with __ like __init__, __call__, __str__ etc. These methods are also called Dunder Methods, because of their name starting and ending with Double Underscore (Dunder).
Python is an interpreted, high-level, powerful general-purpose programming language. You may ask, Python’s a snake right? and Why is this programming language named after it?
Are you looking for experienced, reliable, and qualified Python developers? If yes, you have reached the right place. At **[HourlyDeveloper.io](https://hourlydeveloper.io/ "HourlyDeveloper.io")**, our full-stack Python development services...
Python any() function returns True if any element of an iterable is True otherwise any() function returns False. The syntax is any().