An Introduction to Python AsyncIO

Write code that looks and feels synchronous, but runs asynchronously.

What is AsyncIO?

AsyncIO is the Python library to program concurrent network IO using async/await syntax. This allows you to write code that looks and feels synchronous, but runs asynchronously. Simply put, no callbacks! It was first introduced in Python 3.4 and has had performance and API improvements in every release of Python since.

Bonus: Typescript and ES2017 provide async/await syntax. If you’re familiar with using async/await in the JavaScript world, you’ll feel right at home with Python’s library.

The implementation of Python AsyncIO is done using an event loop--very similar to how NodeJS runs. Additionally, the event loop is pluggable. For example, you can drop in uvloop support: https://github.com/MagicStack/uvloop.

Why should you use AsyncIO?

Most web servers are network IO heavy. They are connecting to Databases(PostgreSQL, MySQL, MongoDB), Caching servers(Redis, memcached), OAuth servers and cloud file storage which is all network IO.

Moreover, the web server itself is network IO. In fact, the only way you can do things like websockets in Python is by using web servers that have non-blocking network IO.

Traditional web application servers (i.e. Pyramid, Django, Plone) use the threaded model. Each user request gets a Python thread associated with it in order to service the request. Once the request is finish, the thread is free to service other requests. This means you can only service the number of simultaneous requests as you have available threads. Typically, a Python web server process is deployed with two threads. In other words, the rate of scaling threaded model Python web applications is: [number of Python processes] X 2 = Total number of simultaneous requests.

To make matters worse, if you write network bound code with threaded web application servers, you are potentially blocking the service of requests for network traffic. So imagine that oAuth service isn’t responding to requests--now your web application is basically denial of service.

Primer

(all examples will be in Python 3.7)

Okay, let’s get started with a simple example:


import asyncio




async def hello1():

  await asyncio.sleep(0.5)

  print('hi 1')




async def hello2():

  print('hi 2')




async def main():

  task1 = asyncio.create_task(hello1())

  task2 = asyncio.create_task(hello2())


  await task1

  await task2




if __name__ == '__main__':

  asyncio.run(main())

 

In this example, the output will be:

hi 2

hi 1


Even though the “hello” task is created first, the “await asyncio.sleep(0.5)” causes it to finish execution after the “hello2” task. It’s a simple way to demonstrate how AsyncIO tasks are running “simultaneously” on the event loop while the code looks sequential.

“Multi” Processing

Another simple use-case for using AsyncIO is scraping content from the web. With typical sequential programming, you might be required to do something tricky with threads in order to simultaneously retrieve the content from multiple http resources at the same time.

 

This example will use the “aiohttp” Python library. Make sure to install it with pip before you try it out:

pip install aiohttp


And finally, our example:


import aiohttp

import asyncio




async def download_url(url):

  async with aiohttp.ClientSession() as session:

      async with session.get(url) as resp:

          text = await resp.text()

          print(f'Downloaded {url}, size {len(text)}')




async def main():

  await asyncio.gather(

      download_url('https://www.facebook.com'),

      download_url('https://www.twitter.com'),

      download_url('https://www.stackoverflow.com'),

      download_url('https://www.google.com')

  )




if __name__ == '__main__':

  asyncio.run(main())


In this example, we are using “asyncio.gather” to run multiple coroutines at the same time and wait for them all to finish. You’ll notice as you run, the output will print as the resources finish loading:

Downloaded https://www.twitter.com, size 190854

Downloaded https://www.stackoverflow.com, size 259072

Downloaded https://www.facebook.com, size 659243

Downloaded https://www.google.com, size 11278

Subprocess

Another great feature of the AsyncIO library is the subprocess async functions that are available that allow you to run subprocess commands asynchronously.


import asyncio




async def run_cmd(cmd):

  print(f'Executing: {" ".join(cmd)}')

  process = await asyncio.create_subprocess_exec(

      *cmd, stdout=asyncio.subprocess.PIPE)

  out, error = await process.communicate()

  print(f'Done with {" ".join(cmd)}: {out.decode("utf8")}')




async def main():

  await asyncio.gather(

      run_cmd(['sleep', '1']),

      run_cmd(['echo', 'hello'])

  )




if __name__ == '__main__':

  asyncio.run(main())


This example is using the “asyncio.gather” function again to showcase that both subprocess commands are running at the same time and waiting until they are finished:

Executing: sleep 1

Executing: echo hello

Done with echo hello: hello

Done with sleep 1:

AsyncIO Features

The AsyncIO provides many features that give you more tools to solve network IO bound problems in easier ways.

Additional AsyncIO features to check out:

  • Async generators
  • Executors: run CPU bound code in an AsyncIO managed thread
  • Queues
  • Task management
  • Scheduling

Helpful AsyncIO packages:

  • aiohttp: robust http client and server
  • uvloop: event loop implementation with libuv
  • asyncpg: massive performance improvements over all other PostgreSQL adapters
  • aioredis: redis library
  • aiobotocore: asyncio-powered boto library
  • aioconsole: run AsyncIO code directly from synchronous-like console
  • aiomonitor: monitor running AsyncIO tasks

 

To learn more about our web development services, 

email [email protected], or call (715) 869-3440.