Singleton patterns in Python’s AIO-HTTP

AIO-HTTP is a web server framework built on python 3’s async capabilities. The performance benefits of using async are widely documented online – but suffice to say you can get performance from Python which is on par with NodeJS!

If you’re coming from a Java (or similar) background, one of the questions you’ll probably come op against is how to create a “singleton” object to – for example – share a database or Redis connection pool among requests. Creating connections to data stores is usually an expensive operation so ideally a connection pool is created once and shared among requests.

Creating a redis connection pool singleton

It’s important to note when building your async applications that (to paraphrase) “it’s async all the way down”. In other words, if you use a run-of-the-mill synchronous version of the python redis library, you’ll lose the async benefits. So you need to ensure that you use async versions of redis libraries. Luckily, there are some excellent python async libraries for popular datastores in the aio-libs repository, including one for redis: aio-redis.

Creating a redis pool is easy:

redis = await aioredis.create_redis_pool('redis://localhost')

Now we’re faced with two questions:

  1. The “await” keyword can only be used within an “async” function, so where exactly do we place the command we just wrote?
  2. Where do we store the “redis” variable to make it globally accessible as a singleton throughout our application?

Singleton Redis pool on startup

Let’s address (1). AIO-HTTP usually handles creating an event-loop for us, as soon as we run aiohttp.web.run_app(…). As we just pointed out, we need all “await” expressions to be contained within an “async” function. In order to tell AIO-HTTP to run our async function on startup, we leverage AIO-HTTP WEB SIGNALS. Web signals allow us to define async functions that are run on startup, on shutdown, or even for every request. So our async function to create the redis pool looks like this:

import asyncio

async createRedisPool():
      redis = await aioredis.create_redis_pool('redis://localhost')

Next, we need to store the redis pool at an “application” level, rather than at a request level. To accomplish this, we leverage the “Application” object provided by AIO-HTTP. As the documentation states:

Application is a dict-like object, so you can use it for sharing data globally by storing arbitrary properties for later access from a handler via the Request.app property

https://docs.aiohttp.org/en/stable/web_reference.html#application

Conveniently, the application object is passed to any function which acts as a signal handler. So we can modify the function above to read:

import asyncio

async createRedisPool(app):
      app['redis'] = await aioredis.create_redis_pool('redis://localhost')

Wiring it up and accessing the redis singleton

Our handler function is ready. We just need to wire it up to AIO-HTTP via a “on startup” signal:

from aiohttp import web
import uvloop

# use uvloop (optional)
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())

app = web.Application()
app.on_startup(createRedisPool)
web.run_app(app)

At this point, whenever you startup your web application, a “redis” variable is stored in the Application object – a true singleton.

How do we access this application object when a new request comes in? AIO-HTTP makes the application object available as a property in the “Request” object, so accessing our newly minted singleton is as easy as:

@routes.get('/getRedisKeys')
async def handler(request):
    with (await request.app['redis']) as r:
        r.keys("*") # or whatever...

References