WebSockets are awesome.
In the past, using them in a Django app hasn’t been perfect. A separate server performing the legwork and then everyone reinventing the wheel to communicate with that server about what to send out and to whom.
And then Django Channels came along and changed all that.
In this walkthrough, I’ll show how we’re using Channels at the backend and where we plug this into Aurelia to give our users timely updates to the application state.
If you’re not familiar with Channels, you may benefit from reading Channels Concepts first.
We use Redis to handle the messaging between interface and worker servers, so
CHANNEL_LAYERS looks like this:
In Aurelia, we have a
WebSocketConnection class that is used to manage the
connection with Django.
This sets things up so that a connection lifespan revolves around logging in and out.
We’re using jwt tokens to authenticate with the backend. This poses a slight problem in Django Channels land in that we’re not using the Django sessions layer so there’s no built in way of knowing which user a connection belongs to.
connect() method passes in the JWT token as part of the WebSocket path
so that the backend can figure out which user this belongs to.
user_from_jwt_token decorator performs the leg work of getting the JWT
token out of the query_string and associating the user with the channel_session.
We can then use this decorator around the
websocket.connect linked method:
The front end handles disconnects by retrying to open the WebSocket at increasing intervals.
And in order to display a message to the user about connectivity status, we publish events on both open and close of a WebSocket.
Extra: Production Configuration
We’re running on Kubernetes, so we have a set of asgi interface containers running behind nginx-ssl-proxy that put requests into Redis.
We have a set of worker containers consuming from Redis and forming responses.
We’ve chosen to use the same docker image for both, but we run with different commands in Kubernetes for each.