I have a Django Application that uses Celery, RabbitMQ, with Apache mod_wsgi. Currently all on one server. Each client has their own URL mount, eg:
Each client has their own database and project directory with local_setting.py for their Django settings.
I’m using supervisord to manage Celery Worker + Celery Beat for each client.
As I get more clients so maintaining gets more time consuming.
I’ve started playing with Docker to try and simplify deployments, and probably scale across multiple hosts.
Whilst it’s quite easy setting up Docker Compose to run a group of services for one client, I’m trying to figure out the best approach for multiple clients that is easy to manage, e.g. quickly setup a new client mounted under the main URL.
I’m thinking that the Postgres database instance should be share to hold each clients database, much as it is now. And to have a shared NGIX instance to handle the HTTP side. For each client use a Kubernetes Pod consisting:
- Gunicorn to handle Django
- Celery Beat
- Celery Worker
- Light weight HTTP server for static files.
So the question is, is this a good way or is there a better way of approaching and dealing with this?
I’m also wondering if I should go down the route of building an image for each client as that might be easier to manage?
Any advice welcome.
2
Answers
There are three ways on handling clients:
django-tenents package uses 2nd method and have a sub-domains for client such as client1.example.com, client2.example.com etc.
I have used django-tenants also 3rd with adding a
Company
model foreign key for every Model that I created.django tenant helps but has different schema in postgres; has less overhead integration.
Adding
Company
must be implement in every model and should be handled with middleware or mixins if you’re using class based views.My suggestion would be keeping one codebase and one server running(or multiple server of the same Django application without any customization based on clients) for this. Main reason is to maintenance easier. You do not want to make changes multiple times to provide a feature to multiple clients.
As you already have a Django application, I think it is best to utilize that code to accommodate the approach given above with minimal change of code. Meaning, you need some way to handle multiple clients connecting to multiple DB. I would suggesting is to use a middleware and database router. Like this:(codes based on this snippet).
Then add those to
settings.py
:Finally update the
urls.py
:Advantage of this approach is that you do not have to configure anything for new clients, all you need to do is add a new database in settings and run the migrations as per described in the documentation. No need to configure reverse proxy server or anything else.
Now, when it comes to handling tasks in celery, you can provide which database you will be using to run the the queries(reference to docs). Here is an example: