skip to Main Content

We are using Django with its ORM in connection with an underlying PostgreSQL database and want to extend the data model and technology stack to store massive amounts of time series data (~5 million entries per day onwards).

The closest questions I found were this and this which propose to combine Django with databases such as TimescaleDB or InfluxDB. But his creates parallel structures to Django’s builtin ORM and thus does not seem to be straightforward.

How can we handle large amounts of time series data while preserving or staying really close to Django’s ORM?

Any hints on proven technology stacks and implementation patterns are welcome!

2

Answers


  1. Your best option is to keep your relational data in Postgres and your time series data in a separate database, and combining them when needed in your code.

    With InfluxDB you can do this join with a Flux script by passing it the SQL that Django’s ORM would execute, along with your database connection info. This will return your data in InfluxDB’s format though, not Django models.

    Login or Signup to reply.
  2. why not using in parallel to your existing postgres a timescaledb for the time series data, and use this django integration for the latter one: https://pypi.org/project/django-timescaledb/.

    Using multiple databases in django is possible, also I not did it by myself so far. Have a look here to do it in a convenient way (reroute certain Models to another db instead of default postgres one)
    Using Multiple Databases with django

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search