skip to Main Content

We currently store things into Redis for temporary aggregation and have a worker that goes and does insertion in bulk into Postgres. Is there a way that we can do bulk insert across multiple schemas in a single Insert transaction? This will remove the need to aggregate things in Redis. Or, is there a better way to aggregate the requests?

Thanks for the help in advance.

4

Answers


  1. If you execute a insert statement a single transaction will happen and you can only insert in a single table(So inserting across multiple schema is altogether not possible in a single transaction).

    Login or Signup to reply.
  2. It really depends on what you mean with "single insert transaction".

    One single INSERT statement can only affect one specific table. However you could still BEGIN a transaction (depends on implementation), perform all of your INSERT in there and then COMMIT the transaction.
    This would still be more efficient than performing all the INSERTs on many transactions since it avoid redundant "hand shakings".

    https://www.postgresql.org/docs/current/sql-begin.html

    Login or Signup to reply.
  3. Have you tried creating an update-able view that references two tables and then bulk insert in to this view?

    Login or Signup to reply.
  4. Are you looking for something like this?

    with data (c1, c2) as (
      values (1,2),(10,20),(30,40)
    ), s1_insert as (
      insert into schema_one.table_1(c1, c2)
      select c1, c2
      from data
    )
    insert into schema_two.table_2(col1, col2)
    select c1, c2
    from data;
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search