skip to Main Content

I am new to knex.js for mysql queries, I have following three queries which take a long time to resolve. is there a better way to do the following

await db.transaction((trx) => {
    let queries = existingProductsArr.map((tuple) => {
        const id = tuple.id;
        delete tuple.id;
    
        return trx('product')
            .update(tuple)
            .where({ id })
            .transacting(trx);
    });
    
    let issueQueries = trx('product_issue')
        .insert(findings.map((tuple) => tuple))
        .onConflict('id')
        .merge()
        .transacting(trx)  

    let resolvedQueries = trx('product_issue')
        .whereIn(['asin', 'market', 'created_by'],
            resolvedFindings.map((rf) => ([rf.asin, rf.market, rf.created_by] ) 
        )).update({ is_resolved: 'yes' });

    return Promise.all([...queries, ...issueQueries, ...resolvedQueries])
        .then(trx.commit)
        .catch(trx.rollback);
});

All arrays in this question can be in thousands in lengths

Note: I’m open to use options like in-memory caches like redis and any other

2

Answers


  1. (There are times when the platform gets in the way of making efficient use of the underlying database; this might be such a case.)

    First, let me comment on what seems to be multiple threads using a single connection to the database — That will not work. If it is actually multiple connections, each performing one ‘transaction’, then I should point out that making a connection may take longer than the query to be performed.

    Bottom line: Have a single connection and do the operations one at a time.

    Please provide a sample of the generated SQL that you are trying to batch.

    Now, let me comment on what SQL can do to "batch" statements.

    It is certainly possible in SQL to insert multiple rows:

    INSERT INTO tbl (col1, col2) VALUES (1,2), (33,44), (9,8);
    

    Inserting 100 rows this way is 10 times as fast as 100 single-row INSERTs. So it is worth doing.

    It is trickier to do a batch UPDATE. One way is to generate a table with the update info, then do a multi-table UPDATE using that table.

    Another way is to build an IODKU (upsert) that is "batched". (It’s a bit tricky.)

    Sorry, cannot translate any of this into Knex. If you need the raw SQL for the batched IODKU, let me know. Please provide SHOW CREATE TABLE for the table in question; I will need some details from it.

    Login or Signup to reply.
  2. I would suggest to use a node.js connector like MariaDB’s node.js which supports batch operations. For MySQL databases the driver rewrites/optimizes your insert statement, for MariaDB databases it will use the bulk feature which allows to send insert/update/delete with only one roundtrip.

    The Documentation for MariaDB’s node.js batch also contains a performance comparision between batch and single execution.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search