Lets say I have a model Document.
Document
document_id -> int
client_id -> int
I want the document id to be auto incremented based on the client id avoiding a race condition. I have an unique constraint on document_id, client_id.
client_id document_id
1 1
1 2
2 1
1 3
Approach 1: Right now, Im setting the document id on before_create callback but it leads to race condition and sets the same value when multiple documents are received for the same client at the same time.
before_create :set_identifier
def set_identifier
self.document_id = client.documents.maximum(:document_id).next
end
This raises the below error when 2 documents are created at the same time.
ActiveRecord::RecordNotUnique
PG::UniqueViolation: ERROR: duplicate key value violates unique constraint
Approach 2: Tried using DB triggers on postgresql. It works fine. But the only problem is that, the updated value doesn’t reflect initially on the assigned variable. I need to do a force reload on the variable.
DB Function:
CREATE OR REPLACE FUNCTION set_document_identifier()
RETURNS TRIGGER AS $$
DECLARE
max_identifier INTEGER;
BEGIN
SELECT MAX(document_id) INTO max_identifier
FROM documents
WHERE client_id = NEW.client_id;
IF max_identifier IS NULL THEN
max_identifier := 1;
END IF;
NEW.document_id := max_identifier;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
DB Trigger:
CREATE TRIGGER before_insert_trigger_documents
BEFORE INSERT ON documents
FOR EACH ROW
EXECUTE FUNCTION set_document_identifier();
Rails:
doc = Document.create!(client_id: 1)
doc.document_id -> nil
doc.reload.document_id -> 1
Is there any other possible OTB solution which rails provide or a workaround that can be done on the triggers so that the value is set before the transaction commit?
2
Answers
You obviously can’t have unique incrementing values per
client_id
and also have multiple users generating their own next value concurrently. That’s just how the universe works.The simplest method to deal with this is (as you discovered) to have a single point where this is done – in the database.
You should also either (a) lock appropriately to prevent concurrent inserts or (b) expect duplicate key errors and handle them.
If what you showed is really how your trigger works, you forgot to increment. This would set first
document_id
to1
. Next insert for the sameclient_id
finds only that1
, which makes itmax()
and that insert will also use that same value of1
. You’d keep getting1
‘s all the time: demo. If you add the+1
, it’ll kind of work: demo2.Even if you fix that, when you insert multiple documents for a single
client_id
concurrently, they will all find and use the samemax()
, resulting in two different documents with the same id.The value is set before the transaction commit but if your app the
returning
clause on theinsert
issued by Rails doesn’t include the value you were hoping to generate, you need to tell it to go find it afterwards based on the record’s unique id that it does always have.If you instead use generated, seed-based sequences like in the similar thread I mentioned, it’ll work with concurrent inserts too: demo3