skip to Main Content

Context: I have an application in Azure that receives information from multiple clients, however, sometimes the client sends multiple messages at once which seems to cause duplicate entries in my database.

The message has an externalid (id on client side), customerId (id of the client) and another subobject with some values (Eg.: height, price, etc.).

Each change the user make in this object or its subobject, a message is sent to our server. Those changes can be: creating, editing and excluding. But sometimes, the user resend this message, which seems to create a second register on my side on some occasions.

Now, I do make validations to avoid it:

var FoundCustomer = _dataContext.Customer
                                .Where(x => x.CustomerId == message.customerId)
                                .SingleOrDefault();

if (FoundCustomer != null)
{
    var Order = _dataContext.Order
                            .Where(x => x.Customer == FoundCustomer)
                            .SingleOrDefault(x => x.ExternalId == message.externalId);

    if (Order == null)
    {
        var OrderNew = new Order
                           {
                               Externalid = message.externalId, 
                               Customer = FoundCustomer
                           };

        await _dataContext.Order.AddAsync(OrderNew);
        await _dataContext.SaveChangesAsync();
    }
}

But, somehow, the duplicates still happen, as if it couldn’t find it.

Am I missing something?

2

Answers


  1. Chosen as BEST ANSWER

    I discovered that my problem is simply the old problem named race condition, each message triggered the function, creating an instance which caused the issue.

    The solution was to create a thread to call this section of the code, which I did in another class named DataContextHelper, and implement a lock in said thread, so a second instance of the function cannot execute said section until the lock is freed.

    private void orderConsumer(ConsumeContext<Order> context)
    {
        // Some code
    
        Thread orderThread = new Thread(new ThreadStart(() => 
                        DataContextHelper.includeOrderThread(_dataContext, message))); // Declare thread
        orderThread.Start(); // Start thread
        orderThread.Join(); // Wait for thread finish
    
        // Some code
    }
    

    The function in DataContextHelper:

    private static void includeOrderThread(DataContext _dataContext, Message message)
    {
        lock(object)
        {
            var FoundCustomer = _dataContext.Customer
                                    .Where(x => x.CustomerId == message.customerId)
                                    .SingleOrDefault();
    
            if (FoundCustomer != null)
            {
                var Order = _dataContext.Order
                                .Where(x => x.Customer == FoundCustomer)
                                .SingleOrDefault(x => x.ExternalId == message.externalId);
    
                if (Order == null)
                {
                    var OrderNew = new Order
                    {
                        Externalid = message.externalId, 
                        Customer = FoundCustomer
                    };
    
                    _dataContext.Order.Add(OrderNew);
                    _dataContext.SaveChanges();
                }
            }
        }
    }
    

  2. You should separate the incoming orders into a separate table and read them with an additional job. This way, you can also ensure that the externalId has not already been read and close the duplicate as such.
    This also helps when you want to shut down the order context so that the service can still read orders from the outside.
    It is indeed a bit more effort, but from my experience in a large e-commerce company, I can say that this gives you the best control.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search