skip to Main Content

My script sometimes receives 2 identical requests at the same time (difference in milliseconds) from an external system.

The script, upon incoming request, makes a request to the external system, checks for the existence of an entry there, and if not, creates it.

The problem is that with simultaneous requests, the check for uniqueness fails and as a result 2 records are created.

I tried to do a random sleep but it didn’t work.
$sleep = rand(1,5); sleep($sleep);

2

Answers


  1. Chosen as BEST ANSWER

    Solution was to write lock file with ID:

    $tmp_file = __DIR__.'/tmp/'.$origDealId.'.lock';
            if (file_exists($tmp_file)) {
    // duplicate request
                return null;
            } else {
    // do something
    }
    

  2. I would suggest using a fast caching system, like memcached or redis.

    Have a check if the system is busy

    If not busy, make system busy by adding a flag in cache

    Process the request.

    Unflag the busy flag.

    While processing, if another request comes, checks if busy in memcache/redis. If system busy, just don’t do anything.

    I’m going to try some pseudo code here:

    function processData($requestData) 
    {
    
       $isSystemBusy = Cache::get('systemBusy');
    
       if $isSystemBusy == true {
    
           exit();
    
       }
    
       Cache::set('systemBusy', true);
    
       //do your logic here
    
    
       Cache::set('systemBusy', false);
    
    
    }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search