I have more than 2000 user in my database , when I try to broadcast a message to all users, it barely sends about 200 request then my server stops and I get an error as below :
{ Error: connect ETIMEDOUT 31.13.88.4:443
at Object.exports._errnoException (util.js:1026:11)
at exports._exceptionWithHostPort (util.js:1049:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1090:14)
code: 'ETIMEDOUT',
errno: 'ETIMEDOUT',
syscall: 'connect,
address: '31.13.88.4',
port: 443 }
Sometimes I get another error that says :
Error!: Error: socket hang up
This is my request :
function callSendAPI(messageData) {
request({
uri: 'https://graph.facebook.com/v2.6/me/messages',
qs: { access_token: '#####' },
method: 'POST',
json: messageData
}, function (error, response, body) {
if (!error && response.statusCode == 200) {
var recipientId = body.recipient_id;
var messageId = body.message_id;
if (messageId) {
console.log("Successfully sent message with id %s to recipient %s",
messageId, recipientId);
} else {
console.log("Successfully called Send API for recipient %s",
recipientId);
}
} else {
console.error("Failed calling Send API");
console.log(error)
}
});
}
I have tried
setTimeout
to make the the API calling wait for a while:
setTimeout(function(){callSendAPI(data)},200);
Can anyone help if he/she faced a similar error ?
EDITED
I’m using Messenger Platform which support high rate of calls to the Send API and it is not limited with 200 calls .
2
Answers
It sounds like you are hitting a rate limit.
From the Facebook documentation:
You can check the dashboard to see if you are hitting the rate limiting in these cases.
You may be hitting Facebook API limits. To throttle the requests you should send every request after some interval from the previous one. You didn’t include where you’re iterating over all users but I suspect that you maybe do it in a loop and if you use
setTimeout
to delay every request with flat 200ms delay then you have all requests done at the same time like you did before – just 200ms later.What you can do is:
series
orparallelLimit
(using callbacks)Promise.mapSeries
orPromise.map
withconcurrency
limit (using promises)The 1 is not recommended because it will still be fire-and-forget (unless you add more complexity to that) and you still risk that you have too much concurrency and go over limit because you only control when the requests start, not how many of outstanding requests are there.
The 2 and 3 are mostly the same but differ by using callbacks or promises. In your example you’re using callbacks but your
callSendAPI
doesn’t take its own callback which it should if you want option 2 to work – or, alternatively, it should return a promise if you want option 3 to work.For more info see the docs:
Of course there are more ways to do it but those are the most straightforward.
Ideally, if you want to fully utilize the 200 requests per hour limit then you should queue the requests yourself and make the requests at certain intervals that correspond to that limit. Sometimes if you didn’t do a lot of requests in an hour then you won’t need delays, sometime you will. What you should really do here is to queue all requests centrally and empty the queue at intervals corresponding to the already used up portion to the limit which you should track yourself – but that can be tricky.