Specifically, I’m trying to do something very similar to this question (with the same problem): FB Ads API (#17) User request limit reached
However, I’m trying to do this in python (and the API has changed quite a bit since ’15). Here’s my code (and it kicks me out, even with the sleep time) – I’m wondering if anyone could help me in calling an array with similar info, to reduce my number of total calls.
my_account = AdAccount(ad_account_id)
camps = my_account.get_campaigns(fields=[Campaign.Field.name])
for campaign in camps[0:100]:
time.sleep(5)
print campaign[Campaign.Field.name]
adsets = campaign.get_ad_sets([AdSet.Field.name, AdSet.Field.status])
for adset in adsets:
print 't', adset[AdSet.Field.name]
for stat in adset.get_insights(fields=[
'impressions',
'clicks',
'spend',
'unique_clicks',
]):
for statfield in stat:
print "tt%s:t%s" % (statfield, stat[statfield])
More generally, how am I meant to code for my needs (mass alterations) within this limitation? In reality, I want to write a code to go through and change a few options in each of my company’s ad sets (e.g. “Expand Interests when…” from off to on). We have hundreds of ad sets, and the API docs say that alterations consume 10-100 times more calls than creations (I’m getting stuck on neither, just reads!). Is this simply a matter of, say, sleeping the code for 60 seconds between each change? They aren’t very clear on how many calls you get in a time period, or how wide the time period is for checking on those calls. If it’s a daily limitation, for example, then sleeping won’t help me change 1200 ad sets’ options.
I do see documentation on upgrading (https://developers.facebook.com/docs/marketing-api/access), but when going through the review process, everything is based on a public (customer-facing, multiuser) app. All I want to do is be able to make calls from a desktop dev-only, internal script to make bulk changes. Am I looking in the wrong place?
5
Answers
If you’re just reading the data now, why not make a batch request? I was doing the same as you but just ended up requesting more data (I had to fiddle with it since there is such a thing as too much data, FB will not allow that either) and then loop through the data.
For my purposes, I did batch async requests + sleep (10 seconds) if I hit my limit. Works well for me.
Add this to your code and you’ll never have to worry about FB’s Rate Limiting.
Your script will automatically sleep as soon as you approach the limit, and then pick up from where it left after the cool down. Enjoy 🙂
To whom it may be useful, I’ve solved this problem by catching the "headers" returned after each API calls.
json.loads()
because some values are treated as text, so in order to convert it to a subscriptable object.Addition to Smith Orlando’s answer, there are other parameters like:
total_time and total_cputime is limited with 100.
max value of acc_id_util_pct is 100 since it is a percentage too but you can make more requests even you reached that limit because you didn’t reached total_time and total_cputime limit yet.
And max call_count is calculated different for every type of request. You can learn more about this from the link
The issue with the mentioned solutions are that they require making a specific request to the Facebook API every second request to check the limit, which reduces the rate by half (as every second request is used to check the limit). To address this problem, I have proposed two solutions.