Due to reaching the hourly request allocation limit on my current external API, I’ve created multiple additional APIs. Now, I’m seeking a strategy to effectively manage these APIs in a sequential manner. How can I ensure a seamless transition between the APIs, utilizing them one by one, to prevent any disruptions in service while staying within the request limits of each API?
Please give some code so that I use API_1
,API_2
,API_3
for same purpose multiple time in same code in nodejs.
3
Answers
Here’s some code: Just make a queue.
Here is example with limitation
Instead of finding a way around the limit. I suggest you first review the API calls you do make. Here are some general tips.
Are you making any 1+N calls?
A 1+N call is when you request a collection, then for each result in the collection fire another request. Let me use SQL as an example to show you what I mean.
Say I requests some blog posts with
SELECT * FROM posts LIMIT 20 OFFSET 40
(20 per page, 3rd page). I can then iterate over the posts and requests the comments for each post usingSELECT * FROM comments WHERE post_id = ?
where?
is set to the id of the post.The above scenario would yield the correct results, but fires 1 + 20 requests to get these results. This can be reduced by requesting the comments for all the relevant posts in 1 request, then combine the results programmatically. So instead of requesting
SELECT * FROM comments WHERE post_id = ?
for each post we do a singleSELECT * FROM comments WHERE post_id IN (?, ?, ...)
request. This will give us all the comments we need in a total of 2 requests.See: What is the "N+1 selects problem" in ORM (Object-Relational Mapping)?
The same applies for API calls. If you’re making any 1+N requests, check if you can optimize the amount of requests, by requesting multiple resources in a single API call.
Does the API have an endpoint for batch actions/operations?
In some scenarios you might want to update/invoke an action upon multiple resources, but the API doesn’t allow you to pass a collection of updates to the relevant endpoint. In these scenarios an API batch call comes in handy. Implementation and usage depend entirely on the API, but some APIs allow you to batch multiple requests together and send them as a single API call.
An example of this can be seen in the Mailchimp API.
Can you reduce requests by caching data?
In some scenarios you might be requesting the same data over and over again. In these situations you might want to try and cache the data, reducing the API stress.