To ensure stability of the platform, API calls are rate limited. We use Leak Bucket Algorithm to enforce rate limits. Developers shall use optimization techniques for limiting calls, caching results, and re-trying requests based on the rate limit.
Rate limit is enforced per IP address and per API host.
What is the Leaky Bucket Algorithm?
Imagine a bucket with a small hole at the bottom. Water (representing API requests) can be poured into the bucket, but it can only leave through the hole at a steady, constant rate. If you pour water in too quickly, the bucket will overflow. Similarly, if too many API requests are sent at once, some of them will be rejected because the system can't handle them all at once.
How Does It Work?
-
Incoming Requests: Each request to the API is like adding a drop of water to the bucket.
-
Fixed Rate Leak: Water (requests) leaks out of the bucket at a fixed rate, representing how many requests the system can handle per second.
-
Overflow Prevention: If the bucket is full (too many requests have come in too quickly), any additional requests are denied until there's room in the bucket again.
Example Scenario
Imagine an API allows 30 requests per minute. If you send 30 requests all at once, the bucket might overflow because it can only process a certain number of requests per second. The system will then tell you, "You've reached your limit. Try again in a few seconds." This message helps you understand how long you need to wait before your next request can be processed.
Current Limit
The current rate limit is 30 requests with a leak rate of 15 requests per minute.
Each API response will contain header X-RateLimit-Remaining set to the number of remaining requests you can call. Developer should manage their API call frequency based on the number of requests you can call.
If the limit is hit, a 429 code is returned and in the header Retry-After contains the approximate number of seconds you should wait before you can make the call again.
Comments and Suggestions
0 comments
Article is closed for comments.
Related articles