Rate Limiting
The Batchmates API implements rate limiting to ensure service stability and fair usage. Rate limits are applied per IP address and per authenticated user.
Rate Limit Tiers
- Name
Login Endpoints- Type
- 10 requests/minute
- Description
/v1/mobile/auth/login,/v1/web/auth/login
- Name
General API- Type
- 120 requests/minute
- Description
All other authenticated endpoints
Response Headers
Rate limit information is included in response headers:
X-RateLimit-Limit: 120
X-RateLimit-Remaining: 119
X-RateLimit-Reset: 1643587200
- Name
X-RateLimit-Limit- Type
- integer
- Description
Maximum requests allowed per window per user
- Name
X-RateLimit-Remaining- Type
- integer
- Description
Requests remaining in current window
- Name
X-RateLimit-Reset- Type
- integer
- Description
Unix timestamp when limit resets
Rate Limit Exceeded
When you exceed the rate limit, the API returns a 429 status:
{
"success": false,
"message": "Too many requests. Please try again later."
}
Retry After
The response includes a Retry-After header indicating seconds to wait:
HTTP/1.1 429 Too Many Requests
Retry-After: 60
Handling Rate Limits
Check Remaining Requests
Monitor rate limit headers to avoid hitting limits.
const response = await fetch(url, options);
const remaining = response.headers.get('X-RateLimit-Remaining');
const reset = response.headers.get('X-RateLimit-Reset');
if (remaining < 10) {
console.warn(`Only ${remaining} requests remaining`);
console.warn(`Resets at ${new Date(reset * 1000)}`);
}
Implement Exponential Backoff
Automatically retry with increasing delays.
async function fetchWithRetry(url, options, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
const response = await fetch(url, options);
if (response.status !== 429) {
return response;
}
const retryAfter = response.headers.get('Retry-After') || Math.pow(2, i);
await new Promise(resolve =>
setTimeout(resolve, retryAfter * 1000)
);
}
throw new Error('Max retries exceeded');
}
Queue Requests
Use a queue to manage API calls and respect limits.
class RateLimitedQueue {
constructor(requestsPerMinute = 120) {
this.queue = [];
this.processing = false;
this.interval = 60000 / requestsPerMinute;
}
async add(fn) {
return new Promise((resolve, reject) => {
this.queue.push({ fn, resolve, reject });
this.process();
});
}
async process() {
if (this.processing || this.queue.length === 0) return;
this.processing = true;
const { fn, resolve, reject } = this.queue.shift();
try {
const result = await fn();
resolve(result);
} catch (error) {
reject(error);
}
setTimeout(() => {
this.processing = false;
this.process();
}, this.interval);
}
}
const queue = new RateLimitedQueue(120);
await queue.add(() =>
fetch('https://api.batchmates.com/v1/campaigns')
);
Best Practices
1. Cache Responses
const cache = new Map();
const CACHE_TTL = 5 * 60 * 1000;
async function getCachedData(key, fetchFn) {
const cached = cache.get(key);
if (cached && Date.now() - cached.timestamp < CACHE_TTL) {
return cached.data;
}
const data = await fetchFn();
cache.set(key, { data, timestamp: Date.now() });
return data;
}
2. Batch Requests
await Promise.all([
fetch('/v1/campaigns/1'),
fetch('/v1/campaigns/2'),
fetch('/v1/campaigns/3')
]);
await fetch('/v1/campaigns?ids=1,2,3');
3. Use Webhooks
setInterval(() => {
fetch('/v1/donations?status=pending');
}, 5000);
app.post('/webhook/donations', (req, res) => {
const { donation } = req.body;
updateDonationStatus(donation);
});
4. Monitor Usage
let requestCount = 0;
let resetTime = Date.now() + 60000;
function trackRequest(response) {
requestCount++;
const remaining = response.headers.get('X-RateLimit-Remaining');
const reset = response.headers.get('X-RateLimit-Reset');
console.log({
count: requestCount,
remaining: remaining,
resetAt: new Date(reset * 1000)
});
}