In this tutorial, we'll learn how to build a rate limiter with Node.js on app typescript language.
Rate limiting is essential for preventing abuse, protecting your APIs, and managing resources. We will build a production-grade rate limiter using Node.js and deploy it on a modern app platform. The rate limiter will be implemented using Redis for efficient state management and integrated into a sample API application.
Key Concepts in Rate Limiting
Rate Limiting: Restricts the number of requests a user or client can make in a specific time window.
Token Bucket Algorithm: A popular algorithm for implementing rate limiting that grants tokens periodically, allowing requests only if tokens are available.
Redis: A high-performance, in-memory data store used to manage state for distributed systems like APIs.
Prerequisites
- A Node.js development environment installed on your machine.
- A Redis instance. For production, consider a managed Redis service or a containerized Redis instance.
- Setting Up the Application
Build a Rate Limiter With Node.js on App Typescript
Step 1: Create a Node.js Application
Initialize the Project
mkdir rate-limiter-app
cd rate-limiter-app
npm init -y
Install Dependencies
npm install express redis dotenv
Install Development Tools
npm install --save-dev typescript @types/node ts-node @types/express
Step 2: Set Up Redis for Rate Limiting
Create a redisClient.ts
file to handle Redis interactions:
nano redisClient.ts
Add following code:
import { createClient, RedisClientType } from 'redis';
import dotenv from 'dotenv';
// Load environment variables from .env file
dotenv.config();
class RedisClient {
private static instance: RedisClientType | null = null;
/**
* Initialize Redis client or return an existing instance.
*/
public static getInstance(): RedisClientType {
if (!RedisClient.instance) {
const redisUrl = process.env.REDIS_URL || 'redis://127.0.0.1:6379';
try {
// Create Redis client
RedisClient.instance = createClient({
url: redisUrl,
});
// Add error handling
RedisClient.instance.on('error', (err) => {
console.error(`Redis Client Error: ${err.message}`);
});
// Connect to Redis
RedisClient.instance.connect().catch((err) => {
console.error(`Failed to connect to Redis: ${err.message}`);
});
console.info(`Connected to Redis at ${redisUrl}`);
} catch (err) {
console.error('Error initializing Redis client:', err);
throw err;
}
}
return RedisClient.instance;
}
/**
* Disconnect Redis client (useful during app shutdown).
*/
public static async disconnect(): Promise<void> {
if (RedisClient.instance) {
await RedisClient.instance.quit();
RedisClient.instance = null;
console.info('Disconnected from Redis.');
}
}
}
export default RedisClient;
This setup connects to a Redis instance and ensures the client is available throughout your app.
Step 3: Implement the Rate Limiter Middleware
Create a rateLimiter.ts
file:
nano rateLimiter.ts
Add following code:
import { Request, Response, NextFunction } from 'express';
import { createClient, RedisClientType } from 'redis';
import dotenv from 'dotenv';
// Load environment variables
dotenv.config();
// Redis client setup
const redisClient: RedisClientType = createClient({
url: process.env.REDIS_URL || 'redis://127.0.0.1:6379',
});
// Connect Redis client
redisClient.connect().catch((err) => {
console.error('Error connecting to Redis:', err);
});
interface RateLimiterOptions {
windowMs: number; // Time window in milliseconds
maxRequests: number; // Maximum allowed requests in the time window
}
export const rateLimiter = (options: RateLimiterOptions) => {
const { windowMs, maxRequests } = options;
return async (req: Request, res: Response, next: NextFunction) => {
try {
const key = `rate-limit:${req.ip}`; // Unique key per user/IP
const currentCount = await redisClient.incr(key);
if (currentCount === 1) {
// Set expiry time for the key
await redisClient.expire(key, Math.ceil(windowMs / 1000));
}
if (currentCount > maxRequests) {
const ttl = await redisClient.ttl(key); // Time until key expires
return res.status(429).json({
message: 'Too many requests. Please try again later.',
retryAfter: ttl,
});
}
next(); // Allow request to proceed
} catch (error) {
console.error('Rate limiter error:', error);
res.status(500).json({ message: 'Internal Server Error' });
}
};
};
Step 4: Create the API with Rate Limiting
Create an app.ts
file:
nano app.ts
Add following code:
import express, { Request, Response } from 'express';
import { rateLimiter } from './rateLimiter';
const app = express();
const PORT = process.env.PORT || 3000;
// Apply rate limiter middleware
app.use(
rateLimiter({
windowMs: 60 * 1000, // 1 minute window
maxRequests: 10, // Limit each IP to 10 requests per windowMs
})
);
// Example route
app.get('/', (req: Request, res: Response) => {
res.send('Welcome! You are not rate-limited yet.');
});
// Start the server
app.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});
Testing the Application
Compile TypeScript:
npx tsc
Run the Server:
node dist/app.ts
Test Rate Limiting Use tools like Postman or curl:
curl http://localhost:3000
- Make more than 10 requests in 60 seconds to trigger the rate limiter.
- Observe the 429 Too Many Requests response.
Production-Grade Considerations
- Use Secure Redis Configuration: Ensure authentication is enabled, and the instance is not publicly exposed.
- Distributed Rate Limiting: Use a centralized Redis instance if the application scales horizontally.
- Logging and Monitoring: Integrate tools like Winston for logging and Prometheus for monitoring.
Conclusion
This tutorial demonstrated how to build a robust, production-grade rate limiter with Node.js and Redis, along with deployment steps for a modern app platform. By following these practices, you can efficiently manage API usage and protect your resources from abuse.
Checkout our dedicated servers India, Instant KVM VPS, and Web Hosting India