SEC-15: Rate Limiting for tRPC and Pay API Endpoints
SEC-15: Rate Limiting for tRPC and Pay API Endpoints
Version: 1.0.55
Category: Infrastructure Security
Overview
This release closes two rate limiting gaps identified in security review SEC-15:
- The
/api/trpcendpoint had no rate limiting of any kind. - The
/v1/*pay API had an in-memory rate limiter that was ineffective in Vercel's serverless multi-instance deployment model.
tRPC Endpoint Rate Limiting
Problem
All tRPC mutations and queries are routed through a single Next.js route handler at /api/trpc/[trpc]. Because no rate limiting was applied, any user with a valid session could issue an unbounded number of tRPC requests. This created risk of:
- Data enumeration via repeated queries
- Abuse of expensive backend operations (e.g. third-party API calls, database writes)
- Denial-of-service against downstream services
Fix
Per-user and per-IP rate limiting is now enforced in the tRPC route handler (src/app/api/trpc/[trpc]/route.ts). Requests that exceed the configured threshold receive a 429 Too Many Requests response before reaching any tRPC procedure.
Pay API Rate Limiter — In-Memory to Redis (Upstash)
Problem
The /v1/* pay API endpoints enforced a 100 requests per second per API key limit. However, the limiter used an in-memory counter store. In a serverless environment (Vercel), multiple function instances run concurrently and do not share memory. This meant:
- Each cold-started instance had its own independent counter.
- A client could issue far more than 100 req/s in practice by hitting different instances.
- The documented limit provided a false sense of security.
Fix
The rate limiter has been migrated from in-memory state to a Redis-backed store using Upstash. All instances now read from and write to the same Redis counter, ensuring the 100 req/s per-key limit is consistently enforced regardless of the number of active serverless instances.
The known in-memory limitation has also been documented inline in src/lib/rate-limit.ts for future reference.
Configuration
This feature requires an Upstash Redis instance. The following environment variables must be set:
| Variable | Description |
|---|---|
UPSTASH_REDIS_REST_URL | REST URL for your Upstash Redis database |
UPSTASH_REDIS_REST_TOKEN | Auth token for your Upstash Redis database |
You can create a free Upstash Redis database at https://upstash.com.
Rate Limit Behaviour
| Endpoint | Limit | Scope | Backing Store |
|---|---|---|---|
/api/trpc/* | Configurable | Per user / Per IP | Redis (Upstash) |
/v1/* Pay API | 100 req/s | Per API key | Redis (Upstash) |
When a rate limit is exceeded, the API returns:
HTTP/1.1 429 Too Many Requests
Content-Type: application/json
{
"error": "Too Many Requests"
}
Affected Files
src/app/api/trpc/[trpc]/route.ts— tRPC handler with new rate limiting middlewaresrc/lib/rate-limit.ts— Migrated from in-memory to Upstash Redis; known limitation documented inline