Running Node.js + TypeScript in production is more than just tsc && node dist/index.js. Over three years of shipping backend services, I've settled on a set of practices that consistently produce stable, maintainable, and observable services. This is the guide I wish existed when I started.
tsconfig.json — Get it Right First
Most tutorials show a bare-minimum tsconfig. In production you want stricter settings:
{
"compilerOptions": {
"target": "ES2022",
"module": "CommonJS",
"lib": ["ES2022"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"noUncheckedIndexedAccess": true,
"exactOptionalPropertyTypes": true,
"noImplicitReturns": true,
"noFallthroughCasesInSwitch": true,
"forceConsistentCasingInFileNames": true,
"esModuleInterop": true,
"skipLibCheck": true,
"resolveJsonModule": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist", "**/*.test.ts"]
}
The two settings that catch the most bugs: noUncheckedIndexedAccess (array and object access returns T | undefined, not just T) and exactOptionalPropertyTypes (prevents accidentally assigning undefined to optional properties).
Project Structure
I use a feature-based structure, similar to what I described in the Flutter clean architecture post. The key is separating routing, business logic, and data access from day one:
src/
├── config/
│ ├── env.ts # Validated env vars (zod)
│ └── database.ts
├── middleware/
│ ├── auth.ts # JWT verification
│ ├── errorHandler.ts # Global error handler
│ ├── validate.ts # express-validator wrapper
│ └── rateLimiter.ts
├── features/
│ └── users/
│ ├── users.router.ts
│ ├── users.controller.ts
│ ├── users.service.ts
│ ├── users.repository.ts
│ └── users.schema.ts # Zod/validation schemas
├── shared/
│ ├── errors.ts # Custom error classes
│ ├── logger.ts # Pino logger
│ └── httpClient.ts # Axios wrapper
├── types/
│ └── express.d.ts # Augmented Express types
└── app.ts
Environment Variable Validation
Never use process.env.FOO directly throughout your codebase. Validate all env vars at startup using Zod so your service fails fast with a clear error message, not a cryptic runtime crash three hours after deployment.
// src/config/env.ts
import { z } from 'zod';
const envSchema = z.object({
NODE_ENV: z.enum(['development', 'production', 'test']),
PORT: z.coerce.number().default(3000),
DATABASE_URL: z.string().url(),
JWT_SECRET: z.string().min(32),
JWT_EXPIRES_IN: z.string().default('7d'),
REDIS_URL: z.string().url().optional(),
});
const result = envSchema.safeParse(process.env);
if (!result.success) {
console.error('Invalid environment variables:');
console.error(result.error.flatten().fieldErrors);
process.exit(1);
}
export const env = result.data;
Global Error Handling
All errors flow to a single Express error-handling middleware. Custom error classes carry an HTTP status code and a isOperational flag — operational errors (bad input, not found) return JSON; programming errors crash the process so PM2/Docker restarts it cleanly.
// src/shared/errors.ts
export class AppError extends Error {
constructor(
public readonly statusCode: number,
message: string,
public readonly isOperational = true
) {
super(message);
Object.setPrototypeOf(this, AppError.prototype);
}
}
export class NotFoundError extends AppError {
constructor(resource = 'Resource') {
super(404, `${resource} not found`);
}
}
export class UnauthorizedError extends AppError {
constructor(message = 'Unauthorized') {
super(401, message);
}
}
export class ValidationError extends AppError {
constructor(public readonly errors: object) {
super(422, 'Validation failed');
}
}
// src/middleware/errorHandler.ts
import { Request, Response, NextFunction } from 'express';
import { AppError } from '../shared/errors';
import { logger } from '../shared/logger';
export function errorHandler(
err: Error,
req: Request,
res: Response,
_next: NextFunction
) {
if (err instanceof AppError) {
if (!err.isOperational) {
logger.fatal({ err }, 'Programming error — restarting');
process.exit(1);
}
return res.status(err.statusCode).json({
status: 'error',
message: err.message,
...(err instanceof ValidationError ? { errors: err.errors } : {}),
});
}
logger.error({ err }, 'Unhandled error');
res.status(500).json({ status: 'error', message: 'Internal server error' });
}
JWT Authentication Pattern
I keep JWT logic in a single module and use a typed augmented Express Request so TypeScript knows about req.user everywhere.
// src/types/express.d.ts
declare namespace Express {
interface Request {
user?: { id: string; email: string; role: string };
}
}
// src/middleware/auth.ts
import { Request, Response, NextFunction } from 'express';
import jwt from 'jsonwebtoken';
import { env } from '../config/env';
import { UnauthorizedError } from '../shared/errors';
export function authenticate(req: Request, _res: Response, next: NextFunction) {
const header = req.headers.authorization;
if (!header?.startsWith('Bearer ')) throw new UnauthorizedError();
const token = header.slice(7);
try {
const payload = jwt.verify(token, env.JWT_SECRET) as Express.Request['user'];
req.user = payload;
next();
} catch {
throw new UnauthorizedError('Token invalid or expired');
}
}
Request Validation with express-validator
I contributed to express-validator, so I know it well. Always centralize your validation chains and run them before the controller:
// src/features/users/users.schema.ts
import { body } from 'express-validator';
export const registerSchema = [
body('email').isEmail().normalizeEmail(),
body('password').isLength({ min: 8 }).withMessage('Password must be at least 8 characters'),
body('name').trim().notEmpty().isLength({ max: 100 }),
];
// src/middleware/validate.ts
import { validationResult } from 'express-validator';
import { Request, Response, NextFunction } from 'express';
import { ValidationError } from '../shared/errors';
export function validate(req: Request, _res: Response, next: NextFunction) {
const errors = validationResult(req);
if (!errors.isEmpty()) {
throw new ValidationError(errors.array());
}
next();
}
// src/features/users/users.router.ts
router.post('/register', registerSchema, validate, usersController.register);
Structured JSON Logging with Pino
Console.log is for scripts. Production services need structured logs that a log aggregator (Loki, CloudWatch, Datadog) can query. Pino is the fastest Node.js logger and produces newline-delimited JSON:
// src/shared/logger.ts
import pino from 'pino';
import { env } from '../config/env';
export const logger = pino({
level: env.NODE_ENV === 'production' ? 'info' : 'debug',
transport: env.NODE_ENV !== 'production'
? { target: 'pino-pretty', options: { colorize: true } }
: undefined,
base: { service: 'api' },
timestamp: pino.stdTimeFunctions.isoTime,
redact: ['req.headers.authorization', 'body.password'],
});
Docker for Production
Use a multi-stage Dockerfile to keep the production image lean. The builder stage compiles TypeScript; the runner stage copies only the compiled output and production node_modules.
# Dockerfile
FROM node:20-alpine AS builder
WORKDIR /app
COPY package*.json tsconfig.json ./
RUN npm ci
COPY src/ ./src/
RUN npm run build
FROM node:20-alpine AS runner
WORKDIR /app
ENV NODE_ENV=production
COPY package*.json ./
RUN npm ci --omit=dev
COPY --from=builder /app/dist ./dist
EXPOSE 3000
USER node
CMD ["node", "dist/app.js"]
CI/CD with GitHub Actions
A minimal pipeline: lint, type-check, test, build Docker image, push to registry, deploy. The key is running type-check (tsc --noEmit) separately from the build so errors surface with readable output.
# .github/workflows/deploy.yml
name: Deploy
on:
push:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with: { node-version: '20', cache: 'npm' }
- run: npm ci
- run: npm run lint
- run: npm run type-check
- run: npm test
deploy:
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Build and push Docker image
run: |
docker build -t ghcr.io/${{ github.repository }}:${{ github.sha }} .
docker push ghcr.io/${{ github.repository }}:${{ github.sha }}
Prisma vs Mongoose — When to Use Which
This comes up constantly. My rule: use Prisma with PostgreSQL for anything that has relational data or requires transactions. Use Mongoose with MongoDB when you need flexible schemas, full-text search, or are already on a Mongo stack. Never mix ORMs in the same service — pick one and commit.
TypeScript's value multiplies with a typed ORM. A Prisma schema gives you end-to-end type safety from database to API response with zero extra effort.
Production Readiness Checklist
- All env vars validated at startup with Zod
- Global error handler catches all unhandled errors
- All routes have authentication + rate limiting middleware
- All user inputs validated with express-validator
- Structured JSON logs with correlation IDs per request
- Graceful shutdown: drain connections before exiting on SIGTERM
- Health check endpoint at
GET /healthfor load balancers - Dockerfile uses non-root user (
USER node) - Secrets stored in environment variables, never in code
- Dependencies pinned in
package-lock.json, npm ci in CI