Tutorial 18 min read

Building a Complete REST API with Node.js & TypeScript

Mahmoud Hamdy
March 8, 2026

Node.js combined with TypeScript has become the dominant backend choice for modern web APIs. The developer experience is excellent: fast iteration, a massive npm ecosystem, full-stack JavaScript, and TypeScript's type safety catching entire categories of bugs before they reach production. In this guide I will walk through building a production-quality REST API from an empty folder to a deployed, tested, documented service — with code for every section.

Why Node.js + TypeScript?

Node.js excels at I/O-bound workloads — exactly what an API server does. Its non-blocking event loop handles thousands of concurrent connections on modest hardware. TypeScript adds static typing on top, which pays dividends when a codebase grows beyond a few hundred lines. Autocomplete, refactoring safety, and explicit interfaces make large APIs manageable. Combined with tools like ts-node-dev for hot reload and tsc for production builds, the workflow is fast and reliable.

Express + TypeScript Setup

Start from scratch with a clean project structure:

mkdir my-api && cd my-api
npm init -y
npm install express cors helmet morgan dotenv
npm install -D typescript ts-node-dev @types/express @types/node @types/cors

# Init TypeScript config
npx tsc --init

Configure tsconfig.json with strict mode and sensible output settings:

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "commonjs",
    "lib": ["ES2022"],
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "resolveJsonModule": true,
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "dist"]
}

Scalable Folder Structure

Feature-based organization scales better than role-based (controllers/, models/, services/ at the root). Group everything for a feature in one folder:

src/
├── config/
│   ├── database.ts      # DB connection
│   ├── env.ts           # Validated env vars
│   └── swagger.ts       # Swagger setup
├── features/
│   ├── auth/
│   │   ├── auth.controller.ts
│   │   ├── auth.service.ts
│   │   ├── auth.routes.ts
│   │   ├── auth.middleware.ts
│   │   └── auth.schema.ts   # Validation schemas
│   └── users/
│       ├── user.controller.ts
│       ├── user.service.ts
│       ├── user.model.ts
│       ├── user.routes.ts
│       └── user.schema.ts
├── shared/
│   ├── middleware/
│   │   ├── error.middleware.ts
│   │   ├── rateLimiter.ts
│   │   └── upload.ts
│   ├── types/
│   │   └── express.d.ts     # Augment Request type
│   └── utils/
│       ├── apiResponse.ts
│       └── logger.ts
├── app.ts
└── server.ts

Environment Configuration

Never trust process.env directly. Validate and type environment variables at startup using a schema — if a required variable is missing, the process should crash immediately with a clear error rather than silently misbehaving later:

// src/config/env.ts
import { z } from 'zod';

const envSchema = z.object({
  NODE_ENV: z.enum(['development', 'test', 'production']).default('development'),
  PORT: z.coerce.number().default(3000),
  DATABASE_URL: z.string().url(),
  JWT_SECRET: z.string().min(32),
  JWT_REFRESH_SECRET: z.string().min(32),
  JWT_EXPIRES_IN: z.string().default('15m'),
  JWT_REFRESH_EXPIRES_IN: z.string().default('7d'),
});

export const env = envSchema.parse(process.env);
export type Env = z.infer<typeof envSchema>;

Database Setup — MongoDB with Mongoose

Connect to MongoDB with proper error handling and connection event listeners. For PostgreSQL, swap Mongoose for Prisma or TypeORM — the structure is identical, only the client changes.

// src/config/database.ts
import mongoose from 'mongoose';
import { env } from './env';
import { logger } from '../shared/utils/logger';

export async function connectDB(): Promise<void> {
  try {
    await mongoose.connect(env.DATABASE_URL);
    logger.info('MongoDB connected');
  } catch (err) {
    logger.error('MongoDB connection error', err);
    process.exit(1);
  }
}

mongoose.connection.on('disconnected', () => {
  logger.warn('MongoDB disconnected. Retrying...');
});

// User model
import { Schema, model, Document } from 'mongoose';

export interface IUser extends Document {
  email: string;
  passwordHash: string;
  role: 'user' | 'admin';
  refreshTokens: string[];
  createdAt: Date;
}

const UserSchema = new Schema<IUser>({
  email: { type: String, required: true, unique: true, lowercase: true },
  passwordHash: { type: String, required: true },
  role: { type: String, enum: ['user', 'admin'], default: 'user' },
  refreshTokens: [String],
}, { timestamps: true });

export const User = model<IUser>('User', UserSchema);

JWT Auth with Refresh Tokens

Short-lived access tokens (15 minutes) with long-lived refresh tokens (7 days) is the industry standard. Store refresh tokens in the database so you can revoke them server-side:

// src/features/auth/auth.service.ts
import jwt from 'jsonwebtoken';
import bcrypt from 'bcryptjs';
import { env } from '../../config/env';
import { User } from '../users/user.model';

export async function register(email: string, password: string) {
  const passwordHash = await bcrypt.hash(password, 12);
  const user = await User.create({ email, passwordHash });
  return generateTokenPair(user);
}

export async function login(email: string, password: string) {
  const user = await User.findOne({ email });
  if (!user || !await bcrypt.compare(password, user.passwordHash)) {
    throw new Error('Invalid credentials');
  }
  return generateTokenPair(user);
}

function generateTokenPair(user: IUser) {
  const payload = { sub: user._id.toString(), role: user.role };

  const accessToken = jwt.sign(payload, env.JWT_SECRET, {
    expiresIn: env.JWT_EXPIRES_IN,
  });
  const refreshToken = jwt.sign(payload, env.JWT_REFRESH_SECRET, {
    expiresIn: env.JWT_REFRESH_EXPIRES_IN,
  });

  // Persist refresh token
  User.findByIdAndUpdate(user._id, {
    $push: { refreshTokens: refreshToken },
  }).exec();

  return { accessToken, refreshToken };
}

export async function refresh(token: string) {
  const payload = jwt.verify(token, env.JWT_REFRESH_SECRET) as jwt.JwtPayload;
  const user = await User.findById(payload.sub);
  if (!user || !user.refreshTokens.includes(token)) {
    throw new Error('Invalid refresh token');
  }
  // Rotate: remove old, issue new
  await User.findByIdAndUpdate(user._id, {
    $pull: { refreshTokens: token },
  });
  return generateTokenPair(user);
}

Role-Based Access Control (RBAC)

Implement RBAC as Express middleware. Decorate routes with authorize('admin') to restrict access:

// src/features/auth/auth.middleware.ts
import { Request, Response, NextFunction } from 'express';
import jwt from 'jsonwebtoken';
import { env } from '../../config/env';

export function authenticate(req: Request, res: Response, next: NextFunction) {
  const token = req.headers.authorization?.split(' ')[1];
  if (!token) return res.status(401).json({ message: 'No token provided' });

  try {
    const payload = jwt.verify(token, env.JWT_SECRET) as jwt.JwtPayload;
    req.user = { id: payload.sub!, role: payload.role };
    next();
  } catch {
    res.status(401).json({ message: 'Invalid token' });
  }
}

export function authorize(...roles: string[]) {
  return (req: Request, res: Response, next: NextFunction) => {
    if (!req.user || !roles.includes(req.user.role)) {
      return res.status(403).json({ message: 'Forbidden' });
    }
    next();
  };
}

// Usage in routes:
// router.delete('/:id', authenticate, authorize('admin'), deleteUser);

Input Validation with express-validator

Validate all incoming data at the route level before it reaches the controller. Use express-validator or Zod middleware:

// src/features/auth/auth.schema.ts
import { body } from 'express-validator';

export const registerSchema = [
  body('email').isEmail().normalizeEmail(),
  body('password').isLength({ min: 8 }).withMessage('Min 8 characters'),
];

// src/shared/middleware/validate.ts
import { validationResult } from 'express-validator';
export function validate(req: Request, res: Response, next: NextFunction) {
  const errors = validationResult(req);
  if (!errors.isEmpty()) {
    return res.status(422).json({ errors: errors.array() });
  }
  next();
}

Centralized Error Handling

All errors should flow to a single Express error handler. Create a custom AppError class to distinguish operational errors (400/404/422) from programming bugs (500):

// src/shared/middleware/error.middleware.ts
export class AppError extends Error {
  constructor(
    public message: string,
    public statusCode: number = 500,
    public isOperational = true,
  ) {
    super(message);
    Object.setPrototypeOf(this, new.target.prototype);
  }
}

export function errorHandler(
  err: Error,
  req: Request,
  res: Response,
  _next: NextFunction,
) {
  if (err instanceof AppError) {
    return res.status(err.statusCode).json({
      status: 'error',
      message: err.message,
    });
  }
  // Unknown error — log and return 500
  logger.error(err);
  res.status(500).json({ status: 'error', message: 'Internal server error' });
}

Rate Limiting and Helmet.js

Apply helmet for security headers and express-rate-limit to prevent abuse. Put stricter limits on auth endpoints:

import rateLimit from 'express-rate-limit';
import helmet from 'helmet';

app.use(helmet());

const globalLimiter = rateLimit({ windowMs: 15*60*1000, max: 100 });
const authLimiter = rateLimit({ windowMs: 15*60*1000, max: 10,
  message: 'Too many auth attempts, try again in 15 minutes' });

app.use('/api', globalLimiter);
app.use('/api/auth', authLimiter);

Swagger API Documentation

Auto-generate documentation from JSDoc comments using swagger-jsdoc and serve it with swagger-ui-express. Document every endpoint and model — this becomes your API contract with frontend developers:

/**
 * @swagger
 * /api/auth/login:
 *   post:
 *     summary: Login with email and password
 *     tags: [Auth]
 *     requestBody:
 *       required: true
 *       content:
 *         application/json:
 *           schema:
 *             type: object
 *             required: [email, password]
 *             properties:
 *               email:
 *                 type: string
 *                 format: email
 *               password:
 *                 type: string
 *                 minLength: 8
 *     responses:
 *       200:
 *         description: Returns access and refresh tokens
 *       401:
 *         description: Invalid credentials
 */

Testing with Jest

Use Jest with supertest for integration tests and mongodb-memory-server to spin up an in-memory MongoDB for each test suite — no external dependencies, fast and deterministic:

// src/features/auth/__tests__/auth.test.ts
import request from 'supertest';
import { app } from '../../../app';
import { connectTestDB, closeTestDB } from '../../../test/db';

beforeAll(async () => { await connectTestDB(); });
afterAll(async () => { await closeTestDB(); });

describe('POST /api/auth/register', () => {
  it('should register a new user and return tokens', async () => {
    const res = await request(app)
      .post('/api/auth/register')
      .send({ email: 'test@example.com', password: 'Password1!' });

    expect(res.status).toBe(201);
    expect(res.body).toHaveProperty('accessToken');
    expect(res.body).toHaveProperty('refreshToken');
  });

  it('should reject duplicate email', async () => {
    await request(app).post('/api/auth/register')
      .send({ email: 'dup@example.com', password: 'Password1!' });
    const res = await request(app).post('/api/auth/register')
      .send({ email: 'dup@example.com', password: 'Password1!' });
    expect(res.status).toBe(409);
  });
});

Structured Logging with Pino

Use Pino for structured JSON logging — it is the fastest Node.js logger and integrates with log aggregation services like Datadog and Loki. In production, pipe logs to a file or use pino-pretty only in development:

// src/shared/utils/logger.ts
import pino from 'pino';
import { env } from '../../config/env';

export const logger = pino({
  level: env.NODE_ENV === 'production' ? 'info' : 'debug',
  transport: env.NODE_ENV !== 'production'
    ? { target: 'pino-pretty', options: { colorize: true } }
    : undefined,
});

Dockerizing the API

A multi-stage Dockerfile keeps the production image small by separating the build from the runtime:

# Dockerfile
FROM node:22-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build

FROM node:22-alpine AS runner
WORKDIR /app
ENV NODE_ENV=production
COPY package*.json ./
RUN npm ci --omit=dev
COPY --from=builder /app/dist ./dist
EXPOSE 3000
CMD ["node", "dist/server.js"]

CI/CD with GitHub Actions

Automate testing and deployment on every push to main. The pipeline runs lint, tests, builds the Docker image, and pushes to Docker Hub:

# .github/workflows/ci.yml
name: CI/CD
on:
  push:
    branches: [main]
  pull_request:
    branches: [main]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with: { node-version: '22', cache: 'npm' }
      - run: npm ci
      - run: npm run lint
      - run: npm test -- --coverage

  deploy:
    needs: test
    if: github.ref == 'refs/heads/main'
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: docker/login-action@v3
        with:
          username: ${{ secrets.DOCKERHUB_USERNAME }}
          password: ${{ secrets.DOCKERHUB_TOKEN }}
      - uses: docker/build-push-action@v5
        with:
          push: true
          tags: ${{ secrets.DOCKERHUB_USERNAME }}/my-api:latest

With this pipeline in place, every merged PR automatically ships a tested, containerized build. Pair this with a DigitalOcean droplet or a Railway deployment that pulls the latest image on push, and you have a fully automated delivery pipeline.