Advanced & Expert TypeScript — Mastering the Type System and Beyond

Prerequisites: Solid understanding of TypeScript fundamentals: basic types (string, number, boolean, arrays, tuples), interfaces and type aliases, classes and inheritance, functions with typed parameters/return types, enums, modules (import/export), and basic generics like Array<T>. Familiarity with ES6+ JavaScript features including destructuring, spread/rest, Promises, and async/await.

Advanced Generics: Constraints, Defaults & Patterns

If you've written <T> a few times and feel comfortable with basic generic functions, it's time to go deeper. Advanced generics are where TypeScript's type system starts behaving like a real programming language — you can express relationships between types, propagate literal types through call chains, and build APIs that guide users with precise autocompletion.

This section assumes you already understand basic generic syntax. We'll focus on the patterns and subtleties that separate competent TypeScript from expert TypeScript.

Generic Constraints with extends

An unconstrained generic <T> accepts anything — which means inside the function body, TypeScript knows nothing about T. Constraints let you narrow what T can be while still preserving the specific type the caller provides.

typescript
// Without constraint: T is unknown inside the body
function getId<T>(item: T): string {
  return item.id; // ❌ Error: Property 'id' does not exist on type 'T'
}

// With constraint: T must have at least an `id` property
function getId<T extends { id: string }>(item: T): string {
  return item.id; // ✅ Works — TypeScript knows T has `id`
}

// The caller's specific type is preserved
const user = { id: "u-1", name: "Alice", role: "admin" };
getId(user); // T inferred as { id: string; name: string; role: string }

The constraint T extends { id: string } does two things: it guarantees you can access .id inside the function, and it still infers the caller's full type for T. This is the fundamental mechanism that makes generic constraints so powerful.

Multiple Type Parameters & Inter-dependent Constraints

Real-world generics often need multiple type parameters that relate to each other. The classic pattern is K extends keyof T, which ensures one parameter is a valid key of another.

typescript
function getProperty<T, K extends keyof T>(obj: T, key: K): T[K] {
  return obj[key];
}

const config = { host: "localhost", port: 3000, debug: true };

getProperty(config, "port");   // return type: number
getProperty(config, "debug");  // return type: boolean
getProperty(config, "foo");    // ❌ Error: "foo" is not assignable to "host" | "port" | "debug"

Notice the return type T[K] — it's an indexed access type that changes based on which key you pass. This is not just type safety; it's type precision. The caller gets back the exact type of that property, not some watered-down union.

You can chain constraints across three or more parameters:

typescript
function setNestedProperty<
  T,
  K1 extends keyof T,
  K2 extends keyof T[K1]
>(obj: T, key1: K1, key2: K2, value: T[K1][K2]): void {
  obj[key1][key2] = value;
}

const state = { user: { name: "Alice", age: 30 } };
setNestedProperty(state, "user", "age", 31);   // ✅
setNestedProperty(state, "user", "age", "31");  // ❌ string not assignable to number

Default Type Parameters

Just like function parameters can have defaults, generic type parameters can too. This is especially useful for library APIs where you want a sensible default but allow power users to override it.

typescript
interface ApiResponse<TData = unknown, TError = Error> {
  data: TData | null;
  error: TError | null;
  status: number;
}

// Use with defaults — TData is unknown, TError is Error
const raw: ApiResponse = { data: null, error: null, status: 200 };

// Override just the first parameter
const typed: ApiResponse<User[]> = { data: [], error: null, status: 200 };

// Override both
const custom: ApiResponse<User[], ApiError> = { data: [], error: null, status: 200 };

Defaults follow the same rule as function parameters: all required type parameters must come before optional ones. You can also combine defaults with constraints:

typescript
// TId must be string or number, defaults to string
interface Entity<TId extends string | number = string> {
  id: TId;
  createdAt: Date;
}

const a: Entity = { id: "abc", createdAt: new Date() };           // TId = string
const b: Entity<number> = { id: 42, createdAt: new Date() };     // TId = number
const c: Entity<boolean> = { id: true, createdAt: new Date() };  // ❌ Error

Constraining vs. Narrowing: A Critical Distinction

This is one of the most misunderstood areas in TypeScript generics. A constraint tells TypeScript the minimum shape that T must satisfy. It does not narrow T down to that shape — T could be anything that extends the constraint.

Key Insight

A constraint is a lower bound, not an exact type. T extends { id: string } means "T has at least an id: string" — it might have 50 other properties. This is why you can't return a plain { id: "123" } where a T is expected.

typescript
// ❌ Common mistake: trying to create a T from its constraint
function makeEntity<T extends { id: string }>(id: string): T {
  return { id }; // Error! { id: string } is not assignable to T
  // T might be { id: string; name: string; email: string }
  // You can't manufacture a full T from just the constraint
}

// ✅ Correct: accept a T, return a T
function stampEntity<T extends { id: string }>(entity: T): T & { updatedAt: Date } {
  return { ...entity, updatedAt: new Date() };
}

The error in the first function is a soundness check. If someone calls makeEntity<User>("u-1"), they expect a full User back — but the function only constructs a bare { id } object. TypeScript rightly rejects this.

Literal Type Inference: <T extends string> vs. Parameters

One of the most useful advanced generic patterns is capturing literal types. The way you position the generic determines whether TypeScript infers a literal or a widened type.

typescript
// Pattern 1: T is constrained to string but used in a parameter
function tag<T extends string>(value: T): { tag: T } {
  return { tag: value };
}
const a = tag("admin");  // { tag: "admin" } — literal "admin" preserved!

// Pattern 2: No generic, just accepts string
function tagWide(value: string): { tag: string } {
  return { tag: value };
}
const b = tagWide("admin"); // { tag: string } — literal lost

// Pattern 3: Generic without constraint
function tagAny<T>(value: T): { tag: T } {
  return { tag: value };
}
const c = tagAny("admin"); // { tag: "admin" } — also preserved!

When a generic type parameter appears in a function's parameter position, TypeScript will try to infer the narrowest possible type. Constraining with extends string hints to TypeScript that T should be a string subtype — which means literal string types like "admin" or "readonly". This is the basis for type-safe event systems, builder patterns, and DSLs.

Capturing Literal Object Types

typescript
// Use `const` type parameter (TS 5.0+) to infer deep literals
function defineRoute<const T extends { path: string; method: string }>(
  route: T
): T {
  return route;
}

// Without `const`: T = { path: string; method: string }
// With `const`: T = { path: "/users"; method: "GET" }  ← exact literals!
const route = defineRoute({ path: "/users", method: "GET" });

The Generic Factory Pattern

A generic factory takes a constructor (or a description) and returns a typed instance. This pattern shows up in dependency injection containers, ORM model definitions, and component registries.

typescript
// A constructor signature as a type
type Constructor<T> = new (...args: any[]) => T;

function createInstance<T>(Ctor: Constructor<T>, ...args: any[]): T {
  return new Ctor(...args);
}

class UserService {
  constructor(public baseUrl: string) {}
}

// TypeScript infers T = UserService from the constructor
const service = createInstance(UserService, "https://api.example.com");
service.baseUrl; // ✅ string — fully typed

You can combine this with constraints to ensure the constructed class implements a specific interface:

typescript
interface Disposable {
  dispose(): void;
}

function createManaged<T extends Disposable>(
  Ctor: Constructor<T>,
  ...args: any[]
): T {
  const instance = new Ctor(...args);
  // register instance for cleanup...
  return instance;
}

// Only classes implementing Disposable can be passed here

Generic Higher-Order Functions

The real power of generics shines in higher-order functions — functions that accept or return other functions. The goal is to preserve type information through the call chain so the end consumer gets precise types, not any or a vague union.

typescript
// A pipe that chains two functions with type propagation
function pipe<A, B, C>(
  fn1: (a: A) => B,
  fn2: (b: B) => C
): (a: A) => C {
  return (a) => fn2(fn1(a));
}

const parseAndDouble = pipe(
  (s: string) => parseInt(s, 10),  // string → number
  (n: number) => n * 2              // number → number
);

parseAndDouble("21"); // 42 — return type is number
parseAndDouble(21);   // ❌ Error: number is not assignable to string

Each generic parameter (A, B, C) acts as a "slot" that TypeScript fills in from the provided functions. The type flows from the first function's input (A), through its output/second function's input (B), to the final result (C).

Preserving Types in Wrapper Functions

A common need is to wrap a function while preserving its exact signature. This requires capturing all the type information:

typescript
function withLogging<TArgs extends unknown[], TReturn>(
  fn: (...args: TArgs) => TReturn
): (...args: TArgs) => TReturn {
  return (...args) => {
    console.log("Calling with:", args);
    const result = fn(...args);
    console.log("Returned:", result);
    return result;
  };
}

function add(a: number, b: number): number {
  return a + b;
}

const loggedAdd = withLogging(add);
loggedAdd(2, 3);      // ✅ (a: number, b: number) => number
loggedAdd(2, "3");    // ❌ Error — type safety preserved

The TArgs extends unknown[] pattern captures the function's entire parameter list as a tuple type, and TReturn captures the return type. The wrapped function has the exact same signature as the original.

Common Pitfall: Over-Constraining & Losing Inference

One of the most frequent mistakes is being too explicit with generics. When you manually specify type parameters, you override TypeScript's inference — and you often get less precise types as a result.

Don't Fight the Inference

If you find yourself manually writing type arguments at call sites like fn<SomeType>(arg), it usually means the generic signature could be improved. Let the arguments drive inference whenever possible.

typescript
function pick<T, K extends keyof T>(obj: T, keys: K[]): Pick<T, K> {
  const result = {} as Pick<T, K>;
  keys.forEach(k => result[k] = obj[k]);
  return result;
}

const user = { id: 1, name: "Alice", email: "a@b.com", role: "admin" };

// ✅ Let inference work — K is inferred as "name" | "email"
const subset = pick(user, ["name", "email"]);
// Type: Pick<{id: number; name: string; email: string; role: string}, "name" | "email">

// ❌ Over-specifying — you lose precision and add noise
const manual = pick<typeof user, "name" | "email">(user, ["name", "email"]);

The "Unnecessary Generic" Smell

Another common mistake is using a generic when a simple concrete type would do. If a generic type parameter appears only once in a function signature, it's almost certainly unnecessary.

typescript
// ❌ T is used only once — it doesn't relate input to output
function printLength<T extends { length: number }>(item: T): void {
  console.log(item.length);
}

// ✅ Just use the constraint directly
function printLength(item: { length: number }): void {
  console.log(item.length);
}

// ✅ T used twice — it DOES relate input to output. This is a real generic.
function first<T>(arr: T[]): T | undefined {
  return arr[0];
}

The rule of thumb: a generic exists to create a relationship. If T connects a parameter type to the return type, or one parameter to another, it earns its place. If it appears once and connects nothing, replace it with the constraint.

Putting It All Together: A Type-Safe Event Emitter

Let's combine constraints, literal inference, inter-dependent parameters, and higher-order functions into a single real-world pattern:

typescript
interface EventMap {
  login:  { userId: string; timestamp: number };
  logout: { userId: string };
  error:  { code: number; message: string };
}

class TypedEmitter<TEvents extends Record<string, unknown>> {
  private handlers = new Map<string, Set<Function>>();

  on<K extends keyof TEvents & string>(
    event: K,
    handler: (payload: TEvents[K]) => void
  ): void {
    if (!this.handlers.has(event)) this.handlers.set(event, new Set());
    this.handlers.get(event)!.add(handler);
  }

  emit<K extends keyof TEvents & string>(
    event: K,
    payload: TEvents[K]
  ): void {
    this.handlers.get(event)?.forEach(fn => fn(payload));
  }
}

const emitter = new TypedEmitter<EventMap>();

emitter.on("login", (payload) => {
  // payload is { userId: string; timestamp: number } — fully typed!
  console.log(payload.userId, payload.timestamp);
});

emitter.emit("login", { userId: "u-1", timestamp: Date.now() }); // ✅
emitter.emit("login", { userId: "u-1" }); // ❌ missing timestamp
emitter.emit("typo", {});                 // ❌ "typo" not in EventMap

This pattern ties everything together: TEvents constrains the entire event map, K extends keyof TEvents ensures only valid event names are used, and TEvents[K] looks up the exact payload type for each event. The result is an emitter where every .on() and .emit() call is fully type-checked — including the payload shape.

Design Principle

When designing generic APIs, start with the call site. Write out how you want the function to be called, then work backward to the signature. If inference doesn't give you what you need, add constraints. If constraints aren't enough, add type parameters. Resist the urge to over-engineer — the simplest generic that gives correct types at the call site wins.

Conditional Types, Distributive Behavior & the infer Keyword

Conditional types are the if/else of TypeScript's type system. They let you express type-level branching with the syntax T extends U ? X : Y — "if T is assignable to U, resolve to X; otherwise, resolve to Y." Combined with the infer keyword and distributive behavior over unions, conditional types unlock pattern-matching capabilities that rival those of full programming languages.

The Basic Shape

A conditional type checks an extends constraint at the type level. The left side is your "input," the right side of extends is the pattern you're testing against, and the two branches are the possible results.

typescript
type IsString<T> = T extends string ? true : false;

type A = IsString<string>;   // true
type B = IsString<number>;   // false
type C = IsString<"hello">;  // true  — "hello" extends string

The extends here is not the same as class inheritance. It means "is assignable to" — the same relationship TypeScript checks when you assign a value to a variable. "hello" extends string is true because every string literal is assignable to string.

Distributive Conditional Types

Here's where conditional types become surprisingly powerful — and surprisingly confusing. When you pass a union type to a conditional type where the checked type is a naked type parameter (meaning T on its own, not wrapped in anything), TypeScript distributes the conditional over each member of the union individually, then re-unions the results.

typescript
type ToArray<T> = T extends any ? T[] : never;

// Distribution in action:
// ToArray<string | number>
//   = (string extends any ? string[] : never) | (number extends any ? number[] : never)
//   = string[] | number[]

type Result = ToArray<string | number>;  // string[] | number[]

Notice the result is string[] | number[], not (string | number)[]. Each union member was processed independently, then the results were combined. This is distribution.

flowchart LR
    Input["Input: string | number"]

    subgraph Distributive ["Distributive (naked T)"]
        direction TB
        D1["string extends any?"] -->|Yes| R1["string[]"]
        D2["number extends any?"] -->|Yes| R2["number[]"]
    end

    subgraph NonDist ["Non-Distributive (wrapped [T])"]
        direction TB
        D3["[string | number] extends [any]?"] -->|Yes| R3["(string | number)[]"]
    end

    Input --> Distributive
    Input --> NonDist

    R1 --> Union["Result: string[] | number[]"]
    R2 --> Union
    R3 --> Wrapped["Result: (string | number)[]"]
    

Preventing Distribution with Tuple Wrapping

Sometimes distribution is not what you want. You can disable it by wrapping both sides of the extends check in a tuple ([T] and [U]). This makes T no longer a "naked" type parameter, so the union is checked as a whole.

typescript
// Distributive — each member tested individually
type ToArray<T> = T extends any ? T[] : never;
type D = ToArray<string | number>;  // string[] | number[]

// Non-distributive — union tested as a whole
type ToArrayND<T> = [T] extends [any] ? T[] : never;
type ND = ToArrayND<string | number>;  // (string | number)[]
Why "naked" matters

Distribution only triggers when T appears directly to the left of extends with no wrapper. T extends U distributes. [T] extends [U], T[] extends U[], Promise<T> extends Promise<U> — none of these distribute. The tuple trick [T] extends [U] is the conventional idiom because it has minimal semantic impact.

The infer Keyword — Type-Level Pattern Matching

The infer keyword lets you capture a piece of a type inside a conditional check. Think of it as a "wildcard with a name" — you declare a type variable in the extends clause, and if the match succeeds, that variable holds the captured type in the true branch.

Unwrapping Promises

typescript
type UnwrapPromise<T> = T extends Promise<infer U> ? U : T;

type X = UnwrapPromise<Promise<string>>;  // string
type Y = UnwrapPromise<Promise<number[]>>; // number[]
type Z = UnwrapPromise<boolean>;            // boolean — not a Promise, return T

infer U says: "If T matches the shape Promise<something>, bind that something to U." You can only use infer in the extends clause of a conditional type — never in a standalone position.

Decomposing Function Types

One of the most practical uses of infer is pulling apart function signatures. You can extract parameters, return types, or both at the same time using multiple infer positions.

typescript
type Params<T> = T extends (...args: infer P) => any ? P : never;
type Return<T> = T extends (...args: any[]) => infer R ? R : never;

type Fn = (name: string, age: number) => boolean;

type FnParams = Params<Fn>;   // [name: string, age: number]
type FnReturn = Return<Fn>;   // boolean

TypeScript's built-in Parameters<T> and ReturnType<T> utilities are implemented exactly this way. Understanding infer means you can build your own specialized versions.

Nested and Multi-Position infer

You can use infer at multiple positions in a single conditional, and you can nest conditional types to peel back layers of wrapping progressively.

typescript
// Extract the value type from a Map
type MapValue<T> = T extends Map<infer K, infer V> ? V : never;
type Val = MapValue<Map<string, Date>>;  // Date

// Deeply unwrap nested Promises recursively
type DeepUnwrap<T> = T extends Promise<infer U> ? DeepUnwrap<U> : T;

type Deep = DeepUnwrap<Promise<Promise<Promise<number>>>>;  // number

// Extract first element of a tuple
type Head<T extends any[]> = T extends [infer First, ...any[]] ? First : never;
type H = Head<[string, number, boolean]>;  // string

never — The Empty Union

The never type is the bottom type in TypeScript — it represents something that can never happen. Crucially, never is the empty union. A union of zero members. This fact has a direct and often surprising interaction with distributive conditional types.

When you pass never into a distributive conditional type, TypeScript distributes over its members — but there are zero members. The result of distributing over nothing is never itself. The conditional body never executes at all.

typescript
type IsString<T> = T extends string ? "yes" : "no";

type R1 = IsString<never>;  // never  — NOT "yes" or "no"!

// To actually check for never, prevent distribution:
type IsNever<T> = [T] extends [never] ? true : false;

type R2 = IsNever<never>;   // true
type R3 = IsNever<string>;  // false

Type Filtering with Conditional Types

Distribution makes conditional types a natural tool for filtering unions. By resolving unwanted members to never, they vanish from the resulting union (since X | never simplifies to X).

typescript
// Keep only string members of a union
type OnlyStrings<T> = T extends string ? T : never;

type Mixed = "hello" | 42 | "world" | true | "ts";
type Strings = OnlyStrings<Mixed>;  // "hello" | "world" | "ts"

// This is exactly how the built-in Extract and Exclude work:
type Extract<T, U> = T extends U ? T : never;
type Exclude<T, U> = T extends U ? never : T;

type NoStrings = Exclude<Mixed, string>;  // 42 | true

Conditional Types with Mapped Types

Conditional types compose powerfully with mapped types. You can iterate over an object's keys and conditionally transform each property based on its type — a pattern used extensively in real-world library types.

typescript
// Extract only the keys whose values are functions
type FunctionKeys<T> = {
  [K in keyof T]: T[K] extends (...args: any[]) => any ? K : never;
}[keyof T];

interface API {
  baseUrl: string;
  timeout: number;
  fetchUser: (id: string) => Promise<User>;
  deleteUser: (id: string) => Promise<void>;
}

type APIMethods = FunctionKeys<API>;  // "fetchUser" | "deleteUser"

The pattern here is a two-step trick: the mapped type produces an object where each value is either the key name (if it passes the test) or never. Indexing with [keyof T] then collects all those values into a union, and the never entries disappear.

Chaining Conditionals

You can chain conditional types just like nested ternaries in JavaScript. This lets you handle multiple cases in sequence — a type-level switch statement.

typescript
type TypeName<T> =
  T extends string  ? "string" :
  T extends number  ? "number" :
  T extends boolean ? "boolean" :
  T extends null    ? "null" :
  T extends undefined ? "undefined" :
  T extends (...args: any[]) => any ? "function" :
  "object";

type T1 = TypeName<string>;       // "string"
type T2 = TypeName<() => void>;   // "function"
type T3 = TypeName<Date>;         // "object"
type T4 = TypeName<string | number>;  // "string" | "number" (distributes!)

Common Gotchas

T extends string vs. string extends T

The order matters. T extends string asks "is T assignable to string?" — it checks if T is a subtype. string extends T flips the direction and asks "is string assignable to T?" — it checks if T is a supertype. These are fundamentally different questions.

typescript
type Check1<T> = T extends string ? "yes" : "no";
type Check2<T> = string extends T ? "yes" : "no";

// "hello" is assignable to string → yes
type A1 = Check1<"hello">;       // "yes"
// string is NOT assignable to "hello" → no
type A2 = Check2<"hello">;       // "no"

// string is assignable to string → yes (both directions)
type B1 = Check1<string>;        // "yes"
type B2 = Check2<string>;        // "yes"

// string is assignable to unknown → yes
type C = Check2<unknown>;        // "yes"
The never distribution trap

Passing never to a distributive conditional type always produces never, regardless of the branches. This isn't a bug — never is an empty union, so there are zero members to distribute over, and the union of zero results is never. If you need to detect never, wrap both sides: [T] extends [never] ? true : false.

Deferred Evaluation

When TypeScript can't resolve a conditional type immediately (because T is still a generic parameter), it defers evaluation. This means the type remains in its conditional form until T is instantiated with a concrete type. You'll see this in function signatures — it's not an error, just TypeScript waiting for more information.

typescript
function process<T>(value: T): T extends string ? number : boolean {
  // Inside here, TypeScript can't resolve the conditional —
  // T is still generic. You'll likely need a type assertion.
  if (typeof value === "string") {
    return 42 as any;  // necessary escape hatch
  }
  return true as any;
}
Prefer overloads over conditional return types

If you find yourself using conditional types in a function's return position and needing as any inside the body, function overloads are usually a cleaner pattern. They give you proper type narrowing inside each implementation branch without escape hatches.

Putting It All Together — A Real-World Example

Here's a type that combines conditionals, infer, distribution, and recursion to deeply extract the "inner type" from any level of wrapping — whether it's a Promise, an Array, or both.

typescript
type Unwrap<T> =
  T extends Promise<infer U> ? Unwrap<U> :
  T extends Array<infer U>   ? Unwrap<U> :
  T extends Map<any, infer V> ? Unwrap<V> :
  T extends Set<infer U>     ? Unwrap<U> :
  T;

type T1 = Unwrap<Promise<string[]>>;            // string
type T2 = Unwrap<Map<string, Promise<number>>>; // number
type T3 = Unwrap<Set<boolean[]>>;               // boolean
type T4 = Unwrap<Promise<Set<Date>>>;           // Date

Each layer of wrapping is peeled off recursively by the chained conditionals. The recursion terminates when T doesn't match any wrapper pattern, returning the innermost type. This is the kind of expressive type-level logic that makes TypeScript's type system Turing-complete — and makes conditional types one of the most important tools in your advanced TypeScript toolkit.

Mapped Types & Key Remapping

Mapped types let you create new types by iterating over the keys of an existing type and transforming each property. They are the for...in loop of the type system — and once you understand the pattern, you'll see that most of TypeScript's built-in utility types are just mapped types under the hood.

The Basic Pattern

The core syntax is { [K in keyof T]: NewValueType }. TypeScript iterates over every key K in T, and for each key, produces a property with the type you specify on the right side of the colon.

typescript
type Stringify<T> = {
  [K in keyof T]: string;
};

interface User {
  id: number;
  name: string;
  active: boolean;
}

type StringifiedUser = Stringify<User>;
// { id: string; name: string; active: string; }

The key variable K is available on the right side too, so you can build value types that depend on the key. You can also reference the original value type with T[K] — this is how you preserve or transform existing property types.

typescript
type Nullable<T> = {
  [K in keyof T]: T[K] | null;
};

type NullableUser = Nullable<User>;
// { id: number | null; name: string | null; active: boolean | null; }

Mapping Modifiers: readonly and ?

Mapped types can add or remove the readonly and optional (?) modifiers on properties. Prefix with + to add a modifier or - to remove it. The + is implicit if you write the modifier without a prefix.

typescript
// Add readonly to every property
type Freeze<T> = {
  +readonly [K in keyof T]: T[K];
};

// Remove readonly from every property
type Mutable<T> = {
  -readonly [K in keyof T]: T[K];
};

// Make every property optional
type Optionalize<T> = {
  [K in keyof T]+?: T[K];
};

// Make every property required (remove ?)
type Concrete<T> = {
  [K in keyof T]-?: T[K];
};

You can combine both modifiers in a single mapped type. For example, -readonly and -? together strip both markers at once:

typescript
type FullyMutable<T> = {
  -readonly [K in keyof T]-?: T[K];
};

interface Config {
  readonly host?: string;
  readonly port?: number;
}

type WritableConfig = FullyMutable<Config>;
// { host: string; port: number; }  — no readonly, no optional

Built-in Utility Types Are Mapped Types

Once you know the syntax, you can see how TypeScript's standard utility types are implemented. There is no magic — each is a short mapped type definition.

typescript
// Partial<T> — make all properties optional
type Partial<T> = {
  [K in keyof T]?: T[K];
};

// Required<T> — make all properties required
type Required<T> = {
  [K in keyof T]-?: T[K];
};

// Readonly<T> — make all properties readonly
type Readonly<T> = {
  readonly [K in keyof T]: T[K];
};

// Pick<T, Keys> — select a subset of properties
type Pick<T, Keys extends keyof T> = {
  [K in Keys]: T[K];
};

// Record<Keys, Value> — create a type with specific keys and uniform value type
type Record<Keys extends keyof any, Value> = {
  [K in Keys]: Value;
};
Homomorphic vs. Non-Homomorphic Mapped Types

Partial, Required, Readonly, and Pick are homomorphic — they iterate over keyof T (or a subset), so TypeScript preserves the original modifiers (readonly, ?) and only applies your explicit changes. Record is non-homomorphic because its keys come from an independent type parameter, not from keyof of the value type. Non-homomorphic mapped types don't carry over any modifiers from the source — they produce plain, required, writable properties by default.

Key Remapping with as

TypeScript 4.1 introduced the as clause in mapped types, which lets you transform the key itself — not just the value. The syntax is { [K in keyof T as NewKey]: T[K] }. This is enormously powerful for renaming, filtering, and restructuring keys.

Renaming Keys

Use template literal types inside the as clause to systematically rename every key:

typescript
type Getters<T> = {
  [K in keyof T as `get${Capitalize<string & K>}`]: () => T[K];
};

interface Person {
  name: string;
  age: number;
}

type PersonGetters = Getters<Person>;
// { getName: () => string; getAge: () => number; }

The string & K intersection is needed because keyof T can include symbol and number keys, and Capitalize only works on strings. The intersection filters to just the string keys.

Filtering Keys (Remapping to never)

When the as clause resolves to never, that key is excluded from the resulting type. This gives you a clean way to filter properties:

typescript
// Keep only properties whose values are strings
type OnlyStrings<T> = {
  [K in keyof T as T[K] extends string ? K : never]: T[K];
};

interface Mixed {
  id: number;
  name: string;
  email: string;
  active: boolean;
}

type StringFields = OnlyStrings<Mixed>;
// { name: string; email: string; }

// Remove specific keys by name
type OmitById<T> = {
  [K in keyof T as K extends "id" ? never : K]: T[K];
};

type WithoutId = OmitById<Mixed>;
// { name: string; email: string; active: boolean; }

In fact, this is exactly how you can implement Omit from scratch — by remapping excluded keys to never.

Mapping Over Union Types

You don't have to iterate over keyof T. The in clause accepts any union of string, number, or symbol literals. This lets you build types from arbitrary unions.

typescript
type EventName = "click" | "scroll" | "mouseover";

// Build an event handler map from a union
type EventHandlers = {
  [E in EventName]: (event: Event) => void;
};
// { click: (event: Event) => void; scroll: ...; mouseover: ...; }

// Combine with template literals and key remapping
type PrefixedHandlers = {
  [E in EventName as `on${Capitalize<E>}`]: (event: Event) => void;
};
// { onClick: (event: Event) => void; onScroll: ...; onMouseover: ...; }

Combining Mapped Types with Template Literal Types

Template literal types and mapped types are a potent combination. You can generate entire API surfaces from a single list of entity names.

typescript
type Entity = "user" | "post" | "comment";

type CrudApi = {
  [E in Entity as `get${Capitalize<E>}`]: (id: string) => Promise<unknown>;
} & {
  [E in Entity as `create${Capitalize<E>}`]: (data: unknown) => Promise<unknown>;
} & {
  [E in Entity as `delete${Capitalize<E>}`]: (id: string) => Promise<void>;
};

// Result includes: getUser, getPost, getComment,
//                  createUser, createPost, createComment,
//                  deleteUser, deletePost, deleteComment

Conditional Mapped Types

You can use conditional types on the value side to produce different property types based on the key or its original type. This lets a single mapped type apply distinct transformations per property.

typescript
// Wrap function properties in a "spy" type, leave others unchanged
type Spied<T> = {
  [K in keyof T]: T[K] extends (...args: infer A) => infer R
    ? (...args: A) => R & { calls: A[] }
    : T[K];
};

// Convert string properties to number, leave everything else as-is
type StringsToNumbers<T> = {
  [K in keyof T]: T[K] extends string ? number : T[K];
};

interface Form {
  name: string;
  age: number;
  email: string;
}

type NumericForm = StringsToNumbers<Form>;
// { name: number; age: number; email: number; }

Recursive Mapped Types for Deep Transformations

Standard mapped types are shallow — they only affect the top-level properties. For nested objects, you need recursion. The trick is to check whether each property is an object and, if so, apply the mapped type to it recursively.

typescript
type DeepPartial<T> = {
  [K in keyof T]?: T[K] extends object ? DeepPartial<T[K]> : T[K];
};

type DeepReadonly<T> = {
  readonly [K in keyof T]: T[K] extends object ? DeepReadonly<T[K]> : T[K];
};

interface Company {
  name: string;
  address: {
    street: string;
    city: string;
    geo: { lat: number; lng: number };
  };
}

type PartialCompany = DeepPartial<Company>;
// address?.street?, address?.city?, address?.geo?.lat?, etc.

type FrozenCompany = DeepReadonly<Company>;
// All properties readonly at every nesting level
Watch Out for Arrays in Recursive Types

The naive T[K] extends object check also matches arrays, functions, and Date. In production code, use a more precise guard like T[K] extends object ? T[K] extends Function ? T[K] : DeepReadonly<T[K]> : T[K] to avoid mangling non-plain-object types.

Putting It All Together

Here's a real-world example that combines key remapping, template literals, conditional types, and filtering in a single mapped type — generating a type-safe event emitter interface from a map of event payloads:

typescript
interface AppEvents {
  userLogin: { userId: string; timestamp: number };
  pageView: { url: string; referrer: string };
  error: { code: number; message: string };
}

// Generate "onUserLogin", "onPageView", "onError" listener methods
type EventListeners<T> = {
  [K in keyof T as `on${Capitalize<string & K>}`]: (
    handler: (payload: T[K]) => void
  ) => void;
};

// Generate "emitUserLogin", "emitPageView", "emitError" emitter methods
type EventEmitters<T> = {
  [K in keyof T as `emit${Capitalize<string & K>}`]: (
    payload: T[K]
  ) => void;
};

type TypedEmitter = EventListeners<AppEvents> & EventEmitters<AppEvents>;

// TypedEmitter has:
//   onUserLogin(handler: (payload: { userId: string; timestamp: number }) => void): void
//   emitUserLogin(payload: { userId: string; timestamp: number }): void
//   onPageView(...), emitPageView(...), onError(...), emitError(...)
Debugging Mapped Types

When a complex mapped type isn't producing what you expect, assign it to a concrete type alias and hover over it in your editor. If the expansion is too deep, break it into smaller mapped types and compose them. You can also use the Expand helper — type Expand<T> = T extends infer O ? { [K in keyof O]: O[K] } : never — to force TypeScript to show the fully resolved shape in tooltips.

Template Literal Types & String Manipulation

Template literal types let you build new string types by interpolating other types inside backtick-delimited templates — the same syntax you already use at the value level. Combined with union distribution and conditional inference, they unlock an entire category of string-level type safety that was previously impossible.

The Basics: String Type Algebra

A template literal type uses the ${...} syntax inside a type-level backtick string. Each interpolation slot accepts string, number, bigint, boolean, null, or undefined — plus any union of those.

typescript
type Greeting = `Hello, ${string}!`;

const a: Greeting = "Hello, world!";   // ✅
const b: Greeting = "Hello, TypeScript!"; // ✅
const c: Greeting = "Hi, world!";      // ❌ Type '"Hi, world!"' is not assignable

type HttpUrl = `${"http" | "https"}://${string}`;
const url: HttpUrl = "https://example.com"; // ✅

The real power emerges when you interpolate union types. TypeScript distributes the template across every member of each union, producing the full cartesian product.

typescript
type Color = "red" | "blue" | "green";
type Size  = "sm" | "md" | "lg";

type CSSClass = `${Color}-${Size}`;
// Result: "red-sm" | "red-md" | "red-lg"
//       | "blue-sm" | "blue-md" | "blue-lg"
//       | "green-sm" | "green-md" | "green-lg"

Intrinsic String Manipulation Types

TypeScript provides four built-in utility types that transform the casing of string literal types. These are implemented inside the compiler itself (not in userland), which is why they're called "intrinsic."

Utility TypeEffectExample
Uppercase<S>All characters to upper case"hello""HELLO"
Lowercase<S>All characters to lower case"HELLO""hello"
Capitalize<S>First character to upper case"hello""Hello"
Uncapitalize<S>First character to lower case"Hello""hello"

These compose naturally with template literals. Here's a common pattern for creating type-safe event handler names:

typescript
type EventName = "click" | "focus" | "blur";

type EventHandler = `on${Capitalize<EventName>}`;
// Result: "onClick" | "onFocus" | "onBlur"

type UpperSnake<S extends string> = Uppercase<S>;
type ApiHeader = UpperSnake<"content-type" | "authorization">;
// Result: "CONTENT-TYPE" | "AUTHORIZATION"

Type-Safe Event Emitters

One of the most popular real-world applications is building event emitters where the method names and callback signatures are fully type-checked. The pattern maps each event name to an on-prefixed handler.

typescript
interface Events {
  click: { x: number; y: number };
  focus: { target: string };
  blur:  { target: string };
}

type OnHandlers<T> = {
  [K in keyof T & string as `on${Capitalize<K>}`]: (payload: T[K]) => void;
};

type AppHandlers = OnHandlers<Events>;
// {
//   onClick: (payload: { x: number; y: number }) => void;
//   onFocus: (payload: { target: string }) => void;
//   onBlur:  (payload: { target: string }) => void;
// }

const handlers: AppHandlers = {
  onClick: ({ x, y }) => console.log(x, y),
  onFocus: ({ target }) => console.log(target),
  onBlur:  ({ target }) => console.log(target),
};
Key Remapping with as

The as `on${Capitalize<K>}` clause inside the mapped type is called key remapping (introduced in TypeScript 4.1). It transforms each key during mapping. Without it, you'd need an extra layer of indirection. You can also remap to never to filter keys out entirely.

Pattern Matching with infer

Template literal types become even more powerful when combined with conditional types and the infer keyword. You can decompose string types by matching them against a template pattern — effectively performing regex-like extraction at the type level.

typescript
// Extract protocol and host from a URL type
type ParseUrl<T extends string> =
  T extends `${infer Protocol}://${infer Host}/${infer Path}`
    ? { protocol: Protocol; host: Host; path: Path }
    : T extends `${infer Protocol}://${infer Host}`
      ? { protocol: Protocol; host: Host; path: "" }
      : never;

type Result = ParseUrl<"https://api.example.com/users">;
// { protocol: "https"; host: "api.example.com"; path: "users" }

type Result2 = ParseUrl<"ftp://files.local">;
// { protocol: "ftp"; host: "files.local"; path: "" }

The infer keyword inside a template literal is greedy by default — it captures as many characters as possible. When multiple infer positions exist, TypeScript resolves them left to right, with the last slot being the most greedy.

Extracting Route Parameters

A classic real-world pattern is extracting dynamic path parameters from URL route strings. Frameworks like Express use :param syntax, and you can teach TypeScript to parse it.

typescript
// Recursively extract :param segments from a route string
type ExtractParams<T extends string> =
  T extends `${string}:${infer Param}/${infer Rest}`
    ? Param | ExtractParams<Rest>
    : T extends `${string}:${infer Param}`
      ? Param
      : never;

type Params = ExtractParams<"/api/users/:userId/posts/:postId">;
// Result: "userId" | "postId"

// Build a typed params object from a route pattern
type RouteParams<T extends string> = {
  [K in ExtractParams<T>]: string;
};

type UserPostParams = RouteParams<"/api/users/:userId/posts/:postId">;
// { userId: string; postId: string }

function get<T extends string>(
  path: T,
  handler: (params: RouteParams<T>) => void
): void { /* ... */ }

get("/api/users/:userId/posts/:postId", (params) => {
  console.log(params.userId);  // ✅ Autocomplete works
  console.log(params.postId);  // ✅
  // console.log(params.foo); // ❌ Property 'foo' does not exist
});

Typed HTTP Methods & API Paths

You can combine template literal types to create strongly-typed API clients where both the method and path are validated at compile time. This prevents typos in endpoint strings that would otherwise only surface at runtime.

typescript
type HttpMethod = "GET" | "POST" | "PUT" | "DELETE" | "PATCH";

interface ApiRoutes {
  "GET /users":           { response: User[] };
  "GET /users/:id":       { response: User };
  "POST /users":          { response: User; body: CreateUser };
  "DELETE /users/:id":    { response: void };
}

type ApiKey = keyof ApiRoutes;

// Parse method and path from an API key
type ParseApi<T extends string> =
  T extends `${infer M extends HttpMethod} ${infer Path}`
    ? { method: M; path: Path }
    : never;

type Info = ParseApi<"GET /users/:id">;
// { method: "GET"; path: "/users/:id" }

Strongly-Typed i18n Key Lookups

Internationalization keys are typically nested dot-separated strings like "nav.header.title". Template literal types let you flatten a nested object type into a union of valid dot-paths, catching misspelled keys at compile time rather than showing a blank label in production.

typescript
interface Translations {
  nav: { home: string; about: string };
  auth: { login: string; logout: string };
  errors: { notFound: string; server: string };
}

// Recursively build dot-separated key paths
type DotPaths<T, Prefix extends string = ""> = {
  [K in keyof T & string]: T[K] extends object
    ? DotPaths<T[K], `${Prefix}${K}.`>
    : `${Prefix}${K}`;
}[keyof T & string];

type I18nKey = DotPaths<Translations>;
// "nav.home" | "nav.about" | "auth.login" | "auth.logout"
// | "errors.notFound" | "errors.server"

function t(key: I18nKey): string { /* ... */ }

t("nav.home");       // ✅
t("auth.login");     // ✅
t("nav.headr");      // ❌ Typo caught at compile time!

CSS Property Types with Template Literals

Template literals are a natural fit for CSS value types where the structure is predictable but the combinations are numerous. Instead of accepting a loose string, you can constrain CSS values to valid patterns.

typescript
type CSSUnit = "px" | "rem" | "em" | "%" | "vh" | "vw";
type CSSLength = `${number}${CSSUnit}`;

type CSSColor = `#${string}` | `rgb(${number}, ${number}, ${number})`;

function setWidth(value: CSSLength): void { /* ... */ }

setWidth("100px");   // ✅
setWidth("2.5rem");  // ✅
setWidth("100");     // ❌ Missing unit
setWidth("wide");    // ❌ Not a valid pattern

Combining Template Literals with Mapped Types

The real synthesis happens when you combine template literals, mapped types, and key remapping. You can transform entire object shapes — renaming keys, changing prefixes, generating getter/setter pairs — all while preserving the connection between keys and their value types.

typescript
// Generate getter/setter pairs from an interface
type Getters<T> = {
  [K in keyof T & string as `get${Capitalize<K>}`]: () => T[K];
};

type Setters<T> = {
  [K in keyof T & string as `set${Capitalize<K>}`]: (value: T[K]) => void;
};

interface Config {
  host: string;
  port: number;
  debug: boolean;
}

type ConfigAccessors = Getters<Config> & Setters<Config>;
// {
//   getHost: () => string;       setHost: (value: string) => void;
//   getPort: () => number;       setPort: (value: number) => void;
//   getDebug: () => boolean;     setDebug: (value: boolean) => void;
// }
Tip

Use keyof T & string to filter out symbol and number keys. Template literal types only work with string types, so without this intersection you'll get a compiler error when TypeScript tries to interpolate a non-string key.

Limitations & Gotchas

Template literal types are powerful, but they have hard boundaries you should understand before reaching for them in production code.

Combinatorial Explosion

When you cross multiple large unions, TypeScript generates the cartesian product. A type like `${A}-${B}-${C}` where each union has 20 members produces 8,000 types. TypeScript enforces a cap — currently at 100,000 union members — and will emit an error if you exceed it.

typescript
// ⚠️ This creates 26 × 26 × 26 = 17,576 types — compiles, but slow!
type Alpha = "a"|"b"|"c"|"d"|"e"|"f"|"g"|"h"|"i"|"j"|"k"|"l"|"m"
           | "n"|"o"|"p"|"q"|"r"|"s"|"t"|"u"|"v"|"w"|"x"|"y"|"z";
type ThreeLetterCode = `${Alpha}${Alpha}${Alpha}`;

// ❌ This would exceed the limit and error:
// type FourLetterCode = `${Alpha}${Alpha}${Alpha}${Alpha}`; // 456,976 types!

Not Regular Expressions

Template literal types can match fixed patterns and infer segments, but they can't express repetition, optionality, or character classes the way regex can. `${string}` matches any string — you can't say "one or more digits" or "alphanumeric only" at the type level.

Recursion Depth Limits

Recursive template literal types (like the ExtractParams example above) are subject to TypeScript's type instantiation depth limit. For deeply nested or very long strings, you may hit the "Type instantiation is excessively deep and possibly infinite" error. Keep recursive patterns shallow — typically 3-4 levels is safe.

Watch the Compile Cost

Template literal types that generate large unions slow down the compiler and IDE. If your autocomplete starts lagging, that's often a sign you're generating too many union members. Profile with tsc --generateTrace and consider narrowing the input unions or using branded types as a simpler alternative.

Summary: When to Use Template Literal Types

Use CasePatternComplexity
Event handler names`on${Capitalize<K>}`Low
CSS value constraints`${number}${CSSUnit}`Low
Key remapping (getters/setters)Mapped type + as clauseMedium
Route parameter extractionRecursive infer matchingMedium
i18n dot-path keysRecursive DotPathsMedium
Full URL/SQL parsersDeep recursive inferenceHigh — beware limits

Template literal types bridge the gap between TypeScript's structural type system and the stringly-typed APIs that dominate JavaScript. Use them to push validation from runtime to compile time — but respect the compiler's limits and keep your generated unions manageable.

keyof, typeof & Indexed Access Types

These three type operators — keyof, typeof, and indexed access (T[K]) — form the backbone of TypeScript's type-level programming. Individually they're straightforward, but their real power emerges when you combine them to extract, transform, and constrain types from existing values and structures.

keyof Deep Dive

The keyof operator takes an object type and produces a union of its known, public key names as string (or number/symbol) literal types. It's the type-level equivalent of Object.keys(), but it operates on types, not runtime values.

typescript
interface User {
  id: number;
  name: string;
  email: string;
}

type UserKeys = keyof User; // "id" | "name" | "email"

function getProperty<T, K extends keyof T>(obj: T, key: K): T[K] {
  return obj[key];
}

const user: User = { id: 1, name: "Ada", email: "ada@example.com" };
const name = getProperty(user, "name"); // type: string
const id = getProperty(user, "id");     // type: number

keyof with Index Signatures

When a type has an index signature, keyof returns the index signature's key type rather than individual literal keys. This catches people off guard — a [key: string] signature means keyof yields string | number (because JavaScript coerces numeric keys to strings, so numeric keys are always valid string-indexed lookups).

typescript
interface StringMap {
  [key: string]: unknown;
}
type A = keyof StringMap; // string | number

interface NumberMap {
  [key: number]: unknown;
}
type B = keyof NumberMap; // number

// Mix of index signature and explicit keys
interface Config {
  [key: string]: string;
  host: string;
  port: string; // must be compatible with index signature
}
type C = keyof Config; // string | number (index signature dominates)

keyof with Unions and Intersections — The Counterintuitive Rule

This is one of the most unintuitive aspects of keyof: it distributes inversely over unions and intersections. The rule follows set theory, but it trips up even experienced TypeScript developers.

ExpressionResultWhy
keyof (A & B)keyof A | keyof BAn intersection has all keys from both types
keyof (A | B)keyof A & keyof BA union only guarantees shared keys

Think of it this way: if you have a value that is A & B, it definitely has every key from A and every key from B. But if you have a value that is A | B, you can only safely access the keys that exist on both types.

typescript
interface Dog { name: string; breed: string; }
interface Cat { name: string; color: string; }

type IntersectionKeys = keyof (Dog & Cat); // "name" | "breed" | "color"
type UnionKeys = keyof (Dog | Cat);         // "name" (only shared key)

// Proof — you can only access .name safely on a Dog | Cat
function describeAnimal(animal: Dog | Cat) {
  console.log(animal.name);  // ✅ OK — "name" exists on both
  // console.log(animal.breed); // ❌ Error — not on Cat
}
The mnemonic

keyof flips the set operation: intersection of types → union of keys, union of types → intersection of keys. This is known as contravariance of keyof with respect to set operations.

typeof in Type Position

JavaScript already has a typeof operator that returns a runtime string like "string" or "object". TypeScript adds a second meaning: when you write typeof in a type position (after a : or in a type alias), it extracts the compile-time type of a value. This lets you derive types from existing runtime objects without manually duplicating their shape.

typescript
const defaultSettings = {
  theme: "dark" as const,
  fontSize: 14,
  showLineNumbers: true,
};

// Extract the type — no need to write an interface by hand
type Settings = typeof defaultSettings;
// { readonly theme: "dark"; fontSize: number; showLineNumbers: boolean; }

function applySettings(settings: Settings) { /* ... */ }

typeof on Modules and Functions

You can use typeof on imported modules to capture their entire shape — useful when you need to mock or wrap a module. You can also use it on functions to extract their full signature type, which pairs well with utility types like ReturnType and Parameters.

typescript
import * as mathUtils from "./mathUtils";

// Capture the full module shape
type MathUtilsModule = typeof mathUtils;
// { add: (a: number, b: number) => number; subtract: ... }

// typeof on functions — extract signature
function createUser(name: string, age: number) {
  return { name, age, createdAt: new Date() };
}

type CreateUserFn = typeof createUser;
// (name: string, age: number) => { name: string; age: number; createdAt: Date }

type NewUser = ReturnType<typeof createUser>;
// { name: string; age: number; createdAt: Date }

typeof on Classes: Constructor vs. Instance

This distinction is critical. When you write typeof MyClass, you get the type of the class constructor (the thing you call new on) — not the instance type. The class name MyClass used in type position already refers to the instance type.

typescript
class Logger {
  static defaultLevel = "info";
  constructor(public prefix: string) {}
  log(msg: string) { console.log(`[${this.prefix}] ${msg}`); }
}

// Instance type — has prefix, log()
type LoggerInstance = Logger;

// Constructor type — has defaultLevel, new()
type LoggerConstructor = typeof Logger;

// Practical use: factory that accepts any class constructor
function create<T>(Ctor: new (...args: any[]) => T, ...args: any[]): T {
  return new Ctor(...args);
}

const logger = create(Logger, "app"); // type: Logger

Indexed Access Types

Indexed access types use the bracket syntax T[K] to look up the type of a specific property on another type. Think of it as "type-level property access" — the same way obj["key"] gets a value at runtime, Type["key"] gets a type at compile time.

typescript
interface ApiResponse {
  status: number;
  data: {
    users: { id: number; name: string }[];
    total: number;
  };
  error: string | null;
}

type Status = ApiResponse["status"];           // number
type Data = ApiResponse["data"];               // { users: ...; total: number }
type Users = ApiResponse["data"]["users"];     // { id: number; name: string }[]
type SingleUser = ApiResponse["data"]["users"][number]; // { id: number; name: string }

T[keyof T] — Union of All Value Types

Passing keyof T as the index gives you a union of every possible value type in T. This is one of the most common indexed access patterns — it answers the question "what types can the values of this object be?"

typescript
interface Theme {
  primary: "#0066ff";
  secondary: "#ff6600";
  background: "#ffffff";
  text: "#1a1a1a";
}

type ThemeColor = Theme[keyof Theme];
// "#0066ff" | "#ff6600" | "#ffffff" | "#1a1a1a"

Tuple Element Access with T[number]

When you index a tuple or array type with number, you get a union of all its element types. This is the type-level equivalent of "any element from this array." For tuples you can also use specific numeric literal indices.

typescript
const roles = ["admin", "editor", "viewer"] as const;

type Roles = typeof roles;          // readonly ["admin", "editor", "viewer"]
type Role = Roles[number];          // "admin" | "editor" | "viewer"

// Specific tuple indices
type First = Roles[0];             // "admin"
type Last = Roles[2];              // "viewer"

// Real-world: constrain a function parameter to valid roles
function assignRole(userId: string, role: Role) { /* ... */ }

Advanced Combinations

The real power of these operators shows up when you chain them. These patterns appear constantly in library code, configuration systems, and type-safe APIs.

typeof config[keyof typeof config] — Config Value Types

This is the quintessential "extract all value types from a runtime config object" pattern. You read it inside-out: typeof config gets the type of the value, keyof typeof config gets the keys, and the outer indexed access gives you the union of all value types.

typescript
const endpoints = {
  users: "/api/users",
  posts: "/api/posts",
  comments: "/api/comments",
} as const;

type EndpointKey = keyof typeof endpoints;
// "users" | "posts" | "comments"

type EndpointPath = (typeof endpoints)[keyof typeof endpoints];
// "/api/users" | "/api/posts" | "/api/comments"

function fetchFrom(path: EndpointPath) { /* ... */ }
fetchFrom("/api/users");    // ✅
fetchFrom("/api/unknown");  // ❌ Error

T[K & keyof T] — Safe Indexed Access

Sometimes you have a generic key K that might be a key of T, but TypeScript can't prove it. The pattern T[K & keyof T] narrows K to only the keys that actually exist on T, avoiding an unsafe indexed access error. This appears frequently when building mapped or conditional types.

typescript
// Without the intersection — error in complex generic contexts
type UnsafeLookup<T, K> = T[K]; // ❌ K doesn't satisfy keyof T

// With K & keyof T — always safe
type SafeLookup<T, K> = K extends keyof T ? T[K] : never;

// The intersection trick for mapped types
type PickByValueType<T, ValueType> = {
  [K in keyof T as T[K] extends ValueType ? K : never]: T[K]
};

interface Mixed {
  name: string;
  age: number;
  active: boolean;
  email: string;
}

type StringProps = PickByValueType<Mixed, string>;
// { name: string; email: string }

Indexed Access with Mapped Types

You can combine indexed access with mapped types to build powerful type transformers. A common pattern is creating event handler types from a map of event names to payload types.

typescript
interface EventMap {
  click: { x: number; y: number };
  focus: { target: string };
  submit: { data: Record<string, string> };
}

// Build a union of handler signatures from the map
type EventHandler = {
  [K in keyof EventMap]: (event: K, payload: EventMap[K]) => void
}[keyof EventMap];
// ((event: "click", payload: { x: number; y: number }) => void)
// | ((event: "focus", payload: { target: string }) => void)
// | ((event: "submit", payload: { data: Record<string, string> }) => void)

// Type-safe event emitter
function on<K extends keyof EventMap>(
  event: K,
  handler: (payload: EventMap[K]) => void
) { /* ... */ }

on("click", (payload) => {
  console.log(payload.x, payload.y); // ✅ Fully typed
});

PropertyKey and keyof any

keyof any evaluates to string | number | symbol — the set of all types that can be used as an object property key in JavaScript. TypeScript aliases this as the built-in type PropertyKey. You'll encounter it when building generic utilities that need to accept any possible key.

typescript
type PK = keyof any; // string | number | symbol
// Identical to the built-in PropertyKey type

// Practical use: a generic Record-like type that allows any key
type FlexibleRecord<K extends PropertyKey, V> = {
  [P in K]: V;
};

// Used in TypeScript's own lib: Record<K extends keyof any, T>
// This is why Record accepts string, number, or symbol keys
Common mistake: typeof only works on values

You cannot write typeof MyInterface — interfaces don't exist at runtime, so there's no value to extract a type from. Use typeof only on variables, imports, and class names (classes are both types and values). If you need the type, just reference the interface name directly.

Quick Reference: Cheat Sheet

PatternResultUse Case
keyof TUnion of key literalsConstraining function params to valid keys
T[K]Type of property KLooking up a specific property type
T[keyof T]Union of all value typesGetting all possible value types
T[number]Union of array element typesExtracting element type from tuple/array
typeof valueType of a runtime valueDeriving types from config objects
(typeof x)[keyof typeof x]Value types from a valueConfig value unions
keyof anystring | number | symbolSame as PropertyKey

Type Guards, Control Flow Narrowing & Discriminated Unions

TypeScript's type system doesn't just check types at declaration — it tracks how types change as your code executes. Every if, switch, and conditional expression teaches the compiler something new about the types in scope. This mechanism, called control flow narrowing, is what makes union types practical instead of painful.

This section covers every narrowing mechanism TypeScript offers, from built-in type guards to custom predicates, discriminated unions, and exhaustiveness checking — plus the edge cases where narrowing breaks down and how to work around them.

How Control Flow Narrowing Works

When TypeScript encounters a conditional check, it narrows the type in each branch. Here's a visualization of how a string | number | null type gets progressively narrowed through a chain of type guards:

stateDiagram-v2
    [*] --> Wide: value: string | number | null
    Wide --> NullCheck: if (value === null)

    state NullCheck {
        direction LR
        [*] --> TrueBranch1: true
        [*] --> FalseBranch1: false
        TrueBranch1: value: null
        FalseBranch1: value: string | number
    }

    NullCheck --> TypeofCheck: else branch continues

    state TypeofCheck {
        direction LR
        [*] --> TrueBranch2: typeof value === "string"
        [*] --> FalseBranch2: else
        TrueBranch2: value: string
        FalseBranch2: value: number
    }

    TypeofCheck --> [*]: All cases handled
    

Each branch eliminates possibilities. After checking for null, TypeScript knows the remaining type is string | number. After a typeof check, it narrows further to either string or number. This is the core mechanic behind everything in this section.

Built-in Type Guards

TypeScript recognizes several JavaScript expressions as type guards natively. You don't need any special syntax — the compiler understands these patterns out of the box.

typeof Guards

The typeof operator narrows to JavaScript's primitive types: "string", "number", "bigint", "boolean", "symbol", "undefined", "object", and "function". TypeScript handles both the positive and negative cases.

typescript
function format(value: string | number | boolean): string {
  if (typeof value === "string") {
    return value.toUpperCase();       // value: string
  }
  if (typeof value === "number") {
    return value.toFixed(2);          // value: number
  }
  return value ? "yes" : "no";       // value: boolean
}

instanceof Guards

For class-based types, instanceof narrows to the class type. This works with any constructor function, including built-in ones like Date, RegExp, and Error.

typescript
function getMessage(err: Error | string): string {
  if (err instanceof Error) {
    return err.message;   // err: Error — has .message, .stack, etc.
  }
  return err;             // err: string
}

Truthiness Narrowing

Checking a value in a boolean context eliminates null, undefined, 0, "", and NaN from the type. This is the most common way to strip null | undefined from a union.

typescript
function printName(name: string | null | undefined) {
  if (name) {
    console.log(name.toUpperCase());  // name: string (null & undefined removed)
  }
}
Warning

Truthiness narrowing is too aggressive for string | number types. It will eliminate the empty string "" and 0 — both of which are valid values. Use value != null (loose equality) instead, which only strips null and undefined.

Equality Narrowing & the in Operator

Strict equality (===) narrows both sides to their intersection. The in operator checks for property existence and narrows to the union member(s) that have that property.

typescript
type Fish = { swim: () => void };
type Bird = { fly: () => void };

function move(animal: Fish | Bird) {
  if ("swim" in animal) {
    animal.swim();   // animal: Fish
  } else {
    animal.fly();    // animal: Bird
  }
}

One useful pattern: loose equality with null (!= null) narrows out both null and undefined at once, because null == undefined is true in JavaScript.

GuardNarrows toBest for
typeof x === "string"Primitive typesPrimitives in unions
x instanceof ClsClass instance typesClass hierarchies, Errors
if (x)Removes falsy typesStripping null | undefined (not number!)
x === valueLiteral type / intersectionDiscriminant values, enums
"prop" in xMembers with that propertyObject unions without discriminant
x != nullRemoves null & undefinedNullable types (safe for 0 and "")
Array.isArray(x)any[] or narrowed arrayArray vs. non-array

Custom Type Guards with is

Built-in guards cover primitives and classes, but what about interfaces and complex shapes? TypeScript can't infer structural narrowing from a regular function. That's where user-defined type guard functions come in — functions whose return type is a type predicate.

A type predicate has the form paramName is Type. When the function returns true, TypeScript narrows the argument to that type in the calling scope.

typescript
interface Cat { meow(): void; purr(): void }
interface Dog { bark(): void; fetch(): void }

// User-defined type guard
function isCat(animal: Cat | Dog): animal is Cat {
  return "meow" in animal;
}

function interact(animal: Cat | Dog) {
  if (isCat(animal)) {
    animal.purr();    // animal: Cat ✓
  } else {
    animal.fetch();   // animal: Dog ✓ (narrowed by elimination)
  }
}

The key insight: the compiler trusts you. If your type guard function lies (returns true when the value isn't actually that type), you'll get silent runtime errors. TypeScript does not verify that the body of a type guard function is consistent with its declared predicate.

Type guards with filter

Type guard functions are the primary way to narrow types inside array methods. items.filter(isCat) returns Cat[], not (Cat | Dog)[] — but only if isCat uses an is predicate. A plain boolean-returning function won't narrow the array's element type.

Assertion Functions with asserts

Type guards narrow inside if branches. Assertion functions take a different approach: they narrow everything after the call — or throw if the assertion fails. Think of them as runtime preconditions that the compiler also understands.

typescript
function assertIsString(val: unknown): asserts val is string {
  if (typeof val !== "string") {
    throw new Error(`Expected string, got ${typeof val}`);
  }
}

function process(input: unknown) {
  assertIsString(input);
  // After the assertion, TypeScript knows input is string
  console.log(input.toUpperCase());   // input: string ✓
}

There are two forms of assertion signatures:

SignatureMeaningUse case
asserts x is TypeIf function returns, x is TypeValidating unknown data into a known shape
asserts xIf function returns, x is truthyNon-null assertions, precondition checks
typescript
// "asserts x" form — just checks truthiness
function assertDefined<T>(val: T | null | undefined): asserts val is T {
  if (val == null) {
    throw new Error("Value must be defined");
  }
}

const el = document.getElementById("app");  // el: HTMLElement | null
assertDefined(el);
el.classList.add("loaded");                  // el: HTMLElement ✓

Control Flow Analysis in Depth

TypeScript doesn't just understand if/else. Its control flow analysis tracks type narrowing through every branching construct in the language. Understanding exactly what TypeScript can follow — and what it can't — is critical for writing code that compiles without unnecessary type assertions.

if/else, Ternary & Switch

These are the straightforward cases. TypeScript narrows the type in each branch and restores the original type (or the remaining type from elimination) after the block.

typescript
function example(x: string | number | null) {
  // Ternary narrows identically to if/else
  const label = x === null
    ? "nothing"           // x: null
    : typeof x === "string"
      ? x.toUpperCase()   // x: string
      : x.toFixed(2);     // x: number

  // Early return narrows for the rest of the function
  if (x === null) return;
  // x: string | number (null eliminated from here on)
}

Logical Operators: &&, ||, ??

TypeScript understands short-circuit evaluation. The right-hand side of && only runs when the left is truthy, so the type is narrowed accordingly. The nullish coalescing operator ?? specifically narrows out null | undefined.

typescript
function demo(val: string | null) {
  // && — right side only runs if val is truthy (string)
  val && val.toUpperCase();     // val: string on the right side

  // ?? — right side runs only if left is null/undefined
  const result = val ?? "default"; // result: string (null eliminated)
}

Assignment-Based Narrowing

TypeScript also narrows when you assign to a variable. After an assignment, the variable's type is narrowed to the type of the assigned value — as long as it's assignable to the declared type.

typescript
let x: string | number;
x = "hello";
console.log(x.toUpperCase());  // x: string (narrowed by assignment)

x = 42;
console.log(x.toFixed(2));     // x: number (re-narrowed)

Discriminated Unions

Discriminated unions are TypeScript's most powerful pattern for modeling state machines, API responses, and any data that comes in distinct "shapes." The idea: every member of the union shares a common property (the discriminant) with a unique literal type. TypeScript uses that property to narrow the entire union.

typescript
type Shape =
  | { kind: "circle"; radius: number }
  | { kind: "rectangle"; width: number; height: number }
  | { kind: "triangle"; base: number; height: number };

function area(shape: Shape): number {
  switch (shape.kind) {
    case "circle":
      return Math.PI * shape.radius ** 2;       // shape: { kind: "circle"; ... }
    case "rectangle":
      return shape.width * shape.height;         // shape: { kind: "rectangle"; ... }
    case "triangle":
      return (shape.base * shape.height) / 2;    // shape: { kind: "triangle"; ... }
  }
}

The discriminant property must be a literal type — string literals, number literals, or boolean literals. It can be named anything (kind, type, tag, _tag), but it must appear in every member of the union with a different literal value.

Exhaustiveness Checking with never

The real power of discriminated unions comes from exhaustiveness checking. If you handle every member of the union in a switch, TypeScript knows the default branch is unreachable — and the variable's type is never. You can exploit this to get a compile-time error whenever a new member is added to the union but not handled.

typescript
// Helper that produces a compile error if called with a non-never type
function assertNever(x: never): never {
  throw new Error(`Unexpected value: ${x}`);
}

function area(shape: Shape): number {
  switch (shape.kind) {
    case "circle":
      return Math.PI * shape.radius ** 2;
    case "rectangle":
      return shape.width * shape.height;
    case "triangle":
      return (shape.base * shape.height) / 2;
    default:
      return assertNever(shape);  // ✓ compiles — shape is never
  }
}

// Now add a new member to Shape:
// | { kind: "pentagon"; sideLength: number }
// ❌ Compile error at assertNever(shape) — shape is { kind: "pentagon"; ... }, not never

This is arguably the single most important defensive pattern for any codebase that uses union types. When you add a new variant, the compiler tells you every place that needs updating.

Advanced Narrowing Scenarios

Narrowing with Array.isArray

TypeScript recognizes Array.isArray() as a built-in type guard. This is the idiomatic way to distinguish arrays from other objects in a union.

typescript
function normalize(input: string | string[]): string[] {
  if (Array.isArray(input)) {
    return input;            // input: string[]
  }
  return [input];            // input: string
}

Narrowing Generic Types — The Limitations

TypeScript's narrowing does not work well with unconstrained generic type parameters. A typeof check on a generic T won't narrow T itself — only the local variable. This is a fundamental limitation of the type system.

typescript
function broken<T extends string | number>(val: T): T {
  if (typeof val === "string") {
    // You'd expect val to be T & string here, but...
    return val.toUpperCase() as T;  // ❌ Need assertion — TS can't narrow T
  }
  return val;
}

// Workaround: use overloads or conditional types instead of narrowing T
function fixed(val: string): string;
function fixed(val: number): number;
function fixed(val: string | number): string | number {
  return typeof val === "string" ? val.toUpperCase() : val;
}

Narrowing Resets in Callbacks

This trips up nearly every TypeScript developer. If you narrow a variable and then use it in a callback, TypeScript may widen the type back because the callback could run at a later time, after the variable has been reassigned.

typescript
let value: string | number = "hello";

if (typeof value === "string") {
  // value: string ✓

  setTimeout(() => {
    // value: string | number ❌ — narrowing lost!
    // TypeScript assumes value could have been reassigned before this runs
    console.log(value.toUpperCase());  // Error
  }, 100);
}

// Fix: capture in a const
if (typeof value === "string") {
  const captured = value;  // captured: string (const can't be reassigned)
  setTimeout(() => {
    console.log(captured.toUpperCase());  // captured: string ✓
  }, 100);
}
Tip

Variables declared with const never have this problem — they can't be reassigned, so TypeScript preserves narrowing inside callbacks. Prefer const everywhere you can.

The satisfies Operator for Validation-Style Narrowing

Introduced in TypeScript 4.9, satisfies validates that an expression matches a type without widening it. This is useful when you want type-checking on an object literal but want to keep the narrowest possible inferred type.

typescript
type ColorMap = Record<string, string | number[]>;

// With type annotation — widens values to string | number[]
const colorsAnnotated: ColorMap = {
  red: "#ff0000",
  green: [0, 255, 0],
};
colorsAnnotated.red.toUpperCase();  // ❌ Error — could be number[]

// With satisfies — validates AND preserves narrow types
const colors = {
  red: "#ff0000",
  green: [0, 255, 0],
} satisfies ColorMap;

colors.red.toUpperCase();      // ✓ — TypeScript knows red is string
colors.green.map(c => c * 2); // ✓ — TypeScript knows green is number[]

When Narrowing Breaks: Workaround Patterns

TypeScript's control flow analysis is impressive, but it has limits. It doesn't follow narrowing across function boundaries (besides type guards and assertion functions), through property access chains on mutable objects, or in complex conditional logic. Here are proven workarounds for the most common cases.

Pattern 1: Extract to a const

When narrowing doesn't survive through a property access, extract the value to a local const first.

typescript
// ❌ Doesn't narrow — obj.value could change between checks
if (typeof obj.value === "string") {
  doSomething(obj.value);  // might still be string | number
}

// ✓ Narrowing sticks on a local const
const { value } = obj;
if (typeof value === "string") {
  doSomething(value);      // value: string
}

Pattern 2: Assertion Functions for Complex Validation

When your validation logic is too complex for TypeScript to follow (e.g., calling a schema validator), wrap it in an assertion function to bridge the gap between runtime checks and compile-time types.

typescript
interface ApiResponse {
  status: number;
  data: { id: string; name: string };
}

function assertValidResponse(res: unknown): asserts res is ApiResponse {
  const r = res as any;
  if (typeof r?.status !== "number" || typeof r?.data?.id !== "string") {
    throw new Error("Invalid API response");
  }
}

const raw: unknown = await fetch("/api").then(r => r.json());
assertValidResponse(raw);
console.log(raw.data.name);  // raw: ApiResponse ✓

Pattern 3: Type Assertion as Last Resort

When you are certain of the type but TypeScript can't prove it, a type assertion (as Type) is acceptable — but isolate it. Wrap it in a well-named function so the assertion is documented and centralized, not scattered across your codebase.

typescript
// Centralized, documented escape hatch
function unsafeNarrow<T>(value: unknown): T {
  return value as T;
}

// Use sparingly, with a comment explaining why
const config = unsafeNarrow<AppConfig>(
  JSON.parse(rawConfigString) // We trust the config file schema
);

Advanced Utility Types & Building Custom Type-Level Tools

You already know Partial, Required, Pick, and Omit. They're the everyday workhorses. But TypeScript ships a much deeper toolbox of built-in utility types, and the real power comes from combining them — and building your own.

This section cracks open the lesser-known built-ins, shows you how each one is implemented internally, then graduates to custom type-level tools you'll actually reach for in production codebases.

mindmap
  root((Utility Types))
    Structural
      Partial
      Required
      Pick
      Omit
      Readonly
      Record
    Filtering
      Extract
      Exclude
      NonNullable
      Awaited
      NoInfer
    Function
      ReturnType
      Parameters
      ConstructorParameters
      InstanceType
      ThisParameterType
      OmitThisParameter
    Custom
      DeepPartial
      DeepReadonly
      Prettify
      Mutable
      StrictOmit
      UnionToIntersection
      UnionToTuple
      Entries
    

Lesser-Known Built-in Utility Types

Extract<T, U> — Pull Members from a Union

Extract narrows a union type down to only members that are assignable to U. Think of it as a filter that keeps matches. Here's the internal implementation:

typescript
// Built-in implementation
type Extract<T, U> = T extends U ? T : never;

// Usage: pull only string types from a union
type Events = "click" | "scroll" | "mousemove" | 42 | true;
type StringEvents = Extract<Events, string>;
// ^? "click" | "scroll" | "mousemove"

// Extract object types matching a shape
type Shape = { kind: "circle"; radius: number }
           | { kind: "square"; side: number }
           | { kind: "triangle"; base: number; height: number };

type CircleOrSquare = Extract<Shape, { kind: "circle" } | { kind: "square" }>;
// ^? { kind: "circle"; radius: number } | { kind: "square"; side: number }

The magic is distributive conditional types. When T is a union, TypeScript applies the condition to each member individually. Each member that extends U survives; the rest become never and vanish from the resulting union.

Exclude<T, U> — Remove Members from a Union

Exclude is the inverse of Extract. It removes union members assignable to U.

typescript
// Built-in implementation
type Exclude<T, U> = T extends U ? never : T;

// Remove null and undefined explicitly
type MaybeUser = string | number | null | undefined;
type DefiniteUser = Exclude<MaybeUser, null | undefined>;
// ^? string | number

// Remove specific event types
type AllEvents = "click" | "scroll" | "resize" | "mousemove";
type NonMouseEvents = Exclude<AllEvents, "click" | "mousemove">;
// ^? "scroll" | "resize"

NonNullable<T> — Strip null and undefined

A specialized version of Exclude that strips out null and undefined — the two types that cause the most runtime crashes.

typescript
// Built-in implementation
type NonNullable<T> = T & {};

// Usage
type MaybeString = string | null | undefined;
type DefiniteString = NonNullable<MaybeString>;
// ^? string
Implementation Changed in TS 4.8

Before TypeScript 4.8, NonNullable was implemented as T extends null | undefined ? never : T (a distributive conditional). The current T & {} implementation is simpler and leverages the fact that intersecting with {} filters out null and undefined since they aren't assignable to {}.

Awaited<T> — Unwrap Promise Types Recursively

Awaited recursively peels off Promise wrappers, giving you the type you'd get after await. It handles nested promises and thenables — not just Promise<T>.

typescript
// Simplified built-in implementation
type Awaited<T> = T extends null | undefined
  ? T
  : T extends object & { then(onfulfilled: infer F, ...args: infer _): any }
    ? F extends (value: infer V, ...args: infer _) => any
      ? Awaited<V>  // recursive unwrap
      : never
    : T;

// Usage
type A = Awaited<Promise<string>>;                    // string
type B = Awaited<Promise<Promise<number>>>;            // number (recursive!)
type C = Awaited<string | Promise<boolean>>;            // string | boolean

// Practical: typing the result of Promise.all
async function fetchData() {
  const [users, posts] = await Promise.all([
    fetch("/api/users").then(r => r.json() as Promise<User[]>),
    fetch("/api/posts").then(r => r.json() as Promise<Post[]>),
  ]);
  // users: User[], posts: Post[] — Awaited unwraps each promise
}

NoInfer<T> — Block Inference at Specific Positions

NoInfer (added in TypeScript 5.4) tells the compiler "do not use this position to infer the type parameter." This is invaluable when you have multiple parameters sharing the same generic, but you only want one of them to drive inference.

typescript
// Without NoInfer — T widens to include the default
function createFSM<S extends string>(
  states: S[],
  initialState: S,  // this also contributes to inference of S
) { /* ... */ }

createFSM(["idle", "running", "stopped"], "invalid"); // No error! "invalid" widens S

// With NoInfer — only `states` drives inference
function createFSM<S extends string>(
  states: S[],
  initialState: NoInfer<S>,  // blocked from inference
) { /* ... */ }

createFSM(["idle", "running", "stopped"], "invalid");
//                                         ^^^^^^^^^ Error!
// Argument of type '"invalid"' is not assignable to
// parameter of type '"idle" | "running" | "stopped"'

ThisParameterType<T> & OmitThisParameter<T>

These two handle the this parameter in function types — a TypeScript-specific feature where you can declare what this must be inside a function.

typescript
// Built-in implementations
type ThisParameterType<T> = T extends (this: infer U, ...args: never) => any ? U : unknown;
type OmitThisParameter<T> = unknown extends ThisParameterType<T>
  ? T
  : T extends (...args: infer A) => infer R
    ? (...args: A) => R
    : T;

// Usage
function formatDate(this: { locale: string }, date: Date): string {
  return date.toLocaleDateString(this.locale);
}

type Ctx = ThisParameterType<typeof formatDate>;
// ^? { locale: string }

type Standalone = OmitThisParameter<typeof formatDate>;
// ^? (date: Date) => string

// Practical: safely binding methods
const bound: Standalone = formatDate.bind({ locale: "en-US" });

ConstructorParameters<T> & InstanceType<T>

These extract information from class constructor signatures — the arguments it accepts and the type of instance it produces.

typescript
// Built-in implementations
type ConstructorParameters<T extends abstract new (...args: any) => any> =
  T extends abstract new (...args: infer P) => any ? P : never;

type InstanceType<T extends abstract new (...args: any) => any> =
  T extends abstract new (...args: any) => infer R ? R : any;

// Usage with a class
class HttpClient {
  constructor(public baseUrl: string, public timeout: number) {}
}

type HttpArgs = ConstructorParameters<typeof HttpClient>;
// ^? [baseUrl: string, timeout: number]

type HttpInstance = InstanceType<typeof HttpClient>;
// ^? HttpClient

// Practical: a generic factory function
function create<T extends new (...args: any[]) => any>(
  Ctor: T,
  ...args: ConstructorParameters<T>
): InstanceType<T> {
  return new Ctor(...args);
}

const client = create(HttpClient, "https://api.example.com", 5000);
// ^? HttpClient — fully type-safe, args checked

Building Custom Utility Types

The built-in types are building blocks. Real-world codebases need custom utilities that address gaps the standard library doesn't cover. Let's build the ones you'll actually use.

DeepPartial<T> — Recursive Optional Properties

Partial<T> only makes top-level properties optional. When you're dealing with deeply nested configuration objects, you need to go deeper.

typescript
type DeepPartial<T> = T extends object
  ? { [K in keyof T]?: DeepPartial<T[K]> }
  : T;

// Why it works:
// 1. Base case: if T isn't an object (string, number, etc.), return T as-is
// 2. Recursive case: map over every key, make it optional (?),
//    and recursively apply DeepPartial to the value type

interface AppConfig {
  database: {
    host: string;
    port: number;
    credentials: { user: string; password: string };
  };
  logging: { level: "debug" | "info" | "error"; file: string };
}

// Now you can pass partial overrides at any depth
function configure(overrides: DeepPartial<AppConfig>) { /* merge with defaults */ }

configure({
  database: { credentials: { password: "new-secret" } },
  // ✅ No need to supply host, port, user, logging, etc.
});

DeepReadonly<T> — Immutable All the Way Down

Readonly<T> only freezes top-level properties. Nested objects remain mutable. DeepReadonly fixes this for configuration objects, state snapshots, and anywhere you need true immutability at the type level.

typescript
type DeepReadonly<T> = T extends (...args: any[]) => any
  ? T  // don't freeze functions
  : T extends object
    ? { readonly [K in keyof T]: DeepReadonly<T[K]> }
    : T;

const config: DeepReadonly<AppConfig> = {
  database: {
    host: "localhost",
    port: 5432,
    credentials: { user: "admin", password: "s3cret" },
  },
  logging: { level: "info", file: "/var/log/app.log" },
};

config.database.port = 3306;
//     ^^^^ Error: Cannot assign to 'port' because it is a read-only property
config.database.credentials.password = "hacked";
//                          ^^^^^^^^ Error: read-only

Prettify<T> — Flatten Intersection Types for Readability

When you hover over an intersection like A & B & C, TypeScript shows the raw intersection — not the resolved shape. Prettify forces TypeScript to compute and display the flattened type. It changes nothing at the type-checking level; it's purely for developer experience.

typescript
type Prettify<T> = {
  [K in keyof T]: T[K];
} & {};

// Why it works:
// The mapped type forces TS to iterate through all keys and
// rebuild the object. The `& {}` prevents TS from "simplifying"
// it back to the original intersection.

type UserBase = { id: string; name: string };
type WithEmail = { email: string };
type WithRole = { role: "admin" | "user" };

// Without Prettify — hover shows: UserBase & WithEmail & WithRole
type User = UserBase & WithEmail & WithRole;

// With Prettify — hover shows the full resolved shape:
type PrettyUser = Prettify<UserBase & WithEmail & WithRole>;
// ^? { id: string; name: string; email: string; role: "admin" | "user" }

UnionToIntersection<U> — Convert Unions to Intersections

This is one of the most clever type-level tricks in TypeScript. It exploits the contravariant behavior of function parameter types to convert a union into an intersection.

typescript
type UnionToIntersection<U> =
  (U extends any ? (arg: U) => void : never) extends
    (arg: infer I) => void
    ? I
    : never;

// Step-by-step for U = A | B | C:
// 1. Distribute: (arg: A) => void | (arg: B) => void | (arg: C) => void
// 2. Infer the parameter: for a union of functions to all accept the
//    same argument, that argument must be A & B & C (contravariance)
// 3. Result: A & B & C

type Combined = UnionToIntersection<
  { name: string } | { age: number } | { email: string }
>;
// ^? { name: string } & { age: number } & { email: string }

// Practical: merging event handler maps
type EventMaps = { onClick: () => void } | { onHover: () => void };
type AllHandlers = Prettify<UnionToIntersection<EventMaps>>;
// ^? { onClick: () => void; onHover: () => void }

UnionToTuple<T> — Convert Unions to Tuples

This is the boss fight of type-level programming. It combines UnionToIntersection, function overloads, and recursive types. Union member order isn't guaranteed by TypeScript, so the tuple order may vary — but the members are correct.

typescript
type UnionToIntersectionFn<U> =
  (U extends any ? (k: () => U) => void : never) extends
    (k: infer I) => void ? I : never;

type LastOfUnion<U> =
  UnionToIntersectionFn<U> extends () => infer Last ? Last : never;

type UnionToTuple<U, Last = LastOfUnion<U>> =
  [U] extends [never]
    ? []
    : [...UnionToTuple<Exclude<U, Last>>, Last];

type Result = UnionToTuple<"a" | "b" | "c">;
// ^? ["a", "b", "c"]
Warning — UnionToTuple is Fragile

UnionToTuple relies on the internal ordering of union members, which TypeScript does not guarantee across versions. It's fine for compile-time assertions and tests, but don't depend on the element order at runtime. Also, unions with boolean (which is true | false) or large unions can cause unexpected results or recursion depth errors.

Entries<T> — Typed Object.entries()

The built-in Object.entries() returns [string, T][], losing key information. This utility type preserves the exact key-value pairs.

typescript
type Entries<T> = {
  [K in keyof T]-?: [K, T[K]];
}[keyof T][];

// Usage
interface ColorMap {
  red: "#ff0000";
  green: "#00ff00";
  blue: "#0000ff";
}

type ColorEntries = Entries<ColorMap>;
// ^? (["red", "#ff0000"] | ["green", "#00ff00"] | ["blue", "#0000ff"])[]

// Use it with a cast helper
function typedEntries<T extends Record<string, unknown>>(obj: T): Entries<T> {
  return Object.entries(obj) as Entries<T>;
}

const colors: ColorMap = { red: "#ff0000", green: "#00ff00", blue: "#0000ff" };
for (const [key, hex] of typedEntries(colors)) {
  console.log(key, hex);
  // key: "red"|"green"|"blue", hex: "#ff0000"|"#00ff00"|"#0000ff"
}

Mutable<T> — Remove readonly from All Properties

The inverse of Readonly<T>. Useful when you receive a frozen type from a library but need a mutable working copy internally — for example, building up a response object before sending.

typescript
type Mutable<T> = {
  -readonly [K in keyof T]: T[K];
};

// The `-readonly` modifier removes the readonly flag,
// just like `-?` removes the optional flag.

interface FrozenUser {
  readonly id: string;
  readonly name: string;
  readonly email: string;
}

type EditableUser = Mutable<FrozenUser>;
// ^? { id: string; name: string; email: string }  — all writable

function toEditable(user: FrozenUser): EditableUser {
  return { ...user }; // now the returned copy is mutable
}

StrictOmit<T, K>Omit that Errors on Invalid Keys

The built-in Omit silently accepts keys that don't exist on T. This is dangerous during refactoring — you rename a property, but Omit continues to compile with the old key, silently doing nothing. StrictOmit catches this.

typescript
type StrictOmit<T, K extends keyof T> = Omit<T, K>;

// The difference is `K extends keyof T` instead of
// `K extends string | number | symbol` which is what the built-in Omit uses.

interface User {
  id: string;
  name: string;
  email: string;
  passwordHash: string;
}

// Built-in Omit: silently accepts typos
type PublicUser1 = Omit<User, "passwrdHash">;
// ← Typo. No error. passwordHash leaks!

// StrictOmit: catches the error immediately
type PublicUser2 = StrictOmit<User, "passwrdHash">;
//                                   ^^^^^^^^^^^^
// Error: Type '"passwrdHash"' does not satisfy the constraint
//        '"id" | "name" | "email" | "passwordHash"'

type PublicUser3 = StrictOmit<User, "passwordHash">;  // ✅ Correct

Composing Utility Types

The real power isn't any single utility — it's composing them together. Utility types are functions at the type level, and like regular functions, they compose. Nest them to express exactly what you need.

typescript
interface FullConfig {
  database: { host: string; port: number; ssl: boolean };
  auth: { provider: string; secret: string; ttl: number };
  logging: { level: string; output: string };
  features: { darkMode: boolean; beta: boolean };
}

// "Give me only database and auth, deeply frozen"
type SecurityConfig = DeepReadonly<Pick<FullConfig, "database" | "auth">>;
// SecurityConfig.database.port is readonly
// SecurityConfig.auth.secret is readonly

// "Let me override any nested value in the logging section"
type LoggingOverrides = DeepPartial<Pick<FullConfig, "logging">>;

// "All required, but the display shape is clean"
type ResolvedConfig = Prettify<Required<FullConfig>>;

// "A public-safe user type with readable hover info"
type SafeUser = Prettify<StrictOmit<User, "passwordHash">>;
// ^? { id: string; name: string; email: string }
Wrap Prettify on the Outside

When composing multiple utility types, wrap the entire result in Prettify<...> as the outermost layer. This gives you a clean, flattened hover tooltip in your editor — instead of a nested mess of DeepReadonly<Pick<Omit<...>>>.

Quick Reference: Choosing the Right Utility

You want to...Use thisBuilt-in?
Make all props optional (shallow)Partial<T>
Make all props optional (deep)DeepPartial<T>Custom
Freeze all props (shallow)Readonly<T>
Freeze all props (deep)DeepReadonly<T>Custom
Remove readonlyMutable<T>Custom
Omit keys with typo safetyStrictOmit<T, K>Custom
Filter a union (keep matches)Extract<T, U>
Filter a union (remove matches)Exclude<T, U>
Strip null | undefinedNonNullable<T>
Unwrap Promise<T>Awaited<T>
Block type inference at a positionNoInfer<T>✅ (5.4+)
Flatten intersection for hoverPrettify<T>Custom
Convert union → intersectionUnionToIntersection<U>Custom
Get typed Object.entriesEntries<T>Custom

Recursive Types & Deep Type Manipulation

Recursive types are types that reference themselves in their own definition. They are essential for modeling inherently recursive data — trees, linked lists, JSON, nested configurations — and for building deep utility types that operate on arbitrarily nested structures. TypeScript's support for recursive types has matured significantly, especially since version 4.5 introduced tail-recursive conditional type optimizations.

Self-Referential Type Aliases

A type alias can reference itself directly in its definition, as long as the self-reference is deferred — wrapped inside an object type, array, tuple, or function. This is how you model data that can be nested to any depth.

The classic example is a JSON value type. JSON is inherently recursive: an array can contain more JSON values, and an object's properties can be JSON values.

typescript
type JSONValue =
  | string
  | number
  | boolean
  | null
  | JSONValue[]
  | { [key: string]: JSONValue };

// All of these are valid JSONValue assignments:
const name: JSONValue = "Alice";
const nested: JSONValue = {
  users: [{ id: 1, tags: ["admin", "active"], meta: null }],
  count: 42,
};

This works because every self-reference to JSONValue is inside a container — an array type (JSONValue[]) or an object index signature ({ [key: string]: JSONValue }). The self-references are never "bare" or directly circular.

Tree and Linked List Structures

Recursive types are the natural way to express tree nodes. Here, both interface and type alias approaches work. Interfaces can reference themselves through property types with no restrictions.

typescript
// Binary tree using an interface
interface TreeNode<T> {
  value: T;
  left: TreeNode<T> | null;
  right: TreeNode<T> | null;
}

// Linked list using a type alias
type LinkedList<T> = {
  value: T;
  next: LinkedList<T> | null;
};

// Nested menu — common in UI frameworks
interface MenuItem {
  label: string;
  href?: string;
  children?: MenuItem[];
}

const menu: MenuItem[] = [
  { label: "Home", href: "/" },
  {
    label: "Products",
    children: [
      { label: "Software", href: "/software" },
      {
        label: "Hardware",
        children: [
          { label: "Keyboards", href: "/hardware/keyboards" },
          { label: "Monitors", href: "/hardware/monitors" },
        ],
      },
    ],
  },
];
Circular Alias Restriction

You cannot create a directly circular type alias without indirection. type Bad = Bad | string is an error — the self-reference is not deferred through an object, array, or function. Wrapping in an array (type Ok = Ok[] | string) or object property fixes it. Interfaces don't have this limitation because their properties are always deferred.

Deep Utility Types

Standard utility types like Partial<T> and Readonly<T> only operate on the first level of properties. For deeply nested objects, you need recursive mapped types. These types map over each property and, when the property is an object, recursively apply the same transformation.

DeepPartial<T>

typescript
type DeepPartial<T> = T extends object
  ? { [K in keyof T]?: DeepPartial<T[K]> }
  : T;

interface Config {
  db: { host: string; port: number; ssl: { enabled: boolean; cert: string } };
  logging: { level: string };
}

// Every nested property becomes optional
type PartialConfig = DeepPartial<Config>;

const override: PartialConfig = {
  db: { ssl: { enabled: true } },  // no need to specify host, port, or cert
};

DeepReadonly<T> and DeepRequired<T>

The same recursive pattern works for Readonly and Required. The key difference is the modifier you apply in the mapped type.

typescript
type DeepReadonly<T> = T extends object
  ? { readonly [K in keyof T]: DeepReadonly<T[K]> }
  : T;

type DeepRequired<T> = T extends object
  ? { [K in keyof T]-?: DeepRequired<T[K]> }
  : T;

// Freeze an entire config tree at the type level
const frozen: DeepReadonly<Config> = {
  db: { host: "localhost", port: 5432, ssl: { enabled: true, cert: "..." } },
  logging: { level: "info" },
};

// frozen.db.ssl.enabled = false;  // Error: Cannot assign to 'enabled'
Handling Arrays in Deep Utilities

The simple T extends object check also matches arrays and functions, which can produce unexpected results. For production code, add explicit checks: T extends Function ? T : T extends (infer U)[] ? DeepPartial<U>[] : ... to handle arrays and functions correctly.

Recursion Depth Limits

TypeScript imposes recursion limits to prevent infinite type expansion from crashing the compiler. Understanding these limits is critical when designing recursive types.

ScenarioTypical LimitSince
Recursive type instantiation (generic expansion)~50 levelsAll versions
Tail-recursive conditional types~1000 levelsTypeScript 4.5+
Template literal type recursion~1000 levels (with tail recursion)TypeScript 4.5+

When you exceed these limits, TypeScript emits Type instantiation is excessively deep and possibly infinite. This is a hard stop — you cannot configure a higher limit.

Tail-Recursive Conditional Types

TypeScript 4.5 introduced an optimization for recursive conditional types that are in tail position — meaning the recursive call is the final operation, with no further type-level wrapping around it. When TypeScript detects tail recursion, it unrolls the recursion iteratively, raising the effective depth limit from ~50 to ~1000.

The trick is to restructure your type so the recursive call is the last thing that happens, typically by accumulating results in a type parameter rather than building up after the recursive return.

typescript
// ❌ NON-tail-recursive — wraps result AFTER the recursive call
type NaiveTupleOf<T, N extends number, R extends T[] = []> =
  R["length"] extends N ? R : NaiveTupleOf<T, N, [T, ...R]>;
// This is actually tail-recursive by accident because the
// conditional's true/false branches return directly.

// ✅ Tail-recursive string reversal — accumulator pattern
type Reverse<S extends string, Acc extends string = ""> =
  S extends `${infer Head}${infer Tail}`
    ? Reverse<Tail, `${Head}${Acc}`>  // recursive call is in tail position
    : Acc;

type Reversed = Reverse<"hello">;  // "olleh"

// ❌ NON-tail-recursive — wraps result after recursion
type ReverseNonTail<S extends string> =
  S extends `${infer Head}${infer Tail}`
    ? `${ReverseNonTail<Tail>}${Head}`  // builds string AROUND the recursive call
    : S;

The key difference: in the tail-recursive version, the recursive call Reverse<Tail, ...> is the direct return of the conditional branch. In the non-tail version, ReverseNonTail<Tail> is embedded inside a template literal — TypeScript must resolve the inner recursion before composing the outer string, preventing tail-call optimization.

Advanced Patterns

Recursive String Parsing

You can parse structured strings entirely at the type level by recursively matching patterns with template literal types. This is the foundation for type-safe route parsers, SQL query types, and similar utilities.

typescript
// Extract all :param segments from a route string
type ExtractParams<
  Path extends string,
  Acc extends string = never
> = Path extends `${string}:${infer Param}/${infer Rest}`
  ? ExtractParams<Rest, Acc | Param>
  : Path extends `${string}:${infer Param}`
    ? Acc | Param
    : Acc;

type Params = ExtractParams<"/users/:userId/posts/:postId">;
//   ^? "userId" | "postId"

// Build a typed params object from the route
type RouteParams<Path extends string> = {
  [K in ExtractParams<Path>]: string;
};

type UserPostParams = RouteParams<"/users/:userId/posts/:postId">;
//   ^? { userId: string; postId: string }

Flattening Deeply Nested Types

Recursive types can flatten nested arrays or object hierarchies. The Flatten type below recursively unwraps nested arrays to a single level.

typescript
// Deep flatten: recursively unwrap nested arrays
type DeepFlatten<T> = T extends readonly (infer U)[]
  ? DeepFlatten<U>
  : T;

type Nested = number[][][];
type Flat = DeepFlatten<Nested>;  // number

// Controlled-depth flatten with a counter tuple
type FlattenDepth<
  T,
  Depth extends number = 1,
  Counter extends unknown[] = []
> = Counter["length"] extends Depth
  ? T
  : T extends readonly (infer U)[]
    ? FlattenDepth<U, Depth, [...Counter, unknown]>
    : T;

type TwoLevels = FlattenDepth<string[][][], 2>;  // string[]

Recursive Discriminated Unions

Recursive discriminated unions let you model expression trees, ASTs, and other recursive structures where each node has a kind discriminant. TypeScript's narrowing works through each level of recursion when you switch on the discriminant.

typescript
type Expr =
  | { kind: "number"; value: number }
  | { kind: "string"; value: string }
  | { kind: "add"; left: Expr; right: Expr }
  | { kind: "concat"; left: Expr; right: Expr }
  | { kind: "if"; condition: Expr; then: Expr; else: Expr };

function evaluate(expr: Expr): number | string {
  switch (expr.kind) {
    case "number":  return expr.value;
    case "string":  return expr.value;
    case "add":     return (evaluate(expr.left) as number) + (evaluate(expr.right) as number);
    case "concat":  return `${evaluate(expr.left)}${evaluate(expr.right)}`;
    case "if":      return evaluate(expr.condition) ? evaluate(expr.then) : evaluate(expr.else);
  }
}

// Type-safe AST construction
const ast: Expr = {
  kind: "if",
  condition: { kind: "number", value: 1 },
  then: { kind: "add", left: { kind: "number", value: 2 }, right: { kind: "number", value: 3 } },
  else: { kind: "string", value: "fallback" },
};
When You Hit Depth Limits

If your recursive type exceeds TypeScript's depth limit, try three things: (1) restructure to tail-recursive form with an accumulator parameter, (2) limit recursion depth explicitly using a counter tuple, or (3) break the recursion at known boundary types (arrays, primitives, functions) to prevent unnecessary expansion into non-object leaves.

Type-Level Programming & Arithmetic

TypeScript's type system is more than a validator — it's a full programming language that runs at compile time. You can encode numbers, perform arithmetic, parse strings, and build data structures entirely within the type system. The "runtime" is tsc itself, and the "output" is a type that either compiles or produces an error.

This section treats the type system as what it is: a recursive, pattern-matching functional language embedded inside TypeScript. We'll build real utilities from first principles and then draw a clear line between powerful type-level computation and pointless "type golf."

Tuple-Based Arithmetic

TypeScript has no built-in Add or Subtract for number literal types. But it can measure tuple lengths. The fundamental insight: represent a number N as a tuple of length N, manipulate tuples with spreads and recursive conditionals, then read the resulting ['length'] back out as a number literal.

graph LR
    A["Add<3, 2>"] --> B["BuildTuple<3>
→ [?, ?, ?]"] A --> C["BuildTuple<2>
→ [?, ?]"] B --> D["Spread: [...T1, ...T2]
→ [?, ?, ?, ?, ?]"] C --> D D --> E["Combined['length']
→ 5"]

The core building block is BuildTuple — a recursive type that constructs a tuple of any given length. Once you have tuples, arithmetic is just tuple manipulation.

typescript
// Build a tuple of length N filled with unknown elements
type BuildTuple<
  N extends number,
  T extends unknown[] = []
> = T['length'] extends N ? T : BuildTuple<N, [...T, unknown]>;

// Addition: spread two tuples into one, read the combined length
type Add<A extends number, B extends number> =
  [...BuildTuple<A>, ...BuildTuple<B>]['length'] & number;

// Subtraction: peel elements off the front to find the difference
type Subtract<A extends number, B extends number> =
  BuildTuple<A> extends [...BuildTuple<B>, ...infer Rest]
    ? Rest['length'] & number
    : never; // A < B → no valid natural number result

type Five = Add<3, 2>;        // 5
type Three = Subtract<7, 4>;   // 3
type Nope = Subtract<2, 5>;    // never

Comparison works by the same principle. Build a tuple of length A, then try to match it as a tuple of length B with leftover elements. If there's a non-empty rest, A is greater.

typescript
type IsGreaterThan<A extends number, B extends number> =
  BuildTuple<A> extends [...BuildTuple<B>, infer _First, ...infer _Rest]
    ? true
    : false;

type Yes = IsGreaterThan<5, 3>;  // true
type No  = IsGreaterThan<2, 7>;  // false
type Eq  = IsGreaterThan<4, 4>;  // false
Recursion Depth Limits

TypeScript caps recursive type instantiation at roughly 1,000 levels (varying by version). Tuple-based arithmetic works well for numbers up to ~999. For larger values, you'd need digit-by-digit string-based approaches — but at that point, ask yourself if you really need compile-time math that large.

String Parsing at the Type Level

Template literal types let you destructure strings with infer the same way you destructure tuples. Combined with recursion, this turns TypeScript into a full string-processing engine at compile time.

Split and Join

typescript
// Split a string by a delimiter into a tuple of substrings
type Split<
  S extends string,
  D extends string
> = S extends `${infer Head}${D}${infer Tail}`
  ? [Head, ...Split<Tail, D>]
  : [S];

type Parts = Split<"a.b.c", ".">;
// → ["a", "b", "c"]

// Join a tuple of strings with a delimiter
type Join<
  T extends string[],
  D extends string
> = T extends [infer First extends string]
  ? First
  : T extends [infer First extends string, ...infer Rest extends string[]]
    ? `${First}${D}${Join<Rest, D>}`
    : "";

type Rejoined = Join<["a", "b", "c"], "-">;
// → "a-b-c"

Parsing Numeric Strings

A particularly useful trick is converting string literal types to number literal types. This comes up when you read numbers from template literals, config strings, or route parameters.

typescript
// Convert a string literal "42" → number literal 42
type ParseInt<S extends string> =
  S extends `${infer N extends number}` ? N : never;

type Forty2 = ParseInt<"42">;      // 42 (number literal)
type Nah    = ParseInt<"hello">;    // never

// Parse a semver-style version string into its components
type ParseVersion<S extends string> =
  S extends `${infer Major extends number}.${infer Minor extends number}.${infer Patch extends number}`
    ? { major: Major; minor: Minor; patch: Patch }
    : never;

type V = ParseVersion<"3.11.2">;
// → { major: 3; minor: 11; patch: 2 }

Pattern Matching with Conditional Types

Conditional types are the if/else and switch of type-level programming. Every recursive type utility you've seen so far depends on them. The pattern is always the same: test a shape with extends, extract pieces with infer, and recurse or return.

You can chain conditional types for multi-branch logic, creating exhaustive pattern matches that would feel at home in Haskell or Rust:

typescript
// Type-level pattern match on a value's "shape"
type Classify<T> =
  T extends null | undefined  ? "nullish"  :
  T extends boolean           ? "boolean"  :
  T extends number            ? "number"   :
  T extends string            ? "string"   :
  T extends unknown[]         ? "array"    :
  T extends (...args: any[]) => any ? "function" :
  T extends object            ? "object"   :
  "unknown";

type A = Classify<42>;          // "number"
type B = Classify<[1, 2, 3]>;   // "array"
type C = Classify<() => void>;  // "function"

The key insight: order matters. Just like a switch statement, TypeScript evaluates each extends branch top-down and takes the first match. Since arrays are objects and functions are objects, you must test more specific types before broader ones.

Type-Level Data Structures

Once you can recurse and pattern-match, you can build data structures that exist purely at the type level. These aren't runtime values — they're shapes the compiler reasons about.

Type-Level Linked List

typescript
// A linked list encoded as nested object types
type Nil = { tag: "nil" };
type Cons<Head, Tail> = { tag: "cons"; head: Head; tail: Tail };

// Build a list: 1 → 2 → 3 → nil
type MyList = Cons<1, Cons<2, Cons<3, Nil>>>;

// Get the head of a list
type Head<L> = L extends Cons<infer H, any> ? H : never;

// Get the length
type Length<L, Acc extends unknown[] = []> =
  L extends Nil
    ? Acc['length']
    : L extends Cons<any, infer Tail>
      ? Length<Tail, [...Acc, unknown]>
      : never;

type First = Head<MyList>;     // 1
type Len   = Length<MyList>;    // 3

Type-Level Maps (Object Types as Dictionaries)

Object types already behave like compile-time dictionaries. You can use mapped types and key lookup to implement Get, Set, and Delete operations:

typescript
// Get a value from a type-level map
type Get<Map, Key extends keyof Map> = Map[Key];

// Set (upsert) a key in a type-level map
type Set<Map, Key extends string, Value> =
  Omit<Map, Key> & Record<Key, Value>;

// Delete a key from a type-level map
type Delete<Map, Key extends string> = Omit<Map, Key>;

type Config = { host: "localhost"; port: 3000; debug: true };

type Updated = Set<Config, "port", 8080>;
// → { host: "localhost"; debug: true; port: 8080 }

type Minimal = Delete<Config, "debug">;
// → { host: "localhost"; port: 3000 }

Practical Applications

Type-level programming isn't just an academic exercise. Here are three patterns that ship in real production code.

Tuple Length Constraints

Enforce that a function receives exactly the right number of arguments at compile time, not runtime:

typescript
type ExactLength<T extends unknown[], N extends number> =
  T['length'] extends N ? T : never;

function createPoint<T extends [number, number]>(
  coords: T & ExactLength<T, 2>
): { x: number; y: number } {
  return { x: coords[0], y: coords[1] };
}

createPoint([10, 20]);      // ✓ compiles
// createPoint([10, 20, 30]); // ✗ Type 'never' error

Compile-Time API Version Validation

typescript
type ParseVersion<S extends string> =
  S extends `${infer Maj extends number}.${infer Min extends number}.${infer Pat extends number}`
    ? { major: Maj; minor: Min; patch: Pat }
    : never;

type IsCompatible<
  Current extends string,
  Required extends string
> = IsGreaterThan<
  ParseVersion<Current>['major'],
  ParseVersion<Required>['major']
> extends true
  ? false  // Major version bump = breaking change
  : ParseVersion<Current>['major'] extends ParseVersion<Required>['major']
    ? true // Same major = compatible (simplified)
    : false;

type Ok   = IsCompatible<"3.2.1", "3.0.0">;  // true
type Nope2 = IsCompatible<"4.0.0", "3.0.0">;  // false

Type-Safe printf-Style Format Strings

This is the crown jewel of type-level string parsing. Given a format string like "%s has %d items", we can extract the specifiers and produce a correctly-typed function signature — all at compile time.

typescript
// Map format specifiers to their expected types
type SpecifierToType<S extends string> =
  S extends "s" ? string :
  S extends "d" ? number :
  S extends "b" ? boolean :
  never;

// Parse a format string into a tuple of expected argument types
type ParseFormatString<S extends string> =
  S extends `${string}%${infer Spec}${infer Rest}`
    ? [SpecifierToType<Spec>, ...ParseFormatString<Rest>]
    : [];

// Test it: "%s has %d items" → [string, number]
type Args = ParseFormatString<"%s has %d items">;
// → [string, number]

// A type-safe printf wrapper
declare function printf<F extends string>(
  format: F,
  ...args: ParseFormatString<F>
): string;

printf("%s has %d items", "Alice", 42);      // ✓
// printf("%s has %d items", "Alice", "oops"); // ✗ 'string' not 'number'
Real-World Precedent

This isn't theoretical. Libraries like ts-pattern, Drizzle ORM, and tRPC all use recursive template literal parsing to deliver fully type-safe APIs.

Useful Type Computation vs. "Type Golf"

TypeScript's type system is Turing complete. You can implement Fibonacci sequences, Brainfuck interpreters, and even ray tracers entirely in types. But should you?

CategoryShip It ✓Type Golf ✗
String parsing Route params, format strings, template DSLs Parsing JSON/HTML at the type level
Arithmetic Tuple length checks, index bounds Computing prime numbers in types
Data structures Mapped object transforms, recursive DeepPartial Type-level linked list sorting algorithms
Pattern matching Discriminated unions, exhaustive checking Full regex engine in the type system

Here are the rules of thumb for deciding when type-level computation is worth it:

  • Error messages: If your clever type produces Type 'X' is not assignable to type 'Y extends Z ? Foo<Bar<Baz<...>>> : never', it's harming DX more than it helps. Aim for errors a teammate can read in 5 seconds.
  • Compile time: Deep recursion slows tsc. If your types add noticeable delay to the IDE, the productivity cost outweighs the safety benefit.
  • Maintainability: If the type requires a paragraph-length comment to explain, consider runtime validation (like Zod) instead. Types should make code clearer, not obscure it.
Turing Completeness ≠ Permission

TypeScript's Turing completeness at the type level is an emergent property, not a design goal. The TypeScript team doesn't optimise for it and may break advanced type-level patterns between versions. Treat deep type-level computation as a power tool: reach for it deliberately, not by default.

Branded Types, Nominal Typing & Phantom Types

TypeScript's type system is structural: two types are compatible if their shapes match, regardless of what you named them. Most of the time this is exactly what you want. But when you have two semantically distinct values that share the same underlying shape — like a user ID and a post ID, both string — structural typing lets you silently swap one for the other. Branded types, nominal patterns, and phantom types exist to close this gap.

The Problem: Structural Equivalence Hides Bugs

Consider a function that fetches a user and another that fetches a post. Both accept a string ID. Nothing stops you from passing a post ID where a user ID is expected.

typescript
type UserId = string;
type PostId = string;

function getUser(id: UserId) { /* ... */ }
function getPost(id: PostId) { /* ... */ }

const postId: PostId = "post_abc123";
getUser(postId); // ✅ No error — but this is a bug!

TypeScript sees UserId and PostId as identical — both resolve to string. The compiler can't help you here because it only checks structure. This class of bug is silent, compiles cleanly, and typically surfaces as a confusing 404 or corrupt data in production.

The Branding Pattern

The core idea is to intersect a primitive type with a phantom property that exists only at the type level. This "brands" the type so that one branded string is not assignable to another, even though at runtime they're both plain strings.

typescript
// Approach 1: unique symbol per brand
declare const __brand: unique symbol;
type Brand<T, B> = T & { readonly [__brand]: B };

type UserId = Brand<string, "UserId">;
type PostId = Brand<string, "PostId">;

function getUser(id: UserId) { /* ... */ }
function getPost(id: PostId) { /* ... */ }

const postId = "post_abc123" as PostId;
getUser(postId); // ❌ Error: PostId is not assignable to UserId

The Brand utility type intersects T with an object containing a symbol-keyed property. Because each brand string literal ("UserId", "PostId") is different, the resulting types are structurally incompatible. At runtime, the __brand property doesn't exist — it's purely a compile-time guard.

Why unique symbol?

Using a unique symbol as the property key prevents accidental collisions. A plain string key like __brand could theoretically match a real property on an object. The unique symbol approach is the most airtight, though the simpler string-key version ({ readonly __brand: "UserId" }) works fine in most codebases.

Validation at the Boundary: Type Guard Constructors

Casting with as defeats the purpose if you scatter it everywhere. The correct pattern is to validate data at system boundaries — API responses, form inputs, database reads — and return branded types from those validation points. The rest of your code then benefits from compile-time safety with zero additional casts.

typescript
function parseUserId(input: string): UserId {
  if (!/^usr_[a-z0-9]{8,}$/.test(input)) {
    throw new Error(`Invalid user ID format: ${input}`);
  }
  return input as UserId; // single cast, guarded by validation
}

function parsePostId(input: string): PostId {
  if (!/^post_[a-z0-9]{8,}$/.test(input)) {
    throw new Error(`Invalid post ID format: ${input}`);
  }
  return input as PostId;
}

// Usage at API boundary
const userId = parseUserId(req.params.userId); // UserId
const postId = parsePostId(req.params.postId); // PostId
getUser(userId);  // ✅
getUser(postId);  // ❌ Compile error

The as cast is confined to the constructor function — a single, auditable location. Every consumer of UserId downstream is guaranteed to have passed through validation.

Branding Objects, Not Just Primitives

The same technique works on object types. This is useful when two interfaces have identical shapes but represent different domain concepts.

typescript
type USD = Brand<number, "USD">;
type EUR = Brand<number, "EUR">;

function chargeUSD(amount: USD) { /* ... */ }

const euros = 49.99 as EUR;
chargeUSD(euros); // ❌ Error: EUR is not assignable to USD

// Explicit conversion required
function eurToUsd(eur: EUR, rate: number): USD {
  return (eur * rate) as USD;
}

This pattern is powerful for financial applications where mixing currencies is a critical correctness bug, not just a style issue. The type system forces explicit conversions through well-defined functions.

The Opaque Type Pattern

Some codebases use an Opaque type that's conceptually identical to Brand but names the intent more clearly. The idea: the internal representation is hidden ("opaque") from consumers.

typescript
// Opaque: same mechanism, different naming convention
type Opaque<T, Token> = T & { readonly __opaque__: Token };

type Email = Opaque<string, "Email">;
type URL   = Opaque<string, "URL">;

function sendEmail(to: Email, link: URL) { /* ... */ }

const email = validateEmail("alice@example.com"); // returns Email
const url   = validateURL("https://example.com"); // returns URL

sendEmail(email, url);  // ✅
sendEmail(url, email);  // ❌ Arguments swapped — caught at compile time

Libraries like ts-brand and type-fest (which includes Opaque) provide battle-tested versions of these utilities so you don't have to maintain your own.

Phantom Types: Encoding State Without Runtime Cost

A phantom type is a type parameter that appears in the type signature but has no runtime representation. It exists solely to carry compile-time information — like "this string has been sanitized" or "this form has been submitted." The key difference from branding: phantom types often use a generic parameter to track state transitions.

typescript
// Phantom type for validated/unvalidated distinction
type Validated<T> = T & { readonly __validated: true };

type RawHTML = string;
type SafeHTML = Validated<string> & { readonly __kind: "SafeHTML" };

function sanitize(raw: RawHTML): SafeHTML {
  const clean = raw.replace(/<script\b[^<]*(?:(?!<\/script>)<[^<]*)*<\/script>/gi, "");
  return clean as SafeHTML;
}

function renderToDOM(html: SafeHTML) {
  document.body.innerHTML = html; // Safe — we know it's sanitized
}

renderToDOM("<script>alert('xss')</script>"); // ❌ string is not SafeHTML
renderToDOM(sanitize(userInput));               // ✅

The __validated property never exists at runtime. It's a compile-time contract: "this value has passed through a sanitizer." Any function that requires SafeHTML can only receive values that went through sanitize().

Phantom Types for State Machines

Where phantom types really shine is encoding state transitions in the type system. You can make it impossible to call operations on an entity in the wrong state, with the compiler enforcing the valid transitions.

typescript
type FormState = "draft" | "submitted" | "approved" | "rejected";

type Form<S extends FormState> = {
  readonly _state: S;   // phantom — set via cast, not at runtime
  title: string;
  body: string;
};

function createDraft(title: string, body: string): Form<"draft"> {
  return { title, body } as Form<"draft">;
}

function submit(form: Form<"draft">): Form<"submitted"> {
  // ...send to server
  return form as unknown as Form<"submitted">;
}

function approve(form: Form<"submitted">): Form<"approved"> {
  return form as unknown as Form<"approved">;
}

// Enforced transitions
const draft = createDraft("My Post", "Hello world");
const submitted = submit(draft);     // ✅ draft → submitted
const approved = approve(submitted); // ✅ submitted → approved
approve(draft); // ❌ Error: Form<"draft"> is not Form<"submitted">

Each state transition function accepts only the correct input state and returns the next state. You cannot approve a draft or submit an already-approved form — the compiler rejects it. This moves business rules from runtime checks into the type system.

When to reach for these patterns

Branded types shine at system boundaries — parsing API responses, reading from databases, handling user input. Phantom state types are most valuable in workflow-heavy domains — document approval pipelines, payment processing, multi-step wizards. Don't brand every string in your codebase; apply these patterns where a mix-up would cause real damage.

Real-World Example: Validated Database IDs

Pulling the patterns together, here's how you might handle database IDs in a backend application. Branded types ensure you never query the wrong table, and validation constructors guarantee format correctness.

typescript
declare const __brand: unique symbol;
type Brand<T, B> = T & { readonly [__brand]: B };

// Domain IDs
type UserId    = Brand<string, "UserId">;
type OrderId   = Brand<string, "OrderId">;
type ProductId = Brand<string, "ProductId">;

// Constructors with UUID validation
const UUID_RE = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;

function toUserId(raw: string): UserId {
  if (!UUID_RE.test(raw)) throw new Error(`Invalid UserId: ${raw}`);
  return raw as UserId;
}

function toOrderId(raw: string): OrderId {
  if (!UUID_RE.test(raw)) throw new Error(`Invalid OrderId: ${raw}`);
  return raw as OrderId;
}

// Repository functions accept only the correct branded type
function findUser(id: UserId): Promise<User> { /* ... */ }
function findOrder(id: OrderId): Promise<Order> { /* ... */ }

// At the API boundary
const userId  = toUserId(req.params.userId);
const orderId = toOrderId(req.params.orderId);

findUser(userId);   // ✅
findUser(orderId);  // ❌ Compile error — OrderId ≠ UserId
PatternUse CaseRuntime CostComplexity
Type aliasDocumentation onlyNoneTrivial
Branded typePrevent ID/value mix-upsNoneLow
Opaque typeModule-boundary encapsulationNoneLow
Phantom stateState machine transitionsNoneMedium
Runtime classTrue nominal typing via instanceofObject allocationMedium
Brands are erased at runtime

Branded types provide zero runtime safety. If you bypass TypeScript (e.g., any casts, untyped JSON parsing, JavaScript interop), the brand disappears. Always pair branding with a runtime validation function at the entry point. The brand is a proof certificate that validation happened — not the validation itself.

Higher-Kinded Type Patterns & Workarounds

In languages like Haskell or Scala, you can write functions that abstract over type constructors — not just Array<number>, but Array itself as a thing that takes a type parameter. TypeScript's type system has no syntax for this. You can't write type Functor<F<_>> because generic type parameters can't themselves be generic.

This limitation matters when you want to write a single map interface that works uniformly over Arrays, Promises, Options, and Eithers — without repeating yourself for each container type. Higher-kinded types (HKTs) are the theoretical tool for this. What follows are the practical encoding patterns the TypeScript community has developed to work around the gap.

What Are Higher-Kinded Types?

A kind describes the "type of a type." Concrete types like string or number have kind * (pronounced "type"). A type constructor like Array has kind * → * — it takes one type and produces another. A higher-kinded type is a type variable that ranges over type constructors rather than concrete types.

Consider this pseudocode that TypeScript cannot express:

typescript
// ❌ This is NOT valid TypeScript
interface Functor<F<_>> {
  map<A, B>(fa: F<A>, f: (a: A) => B): F<B>;
}

// We wish we could then write:
const arrayFunctor: Functor<Array> = { map: (fa, f) => fa.map(f) };
const promiseFunctor: Functor<Promise> = { map: (fa, f) => fa.then(f) };

Without HKTs, you'd have to define ArrayFunctor, PromiseFunctor, OptionFunctor as completely separate, unrelated interfaces. The URI-based encoding pattern solves this by turning the type constructor into a string literal lookup.

The URI-Based Encoding (fp-ts Pattern)

The core insight is simple: represent each type constructor with a unique string literal (its URI), then use an interface as a lookup table that maps each URI to its concrete applied type. Module augmentation lets anyone register new type constructors into this table.

graph LR
    A["Kind<'Array', number>"] -->|"Looks up URI in"| B["URItoKind<number>"]
    B -->|"Finds 'Array' key"| C["{ Array: Array<number> }"]
    C -->|"Resolves to"| D["Array<number>"]

    E["Kind<'Option', string>"] -->|"Looks up URI in"| F["URItoKind<string>"]
    F -->|"Finds 'Option' key"| G["{ Option: Option<string> }"]
    G -->|"Resolves to"| H["Option<string>"]

    style A fill:#2d3748,stroke:#63b3ed,color:#e2e8f0
    style D fill:#2d3748,stroke:#68d391,color:#e2e8f0
    style E fill:#2d3748,stroke:#63b3ed,color:#e2e8f0
    style H fill:#2d3748,stroke:#68d391,color:#e2e8f0
    

Here's the complete encoding, built step by step:

typescript
// 1. The lookup table — starts empty, extended via module augmentation
interface URItoKind<A> {}

// 2. A union of all registered URIs
type URIS = keyof URItoKind<any>;

// 3. The "Kind" type: looks up the URI in the table
type Kind<URI extends URIS, A> = URItoKind<A>[URI];

// 4. Register Array as a type constructor
declare module './hkt' {
  interface URItoKind<A> {
    Array: Array<A>;
  }
}

// 5. Now Kind<'Array', number> === Array<number> ✅
type Test = Kind<'Array', number>; // Array<number>
How Module Augmentation Makes This Work

TypeScript's declare module lets you add properties to an existing interface from a different file. Each library or module that defines a new container type (like Option, Either, or Task) augments URItoKind to register itself. The Kind type then automatically resolves to the correct concrete type — no central registry file needed.

Implementing Functor, Monad, and Applicative

With the Kind encoding in place, you can define type class interfaces that abstract over any registered type constructor. These interfaces constrain the URI to be a valid key in the registry, and use Kind<URI, A> wherever Haskell would write f a.

Functor

typescript
interface Functor<F extends URIS> {
  readonly URI: F;
  readonly map: <A, B>(fa: Kind<F, A>, f: (a: A) => B) => Kind<F, B>;
}

// Concrete instance for Array
const arrayFunctor: Functor<'Array'> = {
  URI: 'Array',
  map: (fa, f) => fa.map(f),
};

Applicative and Monad

Applicative adds the ability to lift a value into the container (of) and apply a wrapped function to a wrapped value (ap). Monad adds chain (also called flatMap or bind), which sequences dependent computations.

typescript
interface Applicative<F extends URIS> extends Functor<F> {
  readonly of: <A>(a: A) => Kind<F, A>;
  readonly ap: <A, B>(fab: Kind<F, (a: A) => B>, fa: Kind<F, A>) => Kind<F, B>;
}

interface Monad<F extends URIS> extends Applicative<F> {
  readonly chain: <A, B>(fa: Kind<F, A>, f: (a: A) => Kind<F, B>) => Kind<F, B>;
}

// Full Monad instance for Array
const arrayMonad: Monad<'Array'> = {
  URI: 'Array',
  map: (fa, f) => fa.map(f),
  of: (a) => [a],
  ap: (fab, fa) => fab.flatMap(f => fa.map(f)),
  chain: (fa, f) => fa.flatMap(f),
};

The power comes from writing generic functions that accept any Monad<F>:

typescript
// Works with Array, Option, Either, Task — anything with a Monad instance
function flatMapTwice<F extends URIS, A, B, C>(
  M: Monad<F>,
  fa: Kind<F, A>,
  f: (a: A) => Kind<F, B>,
  g: (b: B) => Kind<F, C>,
): Kind<F, C> {
  return M.chain(M.chain(fa, f), g);
}

// Use with arrays
const result = flatMapTwice(
  arrayMonad,
  [1, 2, 3],
  (n) => [n, n * 10],
  (n) => [String(n)],
); // ['1', '10', '2', '20', '3', '30']

Alternative Approaches

The URI-based pattern from fp-ts is the most complete solution, but it's heavy machinery. Depending on your needs, lighter alternatives exist.

Mapped Types as Lightweight HKTs

If you only need to abstract over a known, fixed set of type constructors, a mapped type with a discriminating key avoids the module augmentation dance entirely:

typescript
// Closed registry — no module augmentation needed
interface TypeMap<A> {
  array: Array<A>;
  promise: Promise<A>;
  set: Set<A>;
}

type ApplyKind<Tag extends keyof TypeMap<any>, A> = TypeMap<A>[Tag];

// ApplyKind<'array', number> === Array<number> ✅
// ApplyKind<'promise', string> === Promise<string> ✅

Defunctionalization

Defunctionalization replaces function application at the type level with explicit "apply" types. Instead of passing a type constructor directly, you pass a symbol representing it and an interpreter that "applies" it:

typescript
// A "type-level function" is an interface with a hidden placeholder
interface HKT {
  readonly _A: unknown;
  readonly type: unknown;
}

// Each type constructor becomes a concrete HKT implementation
interface ArrayHKT extends HKT {
  readonly type: Array<this['_A']>;
}

interface PromiseHKT extends HKT {
  readonly type: Promise<this['_A']>;
}

// "Apply" the HKT by injecting A into the _A slot
type Apply<F extends HKT, A> = (F & { readonly _A: A })['type'];

// Apply<ArrayHKT, number> === Array<number> ✅
type Mapped = Apply<PromiseHKT, string>; // Promise<string>

This approach uses TypeScript's this type in interfaces to achieve type-level function application. It's more self-contained than URI-based encoding — no module augmentation or string literal registry — but it's harder to read and the this-type trick can confuse tooling.

Generic Classes as Type-Level Functions

A class or interface with a generic parameter can serve as a crude type-level function. You pass the class itself and use InstanceType or conditional types to extract the applied form:

typescript
class Box<A> {
  constructor(readonly value: A) {}
  map<B>(f: (a: A) => B): Box<B> {
    return new Box(f(this.value));
  }
}

// Use a generic function constrained by a "Mappable" shape
function doubleMap<C extends { map(f: (a: any) => any): any }>(
  container: C,
  f: (a: any) => any,
  g: (a: any) => any,
): any {
  return container.map(f).map(g);
}

This is simple but sacrifices type safety — the any types in the constraint lose track of the inner type parameter. It's fine for quick structural polymorphism but not for building rigorous abstractions.

The Type Classes Pattern (Ad-Hoc Polymorphism)

You don't always need full HKT encoding. If your goal is ad-hoc polymorphism — different behavior for different types under a common interface — a dictionary-passing style works well in TypeScript without any HKT machinery:

typescript
// Type class: anything that can be compared for equality
interface Eq<A> {
  equals(a: A, b: A): boolean;
}

// Type class: anything that can be combined
interface Semigroup<A> {
  concat(a: A, b: A): A;
}

// Instances — explicit dictionaries, not implicit resolution
const eqNumber: Eq<number> = { equals: (a, b) => a === b };
const eqString: Eq<string> = { equals: (a, b) => a === b };
const semigroupSum: Semigroup<number> = { concat: (a, b) => a + b };

// Generic function parameterized by the type class instance
function deduplicate<A>(E: Eq<A>, items: A[]): A[] {
  return items.reduce<A[]>(
    (acc, item) => acc.some(x => E.equals(x, item)) ? acc : [...acc, item],
    [],
  );
}

deduplicate(eqNumber, [1, 2, 2, 3]); // [1, 2, 3]
deduplicate(eqString, ['a', 'b', 'a']); // ['a', 'b']

This pattern gives you the core benefit of type classes — swappable implementations under a shared interface — while keeping the types concrete. It's the right choice when you don't need to abstract over the container shape itself.

Practical Application: A Unified Pipeline

Here's a realistic example that ties the HKT encoding together. We define Option, register it, create instances, and write a generic validation pipeline that works with both Array and Option:

typescript
// --- Option type definition ---
type Option<A> = { _tag: 'None' } | { _tag: 'Some'; value: A };
const none: Option<never> = { _tag: 'None' };
const some = <A>(a: A): Option<A> => ({ _tag: 'Some', value: a });

// Register Option in the HKT system
declare module './hkt' {
  interface URItoKind<A> {
    Option: Option<A>;
  }
}

// Monad instance for Option
const optionMonad: Monad<'Option'> = {
  URI: 'Option',
  map: (fa, f) => (fa._tag === 'None' ? none : some(f(fa.value))),
  of: some,
  ap: (fab, fa) =>
    fab._tag === 'None' || fa._tag === 'None'
      ? none
      : some(fab.value(fa.value)),
  chain: (fa, f) => (fa._tag === 'None' ? none : f(fa.value)),
};

// --- Generic pipeline that works with ANY Monad ---
function pipeline<F extends URIS, A, B, C>(
  M: Monad<F>,
  input: Kind<F, A>,
  step1: (a: A) => Kind<F, B>,
  step2: (b: B) => Kind<F, C>,
): Kind<F, C> {
  return M.chain(M.chain(input, step1), step2);
}

// Use with Option
const parseAge = (s: string): Option<number> => {
  const n = parseInt(s, 10);
  return isNaN(n) ? none : some(n);
};
const validateAge = (n: number): Option<number> =>
  n >= 0 && n <= 150 ? some(n) : none;

pipeline(optionMonad, some('42'), parseAge, validateAge); // { _tag: 'Some', value: 42 }
pipeline(optionMonad, some('abc'), parseAge, validateAge); // { _tag: 'None' }

Trade-Offs and When to Use What

ApproachType SafetyComplexityExtensibleBest For
URI-based (fp-ts)FullHighYes (module augmentation)FP libraries, large abstraction layers
DefunctionalizationFullMedium-HighYes (new HKT interfaces)Self-contained HKT usage
Mapped type registryFullLowNo (closed set)Known, fixed set of containers
Structural (class-based)PartialLowN/AQuick polymorphism, prototyping
Dictionary passing (no HKTs)FullLowYesAd-hoc polymorphism on concrete types
Don't Reach for HKTs by Default

HKT encodings add real complexity — module augmentation, string literal URIs, indirect type resolution, and error messages that mention Kind<'Option', A> instead of Option<A>. If you're writing application code (not a library), simple generics and dictionary-passing cover 95% of use cases. Reserve HKT patterns for situations where you genuinely need to write functions polymorphic over the container type itself.

Tip

If you want to use HKT-based abstractions without building the encoding yourself, the Effect library (successor to fp-ts) provides a mature, well-tested HKT system with excellent type inference. Start there rather than rolling your own.

Advanced Function Types: Overloads, Call & Construct Signatures

TypeScript's function type system goes far beyond (x: string) => number. Overloads, call signatures, construct signatures, this parameters, and variadic tuple types give you precise control over how functions are typed, called, and composed. This section explores every advanced tool TypeScript provides for modeling function behavior at the type level.

Function Overloads

Function overloads let you describe a function that accepts different argument shapes and returns different types based on what's passed. You write one or more overload signatures (visible to callers) followed by a single implementation signature (visible only inside the function body).

typescript
// Overload signatures (what callers see)
function parseInput(input: string): string[];
function parseInput(input: number): number[];

// Implementation signature (must be compatible with ALL overloads)
function parseInput(input: string | number): string[] | number[] {
  if (typeof input === "string") {
    return input.split(",");
  }
  return Array.from({ length: input }, (_, i) => i);
}

Callers see only the overload signatures. Passing "a,b,c" returns string[]; passing 5 returns number[]. The implementation signature is not directly callable — parseInput(true) is an error even though the implementation accepts string | number.

The "Last Overload Wins for Inference" Rule

When TypeScript infers the type of an overloaded function (e.g., when you pass it as a callback), it uses the last overload signature. This catches many people off guard.

typescript
function toArray(value: string): string[];
function toArray(value: number): number[];
function toArray(value: string | number) {
  return [value];
}

// When inferring, TS picks the LAST overload:
type Result = ReturnType<typeof toArray>;
//   ^? number[]   <-- not string[], not string[] | number[]

Place the most general overload last if you want ReturnType and similar utilities to infer a useful type.

Overloads vs. Conditional Types vs. Union Parameters

You have three strategies for functions that behave differently based on input. Each has trade-offs:

ApproachBest WhenDrawback
OverloadsDistinct input/output pairs; callers need per-case return typesImplementation body loses narrowing; ReturnType only sees last overload
Conditional return typeReturn type is a direct function of the input type parameterHarder to read; TS can struggle to narrow inside the body
Union parametersAll cases return the same type, or you only need a union returnCallers don't get narrowed return types per call
typescript
// Conditional return type approach — single signature, computed return
function parse<T extends string | number>(
  input: T
): T extends string ? string[] : number[] {
  if (typeof input === "string") {
    return input.split(",") as any; // cast needed inside body
  }
  return [input] as any;
}

const a = parse("hello"); // string[]
const b = parse(42);      // number[]
Tip

Prefer conditional return types when you have 2 cases and a clean generic constraint. Use overloads when you have 3+ distinct call signatures or when the parameter shapes differ (not just types).

Call Signatures in Interfaces and Type Aliases

Beyond arrow-style (x: string) => number, TypeScript supports call signatures inside object types. This lets you define callable objects that also carry properties — something arrow syntax can't express.

typescript
// Overloaded call signatures inside an interface
interface StringParser {
  (input: string): string[];
  (input: string, delimiter: RegExp): string[];
  readonly maxLength: number;
}

// Usage — the object is callable AND has properties
declare const parser: StringParser;
const parts = parser("a-b-c", /-/);  // string[]
console.log(parser.maxLength);         // number

Multiple call signatures in the same type are equivalent to function overloads. Resolution follows the same top-to-bottom order.

Construct Signatures

A construct signature describes something you call with new. You prefix the call signature with the new keyword.

typescript
interface UserConstructor {
  new (name: string, age: number): User;
}

interface User {
  name: string;
  age: number;
}

function createUser(Ctor: UserConstructor, name: string, age: number): User {
  return new Ctor(name, age);
}

Hybrid Types: Callable and Constructable

Some JavaScript values work both with and without newDate is the classic example. You model this by combining call signatures and construct signatures in the same type.

typescript
interface DateLike {
  (): string;                        // callable — returns a string
  new (ms: number): Date;            // constructable — returns a Date
  new (dateStr: string): Date;
  parse(s: string): number;          // static method
}

// Models something like JavaScript's built-in Date

The this Parameter Type

TypeScript lets you declare a fake first parameter named this to enforce the calling context of a function. It's erased at runtime — it exists purely for type checking. This prevents a common class of bugs where methods are detached from their objects.

typescript
interface DOMHandler {
  element: HTMLElement;
  onClick(this: DOMHandler, event: MouseEvent): void;
}

const handler: DOMHandler = {
  element: document.body,
  onClick(event) {
    // `this` is typed as DOMHandler here — safe to access
    console.log(this.element.tagName);
  },
};

// Error: 'this' context of type 'void' is not assignable to DOMHandler
const detached = handler.onClick;
detached(new MouseEvent("click")); // TS error!

ThisParameterType<T> and OmitThisParameter<T>

Two built-in utility types let you extract or strip the this parameter from function types. These are essential when wrapping or rebinding methods.

typescript
type Fn = (this: { name: string }, age: number) => string;

type Context = ThisParameterType<Fn>;
//   ^? { name: string }

type Unbound = OmitThisParameter<Fn>;
//   ^? (age: number) => string

// Practical use: creating a bound version
function safeBind<T, A extends any[], R>(
  fn: (this: T, ...args: A) => R,
  thisArg: T
): (...args: A) => R {
  return fn.bind(thisArg);
}

Generic Function Types

There's a critical distinction between a generic type alias that describes a function and a type alias that describes a generic function. The placement of the type parameter changes everything.

typescript
// Generic is on the TYPE — T is fixed when you reference the alias
type FixedMapper<T> = (arr: T[]) => T[];
const numMapper: FixedMapper<number> = (arr) => arr.map((n) => n * 2);

// Generic is on the CALL — T is inferred fresh at each call site
type FlexMapper = <T>(arr: T[]) => T[];
const identity: FlexMapper = (arr) => arr;

identity([1, 2, 3]);      // T inferred as number
identity(["a", "b"]);     // T inferred as string
Key Distinction

With type Fn<T> = (x: T) => T, T is bound when you write Fn<number>. With type Fn = <T>(x: T) => T, T is bound at each call. Use call-site generics when a single function variable must work with multiple types across different invocations.

Extracting and Manipulating Function Types

TypeScript ships several utility types for deconstructing function types. These are invaluable when wrapping, decorating, or proxying functions.

typescript
function createUser(name: string, age: number, admin: boolean) {
  return { name, age, admin };
}

// Extract parameter types as a tuple
type Params = Parameters<typeof createUser>;
//   ^? [name: string, age: number, admin: boolean]

// Extract the return type
type UserObj = ReturnType<typeof createUser>;
//   ^? { name: string; age: number; admin: boolean }

// For classes — extract constructor parameter types
class Database {
  constructor(public host: string, public port: number) {}
}

type DBArgs = ConstructorParameters<typeof Database>;
//   ^? [host: string, port: number]

type DBInstance = InstanceType<typeof Database>;
//   ^? Database

These utility types use infer under the hood. Parameters<T> is defined roughly as T extends (...args: infer P) => any ? P : never. You can write your own variants for specialized extraction.

Variadic Tuple Types with Rest Parameters

TypeScript 4.0 introduced variadic tuple types — the ability to spread generic tuple types inside other tuples. This unlocks precise typing for concat, curry, pipe, and compose patterns that were previously impossible without complex recursive types.

typescript
// Typed concat — preserves all element types
function concat<T extends unknown[], U extends unknown[]>(
  a: [...T],
  b: [...U]
): [...T, ...U] {
  return [...a, ...b];
}

const result = concat([1, "hello"] as const, [true, 42] as const);
//    ^? readonly [1, "hello", true, 42]

Typed pipe and compose

Variadic tuples make it possible to type a basic pipe that chains unary functions. Each function's input must match the previous function's output.

typescript
// Simple two-function pipe with full type safety
function pipe<A, B, C>(
  fn1: (a: A) => B,
  fn2: (b: B) => C
): (a: A) => C {
  return (a) => fn2(fn1(a));
}

const parseAndDouble = pipe(
  (s: string) => parseInt(s, 10),  // string => number
  (n: number) => n * 2              // number => number
);

parseAndDouble("21"); // 42

// Partial application using variadic tuples
function partial<T extends unknown[], R extends unknown[], Out>(
  fn: (...args: [...T, ...R]) => Out,
  ...headArgs: T
): (...tailArgs: R) => Out {
  return (...tailArgs) => fn(...headArgs, ...tailArgs);
}

const add = (a: number, b: number, c: number) => a + b + c;
const add10 = partial(add, 10);     // (b: number, c: number) => number
const add10and20 = partial(add, 10, 20); // (c: number) => number

The NoInfer<T> Utility Type

NoInfer<T> (added in TypeScript 5.4) tells the compiler not to use a particular position for type inference. This is critical in generic functions where you want inference to flow in one direction — typically from one parameter to constrain another, rather than having the second parameter widen the first's inferred type.

typescript
// Without NoInfer — `defaultValue` widens the inferred type of T
function getOrDefault<T>(values: T[], defaultValue: T): T {
  return values.length > 0 ? values[0] : defaultValue;
}
// T inferred as string | number — probably not what you want
getOrDefault(["a", "b"], 42);

// With NoInfer — inference comes ONLY from `values`
function getOrDefaultSafe<T>(values: T[], defaultValue: NoInfer<T>): T {
  return values.length > 0 ? values[0] : defaultValue;
}
// Error: Argument of type 'number' is not assignable to 'string'
getOrDefaultSafe(["a", "b"], 42);

// Correct usage — defaultValue must match the inferred T
getOrDefaultSafe(["a", "b"], "fallback"); // OK, T is string
Watch Out

NoInfer<T> does not constrain T — it merely removes that position from the inference candidate set. If T can't be inferred from any other position, it falls back to its constraint (usually unknown). Always ensure at least one parameter position still participates in inference.

Putting It All Together

These features compose. Here's a realistic example: a type-safe event emitter that uses overloaded call signatures, generic function types, and Parameters extraction.

typescript
type EventMap = {
  login: (user: string, timestamp: number) => void;
  logout: (user: string) => void;
  error: (code: number, message: string) => void;
};

class TypedEmitter<Events extends Record<string, (...args: any[]) => void>> {
  private listeners = new Map<string, Function[]>();

  on<K extends keyof Events>(
    event: K,
    handler: Events[K]
  ): void {
    const handlers = this.listeners.get(event as string) ?? [];
    handlers.push(handler);
    this.listeners.set(event as string, handlers);
  }

  emit<K extends keyof Events>(
    event: K,
    ...args: Parameters<Events[K]>
  ): void {
    const handlers = this.listeners.get(event as string) ?? [];
    handlers.forEach((fn) => fn(...args));
  }
}

const emitter = new TypedEmitter<EventMap>();

emitter.on("login", (user, ts) => {
  // user: string, ts: number — fully inferred
  console.log(`${user} logged in at ${ts}`);
});

emitter.emit("login", "alice", Date.now());     // OK
emitter.emit("login", "alice");                  // Error: expected 3 args
emitter.emit("logout", "alice", Date.now());     // Error: expected 2 args

The Parameters<Events[K]> extraction ensures emit requires exactly the right arguments for each event. The generic K extends keyof Events connects the event name to its handler signature. No overloads needed — the mapped type does the heavy lifting.

Variance Annotations & Covariance/Contravariance

Variance describes how subtyping between complex types relates to subtyping between their type parameters. If Cat extends Animal, can you use a Producer<Cat> wherever a Producer<Animal> is expected? The answer depends on how the type parameter is used — and getting this wrong is a common source of subtle bugs in generic code.

TypeScript tracks four variance positions. Understanding them unlocks precise control over type safety in generics, callbacks, event systems, and state management.

The Four Variance Positions

VarianceKeywordPositionAssignability Direction
CovariantoutReturn / output onlySame as the type hierarchy (Cat → Animal)
ContravariantinParameter / input onlyReversed from the type hierarchy (Animal → Cat)
Invariantin outBoth input and outputNo direction — must match exactly
BivariantLegacy function paramsBoth directions (unsound)

Let's build intuition for each one using a concrete type hierarchy: Animal → Cat → Kitten.

Covariance — Output Positions

A type is covariant in T when T only appears in output (return) positions. Think of a producer — something that gives you values of type T but never accepts them. Since the container only emits values, it's safe to treat a producer of a more specific type as a producer of a more general type.

typescript
interface Animal { name: string }
interface Cat extends Animal { purrs: boolean }

// ReadonlyArray<T> is covariant in T — it only outputs T values
const cats: ReadonlyArray<Cat> = [{ name: "Whiskers", purrs: true }];
const animals: ReadonlyArray<Animal> = cats; // ✅ OK — Cat[] → Animal[]

// Why is this safe? Every Cat IS an Animal.
// When you read from `animals`, you get an Animal — which a Cat certainly is.
animals[0].name; // "Whiskers" — works perfectly

The key insight: if you can only read from a container, narrowing the element type is always safe. A basket of cats is a perfectly valid basket of animals — as long as no one tries to put a dog in it.

Contravariance — Input Positions

A type is contravariant in T when T only appears in input (parameter) positions. Think of a consumer — something that accepts values of type T but never returns them. Here the assignability direction reverses: a consumer of a broader type can safely be used where a consumer of a narrower type is expected.

typescript
type Handler<T> = (value: T) => void;

const handleAnimal: Handler<Animal> = (a) => console.log(a.name);
const handleCat: Handler<Cat> = handleAnimal; // ✅ OK — contravariant!

// Why? If a callback site will call handleCat(someCat),
// and handleAnimal can handle ANY Animal, it can certainly handle a Cat.
handleCat({ name: "Mittens", purrs: true }); // works — a Cat is an Animal

// The reverse is NOT safe:
const catOnlyHandler: Handler<Cat> = (c) => console.log(c.purrs);
// const animalHandler: Handler<Animal> = catOnlyHandler; // ❌ Error!
// animalHandler({ name: "Rex" }) would try to access .purrs on a Dog

This is the part that trips most people up. The reversal makes sense when you think about who calls the function. A function that can handle any animal is a more capable handler — it can be safely used in a context that only ever passes cats.

strictFunctionTypes matters

Contravariant function parameters are only enforced when strictFunctionTypes is enabled (part of strict mode). Without it, TypeScript uses bivariant checking for function parameters — allowing assignment in both directions. This is unsound but was TypeScript's original behavior for pragmatic reasons (DOM event handler compatibility). Always enable strict mode for correct variance checking.

Invariance — Both Positions

When T appears in both input and output positions, the type is invariant. No widening, no narrowing — T must match exactly. Mutable arrays are the classic example.

typescript
const cats: Array<Cat> = [{ name: "Whiskers", purrs: true }];

// Array<T> is invariant — T is used for both push() and indexing
// const animals: Array<Animal> = cats; // ❌ Error under strict settings

// Why? If this were allowed:
// animals.push({ name: "Rex" }); // pushed a Dog into a Cat array!
// cats[1].purrs // 💥 runtime error — Rex doesn't have purrs

Compare this directly with the ReadonlyArray example above. The only difference is mutability. ReadonlyArray<T> removes the write methods (push, splice, etc.), which moves T out of input positions, making it covariant. This is why preferring readonly types improves both safety and flexibility.

Visualizing Variance

The diagram below shows a type hierarchy (Animal → Cat → Kitten) and how assignability flows for covariant producers versus contravariant consumers. Notice how the arrows reverse for consumers.

graph LR
    subgraph Type Hierarchy
        A["Animal (broad)"] --- B["Cat"]
        B --- C["Kitten (narrow)"]
    end

    subgraph "Producer<T> — Covariant (out)"
        PA["Producer<Animal>"]
        PB["Producer<Cat>"]
        PC["Producer<Kitten>"]
        PC -->|"assignable to"| PB
        PB -->|"assignable to"| PA
    end

    subgraph "Consumer<T> — Contravariant (in)"
        CA["Consumer<Animal>"]
        CB["Consumer<Cat>"]
        CC["Consumer<Kitten>"]
        CA -->|"assignable to"| CB
        CB -->|"assignable to"| CC
    end
    

For producers, assignability flows in the same direction as the type hierarchy: narrow → broad. For consumers, it reverses: broad → narrow. This is the core of variance.

Explicit Variance Annotations (TypeScript 4.7+)

TypeScript can infer variance by structurally inspecting how a type parameter is used. But starting with TypeScript 4.7, you can declare variance explicitly using in and out keywords on type parameters. This serves three purposes: performance, documentation, and error prevention.

typescript
// Covariant — T only flows out
interface Producer<out T> {
  get(): T;
}

// Contravariant — T only flows in
interface Consumer<in T> {
  accept(value: T): void;
}

// Invariant — T flows both directions
interface Transform<in out T> {
  process(value: T): T;
}

// TypeScript will ERROR if your implementation contradicts the annotation:
// interface Broken<out T> {
//   accept(value: T): void; // ❌ Error: T is used in an 'in' position
// }

Why bother with explicit annotations?

Performance. For complex generic types with deep nesting, TypeScript must recursively inspect the structure to infer variance. Explicit annotations let the compiler skip this analysis entirely. In large codebases with types like Observable<T> used thousands of times, this measurably speeds up type checking.

Documentation. The in/out keywords communicate intent at a glance. When you see interface EventStream<out T>, you immediately know the type only produces T values — no method on it will accept a T parameter.

Error catching. If you annotate out T but later add a method that takes T as an input, TypeScript raises an error. This prevents accidental variance changes that could silently break downstream consumers.

Real-World Variance in Practice

Variance isn't academic — it directly shapes the types you use every day in production code.

Event Emitters

An event emitter's on() method takes a callback, making it contravariant in the handler type. A handler for MouseEvent (broad) can be used where a handler for ClickEvent (narrow) is expected, because the callback only receives events — it doesn't produce them.

typescript
interface Emitter<in T> {
  on(handler: (event: T) => void): void;
}

// A MouseEvent emitter can accept a UIEvent handler
// because UIEvent is a supertype — contravariance at work
declare const clickEmitter: Emitter<MouseEvent>;
const uiHandler = (e: UIEvent) => console.log(e.type);
clickEmitter.on(uiHandler); // ✅ Safe — UIEvent handler can handle MouseEvent

Redux Reducers

A Redux reducer is (state: S, action: A) => S. The action parameter is in an input position (contravariant) while the return type is in an output position (covariant). Since S appears in both positions, the reducer is invariant in S. This is why you can't assign a Reducer<DerivedState> to a Reducer<BaseState> — even if DerivedState extends BaseState.

React Component Props

React's FC<Props> takes props as a function parameter, making it contravariant in Props. A component accepting { name: string } can be used where a component accepting { name: string; age: number } is expected — the broader input type handles the narrower case.

Rule of Thumb

When designing generic interfaces, ask: "Does T flow out (return values, readonly properties), in (parameters, writable properties), or both?" If your answer is "out only" or "in only," add the explicit variance annotation. It costs nothing and protects your API from accidental changes.

Advanced Class Patterns: Mixins, Abstract Classes & Decorators

TypeScript's class system goes far beyond simple inheritance. Once you understand mixins, abstract classes, and the new TC39 Stage 3 decorators, you can compose behavior, enforce contracts, and weave cross-cutting concerns into your code — all with full type safety. This section covers each pattern in depth, including the trade-offs that matter in real projects.

Mixins: Composing Behavior Without Deep Inheritance

JavaScript only supports single inheritance, but mixins let you compose multiple "slices" of behavior onto a class. The key insight is that a mixin is a function that takes a base class and returns a new class extending it. TypeScript makes this pattern fully type-safe with a specific constructor type.

The Mixin Constructor Pattern

Everything starts with this type alias. It describes "any constructor that produces an instance of T":

typescript
// The universal mixin constructor type
type Constructor<T = {}> = new (...args: any[]) => T;

// A mixin that adds timestamping behavior
function Timestamped<TBase extends Constructor>(Base: TBase) {
  return class extends Base {
    createdAt = new Date();
    updatedAt = new Date();

    touch() {
      this.updatedAt = new Date();
    }
  };
}

// A mixin that adds serialization
function Serializable<TBase extends Constructor>(Base: TBase) {
  return class extends Base {
    serialize(): string {
      return JSON.stringify(this);
    }
  };
}

Each mixin function receives a base class, returns a new anonymous class that extends it, and adds its own properties or methods. The generic TBase extends Constructor ensures TypeScript carries the base class type through the chain.

Composing Multiple Mixins

The real power shows when you stack mixins. Order matters — each mixin wraps the previous one, building an inheritance chain at runtime:

typescript
class User {
  constructor(public name: string, public email: string) {}
}

// Compose: User → Timestamped → Serializable
const EnhancedUser = Serializable(Timestamped(User));

const user = new EnhancedUser("Alice", "alice@example.com");
user.touch();                  // ✅ from Timestamped
const json = user.serialize(); // ✅ from Serializable
console.log(user.name);        // ✅ from User
console.log(user.createdAt);   // ✅ from Timestamped

Constrained Mixins

Sometimes a mixin only makes sense on classes that already have certain properties. You constrain the base class by narrowing the Constructor type parameter:

typescript
// This mixin REQUIRES the base class to have a `name` property
interface Nameable {
  name: string;
}

function Greetable<TBase extends Constructor<Nameable>>(Base: TBase) {
  return class extends Base {
    greet() {
      return `Hello, I'm ${this.name}`;
    }
  };
}

// ✅ Works — User has `name`
const GreetableUser = Greetable(User);

// ❌ Compile error — { id: number } has no `name`
class Widget { constructor(public id: number) {} }
const GreetableWidget = Greetable(Widget); // Error!
Why not just use multiple inheritance?

Mixins avoid the "diamond problem" because each mixin creates a linear chain — there's always a single, unambiguous prototype path. This is why they're the idiomatic pattern for behavior composition in TypeScript and JavaScript.

Abstract Classes: Contracts with Implementation

Abstract classes sit between interfaces and concrete classes. They can define contracts that subclasses must implement (abstract members) while also providing shared implementation that subclasses inherit for free. This dual nature is their superpower — and the reason you'd choose them over interfaces.

Abstract Methods and Properties

typescript
abstract class Shape {
  abstract readonly sides: number;       // abstract property
  abstract area(): number;               // abstract method

  // Concrete method — shared by all subclasses
  describe(): string {
    return `A ${this.sides}-sided shape with area ${this.area().toFixed(2)}`;
  }
}

class Circle extends Shape {
  readonly sides = 0; // technically infinite, but we model it as 0
  constructor(private radius: number) { super(); }
  area() { return Math.PI * this.radius ** 2; }
}

class Rectangle extends Shape {
  readonly sides = 4;
  constructor(private w: number, private h: number) { super(); }
  area() { return this.w * this.h; }
}

// ❌ Cannot instantiate abstract class
const s = new Shape(); // Error!

Abstract Classes vs. Interfaces

Both define contracts, but they serve different purposes. The right choice depends on whether you need shared runtime behavior. Here's a concrete comparison:

FeatureInterfaceAbstract Class
Runtime presenceErased at compile time — zero JS outputEmits a real JS class
Implementation code❌ No method bodies✅ Can include concrete methods
Multiple inheritance✅ A class can implement many❌ A class can extend only one
Constructor signaturesCannot enforce (no new in interfaces directly)✅ Can define protected / public constructors
instanceof checks❌ Not possible (erased)✅ Works at runtime
Access modifiersOnly public and readonlypublic, protected, private

Use an interface when you only need a shape contract and want maximum flexibility (multiple implements, no runtime cost). Use an abstract class when subclasses need to share real implementation — template method pattern, common utilities, or when you need instanceof checks at runtime.

Decorators: TC39 Stage 3 (TypeScript 5.0+)

TypeScript 5.0 shipped support for the TC39 Stage 3 decorator proposal — a complete redesign from the older "experimental" decorators. The new decorators are standardized, don't require the experimentalDecorators flag, and work through a well-defined context object instead of property descriptors. They're the future of decorators in JavaScript and TypeScript alike.

Decorator Types at a Glance

The new proposal defines five decorator targets. Each receives the decorated value (or undefined for fields) plus a context object describing what's being decorated:

Decorator TargetReceivesCan Return
ClassThe class itselfA replacement class (or void)
MethodThe method functionA replacement function (or void)
Getter / SetterThe getter or setter functionA replacement function (or void)
FieldundefinedAn initializer function that transforms the initial value
Auto-accessor{ get, set } objectA replacement { get, set, init } object

Class Decorators

A class decorator wraps or replaces an entire class. It receives the class constructor and a context object. This is ideal for registering classes, adding metadata, or sealing the prototype:

typescript
type ClassDecorator = (
  target: Function,
  context: ClassDecoratorContext
) => Function | void;

function sealed(target: Function, context: ClassDecoratorContext) {
  Object.seal(target);
  Object.seal(target.prototype);
}

@sealed
class BankAccount {
  constructor(public balance: number) {}
  deposit(amount: number) { this.balance += amount; }
}

Method Decorators

Method decorators receive the original method and return a replacement. The context object gives you the method name, whether it's static, the access object for private members, and an addInitializer hook:

typescript
function logged(
  target: Function,
  context: ClassMethodDecoratorContext
) {
  const methodName = String(context.name);
  return function (this: any, ...args: any[]) {
    console.log(`→ ${methodName}(${args.map(a => JSON.stringify(a)).join(", ")})`);
    const result = target.call(this, ...args);
    console.log(`← ${methodName} returned ${JSON.stringify(result)}`);
    return result;
  };
}

class Calculator {
  @logged
  add(a: number, b: number): number {
    return a + b;
  }
}

new Calculator().add(2, 3);
// → add(2, 3)
// ← add returned 5

Field Decorators and the accessor Keyword

Field decorators can't directly replace a field value — they receive undefined as the first argument and return an initializer function that transforms the initial value. For more power, TypeScript 5.0 introduces the accessor keyword for auto-accessor fields, which gives decorators get/set hooks:

typescript
// Auto-accessor decorator for validation
function range(min: number, max: number) {
  return function (
    target: ClassAccessorDecoratorTarget<any, number>,
    context: ClassAccessorDecoratorContext<any, number>
  ): ClassAccessorDecoratorResult<any, number> {
    return {
      get(this: any) {
        return target.get.call(this);
      },
      set(this: any, value: number) {
        if (value < min || value > max) {
          throw new RangeError(`${String(context.name)} must be between ${min} and ${max}`);
        }
        target.set.call(this, value);
      },
      init(value: number) {
        if (value < min || value > max) {
          throw new RangeError(`Initial ${String(context.name)} must be between ${min} and ${max}`);
        }
        return value;
      }
    };
  };
}

class Player {
  @range(0, 100)
  accessor health: number = 100;  // the `accessor` keyword creates a backing store

  @range(1, 50)
  accessor level: number = 1;
}

const p = new Player();
p.health = 80;   // ✅ OK
p.health = 150;  // ❌ RangeError: health must be between 0 and 100

Decorator Factories (Parameterized Decorators)

The @range(0, 100) syntax above is a decorator factory — a function that returns a decorator. This pattern lets you configure decorator behavior per use-site:

typescript
// Factory pattern: outer function takes config, returns the decorator
function memoize(maxSize: number = 100) {
  return function (
    target: Function,
    context: ClassMethodDecoratorContext
  ) {
    const cache = new Map<string, any>();
    return function (this: any, ...args: any[]) {
      const key = JSON.stringify(args);
      if (cache.has(key)) return cache.get(key);
      const result = target.call(this, ...args);
      if (cache.size >= maxSize) {
        const firstKey = cache.keys().next().value;
        cache.delete(firstKey);
      }
      cache.set(key, result);
      return result;
    };
  };
}

class MathService {
  @memoize(50)
  fibonacci(n: number): number {
    if (n <= 1) return n;
    return this.fibonacci(n - 1) + this.fibonacci(n - 2);
  }
}

The Decorator Context Object

Every TC39 decorator receives a context object as its second argument. This object is the standardized replacement for the old descriptor-based approach. Its shape varies by decorator kind, but the core properties are shared:

typescript
interface DecoratorContext {
  kind: "class" | "method" | "getter" | "setter" | "field" | "accessor";
  name: string | symbol;
  static: boolean;              // true if decorating a static member
  private: boolean;             // true if decorating a private member
  access: {                     // read/write hooks (varies by kind)
    get?(obj: any): any;
    set?(obj: any, value: any): void;
    has?(obj: any): boolean;
  };
  addInitializer(fn: () => void): void; // run code during construction
  metadata: Record<string | symbol, any>;  // shared metadata object
}

The addInitializer method is particularly powerful. For class decorators it runs after the class is fully defined; for method/field decorators it runs during instance construction. This replaces many use cases that previously needed constructor patching.

Decorator Execution Order

Decorators evaluate in a specific, deterministic order. Understanding this is critical when decorators have side effects or depend on each other. The rule is: inside-out, bottom-to-top for evaluation, and initializers run top-down during construction.

sequenceDiagram
    participant E as Engine
    participant F as Field Decorators
    participant A as Accessor Decorators
    participant M as Method Decorators
    participant C as Class Decorator

    Note over E: Class definition begins
    E->>F: 1. Evaluate field decorators (top to bottom)
    E->>A: 2. Evaluate accessor decorators (top to bottom)
    E->>M: 3. Evaluate method decorators (top to bottom)
    Note over M: For each method: if multiple decorators,
evaluate right-to-left (bottom-to-top) E->>C: 4. Evaluate class decorators (bottom-to-top) Note over E: Class definition complete Note over E: Instance construction (new) E->>M: 5. Method initializers run (top to bottom) E->>F: 6. Field initializers run (top to bottom) E->>A: 7. Accessor initializers run (top to bottom) Note over E: Instance ready

When multiple decorators are stacked on the same target, they compose like mathematical functions — the outermost decorator wraps the result of the inner one. @A @B method() evaluates as A(B(method)).

Practical Decorator Patterns

Dependency Injection with Class Decorators

typescript
const registry = new Map<string, any>();

function injectable(target: any, context: ClassDecoratorContext) {
  const name = String(context.name);
  context.addInitializer(function () {
    registry.set(name, target);
  });
}

function inject(serviceName: string) {
  return function (
    _value: undefined,
    context: ClassFieldDecoratorContext
  ) {
    return function () {
      const ServiceClass = registry.get(serviceName);
      if (!ServiceClass) throw new Error(`Service "${serviceName}" not registered`);
      return new ServiceClass();
    };
  };
}

@injectable
class Logger {
  log(msg: string) { console.log(`[LOG] ${msg}`); }
}

class App {
  @inject("Logger")
  logger!: Logger;

  run() {
    this.logger.log("Application started");
  }
}

Legacy Decorators vs. TC39 Stage 3 Decorators

If you've worked with TypeScript before 5.0, you've likely used the experimentalDecorators flag. Those "legacy" decorators are based on an abandoned Stage 2 proposal and differ significantly from the standardized version. Both exist in TypeScript today, but they are mutually exclusive — you use one or the other, never both.

AspectLegacy (experimentalDecorators)TC39 Stage 3 (TS 5.0+)
Enabled via"experimentalDecorators": true in tsconfigNo flag needed (default in TS 5.0+)
Parameter decorators✅ Supported❌ Not part of the proposal
API surface(target, key, descriptor)(value, context)
MetadataVia reflect-metadata polyfillBuilt-in context.metadata
accessor keywordNot applicable✅ Auto-accessor fields
StandardizationAbandoned proposalTC39 Stage 3 — shipping in browsers
Used byAngular, NestJS, TypeORM (current)New projects, future framework versions
Migration Path

Frameworks like Angular and NestJS still rely on legacy decorators with reflect-metadata for parameter-based dependency injection. Don't migrate these projects to TC39 decorators until the framework itself migrates. For greenfield projects with no framework dependency on legacy decorators, use the TC39 standard directly.

Private Class Fields: #field vs. private

TypeScript offers two ways to mark a field as private, and they behave very differently at runtime. The private keyword is a compile-time-only annotation — it's erased in the emitted JavaScript, meaning any code can still access the field at runtime. The #field syntax (ECMAScript private fields) is enforced at runtime by the JavaScript engine itself.

typescript
class Wallet {
  private softSecret = "ts-private";  // compile-time only
  #hardSecret = "runtime-private";    // true JS private field

  reveal() {
    return { soft: this.softSecret, hard: this.#hardSecret };
  }
}

const w = new Wallet();

// At compile time:
w.softSecret;  // ❌ TS error: Property 'softSecret' is private
w.#hardSecret; // ❌ TS error: Property '#hardSecret' is not accessible

// At runtime (plain JS, or with type assertion):
(w as any).softSecret;  // ✅ "ts-private" — accessible!
(w as any).#hardSecret; // ❌ SyntaxError — truly private
Featureprivate (TS keyword)#field (ES private)
Runtime enforcement❌ None — erased in JS✅ Engine-level enforcement
Visible in Object.keys()✅ Yes❌ No
Accessible via (x as any).field✅ Yes❌ No
Subclass access❌ TS error (but works at runtime)❌ Error at both compile & runtime
Works with decorators✅ Fully⚠️ Limited — decorators use context.access
PerformanceSame as public fieldsSlight overhead (WeakMap in older targets)
Serialization (JSON.stringify)✅ Included❌ Not included
When to use which

Use private for general application code where compile-time checking is sufficient, tests may need to access internals, and you want easy serialization. Use #field for library code, security-sensitive fields, or any scenario where you need a hard runtime guarantee that external code cannot touch the value — even through as any casts.

The satisfies Operator & const Assertions

TypeScript's type inference is powerful, but sometimes it infers types that are either too wide or too narrow for your needs. Two features — as const assertions and the satisfies operator — give you precise control over how types are inferred without resorting to explicit type annotations that erase useful information.

This section covers each feature in depth, then shows how combining them unlocks the most type-safe patterns available in modern TypeScript.

const Assertions with as const

By default, TypeScript widens literal values. A let variable assigned "hello" gets type string, and an object's properties become their widened types. The as const assertion tells the compiler: "treat this entire value as deeply readonly with the narrowest possible literal types."

Preserving Literal Types on Variables

Without as const, even a const variable holding an object has mutable, widened property types. Compare:

typescript
const config = {
  env: "production",
  port: 3000,
  features: ["auth", "logging"],
};
// Type: { env: string; port: number; features: string[] }
// config.env could be reassigned to any string
// config.features could be pushed to
typescript
const config = {
  env: "production",
  port: 3000,
  features: ["auth", "logging"],
} as const;
// Type: {
//   readonly env: "production";
//   readonly port: 3000;
//   readonly features: readonly ["auth", "logging"];
// }

With as const, every property becomes readonly, string values become string literal types, numbers become numeric literal types, and arrays become readonly tuples. This is recursive — nested objects and arrays are all frozen at the type level.

Tuple Types from Arrays

One of the most practical uses of as const is converting arrays into tuple types. Without it, TypeScript has no way to know the exact length or positional types of an array literal.

typescript
// Without as const: string[]
const colors = ["red", "green", "blue"];

// With as const: readonly ["red", "green", "blue"]
const colors = ["red", "green", "blue"] as const;

// Now you can derive a union type from the tuple
type Color = (typeof colors)[number]; // "red" | "green" | "blue"

This pattern is a powerful alternative to enums. You define values once, and the type is derived automatically — no duplication, no runtime overhead beyond a plain array.

Const Objects as Enum Alternatives

TypeScript enums have quirks: numeric enums allow reverse mapping, const enum has bundler compatibility issues, and string enums can't be iterated. A const object with as const avoids all of these problems.

typescript
const HttpStatus = {
  OK: 200,
  NOT_FOUND: 404,
  INTERNAL_ERROR: 500,
} as const;

// Derive the union type from values
type HttpStatus = (typeof HttpStatus)[keyof typeof HttpStatus];
// 200 | 404 | 500

function handleStatus(status: HttpStatus) {
  // status is narrowed to 200 | 404 | 500
}

// You can also iterate over it — unlike string enums
Object.values(HttpStatus).forEach(console.log);
Note

The pattern type HttpStatus = (typeof HttpStatus)[keyof typeof HttpStatus] works because TypeScript allows a type and a value to share the same name. The const creates the value; the type alias creates the companion type. This is exactly what enum does under the hood.

The satisfies Operator (TypeScript 4.9+)

Before satisfies, you had two choices when defining a value: let TypeScript infer the type (risking missing properties or shape mismatches), or annotate the type explicitly with : Type (which widens the inferred type and loses literal information). The satisfies operator gives you both: validation and narrow inference.

The Problem: Type Annotations Widen

Consider a configuration object where you want to ensure it matches a specific interface but still preserve the exact literal values.

typescript
type RouteConfig = Record<string, { path: string; auth: boolean }>;

// Approach 1: Type annotation — validates shape, but widens keys
const routes: RouteConfig = {
  home: { path: "/", auth: false },
  dashboard: { path: "/dash", auth: true },
};
routes.home;      // ✅ compiles — but type is { path: string; auth: boolean }
routes.oops;      // ✅ also compiles! — any string key is valid
routes.home.path; // type is string, not "/"

The annotation : RouteConfig forces the variable's type to be Record<string, ...>. TypeScript no longer knows which keys actually exist — routes.oops compiles without error, and all literal values are widened.

The Solution: satisfies

The satisfies operator validates assignability to a type without replacing the inferred type. TypeScript checks that the expression matches RouteConfig, but the variable keeps its narrow inferred type.

typescript
type RouteConfig = Record<string, { path: string; auth: boolean }>;

const routes = {
  home: { path: "/", auth: false },
  dashboard: { path: "/dash", auth: true },
} satisfies RouteConfig;

routes.home;      // ✅ compiles — type is { path: string; auth: boolean }
routes.oops;      // ❌ ERROR: Property 'oops' does not exist
routes.home.path; // type is string (still widened — we'll fix this next)

Now TypeScript knows the exact keys (home and dashboard), rejects unknown keys like oops, and still validates that every entry matches the { path: string; auth: boolean } shape. However, notice that routes.home.path is still string, not "/". That's where combining satisfies with as const comes in.

Combining as const + satisfies

The real power emerges when you use both features together. as const preserves every literal type, and satisfies validates the shape — giving you maximum type precision with compile-time safety.

typescript
type RouteConfig = Record<string, { path: string; auth: boolean }>;

const routes = {
  home: { path: "/", auth: false },
  dashboard: { path: "/dash", auth: true },
} as const satisfies RouteConfig;

routes.home.path; // type is "/" (literal!)
routes.dashboard.auth; // type is true (literal!)
routes.oops;      // ❌ ERROR — unknown key still caught

// Derive exact types from the definition
type RouteName = keyof typeof routes; // "home" | "dashboard"
type HomePath = (typeof routes)["home"]["path"]; // "/"

The syntax is expression as const satisfies Typeas const comes first, satisfies second. Think of it as: "freeze this value, then verify it matches the shape." This is the gold standard for type-safe configuration objects in modern TypeScript.

Pattern: Type-Safe Record Definitions

A common challenge is defining a record where you need both exhaustive key coverage and narrow value types. The satisfies operator pairs well with mapped types to enforce that every key is present.

typescript
type Theme = "light" | "dark" | "system";

type ThemeColors = Record<Theme, { bg: string; fg: string }>;

const themeColors = {
  light: { bg: "#fff", fg: "#000" },
  dark:  { bg: "#1a1a1a", fg: "#eee" },
  system: { bg: "auto", fg: "auto" },
} as const satisfies ThemeColors;
// ✅ If you remove "system", TypeScript errors immediately
// ✅ Each value retains its literal type: bg is "#fff", not string

// Add a new variant to Theme? You get an instant compile error here
// until you add the matching entry.

This pattern guarantees exhaustive coverage: if someone adds a new Theme variant, the satisfies check forces you to add the corresponding entry in themeColors. No runtime surprises.

Decision Framework: : Type vs. satisfies Type vs. as Type

These three constructs look similar but serve fundamentally different purposes. Choosing the wrong one leads to either type errors you don't understand or type safety gaps you don't notice.

SyntaxWhat it DoesNarrows Type?Validates Shape?Safe?
const x: Type = exprDeclares the variable's type; inferred type is replaced❌ Widens to Type✅ Yes✅ Yes
const x = expr satisfies TypeValidates assignability; keeps inferred type✅ Keeps narrow type✅ Yes✅ Yes
const x = expr as TypeType assertion — overrides the compiler's judgment⚠️ Forces the asserted type❌ No⚠️ Unsafe

When to Use Each

  • : Type — Use when the variable will be reassigned or when you intentionally want the wider type (e.g., function parameters, mutable state, or when the literal values don't matter downstream).
  • satisfies Type — Use when you want to validate shape without losing type information. Ideal for configuration objects, route tables, theme definitions, or any readonly data where you need both safety and narrow types.
  • as Type — Use only as a last resort when you know something TypeScript can't prove. Common legitimate uses: DOM element casting (document.getElementById("x") as HTMLCanvasElement) or narrowing after external data validation.
Warning

Don't confuse as const with as Type. Despite sharing the as keyword, they do opposite things. as const narrows the type (making it more precise); as Type overrides the type (potentially lying to the compiler). The former is always safe; the latter can hide bugs.

Edge Cases and Limitations

as const Only Works on Literals

You can only apply as const to literal expressions — object literals, array literals, string/number/boolean literals, and template literal expressions. You cannot use it on a variable reference or function return value.

typescript
const obj = { a: 1 };
const frozen = obj as const;
//             ^^^^^^^^^^^^
// ❌ Error: 'const' assertions can only be applied to
// string, number, boolean, array, or object literals.

// ✅ Must apply directly to the literal
const frozen = { a: 1 } as const;

satisfies Doesn't Create a New Type

The satisfies operator is purely a compile-time check. It doesn't change the runtime value or create a new type — it only validates and preserves. This means you can't use it in type positions, and it has no effect on emitted JavaScript.

satisfies and Excess Property Checks

satisfies does perform excess property checks when the target type has known keys. If you satisfies against a type with specific properties (not Record<string, ...>), extra properties will be flagged.

typescript
type AppConfig = { port: number; host: string };

const config = {
  port: 3000,
  host: "localhost",
  debug: true, // ❌ Error: 'debug' does not exist in type 'AppConfig'
} satisfies AppConfig;

Order Matters: as const satisfies, not satisfies as const

When combining, the order is always as const satisfies Type. Writing it the other way around is a syntax error. Conceptually, as const modifies the expression first (narrowing all literals), and then satisfies validates the resulting narrowed type against your target type.

Tip

When as const satisfies validation fails, the error can be confusing because the as const has already narrowed your types to literals (e.g., 200 instead of number). If the target type expects number, it still works because 200 is assignable to number. But if the target type expects a specific literal like 201, you'll see an error about 200 not being assignable to 201.

Declaration Merging & Module Augmentation

Declaration merging is one of TypeScript's most unique features — and one of the least understood. It allows multiple declarations with the same name to combine into a single definition. Module augmentation builds on this to let you extend third-party types without touching their source code. Together, they're the foundation for typing plugins, middleware, and library extensions.

Which Declarations Can Merge?

Not all declarations are created equal when it comes to merging. TypeScript follows strict rules about what can combine with what. The most important rule: interfaces merge, type aliases don't. If you declare the same type alias twice, you get an error. But two interface declarations with the same name silently combine into one.

DeclarationMerges with InterfaceMerges with NamespaceMerges with ClassMerges with FunctionMerges with Enum
Interface✅ Yes
Namespace✅ Yes✅ Yes✅ Yes✅ Yes
Type Alias❌ No❌ No❌ No❌ No❌ No
Class✅ Yes❌ No❌ No❌ No

Namespaces are the most flexible merger — they can attach to classes, functions, and enums. This is what powers many advanced patterns in the TypeScript ecosystem.

Interface Merging in Detail

When two interfaces share the same name in the same scope, TypeScript merges their members into a single interface. This sounds simple, but the merging rules have specific behaviors you need to know about — especially around property types and method overloads.

Property Types Must Be Identical

If both interface declarations define a property with the same name, the types must be exactly the same. TypeScript won't attempt a union or intersection — it simply errors if there's a mismatch.

typescript
interface User {
  id: number;
  name: string;
}

interface User {
  id: number;    // ✅ OK — same type
  email: string; // ✅ OK — new property
}

// Merged result:
// interface User { id: number; name: string; email: string; }

interface User {
  id: string; // ❌ Error! 'string' is not 'number'
}

Method Overloads: Later Declarations Win

For methods, each redeclaration is treated as an overload. The key detail: overloads from later interface declarations are placed first in the overload list, giving them higher priority during resolution. Within a single interface block, overloads retain their original order.

typescript
interface Logger {
  log(msg: string): void;        // overload A
  log(msg: string[]): void;      // overload B
}

interface Logger {
  log(msg: number): void;        // overload C
  log(msg: boolean): void;       // overload D
}

// Effective overload order: C, D, A, B
// Later interface's methods come first!
Note

There's one exception to the "later wins" rule: overloads with a string literal parameter type are always promoted to the top, regardless of declaration order. This ensures that specific string-based dispatching (like event names) takes priority over generic handlers.

Namespace Merging with Classes (Companion Object Pattern)

One of the most practical merging patterns is combining a namespace with a class. The namespace's exported members become static properties on the class. This is the "companion object" pattern — common in libraries that need both a class and related static utilities or types under the same name.

typescript
class Validator {
  validate(input: string): boolean {
    return Validator.emailRegex.test(input);
  }
}

namespace Validator {
  export const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
  export type Result = { valid: boolean; errors: string[] };
}

// Usage — static members live on the class
const v = new Validator();
v.validate("test@example.com");

// Namespace members accessible as statics
Validator.emailRegex;
const result: Validator.Result = { valid: true, errors: [] };

The namespace must be declared after the class. Exported values become static properties; exported types become associated types accessible via Validator.Result.

Namespace Merging with Functions and Enums

The same pattern works for functions and enums. Merging a namespace with a function lets you attach properties to a callable — a common JavaScript pattern that TypeScript can now type correctly.

typescript
// Function + Namespace: callable with properties
function greet(name: string): string {
  return `Hello, ${name}!`;
}
namespace greet {
  export const defaultName = "World";
  export function loud(name: string): string {
    return greet(name).toUpperCase();
  }
}

greet("Alice");            // "Hello, Alice!"
greet.defaultName;         // "World"
greet.loud("Bob");         // "HELLO, BOB!"
typescript
// Enum + Namespace: adding computed/utility members
enum Color {
  Red = "RED",
  Green = "GREEN",
  Blue = "BLUE",
}
namespace Color {
  export function fromHex(hex: string): Color {
    const map: Record<string, Color> = {
      "#FF0000": Color.Red,
      "#00FF00": Color.Green,
      "#0000FF": Color.Blue,
    };
    return map[hex.toUpperCase()] ?? Color.Red;
  }
}

Color.fromHex("#00FF00"); // Color.Green

Module Augmentation

Declaration merging works within a single codebase, but what about extending types from node_modules? That's where module augmentation comes in. By using declare module 'module-name' inside a module file (one that has a top-level import or export), you can add new members to existing module interfaces.

Extending Express Request & Response

The most common real-world use: adding custom properties to Express's Request object after middleware attaches them. You augment the express-serve-static-core module, since that's where Express's actual Request interface lives.

typescript
// types/express.d.ts
import { User } from "../models/user";

declare module "express-serve-static-core" {
  interface Request {
    user?: User;
    requestId: string;
  }
}

// Now in your route handlers:
// app.get("/profile", (req, res) => {
//   console.log(req.user);      // ✅ typed as User | undefined
//   console.log(req.requestId); // ✅ typed as string
// });

Extending Window and ProcessEnv

The same technique works for browser globals and Node.js environment variables. You target the module or global interface that owns the type you want to extend.

typescript
// types/env.d.ts
export {};  // Make this a module file

declare global {
  interface Window {
    analytics: {
      track(event: string, data?: Record<string, unknown>): void;
    };
  }
}

// Node.js environment variables
declare global {
  namespace NodeJS {
    interface ProcessEnv {
      DATABASE_URL: string;
      API_KEY: string;
      NODE_ENV: "development" | "production" | "test";
    }
  }
}

// Now fully typed:
// window.analytics.track("page_view");
// process.env.DATABASE_URL  // string, not string | undefined

Global Augmentation

The declare global { ... } block in the example above is global augmentation. It lets you add to the global scope from within a module file. Without it, any declarations in a module file would be scoped to that module. The declare global wrapper "breaks out" of module scope to modify global types like Window, Array, or String.

The critical detail: the file must be a module (contain at least one import or export at the top level). That lone export {} you often see at the top of .d.ts files exists solely to make the file a module so that declare global works.

typescript
// global-extensions.ts
export {};  // ← required to make this a module

declare global {
  interface Array<T> {
    sortedBy(key: keyof T): T[];
  }
}

// Implementation (separate file or same file)
Array.prototype.sortedBy = function <T>(this: T[], key: keyof T): T[] {
  return [...this].sort((a, b) =>
    a[key] > b[key] ? 1 : a[key] < b[key] ? -1 : 0
  );
};

The .d.ts Files and declare Keyword

Ambient declarations describe types for code that exists at runtime but wasn't written in TypeScript. The declare keyword tells the compiler "trust me — this exists at runtime" without emitting any JavaScript. Files ending in .d.ts are declaration files — they contain only type information, no runtime code.

SyntaxPurposeWhere
declare module 'x'Describe or augment an external module.d.ts or module file
declare global { }Add to global scope from a moduleModule files only
declare namespace XDescribe a global namespace object.d.ts files
declare function / declare constDescribe a global value.d.ts files

import type vs import in Ambient Contexts

In declaration files, you often need to reference types from other modules. A regular import at the top level turns the .d.ts file into a module, which changes how its declarations are scoped. If you want the file to remain a global script (ambient declarations visible everywhere), use import('module') inline inside a type position instead.

typescript
// ❌ Top-level import makes this a module — ambient declarations stop working
import { User } from "./models/user";
declare const currentUser: User;  // only visible in this module

// ✅ Inline import() keeps the file as a global script
declare const currentUser: import("./models/user").User;  // visible everywhere
Warning

A .d.ts file with a top-level import or export becomes a module, and its ambient declarations (declare const, declare function) are no longer globally visible. This is the #1 source of "my types aren't being picked up" bugs. When augmenting modules, you want module scope. For global ambient declarations, avoid top-level imports.

Practical Patterns

Augmenting a React Component Library

Libraries like Material UI expose theme interfaces that you're expected to extend with your own design tokens. Module augmentation makes this type-safe.

typescript
// types/mui-theme.d.ts
import "@mui/material/styles";

declare module "@mui/material/styles" {
  interface Palette {
    brand: Palette["primary"];
  }
  interface PaletteOptions {
    brand?: PaletteOptions["primary"];
  }
  interface Theme {
    layout: { maxWidth: number; sidebarWidth: number };
  }
  interface ThemeOptions {
    layout?: { maxWidth?: number; sidebarWidth?: number };
  }
}

// Now theme.palette.brand and theme.layout are fully typed

Adding Custom Matchers to Jest

Custom Jest matchers (via expect.extend) need type declarations so TypeScript knows about expect(x).toBeWithinRange(). You augment the jest module's Matchers interface.

typescript
// types/jest-custom.d.ts
export {};

declare global {
  namespace jest {
    interface Matchers<R> {
      toBeWithinRange(floor: number, ceiling: number): R;
      toBeISODate(): R;
    }
  }
}

Extending Vue's Component Options

Vue allows plugins to add custom component options. The pattern is the same — augment the module that defines ComponentCustomOptions.

typescript
// types/vue-custom.d.ts
import "vue";

declare module "vue" {
  interface ComponentCustomOptions {
    permissions?: string[];
  }
  interface ComponentCustomProperties {
    $auth: {
      user: { id: string; role: string } | null;
      logout(): Promise<void>;
    };
  }
}

// In components:
// this.$auth.user?.role  ← fully typed
// export default { permissions: ["admin"] }  ← typed option
Tip

Place all your augmentation files in a types/ directory and make sure it's included in tsconfig.json's "include" array (or referenced via "typeRoots"). A common convention is types/express.d.ts, types/jest-custom.d.ts, etc. — one file per library you're augmenting. This keeps augmentations discoverable and prevents them from silently getting excluded from compilation.

Advanced Error Handling with Result/Either Types

Exceptions are the default error-handling mechanism in JavaScript and TypeScript, but they come with a fundamental flaw: they are invisible in the type system. A function signature like function parseJSON(input: string): User reveals nothing about the fact that it can throw. This section introduces Result<T, E> — a type-safe alternative that makes errors explicit, composable, and exhaustively checkable at compile time.

The Problem with Exceptions

Consider a seemingly innocent function call. TypeScript's type checker sees a clean return type, but the runtime reality is far messier.

typescript
// This type signature LIES — it can throw at least 3 different errors
function getUser(id: string): User {
  const response = fetch(`/api/users/${id}`);  // NetworkError
  const data = JSON.parse(response.body);       // SyntaxError
  return validateUser(data);                     // ValidationError
}

// The caller has no idea what to catch
try {
  const user = getUser("123");
} catch (e) {
  // 'e' is 'unknown' — TypeScript can't help you here
  // Did fetch fail? Did parsing fail? Was the data invalid?
}

There are three concrete problems here. First, invisible failure modes — nothing in getUser's signature tells you it can fail, let alone how. Second, untyped catch blocks — since TypeScript 4.4, caught errors are unknown, so you lose all type information at the exact moment you need it most. Third, no compiler enforcement — if a new error case is added to getUser, the compiler won't tell callers to handle it.

The Result Type: Making Errors Explicit

The Result<T, E> type is a discriminated union with two variants: Ok carrying a success value, and Err carrying an error value. Because the error is part of the return type, the compiler forces every caller to acknowledge it.

typescript
type Result<T, E> = 
  | { ok: true; value: T }
  | { ok: false; error: E };

// Now the type signature tells the FULL story
function getUser(id: string): Result<User, NetworkError | ParseError> {
  // ...
}

const result = getUser("123");
if (result.ok) {
  console.log(result.value.name); // ✅ TypeScript knows 'value' exists
} else {
  console.error(result.error);    // ✅ TypeScript knows 'error' exists
}
Discriminated Unions Are the Key

The ok field acts as the discriminant. When you check result.ok, TypeScript narrows the type automatically — you get value in the true branch and error in the false branch. This is the same mechanism that powers tagged unions everywhere in TypeScript.

Implementing Result Utility Functions

A bare discriminated union works, but becomes tedious to construct and compose. These utility functions form the minimal toolkit you need to work with Result ergonomically.

Constructors: ok() and err()

typescript
function ok<T>(value: T): Result<T, never> {
  return { ok: true, value };
}

function err<E>(error: E): Result<never, E> {
  return { ok: false, error };
}

// Usage — clean and readable
function divide(a: number, b: number): Result<number, string> {
  return b === 0 ? err("Division by zero") : ok(a / b);
}

Using never in the unused type parameter is deliberate. It makes ok(42) assignable to Result<number, E> for any E, and err("oops") assignable to Result<T, string> for any T. This is what makes composition work seamlessly.

Transformers: map(), flatMap(), match(), and unwrapOr()

typescript
// Transform the success value, skip if error
function map<T, U, E>(
  result: Result<T, E>,
  fn: (value: T) => U
): Result<U, E> {
  return result.ok ? ok(fn(result.value)) : result;
}

// Chain operations that themselves return Result
function flatMap<T, U, E>(
  result: Result<T, E>,
  fn: (value: T) => Result<U, E>
): Result<U, E> {
  return result.ok ? fn(result.value) : result;
}

// Pattern match on the result — forces handling both cases
function match<T, E, U>(
  result: Result<T, E>,
  handlers: { ok: (value: T) => U; err: (error: E) => U }
): U {
  return result.ok ? handlers.ok(result.value) : handlers.err(result.error);
}

// Extract the value or fall back to a default
function unwrapOr<T, E>(result: Result<T, E>, defaultValue: T): T {
  return result.ok ? result.value : defaultValue;
}

The critical distinction is between map and flatMap. Use map when your transformation is pure (can't fail): map(result, user => user.name). Use flatMap when your transformation itself returns a Result: flatMap(result, user => validateAge(user)). Confusing the two leads to a nested Result<Result<T, E>, E> — if you see that type, you need flatMap.

Railway-Oriented Programming: Chaining Operations

The real power of Result emerges when you chain multiple fallible operations into a pipeline. This pattern — often called railway-oriented programming — models your data flow as two parallel tracks: the happy path (Ok) and the error path (Err). Each step either continues on the happy path or short-circuits to the error track.

graph LR
    A["fetchUser()"] -->|Ok| B["parseJSON()"]
    A -->|Err| ERR1["⚠️ NetworkError"]
    B -->|Ok| C["validateSchema()"]
    B -->|Err| ERR2["⚠️ ParseError"]
    C -->|Ok| D["transformData()"]
    C -->|Err| ERR3["⚠️ ValidationError"]
    D -->|Ok| E["✅ Result<User>"]
    D -->|Err| ERR4["⚠️ TransformError"]

    ERR1 --> F["❌ Result<Error>"]
    ERR2 --> F
    ERR3 --> F
    ERR4 --> F

    style E fill:#10b981,color:#fff,stroke:#059669
    style F fill:#ef4444,color:#fff,stroke:#dc2626
    style ERR1 fill:#fbbf24,color:#000,stroke:#f59e0b
    style ERR2 fill:#fbbf24,color:#000,stroke:#f59e0b
    style ERR3 fill:#fbbf24,color:#000,stroke:#f59e0b
    style ERR4 fill:#fbbf24,color:#000,stroke:#f59e0b
    

Each function only handles the happy case. If any step fails, the error propagates directly to the end without executing subsequent steps. Here's what this looks like in code:

typescript
// Each function returns Result with a specific error type
function fetchUser(id: string): Result<string, NetworkError> { /* ... */ }
function parseJSON(raw: string): Result<unknown, ParseError> { /* ... */ }
function validateUser(data: unknown): Result<User, ValidationError> { /* ... */ }
function enrichProfile(user: User): Result<EnrichedUser, TransformError> { /* ... */ }

// Build the pipeline with flatMap
type PipelineError = NetworkError | ParseError | ValidationError | TransformError;

function getUserProfile(id: string): Result<EnrichedUser, PipelineError> {
  return flatMap(
    flatMap(
      flatMap(
        fetchUser(id),
        parseJSON
      ),
      validateUser
    ),
    enrichProfile
  );
}

// Or with a pipe helper for readability
function pipe<T, E>(result: Result<T, E>) {
  return {
    then: <U, E2>(fn: (v: T) => Result<U, E2>) =>
      pipe(flatMap(result as Result<T, E | E2>, fn as any)),
    done: () => result,
  };
}

const profile = pipe(fetchUser("123"))
  .then(parseJSON)
  .then(validateUser)
  .then(enrichProfile)
  .done();

Error Type Hierarchies with Discriminated Unions

Using string as your error type throws away most of the benefit. Instead, model your errors as a discriminated union so the compiler can enforce exhaustive handling.

typescript
type NetworkError = { _tag: "NetworkError"; url: string; status: number };
type ValidationError = { _tag: "ValidationError"; field: string; message: string };
type AuthError = { _tag: "AuthError"; reason: "expired" | "invalid" | "missing" };

type AppError = NetworkError | ValidationError | AuthError;

function handleError(error: AppError): string {
  switch (error._tag) {
    case "NetworkError":
      return `Request to ${error.url} failed with ${error.status}`;
    case "ValidationError":
      return `Invalid ${error.field}: ${error.message}`;
    case "AuthError":
      return `Authentication failed: ${error.reason}`;
    // No default needed — TypeScript enforces exhaustiveness!
    // Adding a new error type to AppError causes a compile error here
  }
}

The _tag convention (borrowed from fp-ts) is widely used, but you can use kind, type, or any other discriminant field. The key is that every error variant carries structured data relevant to that specific failure — not just a message string. This lets you build rich error UIs, retry logic, and telemetry without parsing strings.

Either<L, R> — The FP Tradition

Result<T, E> and Either<L, R> are the same concept with different naming. In functional programming, Either has a "left" (conventionally the error) and "right" (conventionally the success — as in "the right answer"). The mapping is straightforward:

ResultEitherMeaning
Ok<T>Right<R>Success value
Err<E>Left<L>Error / alternative value
map()map()Transform success, pass through error
flatMap()chain() / flatMap()Chain fallible operations
match()fold()Collapse both branches into one type

One subtle difference: Either is more general — Left doesn't have to represent an error. You might use Either<CachedData, FreshData> where both branches are valid. Result carries the semantic intent that one branch is a failure. When in doubt, prefer Result for error handling — its naming is self-documenting.

Async Operations with Result

Most real-world error handling involves async I/O. The simplest approach wraps a Result in a Promise, but this creates an awkward double-unwrapping problem: you await the Promise, then check the Result.

typescript
// Approach 1: Promise<Result<T, E>> — simple but verbose
async function fetchUser(id: string): Promise<Result<User, AppError>> {
  try {
    const res = await fetch(`/api/users/${id}`);
    if (!res.ok) return err({ _tag: "NetworkError", url: res.url, status: res.status });
    const data = await res.json();
    return ok(data as User);
  } catch {
    return err({ _tag: "NetworkError", url: `/api/users/${id}`, status: 0 });
  }
}

// Chaining requires nested awaits — gets messy fast
const result = await fetchUser("123");
const profile = result.ok
  ? map(await fetchProfile(result.value.id), enrichProfile)
  : result;
typescript
// Approach 2: ResultAsync — encapsulates Promise + Result
class ResultAsync<T, E> {
  constructor(private promise: Promise<Result<T, E>>) {}

  static from<T, E>(fn: () => Promise<Result<T, E>>): ResultAsync<T, E> {
    return new ResultAsync(fn());
  }

  async then<U>(fn: (v: T) => Promise<Result<U, E>>): ResultAsync<U, E> {
    return new ResultAsync(
      this.promise.then(r => (r.ok ? fn(r.value) : r))
    );
  }

  async unwrap(): Promise<Result<T, E>> {
    return this.promise;
  }
}

// Clean async chaining
const profile = await ResultAsync.from(() => fetchUser("123"))
  .then(user => fetchProfile(user.id))
  .then(profile => enrichProfile(profile))
  .unwrap();

Library Comparison

You don't need to build this from scratch. Several libraries provide battle-tested Result/Either implementations with different philosophies.

LibraryApproachAsync SupportBest For
neverthrowResult<T, E> with methodsBuilt-in ResultAsyncTeams adopting Result incrementally
fp-ts EitherFP-style Either<E, A> with pipeVia TaskEitherTeams committed to FP patterns
EffectFull effect system with typed errorsNative (everything is effectful)Teams wanting a complete framework
ts-resultsRust-inspired Result/OptionLimitedDevelopers coming from Rust
DIY (this section)Plain discriminated unionsManualMinimal deps, full understanding

neverthrow is the pragmatic choice for most teams — it has a small API surface, good TypeScript inference, and built-in async support. fp-ts provides a comprehensive FP toolkit but has a steep learning curve and relies heavily on pipe-based composition. Effect is the most ambitious — it replaces Promise, gives you typed errors, dependency injection, retries, and more, but it's a paradigm shift, not a library drop-in.

When to Use Result vs. Exceptions

Result types and exceptions serve different purposes. Using the wrong one for a given situation makes your code harder to work with, not easier. Here's a practical decision framework:

Use Result<T, E>Use Exceptions (throw)
Expected failures (validation, not found, conflict)Programmer errors (bugs, assertion failures)
Callers need to distinguish error typesCallers can't meaningfully recover
Domain/business logic layerInfrastructure/framework code
Error is part of the function's contractError indicates broken invariants
You want exhaustive handling at compile timeYou want to crash fast and fix the bug
Don't Mix Result and Throw in the Same Function

A function that returns Result<T, E> should never also throw. If it does, you've defeated the entire purpose — callers trust the return type and skip try/catch. Wrap any internal code that might throw (like JSON.parse) in a tryCatch helper before returning.

Integration Patterns

You will almost certainly need to bridge between exceptions and Result, especially when working with third-party libraries. These patterns form the boundary layer.

Wrapping Functions That Throw

typescript
// Generic wrapper: catch exceptions and convert to Result
function tryCatch<T, E>(
  fn: () => T,
  onError: (e: unknown) => E
): Result<T, E> {
  try {
    return ok(fn());
  } catch (e) {
    return err(onError(e));
  }
}

// Async version
async function tryCatchAsync<T, E>(
  fn: () => Promise<T>,
  onError: (e: unknown) => E
): Promise<Result<T, E>> {
  try {
    return ok(await fn());
  } catch (e) {
    return err(onError(e));
  }
}

// Wrapping JSON.parse
function safeParseJSON(input: string): Result<unknown, ParseError> {
  return tryCatch(
    () => JSON.parse(input),
    (e) => ({ _tag: "ParseError" as const, message: String(e) })
  );
}

Using Result at API Boundaries

typescript
// Service layer returns Result
function createUser(input: CreateUserDTO): Result<User, ValidationError | ConflictError> {
  // ... domain logic with typed errors
}

// Controller converts Result back to HTTP responses
app.post("/users", (req, res) => {
  const result = createUser(req.body);

  match(result, {
    ok: (user) => res.status(201).json(user),
    err: (error) => {
      switch (error._tag) {
        case "ValidationError":
          return res.status(400).json({ error: error.message });
        case "ConflictError":
          return res.status(409).json({ error: error.message });
      }
    },
  });
});
The Boundary Pattern

Keep Result types inside your domain and service layers. At the edges of your application — HTTP handlers, CLI entry points, event handlers — convert Result to the appropriate output format. This gives you type-safe error handling internally without forcing it on your framework or external callers.

Design Patterns Implemented with Full Type Safety

Design patterns become dramatically more powerful when TypeScript's type system enforces their contracts at compile time. Instead of relying on runtime checks or developer discipline, you encode the rules directly into the types — making illegal states unrepresentable and invalid operations impossible to express.

Each pattern below shows the naive approach first, then the fully type-safe version. The difference isn't cosmetic — it's the difference between bugs caught in production and bugs caught before your code ever runs.

Builder Pattern with Phantom Types

The builder pattern constructs objects step by step. The classic problem: how do you ensure required fields are set before .build() is called? At runtime, you throw errors. At compile time, you use phantom types — generic parameters that exist only for the type checker, carrying "has this been set?" information through the chain.

Naive version — runtime errors only

typescript
class UserBuilder {
  private _name?: string;
  private _email?: string;
  private _age?: number;

  name(n: string) { this._name = n; return this; }
  email(e: string) { this._email = e; return this; }
  age(a: number) { this._age = a; return this; }

  build() {
    if (!this._name || !this._email) {
      throw new Error("name and email are required"); // Runtime 💥
    }
    return { name: this._name, email: this._email, age: this._age };
  }
}

// Compiles fine, explodes at runtime
const user = new UserBuilder().age(25).build();

Type-safe version — compile-time enforcement

We track which fields have been set using a generic record of boolean flags. The build method only appears when both HasName and HasEmail are true.

typescript
interface User {
  name: string;
  email: string;
  age?: number;
}

interface BuilderState {
  HasName: boolean;
  HasEmail: boolean;
}

class UserBuilder<S extends BuilderState = { HasName: false; HasEmail: false }> {
  private data: Partial<User> = {};

  name(n: string): UserBuilder<S & { HasName: true }> {
    this.data.name = n;
    return this as any;
  }

  email(e: string): UserBuilder<S & { HasEmail: true }> {
    this.data.email = e;
    return this as any;
  }

  age(a: number): UserBuilder<S> {
    this.data.age = a;
    return this as any;
  }

  build(this: UserBuilder<{ HasName: true; HasEmail: true }>): User {
    return this.data as User;
  }
}

// ✅ Compiles — both required fields set
const user = new UserBuilder().name("Alice").email("alice@dev.io").age(30).build();

// ❌ Compile error — email not set
// const bad = new UserBuilder().name("Alice").build();
// Property 'build' does not exist on type 'UserBuilder<{ HasName: true; HasEmail: false }>'
Why as any?

The as any casts are the "escape hatch" inside the builder — the implementation mutates internal state, but the return type changes with each call. TypeScript can't infer that mutation is safe, so we tell it to trust us internally while keeping the external API perfectly typed. This is a deliberate, contained trade-off.

Type-Safe State Machine

State machines are everywhere — order processing, UI flows, authentication. The dangerous part is invalid transitions: going from "shipped" directly to "draft", or from "cancelled" to "paid". TypeScript can make these transitions a compile-time error by encoding the allowed transition map as a type.

Naive version — stringly typed

typescript
class OrderMachine {
  state: string = "draft";

  transition(next: string) {
    this.state = next; // No validation at all!
  }
}

const order = new OrderMachine();
order.transition("shipped"); // draft → shipped? Shouldn't be allowed.

Type-safe version — transitions enforced at compile time

typescript
// 1. Define all states
type OrderState = "draft" | "pending" | "paid" | "shipped" | "delivered" | "cancelled";

// 2. Map each state to its allowed transitions
type TransitionMap = {
  draft:     "pending" | "cancelled";
  pending:   "paid" | "cancelled";
  paid:      "shipped" | "cancelled";
  shipped:   "delivered";
  delivered: never;  // terminal state
  cancelled: never;  // terminal state
};

// 3. The machine tracks its current state as a generic
class StateMachine<Current extends OrderState> {
  constructor(public readonly state: Current) {}

  transition<Next extends TransitionMap[Current]>(
    next: Next
  ): StateMachine<Next> {
    return new StateMachine(next);
  }
}

// ✅ Valid transitions compile
const order = new StateMachine("draft")
  .transition("pending")
  .transition("paid")
  .transition("shipped")
  .transition("delivered");

// ❌ Compile error — "draft" → "shipped" is not allowed
// new StateMachine("draft").transition("shipped");

// ❌ Compile error — "delivered" is terminal (Next = never)
// order.transition("cancelled");

Repository Pattern with Generics

The repository pattern abstracts data access behind a clean API. In most implementations, query methods accept loose string keys — meaning typos compile fine and wrong value types silently pass through. With keyof T and indexed access types, you can make findBy reject both bad field names and mismatched value types.

Naive version

typescript
class NaiveRepo {
  private items: any[] = [];

  findBy(field: string, value: any): any[] {
    // No idea if 'field' actually exists on the entity
    return this.items.filter((item: any) => item[field] === value);
  }
}

Type-safe version

typescript
interface Entity {
  id: string;
}

interface Product extends Entity {
  name: string;
  price: number;
  inStock: boolean;
}

class Repository<T extends Entity> {
  private items: T[] = [];

  save(entity: T): void {
    this.items.push(entity);
  }

  findById(id: string): T | undefined {
    return this.items.find(item => item.id === id);
  }

  // Key: K must be an actual key of T
  // Value: must match the type of T[K]
  findBy<K extends keyof T>(field: K, value: T[K]): T[] {
    return this.items.filter(item => item[field] === value);
  }

  // Partial queries — every key/value pair is type-checked
  findWhere(query: Partial<T>): T[] {
    return this.items.filter(item =>
      (Object.keys(query) as Array<keyof T>).every(key => item[key] === query[key])
    );
  }
}

const products = new Repository<Product>();

// ✅ field is "price", value must be number
products.findBy("price", 29.99);

// ✅ field is "inStock", value must be boolean
products.findBy("inStock", true);

// ❌ Compile error — "price" expects number, not string
// products.findBy("price", "cheap");

// ❌ Compile error — "colour" is not keyof Product
// products.findBy("colour", "red");

Strongly-Typed Event Emitter

Event systems are a breeding ground for runtime bugs: misspelled event names, wrong payload shapes, missing handlers. A typed event emitter maps each event name to its exact payload type, so emit and on are both fully checked.

Naive version

typescript
class NaiveEmitter {
  private listeners: Record<string, Function[]> = {};

  on(event: string, fn: Function) { /* ... */ }
  emit(event: string, data: any) { /* ... */ }
}

const emitter = new NaiveEmitter();
emitter.emit("clck", { x: 10 }); // Typo "clck" — no error

Type-safe version

typescript
// Define the contract: event name → payload type
interface AppEvents {
  click:    { x: number; y: number };
  submit:   { formId: string; data: Record<string, unknown> };
  logout:   void;
  error:    { code: number; message: string };
}

class TypedEmitter<Events extends Record<string, any>> {
  private listeners = new Map<keyof Events, Set<Function>>();

  on<E extends keyof Events>(
    event: E,
    handler: Events[E] extends void ? () => void : (data: Events[E]) => void
  ): void {
    if (!this.listeners.has(event)) this.listeners.set(event, new Set());
    this.listeners.get(event)!.add(handler);
  }

  emit<E extends keyof Events>(
    ...args: Events[E] extends void ? [event: E] : [event: E, data: Events[E]]
  ): void {
    const [event, data] = args;
    this.listeners.get(event)?.forEach(fn => (fn as any)(data));
  }

  off<E extends keyof Events>(event: E, handler: Function): void {
    this.listeners.get(event)?.delete(handler);
  }
}

const bus = new TypedEmitter<AppEvents>();

// ✅ Handler receives correctly typed payload
bus.on("click", (data) => console.log(data.x, data.y));

// ✅ Void event — no data argument needed
bus.emit("logout");

// ✅ Payload matches the registered type
bus.emit("error", { code: 404, message: "Not found" });

// ❌ Compile error — wrong payload shape
// bus.emit("click", { x: 10 });  // missing 'y'

// ❌ Compile error — event name doesn't exist
// bus.emit("clck", { x: 10, y: 20 });

Dependency Injection with Type Tokens

Dependency injection containers typically rely on string keys, losing all type information. The solution: use unique symbol tokens mapped to interface types. The container knows that requesting a LoggerToken returns a Logger, not some arbitrary any.

typescript
// A token carries the type it resolves to
class Token<T> {
  // The phantom property ensures each Token<X> is structurally unique
  private _phantom!: T;
  constructor(public readonly description: string) {}
}

// Define interfaces and their tokens
interface Logger {
  log(message: string): void;
}

interface UserService {
  getUser(id: string): Promise<{ id: string; name: string }>;
}

const LoggerToken = new Token<Logger>("Logger");
const UserServiceToken = new Token<UserService>("UserService");

// Type-safe DI container
class Container {
  private bindings = new Map<Token<any>, () => any>();

  bind<T>(token: Token<T>, factory: () => T): void {
    this.bindings.set(token, factory);
  }

  // Return type is inferred from the token's generic
  resolve<T>(token: Token<T>): T {
    const factory = this.bindings.get(token);
    if (!factory) throw new Error(`No binding for ${token.description}`);
    return factory();
  }
}

const container = new Container();
container.bind(LoggerToken, () => ({
  log: (msg: string) => console.log(msg)
}));

// ✅ Resolved type is Logger — full autocomplete on .log()
const logger = container.resolve(LoggerToken);
logger.log("DI works!");

// ❌ Compile error — resolve returns Logger, not UserService
// const svc: UserService = container.resolve(LoggerToken);

Visitor Pattern with Discriminated Unions

The visitor pattern in classical OOP requires double dispatch and a pile of boilerplate. TypeScript's discriminated unions make it dramatically simpler — and exhaustiveness checking ensures you handle every case. If you add a new variant, the compiler tells you everywhere you forgot to handle it.

Naive version — instanceof chains

typescript
function area(shape: any): number {
  if (shape.kind === "circle") return Math.PI * shape.radius ** 2;
  if (shape.kind === "rect") return shape.w * shape.h;
  return 0; // Silent fallthrough for unknown shapes
}

Type-safe version — exhaustive visitor

typescript
// Discriminated union — each variant has a unique `kind`
type Shape =
  | { kind: "circle"; radius: number }
  | { kind: "rect"; width: number; height: number }
  | { kind: "triangle"; base: number; height: number };

// Visitor: one handler per variant, generic return type
type ShapeVisitor<R> = {
  [K in Shape["kind"]]: (shape: Extract<Shape, { kind: K }>) => R;
};

function visitShape<R>(shape: Shape, visitor: ShapeVisitor<R>): R {
  return (visitor[shape.kind] as any)(shape);
}

// Exhaustive check — you MUST handle all three kinds
const area = (s: Shape) => visitShape(s, {
  circle:   (c) => Math.PI * c.radius ** 2,
  rect:     (r) => r.width * r.height,
  triangle: (t) => 0.5 * t.base * t.height,
});

// If you add { kind: "polygon"; sides: number[] } to Shape:
// ❌ Compile error — Property 'polygon' is missing in the visitor

Command Pattern with Generic Command/Handler Pairs

The command pattern decouples "what to do" from "how to do it." The challenge is ensuring each command is dispatched to a handler that accepts exactly that command's type — not some base class with an any payload. A command bus with a generic registry solves this cleanly.

typescript
// Each command declares its return type via a phantom generic
interface Command<TResult> {
  readonly _resultType?: TResult; // phantom — never assigned at runtime
}

// Concrete commands
interface CreateUser extends Command<{ id: string; name: string }> {
  kind: "CreateUser";
  name: string;
  email: string;
}

interface DeleteUser extends Command<boolean> {
  kind: "DeleteUser";
  userId: string;
}

// Handler: takes Command C and returns C's result type
type Handler<C extends Command<any>> =
  C extends Command<infer R> ? (cmd: C) => Promise<R> : never;

// Command bus — maps command kinds to handlers
class CommandBus {
  private handlers = new Map<string, Handler<any>>();

  register<C extends Command<any> & { kind: string }>(
    kind: C["kind"],
    handler: Handler<C>
  ): void {
    this.handlers.set(kind, handler);
  }

  async dispatch<R>(cmd: Command<R> & { kind: string }): Promise<R> {
    const handler = this.handlers.get(cmd.kind);
    if (!handler) throw new Error(`No handler for ${cmd.kind}`);
    return handler(cmd as any);
  }
}

const cmdBus = new CommandBus();

cmdBus.register<CreateUser>("CreateUser", async (cmd) => {
  // cmd is fully typed as CreateUser
  return { id: crypto.randomUUID(), name: cmd.name };
});

cmdBus.register<DeleteUser>("DeleteUser", async (cmd) => {
  // cmd is fully typed as DeleteUser
  console.log(`Deleting user ${cmd.userId}`);
  return true;
});

// ✅ Return type inferred as { id: string; name: string }
const newUser = await cmdBus.dispatch<{ id: string; name: string }>({
  kind: "CreateUser", name: "Alice", email: "alice@dev.io"
});

Middleware Pattern with Typed Context Accumulation

Express-style middleware chains have a notorious type problem: each middleware adds properties to the request context, but downstream handlers have no way to know what's been added. TypeScript can solve this by accumulating context types through the chain — each middleware declares what it adds, and the final handler sees the full merged type.

typescript
// Base context — what every request starts with
interface BaseContext {
  url: string;
  method: string;
}

// A middleware takes context In, and returns context In & Out
type Middleware<In, Out> = (ctx: In) => In & Out;

// Pipeline: chains middlewares, accumulating their context types
class Pipeline<Ctx extends BaseContext> {
  private middlewares: Middleware<any, any>[] = [];

  constructor(private initial: Ctx) {}

  // Each .use() extends the context type
  use<Added>(mw: Middleware<Ctx, Added>): Pipeline<Ctx & Added> {
    this.middlewares.push(mw);
    return this as any;
  }

  // Final handler sees the fully accumulated context
  handle(handler: (ctx: Ctx) => Response): Response {
    let ctx: any = this.initial;
    for (const mw of this.middlewares) {
      ctx = mw(ctx);
    }
    return handler(ctx);
  }
}

// Middleware that adds auth info
const authMiddleware: Middleware<
  BaseContext,
  { userId: string; role: string }
> = (ctx) => ({
  ...ctx,
  userId: "u_123",
  role: "admin",
});

// Middleware that adds parsed body
const bodyParser: Middleware<
  BaseContext,
  { body: Record<string, unknown> }
> = (ctx) => ({
  ...ctx,
  body: { title: "Hello" },
});

const app = new Pipeline({ url: "/api/posts", method: "POST" })
  .use(authMiddleware)
  .use(bodyParser);

app.handle((ctx) => {
  // ✅ ctx has: url, method, userId, role, body — all fully typed
  console.log(ctx.userId);  // string
  console.log(ctx.body);    // Record<string, unknown>

  // ❌ Compile error — 'sessionId' was never added by any middleware
  // console.log(ctx.sessionId);

  return new Response("OK");
});
When to reach for these patterns

Not every project needs phantom-typed builders or generic command buses. Use these when the cost of a runtime bug is high (payment flows, state machines, public APIs) or when many developers touch the same abstraction. For small, single-owner utilities, a simple function with good types is often enough.

Pattern Comparison at a Glance

PatternCore Type MechanismWhat It Prevents
Builder (phantom types)Generic boolean flags tracked across chained callsCalling .build() before required fields are set
State MachineMapped type from state → allowed transitionsInvalid state transitions
Repositorykeyof T and indexed access T[K]Querying nonexistent fields or wrong value types
Event EmitterGeneric event map interfaceMisspelled events and wrong payloads
DI ContainerPhantom generics on token objectsResolving the wrong type from a token
VisitorDiscriminated unions + mapped typesForgetting to handle a variant
Command BusPhantom Command<TResult> + inferenceCommand/handler type mismatches
MiddlewareIntersection type accumulation via genericsAccessing context properties never added

Module System Deep Dive: Declaration Files & Ambient Modules

TypeScript's module system is the bridge between your source code and the outside world — npm packages, bundlers, runtimes, and other files in your project. Understanding how TypeScript resolves imports, generates declaration files, and handles the CJS/ESM divide is essential for anyone authoring libraries or working in complex monorepo setups.

This section covers the full machinery: resolution algorithms, declaration files, ambient modules, triple-slash directives, and the compiler flags that control module interop.

Module Resolution Algorithms

When you write import { foo } from "bar", TypeScript needs a strategy to locate the file or declaration behind "bar". The moduleResolution compiler option selects which algorithm to use. There are four main strategies, and choosing the wrong one is a common source of "cannot find module" errors.

StrategyUse WhenKey Behavior
node10Legacy Node.js (CJS only)Mimics Node's require() resolution. Ignores exports in package.json.
node16 / nodenextNode.js 16+ with ESM supportRespects exports field, distinguishes .mjs/.cjs, requires file extensions in relative imports.
bundlerWebpack, Vite, esbuild, etc.Respects exports but allows extensionless imports like bundlers do.
classicNever (legacy TS pre-2.0)Walks up directory tree. No node_modules lookup. Effectively obsolete.

node10 — Classic Node Resolution

This mirrors how Node.js resolved require() calls before ESM existed. For a bare specifier like "lodash", it walks up node_modules directories. For relative imports, it tries appending .ts, .tsx, .d.ts, then looks for index files in directories. It reads the main and types fields from package.json but completely ignores exports.

If you're targeting Node.js and your project still uses CommonJS exclusively, node10 works fine. But the moment you need conditional exports or ESM interop, it falls short.

node16 / nodenext — Modern Node with ESM

This is the algorithm you should use when targeting Node.js 16 or later. It faithfully models Node's actual resolution, including the dual CJS/ESM system. The key differences from node10: it reads the exports field in package.json, it requires file extensions on relative imports in ESM files, and it determines whether a file is ESM or CJS based on the nearest package.json's "type" field.

jsonc
// tsconfig.json for Node.js 16+
{
  "compilerOptions": {
    "module": "node16",
    "moduleResolution": "node16",
    "target": "es2022"
  }
}

Under node16, a relative import in an ESM file must include the file extension — and you write .js even though the source file is .ts:

typescript
// ✅ Correct — use .js extension (TypeScript finds utils.ts)
import { formatDate } from "./utils.js";

// ❌ Error under node16 — extensionless relative imports not allowed
import { formatDate } from "./utils";
Why .js in imports?

TypeScript does not rewrite import specifiers in emitted JavaScript. Since Node.js will see the .js file at runtime, you must write .js in your import — TypeScript maps it back to the .ts source during compilation.

bundler — For Modern Bundlers

Introduced in TypeScript 5.0, the bundler strategy is a pragmatic middle ground. It respects exports in package.json (like node16) but doesn't require file extensions on relative imports (because bundlers like Webpack, Vite, and esbuild don't require them). If you're building a frontend app or anything processed by a bundler, this is typically the right choice.

json
{
  "compilerOptions": {
    "module": "esnext",
    "moduleResolution": "bundler",
    "noEmit": true
  }
}

Resolution Flowchart: node16 / nodenext

The following diagram shows how TypeScript resolves an import specifier under the node16/nodenext strategy. The algorithm differs depending on whether the import is a relative path or a bare package specifier.

flowchart TD
    A["import 'specifier'"] --> B{"Relative path?\n(starts with ./ or ../)"}

    B -->|Yes| C{"Has file extension?"}
    C -->|Yes| D["Resolve exact path:\nspecifier.ts → specifier.js\nspecifier.d.ts → specifier.js"]
    C -->|No| E{"ESM or CJS context?"}
    E -->|"ESM (.mts or type: module)"| F["❌ Error: Extension required\nin ESM relative imports"]
    E -->|"CJS (.cts or type: commonjs)"| G["Try extensions:\n.ts → .tsx → .d.ts → .js\nThen try /index.*"]

    B -->|"No (bare specifier)"| H["Walk up node_modules"]
    H --> I{"package.json\nhas 'exports'?"}
    I -->|Yes| J{"Match condition:\nimport / require / types"}
    J --> K["Resolve matched\nexport entry"]
    J -->|"No match"| L["❌ Module not found"]

    I -->|No| M{"Read package.json\n'types' or 'main' field"}
    M --> N["Resolve to types file\nor infer .d.ts from main"]
    M -->|"No fields"| O["Try index.ts,\nindex.d.ts, index.js"]

    D --> P["✅ Module resolved"]
    G --> P
    K --> P
    N --> P
    O --> P

    style F fill:#f8d7da,stroke:#dc3545,color:#333
    style L fill:#f8d7da,stroke:#dc3545,color:#333
    style P fill:#d4edda,stroke:#28a745,color:#333
    

Package.json exports and imports Fields

The exports field in package.json is the modern way to control what a package exposes. It replaces the older main and types fields with a more powerful system that supports conditional exports — serving different files depending on the consumer's environment.

json
{
  "name": "my-library",
  "exports": {
    ".": {
      "types": "./dist/index.d.ts",
      "import": "./dist/esm/index.js",
      "require": "./dist/cjs/index.cjs"
    },
    "./utils": {
      "types": "./dist/utils.d.ts",
      "import": "./dist/esm/utils.js",
      "require": "./dist/cjs/utils.cjs"
    }
  }
}

The "types" condition must come first within each export entry. TypeScript matches conditions top-to-bottom, and if "types" isn't listed before "import" or "require", TypeScript may resolve the JavaScript file instead of the declaration file.

The imports field works similarly but defines private, package-internal aliases — useful for avoiding deeply nested relative paths within your own project:

json
{
  "imports": {
    "#utils/*": "./src/utils/*.ts",
    "#db": "./src/database/client.ts"
  }
}

Declaration Files (.d.ts)

Declaration files describe the shape of JavaScript code without containing any implementation. They are how TypeScript understands third-party libraries, native APIs, and your own compiled output. There are two ways to get them: let the compiler generate them, or write them by hand.

Generating Declarations Automatically

Set declaration: true in your tsconfig and TypeScript will emit a .d.ts file alongside each .js output. For libraries, you should also set declarationMap: true to generate source maps that let consumers "Go to Definition" and land in your .ts source instead of the .d.ts file.

json
{
  "compilerOptions": {
    "declaration": true,
    "declarationMap": true,
    "declarationDir": "./dist/types",
    "emitDeclarationOnly": true
  }
}

With emitDeclarationOnly: true, TypeScript produces only .d.ts files — no JavaScript. This is the standard setup when you use a separate tool (like esbuild or SWC) for the actual transpilation.

Writing Declarations Manually

When you install a package that ships no types and has no @types/* package on DefinitelyTyped, you can write a declaration file yourself. Create a .d.ts file anywhere in your project (commonly in a types/ or typings/ directory) and declare the module:

typescript
// types/legacy-chart-lib.d.ts
declare module "legacy-chart-lib" {
  export interface ChartOptions {
    width: number;
    height: number;
    title?: string;
  }

  export function renderChart(
    element: HTMLElement,
    data: number[],
    options?: ChartOptions
  ): void;
}

The types Field in package.json

For published packages, the "types" (or "typings") field in package.json tells TypeScript where to find the root declaration file. Under the node10 resolution strategy, this is the primary way TypeScript discovers types for a package. Under node16 and bundler, the "types" condition inside "exports" takes precedence when present.

json
{
  "name": "my-library",
  "main": "./dist/cjs/index.js",
  "module": "./dist/esm/index.js",
  "types": "./dist/types/index.d.ts"
}

Triple-Slash Directives

Triple-slash directives are special comments that instruct the compiler to include additional files or type packages. They look archaic, but they still serve specific purposes that import cannot replace.

typescript
/// <reference types="vite/client" />
/// <reference path="./legacy-globals.d.ts" />
/// <reference lib="dom" />
DirectivePurposeWhen to Use
/// <reference types="..." />Includes types from an @types package or a package's type declarationsIn .d.ts files to declare a dependency on global types (e.g., vite/client for import.meta)
/// <reference path="..." />Includes another file in the compilationRare — mostly in non-module scripts or generated code that needs explicit file ordering
/// <reference lib="..." />Includes a built-in lib (e.g., dom, es2022)In .d.ts files that need specific lib types without requiring the consumer to set them in tsconfig

In modern TypeScript projects, you'll most commonly encounter /// <reference types="..." /> in a env.d.ts or global.d.ts file to pull in ambient type augmentations from tools like Vite or Jest.

Ambient Modules

An ambient module declaration tells TypeScript "trust me, this module exists and has this shape" without providing an actual implementation. This is your escape hatch for untyped packages, non-JS imports, and global module augmentation.

Typing an Untyped Package

If a package has no types and no @types/* counterpart, declare a module with whatever level of detail you need. A minimal declaration that at least silences the "cannot find module" error:

typescript
// types/untyped-packages.d.ts

// Minimal: everything is `any`
declare module "some-untyped-lib";

// Better: declare the parts you actually use
declare module "some-untyped-lib" {
  export function parse(input: string): Record<string, unknown>;
  export const version: string;
}

Wildcard Module Declarations

Wildcard patterns handle non-JavaScript imports that bundlers understand but TypeScript doesn't — CSS modules, images, SVGs, and other asset files. One declaration covers an entire file extension:

typescript
// types/assets.d.ts
declare module "*.css" {
  const classes: Record<string, string>;
  export default classes;
}

declare module "*.svg" {
  const content: string;
  export default content;
}

declare module "*.png" {
  const src: string;
  export default src;
}

isolatedModules and Its Constraints

The isolatedModules flag tells TypeScript to enforce that every file can be transpiled in isolation — without knowledge of other files. This is critical when your build tool (esbuild, SWC, Babel) processes files one-at-a-time rather than doing a full program compilation.

With isolatedModules: true, three patterns become errors:

typescript
// ❌ 1. Re-exporting types without `type` keyword
//    A single-file transpiler can't know if `MyType` is a type or value
export { MyType } from "./types";
// ✅ Fix:
export type { MyType } from "./types";

// ❌ 2. const enum declarations (values are inlined from other files)
const enum Direction { Up, Down, Left, Right }
// ✅ Fix: use a regular enum
enum Direction { Up, Down, Left, Right }

// ❌ 3. Files that are not modules (no import/export)
//    Non-module files have ambiguous semantics for single-file transpilers
const x = 1;
// ✅ Fix: add an empty export
const y = 1;
export {};
Always enable isolatedModules

Even if you're using tsc today, enabling isolatedModules keeps your code compatible with faster transpilers. It's a requirement for Vite, Next.js, and most modern frameworks.

verbatimModuleSyntax (TypeScript 5.0+)

Introduced in TypeScript 5.0, verbatimModuleSyntax replaces the older isolatedModules re-export checks and the importsNotUsedAsValues flag with a single, stricter rule: any import that gets erased in the output must use import type.

The logic is simple. If you write import { X }, TypeScript will emit that import in the JavaScript output. If X is only a type and you didn't write import type { X }, you get an error — because the emitted JavaScript would contain an import for something that doesn't exist at runtime.

typescript
// With verbatimModuleSyntax: true

// ❌ Error: `User` is a type but imported as a value
import { User } from "./models";
type Admin = User & { role: "admin" };

// ✅ Correct: use `import type`
import type { User } from "./models";
type Admin = User & { role: "admin" };

// ✅ Mixed imports: inline `type` keyword for individual specifiers
import { createUser, type User } from "./models";

Module Interop: esModuleInterop and allowSyntheticDefaultImports

These two flags address the mismatch between how CJS and ESM handle default exports. In CommonJS, a module's entire export is module.exports — there's no concept of a "default" export separate from named exports. ESM expects export default to be a distinct thing.

esModuleInterop

When true, TypeScript emits helper functions (__importDefault, __importStar) that let you write natural ESM-style default imports for CJS modules. Without it, you'd need clunky namespace imports:

typescript
// express ships as CJS: module.exports = createApp;

// Without esModuleInterop:
import * as express from "express";  // works but awkward
const app = express();

// With esModuleInterop: true
import express from "express";        // clean default import
const app = express();

allowSyntheticDefaultImports

This is the type-checking-only counterpart. It lets you write import X from "cjs-module" without an error, but does not emit any helper code. It's useful when your runtime already handles this interop (Node.js with ESM does, and so do all bundlers). Note that esModuleInterop automatically implies allowSyntheticDefaultImports.

Don't confuse the two

allowSyntheticDefaultImports only silences the type checker — it doesn't change emitted code. If you're emitting CJS with tsc and your CJS module doesn't have a .default property, you'll get a runtime crash. Use esModuleInterop when you need TypeScript to emit the interop helpers.

Dual CJS/ESM Package Authoring with Types

Publishing a library that works for both require() and import consumers is the trickiest part of the module system. You need to ship separate CJS and ESM builds, with matching declaration files for each, and wire everything through exports in package.json.

Here is a complete package.json for a dual-format library:

json
{
  "name": "my-dual-lib",
  "version": "1.0.0",
  "type": "module",
  "exports": {
    ".": {
      "types": {
        "import": "./dist/esm/index.d.ts",
        "require": "./dist/cjs/index.d.cts"
      },
      "import": "./dist/esm/index.js",
      "require": "./dist/cjs/index.cjs"
    }
  },
  "main": "./dist/cjs/index.cjs",
  "types": "./dist/esm/index.d.ts"
}

The key detail is the nested "types" condition with separate "import" and "require" sub-conditions. This ensures that a CJS consumer gets .d.cts declarations (which TypeScript treats as CJS-typed) while an ESM consumer gets .d.ts declarations (ESM-typed). The fallback "main" and "types" fields at the root cover older tools that don't understand exports.

To build both formats, use two tsconfig files — one for each output:

json
{
  "extends": "./tsconfig.json",
  "compilerOptions": {
    "module": "node16",
    "outDir": "./dist/esm",
    "declaration": true,
    "declarationMap": true
  }
}
json
{
  "extends": "./tsconfig.json",
  "compilerOptions": {
    "module": "commonjs",
    "outDir": "./dist/cjs",
    "declaration": true,
    "declarationMap": true
  }
}

Then build both with:

bash
tsc -p tsconfig.esm.json && tsc -p tsconfig.cjs.json
Consider going ESM-only

Dual publishing adds real complexity. If your library's consumers are predominantly ESM (frontend frameworks, modern Node.js), shipping ESM-only with "type": "module" dramatically simplifies your build. Node.js can require() ESM modules since v22.12, further reducing the need for CJS builds.

Advanced tsconfig.json Options & Their Implications

Most TypeScript projects start with "strict": true and never look back. But tsconfig.json has dozens of options that quietly affect what your code compiles to, how it resolves modules, and whether your build tools can even process your files. This section breaks down the options that matter most — and the real-world consequences of getting them wrong.

The Strict Family: Beyond "strict": true

Setting "strict": true is a shorthand that enables a group of flags. Understanding each individually matters when you're migrating a legacy codebase (you can enable them one at a time) or debugging why a specific pattern is flagged as an error.

strictNullChecks

This is arguably the single most impactful flag. Without it, null and undefined are assignable to every type, which means TypeScript silently ignores the #1 source of runtime errors. With it enabled, you must explicitly handle nullable values.

typescript
// Without strictNullChecks — compiles fine, crashes at runtime
function getLength(s: string) {
  return s.length; // 💥 s could be null
}
getLength(null);

// With strictNullChecks — forces you to handle it
function getSafeLength(s: string | null) {
  if (s === null) return 0;
  return s.length; // safe
}

strictFunctionTypes

Enables contravariant checking of function parameter types. Without it, function parameters are checked bivariantly, which is unsound. The classic example involves callback assignments where a more specific handler gets assigned where a general one is expected.

typescript
type Handler = (event: Event) => void;
const mouseHandler: (event: MouseEvent) => void = (e) => {
  console.log(e.clientX); // MouseEvent-specific property
};

// strictFunctionTypes catches this: MouseEvent is narrower than Event
const handler: Handler = mouseHandler; // Error!

strictBindCallApply

Ensures that Function.prototype.bind, .call, and .apply are typed correctly rather than returning any. Without this, calling fn.call(thisArg, wrongArgs) produces no error.

strictPropertyInitialization

Requires that class properties declared without a ? modifier are assigned either in the declaration or in the constructor. This catches a common bug where you declare a property, forget to initialize it, and end up with undefined at runtime.

typescript
class User {
  name: string;  // Error: not assigned in constructor
  age!: number;  // OK — definite assignment assertion (use sparingly)

  constructor(name: string) {
    this.name = name;
    // 'age' is not assigned, but the '!' suppresses the check
  }
}

noUncheckedIndexedAccess

This flag is not included in "strict": true, but many teams consider it essential. It adds undefined to the type of any index signature access, forcing you to check before using the value. Without it, accessing obj[key] on a Record<string, number> gives you number — even though the key might not exist.

typescript
const scores: Record<string, number> = { alice: 95 };

// Without noUncheckedIndexedAccess
const bobScore = scores["bob"]; // type: number (wrong — it's undefined!)

// With noUncheckedIndexedAccess
const checkedScore = scores["bob"]; // type: number | undefined
if (checkedScore !== undefined) {
  console.log(checkedScore.toFixed(2)); // safe
}

exactOptionalProperties

Introduced in TypeScript 4.4, this flag distinguishes between "property is missing" and "property is explicitly set to undefined". With it enabled, you can't assign undefined to a property marked ?: — you must omit it entirely.

typescript
interface Config {
  debug?: boolean;
}

// With exactOptionalProperties
const bad: Config = { debug: undefined }; // Error!
const good: Config = {};                  // OK — property is absent
Migration strategy for strict flags

Don't enable all strict flags at once on a legacy codebase. Enable them one by one in this order: strictNullChecksstrictFunctionTypesstrictBindCallApplystrictPropertyInitializationnoUncheckedIndexedAccess. Each flag has a decreasing surface area of breakage, so you fix the most impactful issues first.

Module Resolution: The Most Confusing Part of TypeScript

Three options — module, moduleResolution, and moduleDetection — interact to determine how TypeScript finds and interprets your imports. Getting these wrong leads to "Cannot find module" errors that feel impossible to debug. The correct combination depends entirely on your runtime and build target.

module

Controls the output format of your emitted JavaScript modules. This tells TypeScript what kind of import/export/require syntax to produce. Common values: commonjs, es2015/es2020/es2022/esnext, node16/nodenext, preserve.

moduleResolution

Controls the algorithm TypeScript uses to find the file behind an import specifier. This is separate from module — one is about the output, the other is about the lookup strategy. Values: node10 (the legacy default), node16/nodenext, bundler.

moduleDetection

Determines how TypeScript decides whether a file is a module or a script. Set to "force" in most modern projects so every file is treated as a module (even without top-level import/export). The default "auto" can cause subtle bugs when a utility file has no imports or exports.

The Correct Combinations

Project TargetmodulemoduleResolutionNotes
Node.js (CJS)commonjsnode10Legacy but still common. Doesn't support exports in package.json.
Node.js (ESM or dual)node16 / nodenextnode16 / nodenextEnforces .js extensions in imports. Reads exports map. Must match each other.
Bundler (Vite, webpack, esbuild)esnext / preservebundlerNo extension enforcement. Supports exports map. Most flexible for apps.
Don't mix node16 module with bundler resolution

Using "module": "node16" with "moduleResolution": "bundler" is an invalid combination. TypeScript will either error or silently behave incorrectly. When using node16/nodenext for module, the moduleResolution must match. The bundler resolution is designed for "module": "esnext" or "module": "preserve".

Path Mapping: paths and baseUrl

Path aliases let you replace deeply nested relative imports like ../../../utils/logger with clean paths like @/utils/logger. You configure these with the paths option, and optionally baseUrl.

json
{
  "compilerOptions": {
    "baseUrl": ".",
    "paths": {
      "@/*": ["./src/*"],
      "@components/*": ["./src/components/*"]
    }
  }
}

Here's the critical point that trips people up: TypeScript does not rewrite these paths in the emitted JavaScript. The paths option only affects type resolution during compilation. Your output .js files will still contain import ... from "@/utils/logger", which Node.js doesn't understand.

You need a separate mechanism to resolve these at runtime:

  • Bundlers (Vite, webpack, esbuild) — configure corresponding aliases in their config
  • tsc-alias — a post-compilation tool that rewrites paths in emitted .js files
  • Node.js subpath imports — use the "imports" field in package.json (works natively, no extra tooling needed)

Monorepo Project References: composite and references

In a monorepo, you typically have multiple packages that depend on each other. TypeScript's project references let you build them incrementally — only recompiling packages whose source files actually changed.

The setup requires two things: the referenced project must set "composite": true, and the consuming project must list it in "references".

json
// packages/shared/tsconfig.json
{
  "compilerOptions": {
    "composite": true,
    "declaration": true,
    "declarationMap": true,
    "outDir": "./dist"
  }
}

// packages/app/tsconfig.json
{
  "compilerOptions": { "outDir": "./dist" },
  "references": [
    { "path": "../shared" }
  ]
}

When "composite": true is set, TypeScript enforces that "declaration" is also true (it needs .d.ts files to type-check dependents without re-parsing source). You then build with tsc --build (or tsc -b), which compiles projects in dependency order and skips up-to-date ones using .tsbuildinfo files.

Library Authoring: declaration, declarationMap, sourceMap

If you're publishing a library to npm, the compiled output needs to include type information so consumers get autocompletion and type checking. Three options control this:

  • declaration — emits .d.ts files alongside .js files. Consumers use these for type information.
  • declarationMap — emits .d.ts.map files that link declaration positions back to the original .ts source. This enables "Go to Definition" to jump to your TypeScript source rather than the .d.ts file.
  • sourceMap — emits .js.map files linking compiled JavaScript back to TypeScript source. Required for debugging in tools like VS Code, Chrome DevTools, or error-reporting services.

For library authoring, enable all three. The cost is slightly larger package size, but the developer experience for consumers is vastly better.

skipLibCheck vs. skipDefaultLibCheck

These two flags are commonly confused, and one of them is far more aggressive than it appears.

OptionWhat it skipsWhen to use
skipLibCheckSkips type checking of all .d.ts files — your own, third-party, and built-inWhen you need fast builds and accept the risk of hidden type errors in declarations
skipDefaultLibCheckSkips type checking of only the built-in lib.*.d.ts files (e.g., lib.dom.d.ts)When built-in lib conflicts cause issues but you still want to check third-party types

In practice, nearly every project uses "skipLibCheck": true because checking all .d.ts files is slow and often surfaces errors in third-party packages you can't fix anyway. But be aware: it means your own .d.ts output files (if you emit them) are also unchecked. Bugs in your type declarations can ship silently.

Transpiler Compatibility: isolatedModules, verbatimModuleSyntax, isolatedDeclarations

These flags exist because modern build tools like esbuild, swc, and Babel transpile each file independently — they don't perform cross-file type analysis like tsc. Certain TypeScript patterns are impossible to compile without whole-program knowledge, and these flags prevent you from writing them.

isolatedModules

The original flag for single-file transpiler safety. It disallows patterns that require cross-file information to emit correct JavaScript. Key restrictions: no const enum across files (the transpiler can't inline values it can't see), no re-exporting types without the type keyword, and no files that aren't modules.

typescript
// ❌ Error with isolatedModules: re-exporting a type needs 'type' keyword
export { SomeType } from "./types";

// ✅ Correct
export type { SomeType } from "./types";

// ❌ Error: const enums can't be inlined by single-file transpilers
const enum Direction { Up, Down, Left, Right }

// ✅ Use regular enum or union type instead
enum Direction { Up, Down, Left, Right }

verbatimModuleSyntax (TS 5.0+)

The modern replacement for isolatedModules combined with importsNotUsedAsValues and preserveValueImports (both now deprecated). The rule is simple: anything imported with the type keyword is dropped entirely from the output; everything else is preserved exactly as written.

typescript
import type { User } from "./models";    // completely erased in output
import { formatDate } from "./utils";     // preserved in output
import { type Role, fetchUser } from "./api"; // Role erased, fetchUser kept

isolatedDeclarations (TS 5.5+)

This is the newest addition and enables parallel declaration file generation. Just as isolatedModules requires each file to be transpilable in isolation, isolatedDeclarations requires each file to have enough type annotations that its .d.ts can be generated without cross-file inference. In practice, this means you need explicit return types on exported functions.

typescript
// ❌ Error with isolatedDeclarations: return type must be explicit
export function createUser(name: string) {
  return { id: crypto.randomUUID(), name, createdAt: new Date() };
}

// ✅ Explicit return type allows parallel .d.ts generation
export function createUser(name: string): {
  id: string;
  name: string;
  createdAt: Date;
} {
  return { id: crypto.randomUUID(), name, createdAt: new Date() };
}

noEmit: Type-Checking Only

When your build pipeline uses a separate tool for transpilation (esbuild, swc, Vite, Next.js), TypeScript's job becomes purely type checking. Setting "noEmit": true tells tsc to not produce any output files — no .js, no .d.ts, no source maps.

This is the standard setup for modern application projects. You run tsc --noEmit (or tsc -b --noEmit for monorepos) in CI as a type-check step, while your bundler handles the actual compilation. It's fast because tsc skips all the emit work.

json
{
  "compilerOptions": {
    "noEmit": true,
    "isolatedModules": true,
    "verbatimModuleSyntax": true,
    "moduleResolution": "bundler",
    "module": "esnext"
  }
}

Compiler Target vs. Runtime Polyfills: target and lib

These two options are often confused. They control different things entirely.

target controls syntax downleveling. It tells TypeScript what JavaScript version to emit. Setting "target": "es2015" means tsc will downlevel async/await (ES2017) into generator-based code, but leave arrow functions (ES2015) as-is. Setting "target": "es2022" preserves top-level await and class fields natively.

lib controls which type declarations are available. It determines what APIs TypeScript considers to exist in your runtime. Setting "lib": ["es2022", "dom"] means you get type definitions for Promise, Array.prototype.at(), structuredClone(), and all DOM APIs.

target does not polyfill APIs

Setting "target": "es5" downlevels syntax (arrow functions, classes, destructuring), but it does not polyfill APIs like Promise, Map, Array.from(), or fetch. If you target an older runtime, you must provide polyfills yourself (e.g., via core-js). Conversely, lib only adds type definitions — it doesn't make the API actually exist at runtime.

Recommended Configurations

Below are battle-tested starting points for the three most common project types. Adjust individual flags based on your specific constraints.

json
{
  "compilerOptions": {
    "target": "es2022",
    "lib": ["es2023", "dom", "dom.iterable"],
    "module": "esnext",
    "moduleResolution": "bundler",
    "moduleDetection": "force",
    "strict": true,
    "noUncheckedIndexedAccess": true,
    "noEmit": true,
    "isolatedModules": true,
    "verbatimModuleSyntax": true,
    "skipLibCheck": true,
    "jsx": "react-jsx"
  },
  "include": ["src"]
}

Type-check only — your bundler (Vite, esbuild, webpack) handles emit. The bundler module resolution lets you skip file extensions in imports. noEmit means tsc produces zero output files.

json
{
  "compilerOptions": {
    "target": "es2020",
    "lib": ["es2020"],
    "module": "node16",
    "moduleResolution": "node16",
    "strict": true,
    "noUncheckedIndexedAccess": true,
    "declaration": true,
    "declarationMap": true,
    "sourceMap": true,
    "outDir": "./dist",
    "isolatedModules": true,
    "verbatimModuleSyntax": true,
    "skipLibCheck": true
  },
  "include": ["src"]
}

Emits JS, .d.ts, and source maps. Uses node16 resolution to enforce correct ESM conventions — file extensions in imports, proper exports map support. Set target conservatively based on your consumers' minimum Node.js version.

json
{
  "compilerOptions": {
    "target": "es2022",
    "module": "node16",
    "moduleResolution": "node16",
    "strict": true,
    "noUncheckedIndexedAccess": true,
    "declaration": true,
    "declarationMap": true,
    "composite": true,
    "sourceMap": true,
    "isolatedModules": true,
    "verbatimModuleSyntax": true,
    "skipLibCheck": true
  },
  "references": [
    { "path": "./packages/shared" },
    { "path": "./packages/core" },
    { "path": "./packages/app" }
  ],
  "include": []
}

Root config for a monorepo using project references. The "include": [] means the root project itself compiles nothing — it only orchestrates builds via tsc -b. Each referenced package has its own tsconfig.json extending a shared base, with "composite": true set.

Performance: Type-Level Optimization & Project References

TypeScript's type system is extraordinarily powerful — but power has a cost. Complex types can slow your IDE to a crawl, and large codebases can take minutes to compile. This section covers two complementary performance domains: type-level optimization (making the compiler think less) and build-level optimization (making the compiler do less).

Why Types Get Slow

The TypeScript compiler resolves types by expanding them fully. When you write a deeply recursive conditional type or distribute a union across a complex mapped type, the compiler must evaluate every branch and every combination. A union of 20 string literals distributed through 3 layers of conditional types can generate thousands of intermediate type nodes.

The three most common patterns that cause type explosion are:

  • Deeply nested conditional types — Each level of recursion multiplies the work. A type that recurses 50+ levels to parse a string literal or walk a deeply nested object can hit the compiler's recursion limit or just take seconds per keystroke.
  • Large union distributions — Distributive conditional types iterate over every member of a union independently. A union of 100 members through a complex conditional means 100 separate evaluations.
  • Unresolved generics in complex positions — When the compiler can't narrow a generic early, it carries the full generic expression through every subsequent operation, deferring resolution and bloating intermediate representations.
typescript
// ❌ Expensive: deeply recursive + distributive over large unions
type DeepFlatten<T> = T extends ReadonlyArray<infer U>
  ? DeepFlatten<U>
  : T;

// With a union of 50 array types, this evaluates 50 separate
// recursive chains — each potentially many levels deep.
type Result = DeepFlatten<A | B | C | /* ... 50 members */ >;

Identifying Expensive Types with --generateTrace

Before optimizing, you need to measure. The --generateTrace flag outputs a Chrome-compatible trace file that shows exactly where the compiler spends time. Open it in chrome://tracing or Perfetto UI to see a flame chart of type checking.

bash
# Generate a trace directory with JSON files
tsc --generateTrace ./trace-output

# Then open trace.json in chrome://tracing or Perfetto UI
# Look for long bars in "checkSourceFile" and "checkExpression"

In the trace, look for type instantiations that appear thousands of times or single type checks that take hundreds of milliseconds. These are your optimization targets. The trace also shows which specific source locations trigger the expensive type work.

Type-Level Optimization Strategies

Use interface extends Instead of Type Intersections

This is one of the highest-impact changes you can make. The TypeScript compiler caches interface types but must recompute intersection types (&) every time they're encountered. If you're building up object types through intersections, switching to interface inheritance can dramatically reduce check times.

typescript
// ❌ Slow: intersection must be recomputed at each usage site
type UserWithPermissions = User & Permissions & AuditFields;

// ✅ Fast: interface is computed once and cached
interface UserWithPermissions extends User, Permissions, AuditFields {}

Avoid Unnecessary Distribution

Distributive conditional types (where T is a bare type parameter in T extends ...) automatically iterate over union members. If you don't need that behavior, wrap the type parameter in a tuple to prevent distribution.

typescript
// ❌ Distributive: evaluates once per union member
type IsArray<T> = T extends any[] ? true : false;
type R1 = IsArray<string | number[]>;  // boolean (distributes!)

// ✅ Non-distributive: evaluates the union as a whole
type IsArray<T> = [T] extends [any[]] ? true : false;
type R2 = IsArray<string | number[]>;  // false (no distribution)

Use NoInfer to Reduce Inference Work

TypeScript 5.4 introduced the NoInfer<T> utility type. It tells the compiler not to use a particular parameter position for type inference, which reduces the number of inference candidates the compiler must reconcile. This is especially useful when a type parameter appears in multiple positions and you only want one of them to drive inference.

typescript
// Without NoInfer, TS tries to infer T from both `value` and `fallback`
function getOrDefault<T>(value: T | null, fallback: T): T { ... }

// With NoInfer, inference is driven only by `value`
function getOrDefault<T>(value: T | null, fallback: NoInfer<T>): T {
  return value ?? fallback;
}

// Now only `value` determines T — less inference work, clearer errors
getOrDefault("hello", 42);  // Error: number is not assignable to string

Simplify Type Constraints

Overly complex generic constraints force the compiler to carry large type expressions throughout a function body. Prefer simple, named interface constraints over inline complex types. If a constraint is longer than one line, extract it into a named type or interface.

The Caching Rule of Thumb

Interfaces are structurally cached by the compiler. Type aliases using intersections, mapped types, or conditional types are re-evaluated each time they appear. If you notice a complex type appearing in many places, converting it to an interface (when possible) can yield significant speedups.

Build Performance: Incremental Compilation

Even with optimized types, compiling a large project from scratch is slow. Incremental compilation lets TypeScript skip re-checking files that haven't changed since the last build. Enable it with a single flag — TypeScript writes a .tsbuildinfo file that records the state of the last successful build.

json
// tsconfig.json
{
  "compilerOptions": {
    "incremental": true,
    "tsBuildInfoFile": "./dist/.tsbuildinfo"
  }
}

On subsequent builds, TypeScript reads the .tsbuildinfo file to determine which files are unchanged and skips them entirely. For a 500-file project, this can reduce rebuild times from 30 seconds to 2-3 seconds when only a few files change.

Project References for Monorepos

Incremental compilation works within a single project. Project references extend this across multiple projects in a monorepo. Each sub-project declares itself as composite and lists its dependencies via the references field. The tsc --build (tsc -b) command then builds projects in dependency order, skipping any that are already up to date.

The key benefits are: build orchestration respects dependency order automatically, each project is type-checked independently (parallelizable), and declaration files (.d.ts) serve as the boundary between projects — downstream projects never re-parse upstream source files.

graph TD
    ST["shared-types
tsconfig.json (composite)"] CL["core-lib
tsconfig.json (composite)"] API["api-server
tsconfig.json"] WEB["web-app
tsconfig.json"] CLI["cli-tool
tsconfig.json"] ST --> CL CL --> API CL --> WEB CL --> CLI ST -.->|"direct ref"| API ST -.->|"direct ref"| WEB style ST fill:#4a9eff,color:#fff,stroke:#2d7fd4 style CL fill:#6c5ce7,color:#fff,stroke:#5041b2 style API fill:#00b894,color:#fff,stroke:#009874 style WEB fill:#00b894,color:#fff,stroke:#009874 style CLI fill:#00b894,color:#fff,stroke:#009874

Solid arrows indicate primary build dependencies. Dashed arrows indicate direct references where leaf projects also import shared types. The build order is: shared-types → core-lib → (api-server, web-app, cli-tool in parallel).

Setting Up the Project Structure

Here's a concrete monorepo setup. Each package has its own tsconfig.json with composite: true, and a root tsconfig.json orchestrates the build.

json
// packages/shared-types/tsconfig.json
{
  "compilerOptions": {
    "composite": true,
    "declaration": true,
    "declarationMap": true,
    "outDir": "./dist",
    "rootDir": "./src"
  },
  "include": ["src"]
}
json
// packages/core-lib/tsconfig.json
{
  "compilerOptions": {
    "composite": true,
    "declaration": true,
    "outDir": "./dist",
    "rootDir": "./src"
  },
  "references": [
    { "path": "../shared-types" }
  ],
  "include": ["src"]
}
json
// packages/api-server/tsconfig.json
{
  "compilerOptions": {
    "outDir": "./dist",
    "rootDir": "./src"
  },
  "references": [
    { "path": "../shared-types" },
    { "path": "../core-lib" }
  ],
  "include": ["src"]
}
json
// tsconfig.json (root — build orchestrator)
{
  "files": [],
  "references": [
    { "path": "packages/shared-types" },
    { "path": "packages/core-lib" },
    { "path": "packages/api-server" },
    { "path": "packages/web-app" },
    { "path": "packages/cli-tool" }
  ]
}

Build everything in the correct order with a single command:

bash
# Build all projects in dependency order, skip up-to-date ones
tsc -b

# Force a clean rebuild
tsc -b --clean

# Build in watch mode
tsc -b --watch

Watch Mode & Editor Performance

In large codebases, even watch mode can feel sluggish because TypeScript re-checks all transitive dependents of a changed file. The assumeChangesOnlyAffectDirectDependencies flag tells the compiler to only re-check files that directly import the changed file, skipping transitive dependents. This trades correctness for speed — rare transitive type errors may go undetected until a full build.

json
{
  "watchOptions": {
    "assumeChangesOnlyAffectDirectDependencies": true
  }
}

For editor performance specifically, two quick wins:

  • skipLibCheck: true — Skips type-checking of all .d.ts files (including node_modules). This is almost always safe and can cut check times by 30-50% in projects with many dependencies.
  • Avoid @ts-check in large JS codebases — If you're gradually migrating a JavaScript project, applying @ts-check to every file forces the language service to type-check all of them. Migrate files to .ts incrementally instead.
VS Code–Specific Tip

If your editor is sluggish, open the TypeScript output panel (Ctrl+Shift+U → select "TypeScript") to see what the language server is doing. You can also set "typescript.tsserver.maxTsServerMemory" in VS Code settings to increase the memory limit for very large projects (default is ~3 GB).

Measuring & Benchmarking

You can't optimize what you don't measure. TypeScript provides several built-in diagnostics flags that give you hard numbers on compile performance.

bash
# Basic timing breakdown
tsc --diagnostics

# Detailed breakdown including memory, I/O, and type counts
tsc --extendedDiagnostics

# Full trace for Chrome DevTools profiling
tsc --generateTrace ./trace-output

Here's what to look for in the --extendedDiagnostics output:

MetricWhat It Tells YouRed Flag Threshold
Check timeTime spent on type-checking alone> 60% of total time
TypesTotal number of type nodes created> 100,000 in a mid-size project
InstantiationsNumber of generic type instantiations> 500,000 suggests type explosion
Memory usedPeak memory consumption> 2 GB for a single project
I/O Read timeTime spent reading files from diskHigh values suggest too many files in scope

For continuous monitoring, add a CI step that runs tsc --extendedDiagnostics and parses the output. Track the Instantiations and Check time metrics over time — if they spike after a PR, you know exactly which change caused the regression.

Don't Optimize Prematurely

Most projects never need type-level optimization. Start with skipLibCheck: true, incremental: true, and project references. Only dive into --generateTrace and type-level rewrites when you have measurable evidence of a bottleneck — a specific file or type that shows up in the trace as disproportionately expensive.

TypeScript Compiler API & Custom Transformers

The typescript npm package is not just a CLI tool — it exposes a full programmatic API that lets you parse, analyze, transform, and emit TypeScript code. This is the same API that powers tsc, VS Code's TypeScript integration, and dozens of build tools. Mastering it unlocks capabilities like custom linters with full type awareness, compile-time code generation, and automated refactoring at scale.

This section walks through the compiler's internal architecture, shows you how to work with the AST and type system programmatically, and guides you through writing custom transformers that modify code during compilation.

The Compilation Pipeline

Before writing any compiler API code, you need a mental model of how TypeScript processes source files. The pipeline has distinct stages, each producing a data structure that the next stage consumes. Custom transformers hook into this pipeline just before the final emission step.

graph LR
    A["Source Files\n(.ts, .tsx)"] --> B["Scanner\n(Tokens)"]
    B --> C["Parser\n(AST / SourceFile)"]
    C --> D["Binder\n(Symbols)"]
    D --> E["Type Checker\n(Types)"]
    E --> F["Emitter\n(.js, .d.ts)"]
    T["Custom\nTransformers"] -.->|"before / after /\nafterDeclarations"| F
    style T fill:#f59e0b,stroke:#d97706,color:#000
    style F fill:#10b981,stroke:#059669,color:#000
    

The Scanner tokenizes raw text into a stream of tokens (keywords, identifiers, punctuation). The Parser consumes tokens and builds an Abstract Syntax Tree (AST), represented as SourceFile nodes. The Binder walks the AST and creates Symbol objects, connecting declarations to their usage sites. The Type Checker resolves all types, performs assignability checks, and populates each symbol with its resolved type. Finally, the Emitter walks the (possibly transformed) AST and produces JavaScript output, declaration files, and source maps.

Core API Objects

Four objects form the foundation of every compiler API interaction. Understanding their relationships is essential before you write a single line of code.

ObjectWhat It RepresentsHow You Get It
ProgramThe entire compilation unit — all source files plus configts.createProgram(files, options)
SourceFileA single file's AST. The root node of the tree for that fileprogram.getSourceFile(path)
TypeCheckerThe query interface for type information, symbol resolution, and diagnosticsprogram.getTypeChecker()
NodeA single AST node (function declaration, identifier, expression, etc.)Traversing a SourceFile tree

Creating a Program and Querying Types

The entry point to the compiler API is ts.createProgram. You pass it a list of root file names and a compiler options object, and it returns a Program that has parsed and bound all reachable files. From the program, you can obtain the TypeChecker — your window into the full type system.

typescript
import * as ts from "typescript";

// Create a program from a tsconfig or explicit file list
const program = ts.createProgram(["src/index.ts"], {
  target: ts.ScriptTarget.ES2022,
  module: ts.ModuleKind.ESNext,
  strict: true,
});

const checker = program.getTypeChecker();
const sourceFile = program.getSourceFile("src/index.ts")!;

// Walk top-level statements and inspect types
ts.forEachChild(sourceFile, (node) => {
  if (ts.isVariableStatement(node)) {
    for (const decl of node.declarationList.declarations) {
      const symbol = checker.getSymbolAtLocation(decl.name);
      if (symbol) {
        const type = checker.getTypeOfSymbolAtLocation(symbol, decl);
        const typeName = checker.typeToString(type);
        console.log(`${symbol.getName()}: ${typeName}`);
      }
    }
  }
});

This snippet creates a program, obtains the type checker, and iterates over every top-level variable declaration in src/index.ts — printing each variable's name alongside its resolved type. The TypeChecker methods like getTypeOfSymbolAtLocation and typeToString give you the same type information that VS Code shows on hover.

Walking the AST: Visitor Patterns

TypeScript provides two primary ways to traverse the AST. The simpler ts.forEachChild visits direct children of a node (one level deep). For recursive, depth-first traversal — or when you need to transform nodes — you use the visitor pattern with ts.visitNode and ts.visitEachChild.

typescript
// Simple recursive walk — find all function names in a file
function findAllFunctions(sourceFile: ts.SourceFile): string[] {
  const names: string[] = [];

  function visit(node: ts.Node) {
    if (ts.isFunctionDeclaration(node) && node.name) {
      names.push(node.name.text);
    }
    // Recurse into children
    ts.forEachChild(node, visit);
  }

  visit(sourceFile);
  return names;
}

// Visitor pattern for transformations (returns new nodes)
function visitor(ctx: ts.TransformationContext) {
  return (rootNode: ts.SourceFile): ts.SourceFile => {
    function visit(node: ts.Node): ts.Node {
      // Transform or return node as-is
      node = ts.visitEachChild(node, visit, ctx);
      return node;
    }
    return ts.visitNode(rootNode, visit) as ts.SourceFile;
  };
}
Immutable AST

The TypeScript AST is immutable. You never mutate a node in place. Instead, transformer visitors return new nodes (created with ts.factory methods) to replace existing ones. Returning the original node means "no change." This is why ts.visitEachChild returns a new subtree rather than modifying the existing one.

Custom Transformers: Before, After, and AfterDeclarations

Custom transformers are functions that receive a TransformationContext and return a visitor. They hook into the emitter phase at three points:

  • before — Runs on the AST before TypeScript's built-in transformers (like JSX to createElement, or enum to object). Your code sees the full TypeScript AST including type annotations.
  • after — Runs after TypeScript's transformers have downleveled the code. You see near-final JavaScript AST.
  • afterDeclarations — Runs on .d.ts output only. Useful for modifying emitted type declarations.

Example: Auto-Inject Logging into Functions

This transformer prepends a console.log call at the beginning of every function body, printing the function name and arguments. It operates in the before phase to access the original function names.

typescript
import * as ts from "typescript";

const addLoggingTransformer: ts.TransformerFactory<ts.SourceFile> =
  (context) => {
    return (sourceFile) => {
      function visit(node: ts.Node): ts.Node {
        if (ts.isFunctionDeclaration(node) && node.name && node.body) {
          const fnName = node.name.text;

          // Create: console.log("entering fnName", arguments)
          const logStatement = ts.factory.createExpressionStatement(
            ts.factory.createCallExpression(
              ts.factory.createPropertyAccessExpression(
                ts.factory.createIdentifier("console"),
                "log"
              ),
              undefined,
              [
                ts.factory.createStringLiteral(`entering ${fnName}`),
                ts.factory.createIdentifier("arguments"),
              ]
            )
          );

          // Prepend the log statement to the function body
          const newBody = ts.factory.createBlock(
            [logStatement, ...node.body.statements],
            true
          );

          return ts.factory.updateFunctionDeclaration(
            node,
            node.modifiers,
            node.asteriskToken,
            node.name,
            node.typeParameters,
            node.parameters,
            node.type,
            newBody
          );
        }
        return ts.visitEachChild(node, visit, context);
      }

      return ts.visitNode(sourceFile, visit) as ts.SourceFile;
    };
  };

Running Transformers Programmatically

To use a custom transformer, pass it to program.emit via the customTransformers option:

typescript
const program = ts.createProgram(["src/index.ts"], compilerOptions);

const emitResult = program.emit(
  undefined, // targetSourceFile — undefined = all files
  undefined, // writeFile — undefined = default disk write
  undefined, // cancellationToken
  false,     // emitOnlyDtsFiles
  {
    before: [addLoggingTransformer],
    after: [],
    afterDeclarations: [],
  }
);

// Check for diagnostics
const diagnostics = ts.getPreEmitDiagnostics(program)
  .concat(emitResult.diagnostics);
for (const d of diagnostics) {
  console.error(ts.flattenDiagnosticMessageText(d.messageText, "\n"));
}

Building a Custom Linter with Type Information

Unlike ESLint rules that operate on syntax alone (or use @typescript-eslint/parser for limited type info), a compiler API–based linter has access to the full type checker. This means you can write rules like "flag any function call where the return type is Promise but the result is not awaited."

typescript
function lintUnawaitedPromises(program: ts.Program) {
  const checker = program.getTypeChecker();
  const issues: { file: string; line: number; text: string }[] = [];

  for (const sourceFile of program.getSourceFiles()) {
    if (sourceFile.isDeclarationFile) continue;

    function visit(node: ts.Node) {
      // Look for expression statements (not assignments, not awaits)
      if (
        ts.isExpressionStatement(node) &&
        ts.isCallExpression(node.expression)
      ) {
        const type = checker.getTypeAtLocation(node.expression);
        const typeName = checker.typeToString(type);

        if (typeName.startsWith("Promise<")) {
          const { line } = sourceFile.getLineAndCharacterOfPosition(
            node.getStart()
          );
          issues.push({
            file: sourceFile.fileName,
            line: line + 1,
            text: `Unawaited Promise (type: ${typeName})`,
          });
        }
      }
      ts.forEachChild(node, visit);
    }

    visit(sourceFile);
  }
  return issues;
}

Integration with Build Tools

Running transformers programmatically works fine for scripts, but most projects use tsc or a bundler. Several tools bridge this gap and let you plug custom transformers into your standard build workflow.

ToolApproachConfig Location
ts-patchPatches the installed typescript package to support a plugins array in tsconfig.jsontsconfig.json compilerOptions.plugins
ts-loaderWebpack loader with a getCustomTransformers optionwebpack.config.js
@swc/coreSWC plugins (Rust-based, different API) for high-perf transforms.swcrc
ProgrammaticCall ts.createProgram + program.emit directly in a build scriptCustom script

Using ts-patch

ts-patch is the most popular way to use custom transformers with standard tsc. After installing and patching, you declare transformers directly in tsconfig.json:

json
{
  "compilerOptions": {
    "plugins": [
      { "transform": "./transforms/add-logging.ts", "type": "program" },
      { "transform": "my-published-transformer", "afterDeclarations": true }
    ]
  }
}
bash
npm install -D ts-patch
npx ts-patch install   # patches node_modules/typescript
npx tsc                # now reads "plugins" and applies transformers

Language Service Plugins

Language service plugins extend the editor experience rather than the compilation output. They run inside VS Code (or any editor using tsserver) and can add custom completions, diagnostics, quick fixes, and hover information. The plugin API is a decorator around the existing LanguageService interface.

typescript
// A language service plugin that adds custom diagnostics
function init(modules: { typescript: typeof ts }) {
  const tsModule = modules.typescript;

  function create(info: ts.server.PluginCreateInfo) {
    const proxy = Object.create(null) as ts.LanguageService;
    const oldService = info.languageService;

    // Proxy all existing methods
    for (const k of Object.keys(oldService) as Array<keyof ts.LanguageService>) {
      (proxy as any)[k] = (...args: any[]) =>
        (oldService as any)[k](...args);
    }

    // Override getSemanticDiagnostics to add custom checks
    proxy.getSemanticDiagnostics = (fileName) => {
      const prior = oldService.getSemanticDiagnostics(fileName);
      const source = oldService.getProgram()?.getSourceFile(fileName);
      if (!source) return prior;

      const custom: ts.Diagnostic[] = [];
      // ... walk AST and push custom diagnostics
      return [...prior, ...custom];
    };

    return proxy;
  }

  return { create };
}

export = init;

Register the plugin in tsconfig.json under compilerOptions.plugins — no patching required, since language service plugins are natively supported by TypeScript.

ts-morph: A Higher-Level Alternative

The raw compiler API is powerful but verbose. Creating a simple string literal requires four nested ts.factory calls. The ts-morph library wraps the compiler API in a fluent, discoverable interface that feels closer to working with a DOM.

typescript
// Find all exported interfaces and list their properties
const program = ts.createProgram(["src/models.ts"], options);
const source = program.getSourceFile("src/models.ts")!;
const checker = program.getTypeChecker();

ts.forEachChild(source, (node) => {
  if (
    ts.isInterfaceDeclaration(node) &&
    node.modifiers?.some(
      (m) => m.kind === ts.SyntaxKind.ExportKeyword
    )
  ) {
    const symbol = checker.getSymbolAtLocation(node.name)!;
    const type = checker.getDeclaredTypeOfSymbol(symbol);
    for (const prop of type.getProperties()) {
      const propType = checker.getTypeOfSymbolAtLocation(prop, node);
      console.log(
        `${symbol.getName()}.${prop.getName()}: ${checker.typeToString(propType)}`
      );
    }
  }
});
typescript
// Same task with ts-morph — drastically simpler
import { Project } from "ts-morph";

const project = new Project({ tsConfigFilePath: "tsconfig.json" });
const source = project.getSourceFileOrThrow("src/models.ts");

for (const iface of source.getInterfaces()) {
  if (!iface.isExported()) continue;
  for (const prop of iface.getProperties()) {
    console.log(
      `${iface.getName()}.${prop.getName()}: ${prop.getType().getText()}`
    );
  }
}

ts-morph is ideal for code analysis scripts, codemods, and code generation. For custom transformers that run during compilation, you still need the raw API — transformers operate on ts.Node objects that ts-morph cannot intercept at emit time.

Practical Project: Compile-Time Schema-to-Type Generator

Let's build a real tool that reads runtime JSON schemas and generates TypeScript types at compile time. This transformer scans for special marker calls and replaces them with concrete type definitions derived from JSON schema files on disk.

  1. Define the marker function and schema

    Your source code uses a placeholder call that the transformer will replace. At runtime, this function would be a no-op. At compile time, the transformer reads the referenced schema file and generates a real type.

    typescript
    // src/schemas/user.schema.json
    // { "name": "string", "age": "number", "email": "string" }
    
    // src/models.ts — before transformation
    declare function GenerateType<T>(schemaPath: string): T;
    
    type User = GenerateType<any>("./schemas/user.schema.json");
  2. Write the transformer

    The transformer finds GenerateType call expressions in type alias declarations, reads the JSON schema at the given path, and produces a TypeLiteral node with the correct properties.

    typescript
    import * as ts from "typescript";
    import * as fs from "fs";
    import * as path from "path";
    
    const schemaTypeMap: Record<string, ts.KeywordTypeSyntaxKind> = {
      string: ts.SyntaxKind.StringKeyword,
      number: ts.SyntaxKind.NumberKeyword,
      boolean: ts.SyntaxKind.BooleanKeyword,
    };
    
    function schemaToTypeTransformer(
      program: ts.Program
    ): ts.TransformerFactory<ts.SourceFile> {
      return (context) => (sourceFile) => {
        function visit(node: ts.Node): ts.Node {
          // Match: type X = GenerateType<any>("path")
          if (
            ts.isTypeAliasDeclaration(node) &&
            ts.isCallExpression(node.type as any)
          ) {
            const call = node.type as unknown as ts.CallExpression;
            const callee = call.expression;
    
            if (ts.isIdentifier(callee) && callee.text === "GenerateType") {
              const arg = call.arguments[0];
              if (ts.isStringLiteral(arg)) {
                const schemaPath = path.resolve(
                  path.dirname(sourceFile.fileName),
                  arg.text
                );
                const schema = JSON.parse(
                  fs.readFileSync(schemaPath, "utf-8")
                );
    
                // Build type literal: { name: string; age: number; ... }
                const members = Object.entries(schema).map(
                  ([key, typeStr]) =>
                    ts.factory.createPropertySignature(
                      undefined,
                      ts.factory.createIdentifier(key),
                      undefined,
                      ts.factory.createKeywordTypeNode(
                        schemaTypeMap[typeStr as string]
                          ?? ts.SyntaxKind.AnyKeyword
                      )
                    )
                );
    
                const typeLiteral =
                  ts.factory.createTypeLiteralNode(members);
    
                return ts.factory.updateTypeAliasDeclaration(
                  node,
                  node.modifiers,
                  node.name,
                  node.typeParameters,
                  typeLiteral
                );
              }
            }
          }
          return ts.visitEachChild(node, visit, context);
        }
        return ts.visitNode(sourceFile, visit) as ts.SourceFile;
      };
    }
  3. Run the transformer and verify output

    Wire everything together in a build script and inspect the emitted .d.ts to confirm the generated type is correct.

    typescript
    const program = ts.createProgram(["src/models.ts"], {
      target: ts.ScriptTarget.ES2022,
      module: ts.ModuleKind.ESNext,
      declaration: true,
      strict: true,
    });
    
    program.emit(undefined, undefined, undefined, false, {
      before: [schemaToTypeTransformer(program)],
    });
    
    // Output in dist/models.d.ts:
    // type User = { name: string; age: number; email: string; }
Transformer Stability

The TypeScript compiler API has no guaranteed stability across versions. Internal node kinds, factory methods, and emit behavior can change between minor releases. Pin your typescript dependency version, test transformers against upgrades, and prefer ts-morph for non-transformer use cases since it smooths over many API changes.

Debugging Tip

Use ts-ast-viewer.com to visualize the AST of any TypeScript snippet. It shows node kinds, properties, and the exact ts.factory calls needed to construct each node — invaluable when building transformers.

Integrating with JavaScript: DefinitelyTyped & Migration Strategies

TypeScript doesn't exist in a vacuum. Most real projects depend on dozens—sometimes hundreds—of JavaScript libraries that ship without type information. Bridging this gap is one of the most important practical skills in TypeScript, and it's where @types/* packages, declaration files, and migration strategies come into play.

This section walks you through the full lifecycle: consuming untyped JavaScript, gradually migrating existing codebases, and publishing your own type-safe libraries.

DefinitelyTyped and @types/* Packages

DefinitelyTyped is a community-maintained repository containing type declarations for thousands of npm packages. When you install @types/lodash, you're pulling hand-written (or auto-generated) .d.ts files that tell TypeScript about Lodash's API. The compiler automatically picks these up from node_modules/@types.

Version Alignment

The @types package version should match the major and minor version of the library it types. For example, if you use express@4.18.2, install @types/express@4.18.*. The patch version of the @types package is independent—it reflects revisions to the type definitions themselves, not the library.

bash
# Install the library and its matching types
npm install express@4.18.2
npm install --save-dev @types/express@4.18

# Check what version you have
npm ls @types/express
Bundled Types vs. DefinitelyTyped

Many modern libraries (e.g., axios, zod, date-fns) ship their own .d.ts files alongside the JavaScript. You don't need a separate @types/* package for those—TypeScript resolves the types or typings field in their package.json automatically.

Contributing and Fixing Types

When you encounter incorrect or outdated types, you have three options: submit a PR to DefinitelyTyped, patch locally with patch-package, or override with a local declaration file. For quick fixes in your own project, a local .d.ts override is often fastest.

typescript
// src/types/express-override.d.ts
// Augment Express Request to fix a missing property
import "express";

declare module "express-serve-static-core" {
  interface Request {
    userId?: string;    // added by your auth middleware
    sessionId?: string; // missing from @types/express
  }
}

Working with Untyped Libraries

Not every library has types on DefinitelyTyped. When you import a package and TypeScript complains with "Could not find a declaration file," you have a spectrum of options—from quick-and-dirty to fully typed.

The declare module Escape Hatch

The fastest way to silence the compiler is a wildcard module declaration. This gives you any for every import from that package. It's zero type safety, but it unblocks you immediately.

typescript
// src/types/untyped-libs.d.ts

// Quick escape hatch — everything is `any`
declare module "some-legacy-lib";

// Slightly better — type the parts you actually use
declare module "some-legacy-lib" {
  export function parse(input: string): Record<string, unknown>;
  export function stringify(obj: unknown): string;
}

Creating a Full Declaration File

For libraries you use heavily, writing a proper .d.ts gives you real type safety. Focus on the API surface you actually consume—you don't have to type every function the library exports. Place these files in a src/types/ directory and make sure it's included in your tsconfig.json's include or typeRoots.

typescript
// src/types/legacy-csv-parser.d.ts
declare module "legacy-csv-parser" {
  interface ParseOptions {
    delimiter?: string;
    headers?: boolean;
    skipEmptyLines?: boolean;
  }

  interface Row {
    [column: string]: string;
  }

  export function parse(csv: string, options?: ParseOptions): Row[];
  export function parseFile(path: string, options?: ParseOptions): Promise<Row[]>;
}

Type-Safe Wrappers Around Untyped Code

Rather than sprinkling any casts throughout your codebase, isolate the untyped boundary in a wrapper module. Your application code imports the wrapper (which has full types), and only the wrapper touches the untyped library. This contains the "unsafe zone" to a single file.

typescript
// src/lib/chart-wrapper.ts
// @ts-expect-error — no types available for legacy-charts
import charts from "legacy-charts";

interface ChartConfig {
  type: "bar" | "line" | "pie";
  data: number[];
  labels: string[];
}

interface ChartInstance {
  render(container: HTMLElement): void;
  destroy(): void;
}

export function createChart(config: ChartConfig): ChartInstance {
  // All `any` usage is confined here
  const instance = charts.create(config.type, {
    values: config.data,
    categories: config.labels,
  });
  return {
    render: (el: HTMLElement) => instance.mount(el),
    destroy: () => instance.teardown(),
  };
}

Migration Strategies: JavaScript to TypeScript

There is no single "right" way to migrate a JavaScript codebase to TypeScript. The best approach depends on your team size, codebase age, test coverage, and risk tolerance. Here are the three most common strategies.

Strategy 1: Incremental Migration

This is the safest and most common approach. You enable allowJs in tsconfig.json so TypeScript and JavaScript files coexist. Then you rename files from .js to .ts one at a time, fixing type errors as you go. There's no big switch—the project is always buildable.

  1. Set up tsconfig.json with allowJs

    Start with a permissive config. allowJs lets TypeScript compile .js files. checkJs optionally type-checks them too (using inferred types and JSDoc).

    json
    {
      "compilerOptions": {
        "allowJs": true,
        "checkJs": false,
        "outDir": "./dist",
        "target": "ES2020",
        "module": "commonjs",
        "strict": false
      },
      "include": ["src/**/*"]
    }
  2. Rename files one at a time

    Start with leaf modules (files that don't import much). Rename utils.jsutils.ts, fix the errors, and commit. Repeat outward toward entry points.

  3. Turn on checkJs for remaining JS files

    Once most files are converted, enable checkJs: true to catch type errors in the remaining JavaScript. Use // @ts-nocheck at the top of any file that's too noisy to fix yet.

  4. Enable strict mode

    After all files are .ts, flip "strict": true. This is the finish line—your codebase now has full type coverage.

Strategy 2: Strict-Gradually

Instead of migrating files one at a time, you convert the entire codebase to .ts at once—but with a very loose tsconfig. Then you enable stricter compiler options one at a time, fixing errors in batches. This works well when you want all files in TypeScript quickly but can't afford to fix everything at once.

json
{
  "compilerOptions": {
    "strict": false,
    "noImplicitAny": false,
    "strictNullChecks": false,
    "strictFunctionTypes": false,
    "strictBindCallApply": false,
    "strictPropertyInitialization": false,
    "noImplicitThis": false,
    "alwaysStrict": false
  }
}

Enable them in this recommended order: noImplicitAnystrictNullChecksstrictFunctionTypesstrictBindCallApplynoImplicitThisstrictPropertyInitializationalwaysStrict. Each step produces a focused batch of errors. Once all are enabled, replace the individual flags with "strict": true.

Strategy 3: Big-Bang

Rename everything, enable strict, and fix all errors in one PR. This only works for small codebases (a few thousand lines) or when you have excellent test coverage to catch regressions. The advantage is zero interim weirdness—you go from JavaScript to strict TypeScript in one commit. The downside is a massive, hard-to-review pull request.

JSDoc Annotations: Types Without .ts Files

You don't have to rename a single file to get TypeScript's benefits. When checkJs is enabled, TypeScript reads JSDoc annotations in .js files and uses them for type checking. This is a legitimate long-term strategy for teams that can't or won't adopt .ts files—not just a stepping stone.

javascript
// @ts-check

/**
 * @typedef {Object} User
 * @property {string} id
 * @property {string} name
 * @property {string} [email]  — optional property
 */

/**
 * Fetches a user by ID.
 * @param {string} id
 * @returns {Promise<User>}
 */
async function getUser(id) {
  const res = await fetch(`/api/users/${id}`);
  return /** @type {User} */ (await res.json());
}

/**
 * Generic identity function.
 * @template T
 * @param {T} value
 * @returns {T}
 */
function identity(value) {
  return value;
}

Key JSDoc tags for TypeScript: @type for inline type annotations, @param and @returns for function signatures, @typedef for reusable type definitions, and @template for generics. Add // @ts-check at the top of each file to opt in to type checking, or set "checkJs": true in tsconfig.json for all files.

Type Assertions and Escape Hatches

Even in well-typed codebases, you sometimes know more about a value's type than TypeScript does. Assertions and escape hatches let you override the compiler—but each one is a potential bug if your assumption is wrong. Use them deliberately and sparingly.

Escape HatchSyntaxWhen to UseRisk Level
as Tvalue as stringWhen you have runtime evidence the type is correctMedium
Double assertionvalue as unknown as TWhen types are structurally incompatible but you know it's safeHigh
! (non-null)map.get(key)!When you've guarded against null in a way TS can't seeMedium
@ts-expect-errorComment above lineIntentionally using incorrect types (e.g., testing error paths)Low-Medium
@ts-ignoreComment above lineLegacy code you can't fix yetHigh
typescript
// ✅ Simple assertion — you parsed JSON and know the shape
const config = JSON.parse(raw) as AppConfig;

// ✅ Non-null assertion — you just checked .has()
const users = new Map<string, User>();
if (users.has(id)) {
  const user = users.get(id)!; // safe: we checked has()
}

// ⚠️ Double assertion — last resort for incompatible types
const handler = legacyCallback as unknown as ModernHandler;

// ✅ @ts-expect-error — testing that a function rejects bad input
// @ts-expect-error — passing wrong type intentionally
const result = parseAge("not a number");
Prefer @ts-expect-error over @ts-ignore

@ts-expect-error will raise a compiler error if the line it suppresses stops having an error—meaning you'll be notified when the fix lands and the suppression is no longer needed. @ts-ignore stays silent forever, hiding potential regressions.

Publishing a Library with Types

If you maintain a library, shipping types alongside your code saves consumers from hunting for @types/* packages. TypeScript looks at the types (or typings) field in package.json to find your declaration entry point.

Package Configuration

json
{
  "name": "my-awesome-lib",
  "version": "2.1.0",
  "main": "./dist/cjs/index.js",
  "module": "./dist/esm/index.js",
  "types": "./dist/types/index.d.ts",
  "exports": {
    ".": {
      "import": {
        "types": "./dist/types/index.d.ts",
        "default": "./dist/esm/index.js"
      },
      "require": {
        "types": "./dist/types/index.d.cts",
        "default": "./dist/cjs/index.cjs"
      }
    }
  },
  "files": ["dist"]
}

The exports field with conditional types entries ensures both ESM (import) and CJS (require) consumers get correct type resolution. The types condition must come first within each export condition—TypeScript stops at the first match.

Generating Declarations

Set "declaration": true and "declarationDir": "./dist/types" in your tsconfig.json. For libraries that use a bundler (Rollup, esbuild) for the JavaScript output, run tsc --emitDeclarationOnly as a separate step to generate just the .d.ts files.

json
{
  "compilerOptions": {
    "declaration": true,
    "declarationDir": "./dist/types",
    "emitDeclarationOnly": true,
    "declarationMap": true,
    "sourceMap": true
  }
}

Enable declarationMap so consumers can "Go to Definition" and land in your .ts source rather than the generated .d.ts. This is a small touch that dramatically improves the developer experience for users of your library.

The Role of lib.d.ts and Environment Types

TypeScript ships with built-in declaration files (called "libs") that describe JavaScript's standard library and host environments. The lib compiler option controls which of these are included. By default, it's set based on your target, but you should set it explicitly when your code runs in a specific environment.

json
{
  "compilerOptions": {
    "target": "ES2022",
    "lib": ["ES2022", "DOM", "DOM.Iterable"]
  }
}

Includes browser globals like document, window, fetch, and DOM element types.

json
{
  "compilerOptions": {
    "target": "ES2022",
    "lib": ["ES2022"]
  }
}

No DOM types. Node-specific globals come from @types/node, not from lib. Install @types/node separately.

json
{
  "compilerOptions": {
    "target": "ES2020",
    "lib": ["ES2020"]
  }
}

For code that must run in both Node and browser, include only ECMAScript libs. This ensures you don't accidentally use document in shared code.

Common lib values include ES2015 through ES2023 (each adds that year's language features), DOM (browser APIs), DOM.Iterable (iterable DOM collections like NodeList), WebWorker (service/web worker globals), and ESNext (bleeding-edge proposals). You can also include granular libs like ES2021.String to add only String.prototype.replaceAll without pulling in the full ES2021 set.

Catch Environment Mistakes at Compile Time

Omitting "DOM" from lib in a Node.js project is intentional — it means document.querySelector() will produce a compile error if someone accidentally uses it in server code. This is a feature, not a bug.