Data Models and Validation with Zod
A good data modeling strategy should:
- Validate at runtime, not just compile time — TypeScript types disappear at runtime; malformed API responses silently corrupt your app
- Catch data issues early — missing or malformed data often goes unnoticed until it causes a bug downstream
- Avoid over-permissive optionals — marking fields optional because they’re sometimes omitted hides real data integrity issues
- Stay consistent across frontend and backend — drift between client expectations and server responses is a common source of bugs
- Transform data at boundaries — dates arrive as ISO strings but should be
Dateobjects in your code
Solution Overview
- Shared schema package — define all models as Zod schemas in a shared package; both frontend and backend import from the same source
- Validate at API boundaries — parse incoming and outgoing data on both sides to catch issues immediately
- Different models for different purposes — separate schemas for summaries vs full objects, and for create vs patch vs fetch, to avoid lax validations that mask problems
- Types derived from schemas — never define TypeScript types separately; always infer from Zod to maintain a single source of truth
- Transform during parse — convert wire formats (ISO strings) to runtime types (
Dateobjects) automatically
Single Source of Truth
All API models are defined as Zod schemas in packages/shared/src/schemas/. Both webapp and service import types and schemas from this package, eliminating type drift.
packages/shared/src/schemas/
├── common.ts # Enums, Address, GPS, DateRange
├── location.ts # Location, LocationSummary, DTOs
├── user.ts # User, Doctor, MedicalStaff, DTOs
├── job.ts # Job, JobSummary, DTOs
├── job-application.ts # JobApplication, DTOs
├── contact.ts # ContactInquiry, DTOs
└── index.ts # Re-exports all schemas + types
Type Inference
Always infer TypeScript types from Zod schemas:
export const JobSchema = z.object({ ... });
export type Job = z.infer<typeof JobSchema>;
Never define types separately from schemas — this defeats the purpose of having a single source of truth.
Naming Conventions
| Schema | Type | Description |
|---|---|---|
XxxSchema | Xxx | Full entity |
XxxSummarySchema | XxxSummary | List/embedded version |
CreateXxxSchema | CreateXxxDto | POST request body |
PatchXxxSchema | PatchXxxDto | PATCH request body |
Schemas Also Transform
Zod schemas do more than validate — they can transform data from its wire format to the representation you want in code.
Enums
Use native TypeScript enums for values that need runtime iteration (e.g., Object.values()):
export enum DocumentationType {
POLICE_CHECK = "Police Check",
WORK_VISA = "Work Visa",
// ...
}
export const DocumentationTypeSchema = z.nativeEnum(DocumentationType);
Use z.enum() for string literals that don’t need runtime iteration:
export const LocationTypeSchema = z.enum(["general-practice", "hospital"]);
export type LocationType = z.infer<typeof LocationTypeSchema>;
Timestamps
Timestamps use DateTimeSchema which transforms ISO strings to Date objects on parse:
// Wire format: ISO 8601 string ("2025-01-15T00:00:00.000Z")
// In code: native Date object
export const DateTimeSchema = z
.string()
.datetime()
.transform((str) => new Date(str));
export const NullableDateTimeSchema = z
.string()
.datetime()
.nullable()
.transform((str) => (str ? new Date(str) : null));
export const EntityTimestampsSchema = z.object({
createdAt: DateTimeSchema, // Date in code
updatedAt: DateTimeSchema, // Date in code
});
Serialization: JSON.stringify() automatically converts Date objects back to ISO strings via Date.toJSON(), so no manual conversion is needed when sending requests.
const job: Job = { createdAt: new Date(), ... };
JSON.stringify(job); // createdAt becomes "2025-01-15T00:00:00.000Z"
Date Strings
Use DateStringSchema for date-only values (no time component) — these remain as strings:
// Stays as string - no transform
export const DateStringSchema = z.string().regex(/^\d{4}-\d{2}-\d{2}$/);
// Example: date ranges use date strings
export const DateRangeSchema = z.object({
startDate: DateStringSchema, // "2025-03-01"
endDate: DateStringSchema, // "2025-03-31"
});
| Schema | Wire Format | In Code |
|---|---|---|
DateStringSchema | "2025-01-15" | string |
DateTimeSchema | "2025-01-15T00:00:00.000Z" | Date |
NullableDateTimeSchema | "2025-01-15T00:00:00.000Z" or null | Date | null |
Nullable vs Optional
nullable(): Field exists but can be null (database columns)optional(): Field may be omitted (request DTOs)nullish(): Either null or undefined
// Database entity - field always present, may be null
publishedAt: z.string().nullable();
// Request DTO - field may be omitted
publishedAt: z.string().nullable().optional();
Different Models for Different Purposes
Using a single schema for all purposes leads to overly permissive definitions that hide real data integrity issues. Instead, create purpose-specific schemas.
Summary vs Full
Use different schemas for list endpoints vs detail endpoints:
| Variant | Usage | Characteristics |
|---|---|---|
XxxSummary | List endpoints, embedded objects | Reduced fields, no nested entities |
Xxx | Detail endpoints | All fields, includes nested entities |
// Summary - for lists (no nested objects)
export const JobSummarySchema = z.object({
id: z.string().uuid(),
title: z.string(),
hourlyRate: z.number(),
locationId: z.string().uuid(), // ID only
// ... reduced fields
});
// Full - for detail views (includes nested objects)
export const JobSchema = JobSummarySchema.extend({
description: z.string(),
location: LocationSummarySchema.optional(), // Nested object
suitability: z.array(DoctorJobSuitabilitySchema).optional(),
});
This avoids the trap of making fields optional “because they’re not always included” — instead, you have explicit schemas for each use case.
Create and Patch DTOs
Derive Create/Patch DTOs from the full schema when possible:
// Derive from full schema
export const CreateJobSchema = JobSchema.omit({
id: true,
createdAt: true,
updatedAt: true,
location: true, // Derived from locationId
suitability: true, // Computed field
});
// Patch is partial of Create
export const PatchJobSchema = CreateJobSchema.partial();
Exception: Define the DTO explicitly when it differs significantly from the full model. Describing a schema through its differences to another can be more confusing than helpful when they share little in common.
// Explicit definition - create differs significantly from fetch
export const CreateJobSchema = z.object({
locationId: z.string().uuid(),
title: z.string(),
description: z.string(),
// ... no requiredDocumentation (set on location)
// ... publishedAt is optional (defaults to null/draft)
});
Validate on Both Sides
Both frontend and backend import the same schemas from the shared package, ensuring consistent validation at every boundary.
Webapp (ApiClient)
Sending requests: Validate before sending (optional but catches errors early)
async createJob(data: CreateJobDto): Promise<Job> {
const validatedData = CreateJobSchema.parse(data); // Validate
const response = await this.fetch(`/jobs`, {
method: 'POST',
body: JSON.stringify(validatedData), // Date → ISO string automatically
});
const result = await response.json();
return JobSchema.parse(result); // Validate + transform response
}
Receiving responses: Always parse to validate and transform
async getJob(id: string): Promise<Job> {
const response = await this.fetch(`/jobs/${id}`);
const json = await response.json();
// json.createdAt is "2025-01-15T00:00:00.000Z" (string)
const job = JobSchema.parse(json);
// job.createdAt is Date object
return job;
}
For array responses, wrap the schema:
async getJobs(): Promise<Job[]> {
const response = await this.fetch(`/jobs`);
const result = await response.json();
return z.array(JobSchema).parse(result);
}
Service (NestJS)
Use a ZodValidationPipe:
import { CreateJobSchema, type CreateJobDto } from "@myapp/shared";
import { ZodValidationPipe } from "../common/zod-validation.pipe";
@Post()
async create(
@Body(new ZodValidationPipe(CreateJobSchema)) data: CreateJobDto
) {
return this.jobsService.create(data);
}
The pipe returns a 400 Bad Request with detailed error messages on validation failure.