How tRPC eliminates the API contract problem, works with Next.js App Router, handles auth middleware, file uploads, and subscriptions. A real-world tRPC setup from scratch.
You know the drill. You change a field name in your API response. You update the backend type. You deploy. Then the frontend breaks in production because someone forgot to update the fetch call on line 247 of Dashboard.tsx. The field is now undefined, the component renders blank, and your error tracking lights up at 2 AM.
This is the API contract problem. It's not a technology problem. It's a coordination problem. And no amount of Swagger docs or GraphQL schemas will fix the fact that your frontend and backend types can drift apart silently.
tRPC fixes this by refusing to let them drift. There's no schema file. No code generation step. No separate contract to maintain. You write a TypeScript function on the server, and the client knows its exact input and output types at compile time. If you rename a field, the frontend won't compile until you fix it.
That's the promise. Let me show you how it actually works, where it shines, and where you should absolutely not use it.
Let's look at how most teams build APIs today.
REST + OpenAPI: You write your endpoints. Maybe you add Swagger annotations. Maybe you generate a client SDK from the OpenAPI spec. But the spec is a separate artifact. It can go stale. The generation step is another thing in your CI pipeline that can break or be forgotten. And the generated types are often ugly — deeply nested paths["/api/users"]["get"]["responses"]["200"]["content"]["application/json"] monsters.
GraphQL: Better type safety, but enormous ceremony. You write a schema in SDL. You write resolvers. You generate types from the schema. You write queries on the client. You generate types from the queries. That's at least two code generation steps, a schema file, and a build step that everyone has to remember to run. For a team that already controls both frontend and backend, this is a lot of infrastructure for a problem that has a simpler solution.
Manual fetch calls: The most common approach and the most dangerous. You write fetch("/api/users"), cast the result to User[], and hope for the best. There's zero compile-time safety. The type assertion is a lie you tell TypeScript.
// The lie every frontend developer has told
const res = await fetch("/api/users");
const users = (await res.json()) as User[]; // 🙏 hope this is righttRPC takes a different approach entirely. Instead of describing your API in a separate format and generating types, you write plain TypeScript functions on the server and import their types directly into the client. No generation step. No schema file. No drift.
Before we set anything up, let's understand the mental model.
A tRPC router is a collection of procedures grouped together. Think of it like a controller in MVC, except it's just a plain object with type inference baked in.
import { initTRPC } from "@trpc/server";
const t = initTRPC.create();
export const appRouter = t.router({
user: t.router({
list: t.procedure.query(/* ... */),
byId: t.procedure.input(/* ... */).query(/* ... */),
create: t.procedure.input(/* ... */).mutation(/* ... */),
}),
post: t.router({
list: t.procedure.query(/* ... */),
publish: t.procedure.input(/* ... */).mutation(/* ... */),
}),
});
export type AppRouter = typeof appRouter;That AppRouter type export is the entire magic trick. The client imports this type — not the runtime code, just the type — and gets full autocompletion and type checking for every procedure.
A procedure is a single endpoint. There are three kinds:
Context is the request-scoped data available to every procedure. Database connections, the authenticated user, request headers — anything you'd put in Express's req object goes here.
Middleware transforms the context or gates access. The most common pattern is an auth middleware that checks for a valid session and adds ctx.user.
This is the critical mental model. When you define a procedure like this:
t.procedure
.input(z.object({ id: z.string() }))
.query(({ input }) => {
// input is typed as { id: string }
return db.user.findUnique({ where: { id: input.id } });
});The return type flows all the way to the client. If db.user.findUnique returns User | null, the client's useQuery hook will have data typed as User | null. No manual typing. No casting. It's inferred end-to-end.
Let's build this from scratch. I'll assume you have a Next.js 14+ project with the App Router.
npm install @trpc/server @trpc/client @trpc/react-query @trpc/next @tanstack/react-query zodCreate your tRPC instance and define the context type.
// src/server/trpc.ts
import { initTRPC, TRPCError } from "@trpc/server";
import { type FetchCreateContextFnOptions } from "@trpc/server/adapters/fetch";
import superjson from "superjson";
import { ZodError } from "zod";
export async function createTRPCContext(opts: FetchCreateContextFnOptions) {
const session = await getSession(opts.req);
return {
session,
db: prisma,
req: opts.req,
};
}
const t = initTRPC.context<typeof createTRPCContext>().create({
transformer: superjson,
errorFormatter({ shape, error }) {
return {
...shape,
data: {
...shape.data,
zodError:
error.cause instanceof ZodError ? error.cause.flatten() : null,
},
};
},
});
export const createCallerFactory = t.createCallerFactory;
export const router = t.router;
export const publicProcedure = t.procedure;A few things to notice:
superjson transformer: tRPC serializes data as JSON by default, which means Date objects, Map, Set, and other non-JSON types get lost. superjson preserves them.createTRPCContext: This function runs on every request. It's where you parse the session, set up the database connection, and build the context object.// src/server/routers/user.ts
import { z } from "zod";
import { router, publicProcedure, protectedProcedure } from "../trpc";
export const userRouter = router({
me: protectedProcedure.query(async ({ ctx }) => {
const user = await ctx.db.user.findUnique({
where: { id: ctx.user.id },
select: {
id: true,
name: true,
email: true,
image: true,
createdAt: true,
},
});
if (!user) {
throw new TRPCError({
code: "NOT_FOUND",
message: "User not found",
});
}
return user;
}),
updateProfile: protectedProcedure
.input(
z.object({
name: z.string().min(1).max(100),
bio: z.string().max(500).optional(),
})
)
.mutation(async ({ ctx, input }) => {
const updated = await ctx.db.user.update({
where: { id: ctx.user.id },
data: {
name: input.name,
bio: input.bio,
},
});
return updated;
}),
byId: publicProcedure
.input(z.object({ id: z.string().uuid() }))
.query(async ({ ctx, input }) => {
const user = await ctx.db.user.findUnique({
where: { id: input.id },
select: {
id: true,
name: true,
image: true,
bio: true,
},
});
if (!user) {
throw new TRPCError({
code: "NOT_FOUND",
message: "User not found",
});
}
return user;
}),
});// src/server/routers/_app.ts
import { router } from "../trpc";
import { userRouter } from "./user";
import { postRouter } from "./post";
import { notificationRouter } from "./notification";
export const appRouter = router({
user: userRouter,
post: postRouter,
notification: notificationRouter,
});
export type AppRouter = typeof appRouter;In the App Router, tRPC runs as a standard Route Handler. No custom server, no special Next.js plugin.
// src/app/api/trpc/[trpc]/route.ts
import { fetchRequestHandler } from "@trpc/server/adapters/fetch";
import { appRouter } from "@/server/routers/_app";
import { createTRPCContext } from "@/server/trpc";
const handler = (req: Request) =>
fetchRequestHandler({
endpoint: "/api/trpc",
req,
router: appRouter,
createContext: createTRPCContext,
onError:
process.env.NODE_ENV === "development"
? ({ path, error }) => {
console.error(
`❌ tRPC failed on ${path ?? "<no-path>"}: ${error.message}`
);
}
: undefined,
});
export { handler as GET, handler as POST };That's it. Both GET and POST are handled. Queries go through GET (with URL-encoded input), mutations go through POST.
// src/lib/trpc.ts
import { createTRPCReact } from "@trpc/react-query";
import type { AppRouter } from "@/server/routers/_app";
export const trpc = createTRPCReact<AppRouter>();Note: we import AppRouter as a type only. No server code leaks into the client bundle.
// src/components/providers/TRPCProvider.tsx
"use client";
import { useState } from "react";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { httpBatchLink, loggerLink } from "@trpc/client";
import { trpc } from "@/lib/trpc";
import superjson from "superjson";
function getBaseUrl() {
if (typeof window !== "undefined") return "";
if (process.env.VERCEL_URL) return `https://${process.env.VERCEL_URL}`;
return `http://localhost:${process.env.PORT ?? 3000}`;
}
export function TRPCProvider({ children }: { children: React.ReactNode }) {
const [queryClient] = useState(
() =>
new QueryClient({
defaultOptions: {
queries: {
staleTime: 5 * 60 * 1000,
retry: 1,
},
},
})
);
const [trpcClient] = useState(() =>
trpc.createClient({
links: [
loggerLink({
enabled: (op) =>
process.env.NODE_ENV === "development" ||
(op.direction === "down" && op.result instanceof Error),
}),
httpBatchLink({
url: `${getBaseUrl()}/api/trpc`,
transformer: superjson,
}),
],
})
);
return (
<trpc.Provider client={trpcClient} queryClient={queryClient}>
<QueryClientProvider client={queryClient}>
{children}
</QueryClientProvider>
</trpc.Provider>
);
}// src/app/layout.tsx (relevant part)
import { TRPCProvider } from "@/components/providers/TRPCProvider";
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html>
<body>
<TRPCProvider>{children}</TRPCProvider>
</body>
</html>
);
}// src/app/dashboard/page.tsx
"use client";
import { trpc } from "@/lib/trpc";
export default function DashboardPage() {
const { data: user, isLoading, error } = trpc.user.me.useQuery();
if (isLoading) return <DashboardSkeleton />;
if (error) return <ErrorDisplay message={error.message} />;
return (
<div>
<h1>Welcome back, {user.name}</h1>
<p>Member since {user.createdAt.toLocaleDateString()}</p>
</div>
);
}That user.name is fully typed. If you misspell it as user.nme, TypeScript catches it immediately. If you change the server to return displayName instead of name, every client usage will show a compile error. No runtime surprises.
Context and middleware are where tRPC goes from "neat type trick" to "production-ready framework."
The context function runs on every request. Here's a real-world version:
// src/server/trpc.ts
import { type FetchCreateContextFnOptions } from "@trpc/server/adapters/fetch";
import { getServerSession } from "next-auth";
import { authOptions } from "@/lib/auth";
import { prisma } from "@/lib/prisma";
export async function createTRPCContext(opts: FetchCreateContextFnOptions) {
const session = await getServerSession(authOptions);
return {
session,
user: session?.user ?? null,
db: prisma,
headers: Object.fromEntries(opts.req.headers),
};
}
type Context = Awaited<ReturnType<typeof createTRPCContext>>;The most common middleware pattern is separating public from protected procedures:
// src/server/trpc.ts
const isAuthed = t.middleware(async ({ ctx, next }) => {
if (!ctx.user) {
throw new TRPCError({
code: "UNAUTHORIZED",
message: "You must be logged in to perform this action",
});
}
return next({
ctx: {
// Override the context type — user is no longer nullable
user: ctx.user,
},
});
});
export const protectedProcedure = t.procedure.use(isAuthed);After this middleware runs, ctx.user in any protectedProcedure is guaranteed to be non-null. The type system enforces this. You can't accidentally access ctx.user.id in a public procedure without TypeScript complaining.
You can compose middleware for more granular access control:
const isAdmin = t.middleware(async ({ ctx, next }) => {
if (!ctx.user) {
throw new TRPCError({ code: "UNAUTHORIZED" });
}
if (ctx.user.role !== "ADMIN") {
throw new TRPCError({
code: "FORBIDDEN",
message: "Admin access required",
});
}
return next({
ctx: {
user: ctx.user,
},
});
});
export const adminProcedure = t.procedure.use(isAdmin);Middleware isn't just for auth. Here's a performance logging middleware:
const loggerMiddleware = t.middleware(async ({ path, type, next }) => {
const start = Date.now();
const result = await next();
const duration = Date.now() - start;
if (duration > 1000) {
console.warn(`⚠️ Slow ${type} ${path}: ${duration}ms`);
}
return result;
});
// Apply to all procedures
export const publicProcedure = t.procedure.use(loggerMiddleware);import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(20, "10 s"),
});
const rateLimitMiddleware = t.middleware(async ({ ctx, next }) => {
const ip = ctx.headers["x-forwarded-for"] ?? "unknown";
const { success, remaining } = await ratelimit.limit(ip);
if (!success) {
throw new TRPCError({
code: "TOO_MANY_REQUESTS",
message: `Rate limit exceeded. Try again later.`,
});
}
return next();
});tRPC uses Zod for input validation. This is not optional decoration — it's the mechanism that ensures inputs are safe on both client and server.
const postRouter = router({
create: protectedProcedure
.input(
z.object({
title: z.string().min(1, "Title is required").max(200, "Title too long"),
content: z.string().min(10, "Content must be at least 10 characters"),
categoryId: z.string().uuid("Invalid category ID"),
tags: z.array(z.string()).max(5, "Maximum 5 tags").default([]),
published: z.boolean().default(false),
})
)
.mutation(async ({ ctx, input }) => {
// input is fully typed:
// {
// title: string;
// content: string;
// categoryId: string;
// tags: string[];
// published: boolean;
// }
return ctx.db.post.create({
data: {
...input,
authorId: ctx.user.id,
},
});
}),
});Here's something subtle: Zod validation runs on both sides. On the client, tRPC validates the input before sending the request. If the input is invalid, the request never leaves the browser. On the server, the same schema validates again as a security measure.
This means you get instant client-side validation for free:
"use client";
import { trpc } from "@/lib/trpc";
import { useState } from "react";
export function CreatePostForm() {
const [title, setTitle] = useState("");
const utils = trpc.useUtils();
const createPost = trpc.post.create.useMutation({
onSuccess: () => {
utils.post.list.invalidate();
},
onError: (error) => {
// Zod errors arrive structured
if (error.data?.zodError) {
const fieldErrors = error.data.zodError.fieldErrors;
// fieldErrors.title?: string[]
// fieldErrors.content?: string[]
console.log(fieldErrors);
}
},
});
return (
<form
onSubmit={(e) => {
e.preventDefault();
createPost.mutate({
title,
content: "...",
categoryId: "...",
});
}}
>
<input value={title} onChange={(e) => setTitle(e.target.value)} />
{createPost.error?.data?.zodError?.fieldErrors.title && (
<span className="text-red-500">
{createPost.error.data.zodError.fieldErrors.title[0]}
</span>
)}
<button type="submit" disabled={createPost.isPending}>
{createPost.isPending ? "Creating..." : "Create Post"}
</button>
</form>
);
}// Discriminated unions
const searchInput = z.discriminatedUnion("type", [
z.object({
type: z.literal("user"),
query: z.string(),
includeInactive: z.boolean().default(false),
}),
z.object({
type: z.literal("post"),
query: z.string(),
category: z.string().optional(),
}),
]);
// Pagination input reused across procedures
const paginationInput = z.object({
cursor: z.string().nullish(),
limit: z.number().min(1).max(100).default(20),
});
const postRouter = router({
infiniteList: publicProcedure
.input(
z.object({
...paginationInput.shape,
category: z.string().optional(),
sortBy: z.enum(["newest", "popular", "trending"]).default("newest"),
})
)
.query(async ({ ctx, input }) => {
const { cursor, limit, category, sortBy } = input;
const posts = await ctx.db.post.findMany({
take: limit + 1,
cursor: cursor ? { id: cursor } : undefined,
where: category ? { categoryId: category } : undefined,
orderBy:
sortBy === "newest"
? { createdAt: "desc" }
: sortBy === "popular"
? { likes: "desc" }
: { score: "desc" },
});
let nextCursor: string | undefined;
if (posts.length > limit) {
const nextItem = posts.pop();
nextCursor = nextItem?.id;
}
return {
posts,
nextCursor,
};
}),
});Not every procedure needs input. Queries often don't:
const statsRouter = router({
// No input needed
overview: publicProcedure.query(async ({ ctx }) => {
const [userCount, postCount, commentCount] = await Promise.all([
ctx.db.user.count(),
ctx.db.post.count(),
ctx.db.comment.count(),
]);
return { userCount, postCount, commentCount };
}),
// Optional filters
detailed: publicProcedure
.input(
z
.object({
from: z.date().optional(),
to: z.date().optional(),
})
.optional()
)
.query(async ({ ctx, input }) => {
const where = {
...(input?.from && { createdAt: { gte: input.from } }),
...(input?.to && { createdAt: { lte: input.to } }),
};
return ctx.db.post.groupBy({
by: ["categoryId"],
where,
_count: true,
});
}),
});tRPC's error handling is structured, type-safe, and integrates cleanly with both HTTP semantics and client-side UI.
import { TRPCError } from "@trpc/server";
const postRouter = router({
delete: protectedProcedure
.input(z.object({ id: z.string().uuid() }))
.mutation(async ({ ctx, input }) => {
const post = await ctx.db.post.findUnique({
where: { id: input.id },
});
if (!post) {
throw new TRPCError({
code: "NOT_FOUND",
message: "Post not found",
});
}
if (post.authorId !== ctx.user.id) {
throw new TRPCError({
code: "FORBIDDEN",
message: "You can only delete your own posts",
});
}
await ctx.db.post.delete({ where: { id: input.id } });
return { success: true };
}),
});tRPC error codes map to HTTP status codes:
| tRPC Code | HTTP Status | When to Use |
|---|---|---|
BAD_REQUEST | 400 | Invalid input beyond Zod validation |
UNAUTHORIZED | 401 | Not logged in |
FORBIDDEN | 403 | Logged in but insufficient permissions |
NOT_FOUND | 404 | Resource doesn't exist |
CONFLICT | 409 | Duplicate resource |
TOO_MANY_REQUESTS | 429 | Rate limit exceeded |
INTERNAL_SERVER_ERROR | 500 | Unexpected server error |
Remember the error formatter from our setup? Here's how it works in practice:
const t = initTRPC.context<typeof createTRPCContext>().create({
transformer: superjson,
errorFormatter({ shape, error }) {
return {
...shape,
data: {
...shape.data,
zodError:
error.cause instanceof ZodError ? error.cause.flatten() : null,
// Add custom fields
timestamp: new Date().toISOString(),
requestId: crypto.randomUUID(),
},
};
},
});"use client";
import { trpc } from "@/lib/trpc";
import { toast } from "sonner";
export function DeletePostButton({ postId }: { postId: string }) {
const utils = trpc.useUtils();
const deletePost = trpc.post.delete.useMutation({
onSuccess: () => {
toast.success("Post deleted");
utils.post.list.invalidate();
},
onError: (error) => {
switch (error.data?.code) {
case "FORBIDDEN":
toast.error("You don't have permission to delete this post");
break;
case "NOT_FOUND":
toast.error("This post no longer exists");
utils.post.list.invalidate();
break;
default:
toast.error("Something went wrong. Please try again.");
}
},
});
return (
<button
onClick={() => deletePost.mutate({ id: postId })}
disabled={deletePost.isPending}
>
{deletePost.isPending ? "Deleting..." : "Delete"}
</button>
);
}You can set up a global error handler that catches all unhandled tRPC errors:
// In your TRPCProvider
const [queryClient] = useState(
() =>
new QueryClient({
defaultOptions: {
mutations: {
onError: (error) => {
// Global fallback for unhandled mutation errors
if (error instanceof TRPCClientError) {
if (error.data?.code === "UNAUTHORIZED") {
// Redirect to login
window.location.href = "/login";
return;
}
toast.error(error.message);
}
},
},
},
})
);Mutations are where tRPC really integrates well with TanStack Query. Let's look at a real-world pattern: a like button with optimistic updates.
// Server
const postRouter = router({
toggleLike: protectedProcedure
.input(z.object({ postId: z.string().uuid() }))
.mutation(async ({ ctx, input }) => {
const existing = await ctx.db.like.findUnique({
where: {
userId_postId: {
userId: ctx.user.id,
postId: input.postId,
},
},
});
if (existing) {
await ctx.db.like.delete({
where: { id: existing.id },
});
return { liked: false };
}
await ctx.db.like.create({
data: {
userId: ctx.user.id,
postId: input.postId,
},
});
return { liked: true };
}),
});The user clicks "Like." You don't want to wait 200ms for the server response to update the UI. Optimistic updates solve this: update the UI immediately, then roll back if the server rejects it.
"use client";
import { trpc } from "@/lib/trpc";
export function LikeButton({ postId, initialLiked, initialCount }: {
postId: string;
initialLiked: boolean;
initialCount: number;
}) {
const utils = trpc.useUtils();
const toggleLike = trpc.post.toggleLike.useMutation({
onMutate: async ({ postId }) => {
// Cancel outgoing refetches so they don't overwrite our optimistic update
await utils.post.byId.cancel({ id: postId });
// Snapshot the previous value
const previousPost = utils.post.byId.getData({ id: postId });
// Optimistically update the cache
utils.post.byId.setData({ id: postId }, (old) => {
if (!old) return old;
return {
...old,
liked: !old.liked,
likeCount: old.liked ? old.likeCount - 1 : old.likeCount + 1,
};
});
// Return the snapshot for rollback
return { previousPost };
},
onError: (_error, { postId }, context) => {
// Roll back to the previous value on error
if (context?.previousPost) {
utils.post.byId.setData({ id: postId }, context.previousPost);
}
},
onSettled: (_data, _error, { postId }) => {
// Always refetch after error or success to ensure server state
utils.post.byId.invalidate({ id: postId });
},
});
return (
<button
onClick={() => toggleLike.mutate({ postId })}
className={initialLiked ? "text-red-500" : "text-gray-400"}
>
♥ {initialCount}
</button>
);
}The pattern is always the same:
onMutate: Cancel queries, snapshot current data, apply optimistic update, return snapshot.onError: Roll back using the snapshot.onSettled: Invalidate the query so it refetches from the server, regardless of success or error.This three-step dance ensures the UI is always responsive and eventually consistent with the server.
After a mutation, you often need to refresh related data. tRPC's useUtils() makes this ergonomic:
const createComment = trpc.comment.create.useMutation({
onSuccess: (_data, variables) => {
// Invalidate the post's comment list
utils.comment.listByPost.invalidate({ postId: variables.postId });
// Invalidate the post itself (comment count changed)
utils.post.byId.invalidate({ id: variables.postId });
// Invalidate ALL post lists (comment counts in list views)
utils.post.list.invalidate();
},
});By default, tRPC with httpBatchLink combines multiple simultaneous requests into a single HTTP call. If a component renders and fires three queries:
function Dashboard() {
const user = trpc.user.me.useQuery();
const posts = trpc.post.list.useQuery({ limit: 10 });
const stats = trpc.stats.overview.useQuery();
// ...
}These three queries are automatically batched into a single HTTP request: GET /api/trpc/user.me,post.list,stats.overview?batch=1&input=...
The server processes all three, returns all three results in a single response, and TanStack Query distributes the results to each hook. No configuration needed.
You can disable batching for specific calls if needed:
// In your provider, use splitLink to route specific procedures differently
import { splitLink, httpBatchLink, httpLink } from "@trpc/client";
const [trpcClient] = useState(() =>
trpc.createClient({
links: [
splitLink({
condition: (op) => op.path === "post.infiniteList",
true: httpLink({
url: `${getBaseUrl()}/api/trpc`,
transformer: superjson,
}),
false: httpBatchLink({
url: `${getBaseUrl()}/api/trpc`,
transformer: superjson,
maxURLLength: 2048,
}),
}),
],
})
);For real-time features, tRPC supports subscriptions over WebSockets. This requires a separate WebSocket server (Next.js doesn't natively support WebSockets in Route Handlers).
// src/server/routers/notification.ts
import { observable } from "@trpc/server/observable";
// In-memory event emitter (use Redis pub/sub in production)
import { EventEmitter } from "events";
const eventEmitter = new EventEmitter();
export const notificationRouter = router({
onNew: protectedProcedure.subscription(({ ctx }) => {
return observable<Notification>((emit) => {
const handler = (notification: Notification) => {
if (notification.userId === ctx.user.id) {
emit.next(notification);
}
};
eventEmitter.on("notification", handler);
return () => {
eventEmitter.off("notification", handler);
};
});
}),
// Mutation that triggers a notification
markAsRead: protectedProcedure
.input(z.object({ id: z.string() }))
.mutation(async ({ ctx, input }) => {
const notification = await ctx.db.notification.update({
where: { id: input.id, userId: ctx.user.id },
data: { readAt: new Date() },
});
return notification;
}),
});On the client:
"use client";
import { trpc } from "@/lib/trpc";
import { useState } from "react";
export function NotificationBell() {
const [notifications, setNotifications] = useState<Notification[]>([]);
trpc.notification.onNew.useSubscription(undefined, {
onData: (notification) => {
setNotifications((prev) => [notification, ...prev]);
toast.info(notification.message);
},
onError: (error) => {
console.error("Subscription error:", error);
},
});
return (
<div>
<span className="badge">{notifications.length}</span>
{/* notification list UI */}
</div>
);
}For the WebSocket transport, you need a dedicated server process. Here's a minimal setup with the ws library:
// ws-server.ts (separate process)
import { applyWSSHandler } from "@trpc/server/adapters/ws";
import { WebSocketServer } from "ws";
import { appRouter } from "./server/routers/_app";
import { createTRPCContext } from "./server/trpc";
const wss = new WebSocketServer({ port: 3001 });
const handler = applyWSSHandler({
wss,
router: appRouter,
createContext: createTRPCContext,
});
wss.on("connection", (ws) => {
console.log(`Connection opened (${wss.clients.size} total)`);
ws.once("close", () => {
console.log(`Connection closed (${wss.clients.size} total)`);
});
});
process.on("SIGTERM", () => {
handler.broadcastReconnectNotification();
wss.close();
});
console.log("WebSocket server listening on ws://localhost:3001");And the client needs a wsLink for subscriptions:
import { wsLink, createWSClient } from "@trpc/client";
const wsClient = createWSClient({
url: "ws://localhost:3001",
});
// Use splitLink to route subscriptions through WebSocket
const [trpcClient] = useState(() =>
trpc.createClient({
links: [
splitLink({
condition: (op) => op.type === "subscription",
true: wsLink({ client: wsClient, transformer: superjson }),
false: httpBatchLink({
url: `${getBaseUrl()}/api/trpc`,
transformer: superjson,
}),
}),
],
})
);tRPC doesn't handle file uploads natively. It's a JSON-RPC protocol — binary data isn't in its wheelhouse. But you can build a type-safe upload flow by combining tRPC with presigned URLs.
The pattern:
// src/server/routers/upload.ts
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
const s3 = new S3Client({ region: process.env.AWS_REGION });
export const uploadRouter = router({
getPresignedUrl: protectedProcedure
.input(
z.object({
filename: z.string().min(1).max(255),
contentType: z.string().regex(/^(image|application)\//),
size: z.number().max(10 * 1024 * 1024, "File must be under 10MB"),
})
)
.mutation(async ({ ctx, input }) => {
const key = `uploads/${ctx.user.id}/${crypto.randomUUID()}-${input.filename}`;
const command = new PutObjectCommand({
Bucket: process.env.S3_BUCKET!,
Key: key,
ContentType: input.contentType,
ContentLength: input.size,
});
const presignedUrl = await getSignedUrl(s3, command, {
expiresIn: 300, // 5 minutes
});
// Store pending upload in database
const upload = await ctx.db.upload.create({
data: {
key,
userId: ctx.user.id,
filename: input.filename,
contentType: input.contentType,
size: input.size,
status: "PENDING",
},
});
return {
uploadId: upload.id,
presignedUrl,
key,
};
}),
confirmUpload: protectedProcedure
.input(z.object({ uploadId: z.string().uuid() }))
.mutation(async ({ ctx, input }) => {
const upload = await ctx.db.upload.findUnique({
where: { id: input.uploadId, userId: ctx.user.id },
});
if (!upload) {
throw new TRPCError({ code: "NOT_FOUND" });
}
// Verify the file actually exists in S3
// (optional but recommended)
const confirmed = await ctx.db.upload.update({
where: { id: upload.id },
data: { status: "CONFIRMED" },
});
return {
url: `https://${process.env.S3_BUCKET}.s3.amazonaws.com/${confirmed.key}`,
};
}),
});On the client:
"use client";
import { trpc } from "@/lib/trpc";
import { useState, useCallback } from "react";
export function FileUpload() {
const [uploading, setUploading] = useState(false);
const getPresignedUrl = trpc.upload.getPresignedUrl.useMutation();
const confirmUpload = trpc.upload.confirmUpload.useMutation();
const handleFileChange = useCallback(
async (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
setUploading(true);
try {
// Step 1: Get presigned URL from tRPC
const { presignedUrl, uploadId } = await getPresignedUrl.mutateAsync({
filename: file.name,
contentType: file.type,
size: file.size,
});
// Step 2: Upload directly to S3
const uploadResponse = await fetch(presignedUrl, {
method: "PUT",
body: file,
headers: {
"Content-Type": file.type,
},
});
if (!uploadResponse.ok) {
throw new Error("Upload failed");
}
// Step 3: Confirm upload via tRPC
const { url } = await confirmUpload.mutateAsync({ uploadId });
toast.success(`File uploaded: ${url}`);
} catch (error) {
toast.error("Upload failed. Please try again.");
} finally {
setUploading(false);
}
},
[getPresignedUrl, confirmUpload]
);
return (
<div>
<input
type="file"
onChange={handleFileChange}
disabled={uploading}
accept="image/*"
/>
{uploading && <p>Uploading...</p>}
</div>
);
}The entire flow is type-safe. The presigned URL response type, the upload ID type, the confirmation response — all inferred from the server definitions. If you add a new field to the presigned URL response, the client knows about it immediately.
With Next.js App Router, you often want to fetch data in Server Components. tRPC supports this through server-side callers:
// src/server/trpc.ts
export const createCaller = createCallerFactory(appRouter);
// Usage in a Server Component
// src/app/posts/[id]/page.tsx
import { createTRPCContext } from "@/server/trpc";
import { createCaller } from "@/server/trpc";
export default async function PostPage({
params,
}: {
params: { id: string };
}) {
const ctx = await createTRPCContext({
req: new Request("http://localhost"),
resHeaders: new Headers(),
});
const caller = createCaller(ctx);
const post = await caller.post.byId({ id: params.id });
return (
<article>
<h1>{post.title}</h1>
<div>{post.content}</div>
{/* Client components for interactive parts */}
<LikeButton postId={post.id} initialLiked={post.liked} />
<CommentSection postId={post.id} />
</article>
);
}This gives you the best of both worlds: server-rendered initial data with full type safety, and client-side interactivity for mutations and real-time features.
Testing is straightforward because procedures are just functions. You don't need to spin up an HTTP server.
// src/server/routers/user.test.ts
import { describe, it, expect, vi } from "vitest";
import { appRouter } from "./routers/_app";
import { createCaller } from "./trpc";
describe("user router", () => {
it("returns the current user profile", async () => {
const caller = createCaller({
user: { id: "user-1", email: "test@example.com", role: "USER" },
db: prismaMock,
session: mockSession,
headers: {},
});
prismaMock.user.findUnique.mockResolvedValue({
id: "user-1",
name: "Test User",
email: "test@example.com",
image: null,
createdAt: new Date("2026-01-01"),
});
const result = await caller.user.me();
expect(result).toEqual({
id: "user-1",
name: "Test User",
email: "test@example.com",
image: null,
createdAt: new Date("2026-01-01"),
});
});
it("throws UNAUTHORIZED for unauthenticated requests", async () => {
const caller = createCaller({
user: null,
db: prismaMock,
session: null,
headers: {},
});
await expect(caller.user.me()).rejects.toThrow("UNAUTHORIZED");
});
it("validates input with Zod", async () => {
const caller = createCaller({
user: { id: "user-1", email: "test@example.com", role: "USER" },
db: prismaMock,
session: mockSession,
headers: {},
});
await expect(
caller.user.updateProfile({
name: "", // min length 1
})
).rejects.toThrow();
});
});No mocking of HTTP layers, no supertest, no route matching. Just call the function and assert the result. This is one of tRPC's underappreciated advantages: testing is trivially simple because the transport layer is an implementation detail.
tRPC is not a universal solution. Here's where it falls apart:
If you're building an API that external developers will consume, tRPC is the wrong choice. External consumers don't have access to your TypeScript types. They need a documented, stable contract — OpenAPI/Swagger for REST, or a GraphQL schema. tRPC's type safety only works when both the client and server share the same TypeScript codebase.
If your mobile app is written in Swift, Kotlin, or Dart, tRPC offers nothing. The types don't cross language boundaries. You could theoretically generate an OpenAPI spec from tRPC routes using trpc-openapi, but at that point you're adding ceremony back in. Just use REST from the start.
tRPC assumes a single TypeScript codebase. If your backend is split across multiple services in different languages, tRPC can't help with inter-service communication. Use gRPC, REST, or message queues for that.
If your frontend and backend live in separate repositories with separate deploy pipelines, you lose tRPC's core advantage. The type sharing requires a monorepo or a shared package. You can publish the AppRouter type as an npm package, but now you have a versioning problem that REST + OpenAPI handles more naturally.
If you need HTTP caching headers, content negotiation, ETags, or other REST-specific features, tRPC's abstraction over HTTP will fight you. tRPC treats HTTP as a transport detail, not a feature.
Here's how I decide:
| Scenario | Recommendation |
|---|---|
| Same-repo fullstack TypeScript app | tRPC — maximum benefit, minimum overhead |
| Internal tool / admin dashboard | tRPC — speed of development is the priority |
| Public API for third-party devs | REST + OpenAPI — consumers need docs, not types |
| Mobile + web clients (non-TS mobile) | REST or GraphQL — need language-agnostic contracts |
| Real-time heavy (chat, gaming) | tRPC subscriptions or raw WebSockets depending on complexity |
| Separate frontend/backend teams | GraphQL — schema is the contract between teams |
A few things I've learned from running tRPC in production that aren't in the docs:
Keep routers small. A single router file shouldn't exceed 200 lines. Split by domain: userRouter, postRouter, billingRouter. Each in its own file.
Use createCallerFactory for server-side calls. Don't reach for fetch when calling your own API from a Server Component. The caller factory gives you the same type safety with zero HTTP overhead.
Don't over-optimize batching. The default httpBatchLink is almost always sufficient. I've seen teams spend days setting up splitLink configurations for marginal gains. Profile first.
Set staleTime in QueryClient. The default staleTime of 0 means every focus event triggers a refetch. Set it to something reasonable (30 seconds to 5 minutes) based on your data freshness requirements.
Use superjson from day one. Adding it later means migrating every client and server simultaneously. It's a one-line configuration that saves you from Date serialization bugs.
Error boundaries are your friend. Wrap tRPC-heavy page sections in React error boundaries. A single failed query shouldn't take down the entire page.
"use client";
import { ErrorBoundary } from "react-error-boundary";
function ErrorFallback({ error, resetErrorBoundary }: {
error: Error;
resetErrorBoundary: () => void;
}) {
return (
<div role="alert" className="p-4 bg-red-50 rounded-lg">
<p className="font-medium text-red-800">Something went wrong</p>
<pre className="text-sm text-red-600 mt-2">{error.message}</pre>
<button
onClick={resetErrorBoundary}
className="mt-3 px-4 py-2 bg-red-600 text-white rounded"
>
Try again
</button>
</div>
);
}
export default function DashboardPage() {
return (
<div>
<h1>Dashboard</h1>
<ErrorBoundary FallbackComponent={ErrorFallback}>
<UserStats />
</ErrorBoundary>
<ErrorBoundary FallbackComponent={ErrorFallback}>
<RecentPosts />
</ErrorBoundary>
</div>
);
}tRPC isn't a replacement for REST or GraphQL. It's a different tool for a specific situation: when you control both the client and the server, both are TypeScript, and you want the shortest possible path from "I changed the backend" to "the frontend knows about it."
In that situation, nothing else comes close. No code generation, no schema files, no drift. Just TypeScript doing what TypeScript does best: catching mistakes before they reach production.
The trade-off is clear: you give up protocol-level interoperability (no non-TypeScript clients) in exchange for development speed and compile-time safety that's difficult to achieve any other way.
For most fullstack TypeScript applications, that's a trade-off worth making.