# Serverless driver (/docs/postgres/database/serverless-driver)

Location: Postgres > Database > Serverless driver

The Prisma Postgres serverless driver ([`@prisma/ppg`](https://www.npmjs.com/package/@prisma/ppg)) is a lightweight client for connecting to Prisma Postgres using raw SQL. It uses HTTP and WebSocket protocols instead of traditional TCP connections, enabling database access in constrained environments where native PostgreSQL drivers cannot run.

> [!WARNING]
> The Prisma Postgres serverless driver is currently in [Early Access](/orm/more/releases#early-access) and not yet recommended for production scenarios.

Key features [#key-features]

The serverless driver uses HTTP and WebSocket protocols instead of TCP, enabling database access in environments where traditional PostgreSQL drivers cannot run:

* Compatible with Cloudflare Workers, Vercel Edge Functions, Deno Deploy, AWS Lambda, Bun, and browsers
* Stream results row-by-row to handle large datasets with constant memory usage
* Pipeline multiple queries over a single connection, reducing latency by up to 3x
* SQL template literals with automatic parameterization and full TypeScript support
* Built-in transactions, batch operations, and extensible type system
* Automatic connection pooling across [all available Prisma Postgres regions](/postgres/faq#what-regions-is-prisma-postgres-available-in) for optimal performance

Use this driver for edge/serverless environments without full Node.js support, or when working with large result sets that benefit from streaming.

For standard Node.js environments, use the [`node-postgres` driver](/orm/core-concepts/supported-databases/postgresql#using-driver-adapters) for lower latency with direct TCP connections.

Prerequisite: Get your connection string [#prerequisite-get-your-connection-string]

The serverless driver requires a Prisma Postgres TCP connection string:

```
postgres://identifier:key@db.prisma.io:5432/postgres?sslmode=require
```

Find and manage your connection string from your [Prisma Postgres dashboard](https://console.prisma.io/?utm_source=docs\&utm_medium=content\&utm_content=postgres)

If you don't have a Prisma Postgres database, create one using the [`create-db` CLI](/postgres/npx-create-db) tool:

  

#### npm

```bash
npx create-db@latest
```

#### pnpm

```bash
pnpm dlx create-db@latest
```

#### yarn

```bash
yarn dlx create-db@latest
```

#### bun

```bash
bunx --bun create-db@latest
```

Installation [#installation]

Install the appropriate package based on your use case:

  

#### Without Prisma ORM

  

  <CodeBlockTab value="npm">
    ```bash
    npm install @prisma/ppg
    ```

#### pnpm

```bash
pnpm add @prisma/ppg
```

#### yarn

```bash
yarn add @prisma/ppg
```

#### bun

```bash
bun add @prisma/ppg
```
    
  </CodeBlockTab>

#### With Prisma ORM

  

  <CodeBlockTab value="npm">
    ```bash
    npm install @prisma/ppg @prisma/adapter-ppg
    ```

#### pnpm

```bash
pnpm add @prisma/ppg @prisma/adapter-ppg
```

#### yarn

```bash
yarn add @prisma/ppg @prisma/adapter-ppg
```

#### bun

```bash
bun add @prisma/ppg @prisma/adapter-ppg
```
    
  </CodeBlockTab>

Usage [#usage]

Query with SQL template literals [#query-with-sql-template-literals]

Use the `prismaPostgres()` high-level API with SQL template literals and automatic parameterization:

```ts
import { prismaPostgres, defaultClientConfig } from "@prisma/ppg";

const ppg = prismaPostgres(defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!));

type User = { id: number; name: string; email: string };

// SQL template literals with automatic parameterization
const users = await ppg.sql<User>`
  SELECT * FROM users WHERE email = ${"user@example.com"}
`.collect();

console.log(users[0].name);
```

Use with Prisma ORM [#use-with-prisma-orm]

Use the [`PrismaPostgresAdapter`](https://www.npmjs.com/package/@prisma/adapter-ppg) to connect Prisma Client via the serverless driver:

```ts
import { PrismaClient } from "../generated/prisma/client";
import { PrismaPostgresAdapter } from "@prisma/adapter-ppg";

const prisma = new PrismaClient({
  adapter: new PrismaPostgresAdapter({
    connectionString: process.env.PRISMA_DIRECT_TCP_URL,
  }),
});

const users = await prisma.user.findMany();
```

Stream results [#stream-results]

Results are returned as `CollectableIterator<T>`. Stream rows one at a time for constant memory usage, or collect all rows into an array:

```ts
type User = { id: number; name: string; email: string };

// Stream rows one at a time (constant memory usage)
for await (const user of ppg.sql<User>`SELECT * FROM users`) {
  console.log(user.name);
}

// Or collect all rows into an array
const allUsers = await ppg.sql<User>`SELECT * FROM users`.collect();
```

Pipeline queries [#pipeline-queries]

Send multiple queries over a single WebSocket connection without waiting for responses. Queries are sent immediately and results arrive in FIFO order:

```ts
import { client, defaultClientConfig } from "@prisma/ppg";

const cl = client(defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!));
const session = await cl.newSession();

// Send all queries immediately (pipelined)
const [usersResult, ordersResult, productsResult] = await Promise.all([
  session.query("SELECT * FROM users"),
  session.query("SELECT * FROM orders"),
  session.query("SELECT * FROM products"),
]);

session.close();
```

With 100ms network latency, 3 sequential queries take 300ms (3 x RTT), but pipelined queries take only 100ms (1 x RTT).

Parameter streaming [#parameter-streaming]

Parameters over 1KB are automatically streamed without buffering in memory. For large binary parameters, you must use `boundedByteStreamParameter()` which creates a `BoundedByteStreamParameter` object that carries the total byte size, required by the PostgreSQL protocol:

```ts
import { client, defaultClientConfig, boundedByteStreamParameter, BINARY } from "@prisma/ppg";

const cl = client(defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!));

// Large binary data (e.g., file content)
const stream = getReadableStream(); // Your ReadableStream source
const totalSize = 1024 * 1024; // Total size must be known in advance

// Create a bounded byte stream parameter
const streamParam = boundedByteStreamParameter(stream, BINARY, totalSize);

// Automatically streamed - constant memory usage
await cl.query("INSERT INTO files (data) VALUES ($1)", streamParam);
```

For `Uint8Array` data, use `byteArrayParameter()`:

```ts
import { client, defaultClientConfig, byteArrayParameter, BINARY } from "@prisma/ppg";

const cl = client(defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!));

const bytes = new Uint8Array([1, 2, 3, 4]);
const param = byteArrayParameter(bytes, BINARY);

await cl.query("INSERT INTO files (data) VALUES ($1)", param);
```

The `boundedByteStreamParameter()` function is provided by the `@prisma/ppg` library and requires the total byte size to be known in advance due to PostgreSQL protocol requirements.

Transactions and batch operations [#transactions-and-batch-operations]

Transactions automatically handle BEGIN, COMMIT, and ROLLBACK:

```ts
const result = await ppg.transaction(async (tx) => {
  await tx.sql.exec`INSERT INTO users (name) VALUES ('Alice')`;
  const users = await tx.sql<User>`SELECT * FROM users WHERE name = 'Alice'`.collect();
  return users[0].name;
});
```

Batch operations execute multiple statements in a single round-trip within an automatic transaction:

```ts
const [users, affected] = await ppg.batch<[User[], number]>(
  { query: "SELECT * FROM users WHERE id < $1", parameters: [5] },
  { exec: "INSERT INTO users (name) VALUES ($1)", parameters: ["Charlie"] },
);
```

Type handling [#type-handling]

When using `defaultClientConfig()`, common PostgreSQL types are automatically parsed (`boolean`, `int2`, `int4`, `int8`, `float4`, `float8`, `text`, `varchar`, `json`, `jsonb`, `date`, `timestamp`, `timestamptz`):

```ts
import { prismaPostgres, defaultClientConfig } from "@prisma/ppg";

const ppg = prismaPostgres(defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!));

// JSON/JSONB automatically parsed
const rows = await ppg.sql<{ data: { key: string } }>`
  SELECT '{"key": "value"}'::jsonb as data
`.collect();
console.log(rows[0].data.key); // "value"

// BigInt parsed to JavaScript BigInt
const bigints = await ppg.sql<{
  big: bigint;
}>`SELECT 9007199254740991::int8 as big`.collect();

// Dates parsed to Date objects
const dates = await ppg.sql<{
  created: Date;
}>`SELECT NOW() as created`.collect();
```

Custom parsers and serializers [#custom-parsers-and-serializers]

Extend or override the type system with custom parsers (by PostgreSQL OID) and serializers (by type guard):

```ts
import { client, defaultClientConfig } from "@prisma/ppg";
import type { ValueParser } from "@prisma/ppg";

// Custom parser for UUID type
const uuidParser: ValueParser<string> = {
  oid: 2950,
  parse: (value) => (value ? value.toUpperCase() : null),
};

const config = defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!);
const cl = client({
  ...config,
  parsers: [...config.parsers, uuidParser], // Append to defaults
});
```

For custom serializers, place them before defaults so they take precedence:

```ts
import type { ValueSerializer } from "@prisma/ppg";

class Point {
  constructor(
    public x: number,
    public y: number,
  ) {}
}

const pointSerializer: ValueSerializer<Point> = {
  supports: (value: unknown): value is Point => value instanceof Point,
  serialize: (value: Point) => `(${value.x},${value.y})`,
};

const config = defaultClientConfig(process.env.PRISMA_DIRECT_TCP_URL!);
const cl = client({
  ...config,
  serializers: [pointSerializer, ...config.serializers], // Your serializer first
});

await cl.query("INSERT INTO locations (point) VALUES ($1)", new Point(10, 20));
```

See the [npm package documentation](https://www.npmjs.com/package/@prisma/ppg) for more details.

Platform compatibility [#platform-compatibility]

The driver works in any environment with `fetch` and `WebSocket` APIs:

| Platform              | HTTP Transport | WebSocket Transport |
| --------------------- | -------------- | ------------------- |
| Cloudflare Workers    | ✅              | ✅                   |
| Vercel Edge Functions | ✅              | ✅                   |
| AWS Lambda            | ✅              | ✅                   |
| Deno Deploy           | ✅              | ✅                   |
| Bun                   | ✅              | ✅                   |
| Node.js 18+           | ✅              | ✅                   |
| Browsers              | ✅              | ✅ (with CORS)       |

Transport modes [#transport-modes]

* **HTTP transport (stateless):** Each query is an independent HTTP request. Best for simple queries and edge functions.
* **WebSocket transport (stateful):** Persistent connection for multiplexed queries. Best for transactions, pipelining, and multiple queries. Create a session with `client().newSession()`.

API overview [#api-overview]

prismaPostgres(config) [#prismapostgresconfig]

High-level API with SQL template literals, transactions, and batch operations. Recommended for most use cases.

client(config) [#clientconfig]

Low-level API with explicit parameter passing and session management. Use when you need fine-grained control.

See the [npm package](https://www.npmjs.com/package/@prisma/ppg) for complete API documentation.

Error handling [#error-handling]

Structured error types are provided: `DatabaseError`, `HttpResponseError`, `WebSocketError`, `ValidationError`.

```ts
import { DatabaseError } from "@prisma/ppg";

try {
  await ppg.sql`SELECT * FROM invalid_table`.collect();
} catch (error) {
  if (error instanceof DatabaseError) {
    console.log(error.code);
  }
}
```

Connection pooling enabled by default [#connection-pooling-enabled-by-default]

The serverless driver automatically uses connection pooling [across all available Prisma Postgres regions](/postgres/faq#what-regions-is-prisma-postgres-available-in) for optimal performance and resource utilization.

Connection pooling is enabled by default and requires no additional configuration.

This ensures efficient database connections regardless of your deployment region, reducing connection overhead and improving query performance.

Limitations [#limitations]

* Requires a Prisma Postgres instance and does not work with [local development](/postgres/database/local-development) databases
* Currently in Early Access and not yet recommended for production

Learn more [#learn-more]

* [`@prisma/ppg` npm package](https://www.npmjs.com/package/@prisma/ppg)
* [prisma/ppg-client GitHub repository](https://github.com/prisma/ppg-client)
* [Prisma Postgres documentation](/postgres)

## Related pages

- [`Backups`](https://www.prisma.io/docs/postgres/database/backups): Manage and restore database backups in Prisma Postgres
- [`Connecting to your database`](https://www.prisma.io/docs/postgres/database/connecting-to-your-database): Choose the right Prisma Postgres connection string for your runtime, tool, and workload.
- [`Connection pooling`](https://www.prisma.io/docs/postgres/database/connection-pooling): Use Prisma Postgres connection pooling for concurrent application traffic.
- [`Extensions`](https://www.prisma.io/docs/postgres/database/postgres-extensions): Enable and use standard PostgreSQL extensions with Prisma Postgres.
- [`Local development`](https://www.prisma.io/docs/postgres/database/local-development): Set up and use Prisma Postgres for local development