Search docs

Jump between documentation pages.

Adapters & runtimes

The DaloyJS core only ever sees Request → Response. Runtime-specific concerns — sockets, signals, edge handlers — live in thin adapters at the edge.

Node.js

ts
import { serve } from "@daloyjs/core/node";

const { port, close } = serve(app, {
  port: 3000,
  hostname: "0.0.0.0",
  connectionTimeoutMs: 30_000,
  shutdownTimeoutMs: 10_000,
  handleSignals: true,       // SIGTERM / SIGINT trigger graceful shutdown
  maxHeaderBytes: 16 * 1024, // 16 KiB cap (default)
  trustProxy: false,         // set true only behind a trusted reverse proxy
});

// later
await close();

The Node adapter wires requestTimeout, headersTimeout, and keepAliveTimeout to safe values, and listens for SIGTERM/SIGINT for zero-downtime rolling deploys. When trustProxy is enabled the adapter honors x-forwarded-proto and x-forwarded-host when constructing the request URL — leave it off (the default) unless you terminate TLS at a known proxy.

Bun

ts
import { serve } from "@daloyjs/core/bun";

const handle = serve(app, {
  port: 3000,
  idleTimeout: 30,              // seconds; Bun default is 10
  development: false,           // disables Bun dev error pages
  // unix: "/tmp/daloy.sock",   // alternative to TCP
  // tls: { cert, key },        // HTTPS
});
console.log("Listening on " + handle.url);

Deno

ts
import { serve } from "@daloyjs/core/deno";

serve(app, {
  port: 3000,
  // HTTPS:
  // cert: Deno.readTextFileSync("./cert.pem"),
  // key:  Deno.readTextFileSync("./key.pem"),
  onListen: ({ hostname, port }) =>
    console.log("Listening on http://" + hostname + ":" + port),
});

The Deno adapter wires an internal AbortController into Deno.serveand listens for SIGTERM/SIGINT, so calling the returned shutdown() drains in-flight requests before resolving.

Cloudflare Workers

ts
// worker.ts
import { toFetchHandler } from "@daloyjs/core/cloudflare";
import { app } from "./src/server.js";

interface Env { /* your bindings */ }

// toFetchHandler returns the { fetch } object Workers expect as the default
// export. Do NOT wrap it again — export the result directly.
export default toFetchHandler<Env>(app);

Vercel (Edge or Node.js)

Vercel now recommends the Node.js runtime over Edge for new functions. The DaloyJS adapter is runtime-agnostic, so the same code runs on either; pick a runtime per function and the adapter does not change.

ts
// app/api/[[...slug]]/route.ts (Next.js App Router)
import { toRouteHandlers } from "@daloyjs/core/vercel";
import { app } from "@/server";

export const runtime = "nodejs"; // or "edge"
export const { GET, POST, PUT, PATCH, DELETE, OPTIONS, HEAD } =
  toRouteHandlers(app);
ts
// api/[...path].ts (Vercel Node.js Functions, non-Next.js)
    import { toFetchHandler } from "@daloyjs/core/vercel";
import { app } from "../src/server.js";

    // Node.js is the default runtime. Vercel expects a default { fetch } export.
    export default toFetchHandler(app);
ts
// api/[...path].ts (Vercel Edge Functions, non-Next.js)
    import { toWebHandler } from "@daloyjs/core/vercel";
    import { app } from "../src/server.js";

    export const config = { runtime: "edge" };
    export default toWebHandler(app);

toEdgeHandler remains exported as a backward-compatible alias of toWebHandler; new code should prefer toWebHandler or toRouteHandlers for Edge/Next route handlers, and toFetchHandler for the default export shape used by Vercel Node.js Functions.

Netlify Edge Functions

Netlify Edge Functions run on a Deno-based runtime with the standard Request/Response API, so the Vercel adapter works unchanged.

ts
// netlify/edge-functions/api.ts
import { toWebHandler } from "@daloyjs/core/vercel";
import { app } from "../../src/server.ts";

export default toWebHandler(app);
export const config = { path: "/api/*" };

Fastly Compute

Fastly Compute uses a fetch-event listener model. The Fastly adapter exposes both a plain handler and a one-call listener installer.

ts
// src/index.ts (Fastly Compute @ Edge JS starter)
import { installFastlyListener } from "@daloyjs/core/fastly";
import { app } from "./server.js";

installFastlyListener(app);

Fastly Compute does not expose node:* modules; avoid Node-only middleware (Node session store, Redis rate-limit store, multipart helpers that depend on node:stream).

AWS Lambda / Netlify Functions / Lambda Function URLs

The Lambda adapter supports API Gateway HTTP API payload format 2.0, API Gateway REST API payload format 1.0, Lambda Function URLs, and Netlify Functions. It handles base64-encoded bodies, v2 cookies, v1 multiValueHeaders, and proxies the request method, path, query, and headers.

ts
// netlify/functions/api.ts
import { toLambdaHandler } from "@daloyjs/core/lambda";
import { app } from "../../src/server.js";

export const handler = toLambdaHandler(app);
export const config = { path: "/api/*" };
ts
// AWS Lambda Function URL or API Gateway HTTP API
import { toLambdaHandler } from "@daloyjs/core/lambda";
import { app } from "./server.js";

export const handler = toLambdaHandler(app);

Heroku, Railway, Render, Fly.io (and any Node PaaS)

These platforms run a long-lived Node process. Use the Node adapter as-is and listen on the platform-provided PORT. Graceful shutdown is wired automatically — the adapter listens for SIGTERM so rolling deploys drain in-flight requests.

ts
// src/server.ts
import { serve } from "@daloyjs/core/node";
import { app } from "./app.js";

serve(app, {
  port: Number(process.env.PORT ?? 3000),
  hostname: "0.0.0.0",
});

For Heroku and Railway, add a start script and a Procfile (Heroku only):

bash
# Procfile
web: node dist/server.js

For Render, set the start command to node dist/server.js and the health-check path to a cheap route like /healthz.

For Fly.io, ship a Dockerfile (see the Deployment page for the distroless template) and a fly.toml with a matching internal port:

toml
# fly.toml
app = "my-daloy-api"

[http_service]
  internal_port = 3000
  force_https = true
  auto_stop_machines = "stop"
  auto_start_machines = true

  [http_service.concurrency]
    type = "requests"
    soft_limit = 200
    hard_limit = 250

[[http_service.checks]]
  interval = "10s"
  timeout = "2s"
  grace_period = "5s"
  method = "GET"
  path = "/healthz"

Roll your own

If your runtime exposes the fetch standard, you don't need an adapter:

ts
addEventListener("fetch", (event) => event.respondWith(app.fetch(event.request)));