Passes the request through to the next Function or to the asset server if no other Function is available.
* `env` [EnvWithFetch](#envwithfetch)
* `params` Params\
Holds the values from [dynamic routing](/pages/functions/routing/#dynamic-routes).
In the following example, you have a dynamic path that is `/users/[user].js`. When you visit the site on `/users/nevi` the `params` object would look like:
```js
{
user: "nevi"
}
```
This allows you fetch the dynamic value from the path:
```js
export function onRequest(context) {
return new Response(`Hello ${context.params.user}`);
}
```
Which would return `"Hello nevi"`.
* `data` Data
### `EnvWithFetch`
Holds the environment variables, secrets, and bindings for a Function. This also holds the `ASSETS` binding which is how you can fallback to the asset-serving behavior.
---
# Bindings
URL: https://developers.cloudflare.com/pages/functions/bindings/
import { Render, TabItem, Tabs, WranglerConfig } from "~/components";
A [binding](/workers/runtime-apis/bindings/) enables your Pages Functions to interact with resources on the Cloudflare developer platform. Use bindings to integrate your Pages Functions with Cloudflare resources like [KV](/kv/concepts/how-kv-works/), [Durable Objects](/durable-objects/), [R2](/r2/), and [D1](/d1/). You can set bindings for both production and preview environments.
This guide will instruct you on configuring a binding for your Pages Function. You must already have a Cloudflare Developer Platform resource set up to continue.
:::note
Pages Functions only support a subset of all [bindings](/workers/runtime-apis/bindings/), which are listed on this page.
:::
## KV namespaces
[Workers KV](/kv/concepts/kv-namespaces/) is Cloudflare's key-value storage solution.
To bind your KV namespace to your Pages Function, you can configure a KV namespace binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#kv-namespaces) or the Cloudflare dashboard.
To configure a KV namespace binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **KV namespace**.
5. Give your binding a name under **Variable name**.
6. Under **KV namespace**, select your desired namespace.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use KV in your Function. In the following example, your KV namespace binding is called `TODO_LIST` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequest(context) {
const task = await context.env.TODO_LIST.get("Task:123");
return new Response(task);
}
```
```ts
interface Env {
TODO_LIST: KVNamespace;
}
export const onRequest: PagesFunction = async (context) => {
const task = await context.env.TODO_LIST.get("Task:123");
return new Response(task);
};
```
### Interact with your KV namespaces locally
You can interact with your KV namespace bindings locally in one of two ways:
- Configure your Pages project's Wrangler file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1).
- Pass arguments to `wrangler pages dev` directly.
To interact with your KV namespace binding locally by passing arguments to the Wrangler CLI, add `-k ` or `--kv=` to the `wrangler pages dev` command. For example, if your KV namespace is bound your Function via the `TODO_LIST` binding, access the KV namespace in local development by running:
```sh
npx wrangler pages dev --kv=TODO_LIST
```
## Durable Objects
[Durable Objects](/durable-objects/) (DO) are Cloudflare's strongly consistent data store that power capabilities such as connecting WebSockets and handling state.
To bind your Durable Object to your Pages Function, you can configure a Durable Object binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#kv-namespaces) or the Cloudflare dashboard.
To configure a Durable Object binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **Durable Object**.
5. Give your binding a name under **Variable name**.
6. Under **Durable Object namespace**, select your desired namespace.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use Durable Objects in your Function. In the following example, your DO binding is called `DURABLE_OBJECT` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequestGet(context) {
const id = context.env.DURABLE_OBJECT.newUniqueId();
const stub = context.env.DURABLE_OBJECT.get(id);
// Pass the request down to the durable object
return stub.fetch(context.request);
}
```
```ts
interface Env {
DURABLE_OBJECT: DurableObjectNamespace;
}
export const onRequestGet: PagesFunction = async (context) => {
const id = context.env.DURABLE_OBJECT.newUniqueId();
const stub = context.env.DURABLE_OBJECT.get(id);
// Pass the request down to the durable object
return stub.fetch(context.request);
};
```
### Interact with your Durable Object namespaces locally
You can interact with your Durable Object bindings locally in one of two ways:
- Configure your Pages project's Wrangler file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1).
- Pass arguments to `wrangler pages dev` directly.
While developing locally, to interact with a Durable Object namespace, run `wrangler dev` in the directory of the Worker exporting the Durable Object. In another terminal, run `wrangler pages dev` in the directory of your Pages project.
To interact with your Durable Object namespace locally via the Wrangler CLI, append `--do =@` to `wrangler pages dev`. `CLASS_NAME` indicates the Durable Object class name and `SCRIPT_NAME` the name of your Worker.
For example, if your Worker is called `do-worker` and it declares a Durable Object class called `DurableObjectExample`, access this Durable Object by running `npx wrangler dev` in the `do-worker` directory. At the same time, run `npx wrangler pages dev --do MY_DO=DurableObjectExample@do-worker` in your Pages' project directory. Interact with the `MY_DO` binding in your Function code by using `context.env` (for example, `context.env.MY_DO`).
## R2 buckets
[R2](/r2/) is Cloudflare's blob storage solution that allows developers to store large amounts of unstructured data without the egress fees.
To bind your R2 bucket to your Pages Function, you can configure a R2 bucket binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#r2-buckets) or the Cloudflare dashboard.
To configure a R2 bucket binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **R2 bucket**.
5. Give your binding a name under **Variable name**.
6. Under **R2 bucket**, select your desired R2 bucket.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use R2 buckets in your Function. In the following example, your R2 bucket binding is called `BUCKET` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequest(context) {
const obj = await context.env.BUCKET.get("some-key");
if (obj === null) {
return new Response("Not found", { status: 404 });
}
return new Response(obj.body);
}
```
```ts
interface Env {
BUCKET: R2Bucket;
}
export const onRequest: PagesFunction = async (context) => {
const obj = await context.env.BUCKET.get("some-key");
if (obj === null) {
return new Response("Not found", { status: 404 });
}
return new Response(obj.body);
};
```
### Interact with your R2 buckets locally
You can interact with your R2 bucket bindings locally in one of two ways:
- Configure your Pages project's Wrangler file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1).
- Pass arguments to `wrangler pages dev` directly.
:::note
By default, Wrangler automatically persists data to local storage. For more information, refer to [Local development](/workers/local-development/).
:::
To interact with an R2 bucket locally via the Wrangler CLI, add `--r2=` to the `wrangler pages dev` command. If your R2 bucket is bound to your Function with the `BUCKET` binding, access this R2 bucket in local development by running:
```sh
npx wrangler pages dev --r2=BUCKET
```
Interact with this binding by using `context.env` (for example, `context.env.BUCKET`.)
## D1 databases
[D1](/d1/) is Cloudflare’s native serverless database.
To bind your D1 database to your Pages Function, you can configure a D1 database binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#d1-databases) or the Cloudflare dashboard.
To configure a D1 database binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add**> **D1 database bindings**.
5. Give your binding a name under **Variable name**.
6. Under **D1 database**, select your desired D1 database.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use D1 in your Function. In the following example, your D1 database binding is `NORTHWIND_DB` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequest(context) {
// Create a prepared statement with our query
const ps = context.env.NORTHWIND_DB.prepare("SELECT * from users");
const data = await ps.first();
return Response.json(data);
}
```
```ts
interface Env {
NORTHWIND_DB: D1Database;
}
export const onRequest: PagesFunction = async (context) => {
// Create a prepared statement with our query
const ps = context.env.NORTHWIND_DB.prepare("SELECT * from users");
const data = await ps.first();
return Response.json(data);
};
```
### Interact with your D1 databases locally
You can interact with your D1 database bindings locally in one of two ways:
- Configure your Pages project's Wrangler file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1).
- Pass arguments to `wrangler pages dev` directly.
To interact with a D1 database via the Wrangler CLI while [developing locally](/d1/best-practices/local-development/#develop-locally-with-pages), add `--d1 =` to the `wrangler pages dev` command.
If your D1 database is bound to your Pages Function via the `NORTHWIND_DB` binding and the `database_id` in your Wrangler file is `xxxx-xxxx-xxxx-xxxx-xxxx`, access this database in local development by running:
```sh
npx wrangler pages dev --d1 NORTHWIND_DB=xxxx-xxxx-xxxx-xxxx-xxxx
```
Interact with this binding by using `context.env` (for example, `context.env.NORTHWIND_DB`.)
:::note
By default, Wrangler automatically persists data to local storage. For more information, refer to [Local development](/workers/local-development/).
:::
Refer to the [D1 Workers Binding API documentation](/d1/worker-api/) for the API methods available on your D1 binding.
## Vectorize indexes
[Vectorize](/vectorize/) is Cloudflare’s native vector database.
To bind your Vectorize index to your Pages Function, you can configure a Vectorize index binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#vectorize-indexes) or the Cloudflare dashboard.
To configure a Vectorize index binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Choose whether you would like to set up the binding in your **Production** or **Preview** environment.
4. Select your Pages project > **Settings**.
5. Select your Pages environment > **Bindings** > **Add** > **Vectorize index**.
6. Give your binding a name under **Variable name**.
7. Under **Vectorize index**, select your desired Vectorize index.
8. Redeploy your project for the binding to take effect.
### Use Vectorize index bindings
To use Vectorize index in your Pages Function, you can access your Vectorize index binding in your Pages Function code. In the following example, your Vectorize index binding is called `VECTORIZE_INDEX` and you can access the binding in your Pages Function code on `context.env`.
```js
// Sample vectors: 3 dimensions wide.
//
// Vectors from a machine-learning model are typically ~100 to 1536 dimensions
// wide (or wider still).
const sampleVectors = [
{
id: "1",
values: [32.4, 74.1, 3.2],
metadata: { url: "/products/sku/13913913" },
},
{
id: "2",
values: [15.1, 19.2, 15.8],
metadata: { url: "/products/sku/10148191" },
},
{
id: "3",
values: [0.16, 1.2, 3.8],
metadata: { url: "/products/sku/97913813" },
},
{
id: "4",
values: [75.1, 67.1, 29.9],
metadata: { url: "/products/sku/418313" },
},
{
id: "5",
values: [58.8, 6.7, 3.4],
metadata: { url: "/products/sku/55519183" },
},
];
export async function onRequest(context) {
let path = new URL(context.request.url).pathname;
if (path.startsWith("/favicon")) {
return new Response("", { status: 404 });
}
// You only need to insert vectors into your index once
if (path.startsWith("/insert")) {
// Insert some sample vectors into your index
// In a real application, these vectors would be the output of a machine learning (ML) model,
// such as Workers AI, OpenAI, or Cohere.
let inserted = await context.env.VECTORIZE_INDEX.insert(sampleVectors);
// Return the number of IDs we successfully inserted
return Response.json(inserted);
}
}
```
```ts
export interface Env {
// This makes our vector index methods available on context.env.VECTORIZE_INDEX.*
// For example, context.env.VECTORIZE_INDEX.insert() or query()
VECTORIZE_INDEX: VectorizeIndex;
}
// Sample vectors: 3 dimensions wide.
//
// Vectors from a machine-learning model are typically ~100 to 1536 dimensions
// wide (or wider still).
const sampleVectors: Array = [
{
id: "1",
values: [32.4, 74.1, 3.2],
metadata: { url: "/products/sku/13913913" },
},
{
id: "2",
values: [15.1, 19.2, 15.8],
metadata: { url: "/products/sku/10148191" },
},
{
id: "3",
values: [0.16, 1.2, 3.8],
metadata: { url: "/products/sku/97913813" },
},
{
id: "4",
values: [75.1, 67.1, 29.9],
metadata: { url: "/products/sku/418313" },
},
{
id: "5",
values: [58.8, 6.7, 3.4],
metadata: { url: "/products/sku/55519183" },
},
];
export const onRequest: PagesFunction = async (context) => {
let path = new URL(context.request.url).pathname;
if (path.startsWith("/favicon")) {
return new Response("", { status: 404 });
}
// You only need to insert vectors into your index once
if (path.startsWith("/insert")) {
// Insert some sample vectors into your index
// In a real application, these vectors would be the output of a machine learning (ML) model,
// such as Workers AI, OpenAI, or Cohere.
let inserted = await context.env.VECTORIZE_INDEX.insert(sampleVectors);
// Return the number of IDs we successfully inserted
return Response.json(inserted);
}
};
```
## Workers AI
[Workers AI](/workers-ai/) allows you to run machine learning models, powered by serverless GPUs, on Cloudflare’s global network.
To bind Workers AI to your Pages Function, you can configure a Workers AI binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#workers-ai) or the Cloudflare dashboard.
When developing locally using Wrangler, you can define an AI binding using the `--ai` flag. Start Wrangler in development mode by running [`wrangler pages dev --ai AI`](/workers/wrangler/commands/#dev) to expose the `context.env.AI` binding.
To configure a Workers AI binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **Workers AI**.
5. Give your binding a name under **Variable name**.
6. Redeploy your project for the binding to take effect.
### Use Workers AI bindings
To use Workers AI in your Pages Function, you can access your Workers AI binding in your Pages Function code. In the following example, your Workers AI binding is called `AI` and you can access the binding in your Pages Function code on `context.env`.
```js
export async function onRequest(context) {
const input = { prompt: "What is the origin of the phrase Hello, World" };
const answer = await context.env.AI.run(
"@cf/meta/llama-3.1-8b-instruct",
input,
);
return Response.json(answer);
}
```
```ts
interface Env {
AI: Ai;
}
export const onRequest: PagesFunction = async (context) => {
const input = { prompt: "What is the origin of the phrase Hello, World" };
const answer = await context.env.AI.run(
"@cf/meta/llama-3.1-8b-instruct",
input,
);
return Response.json(answer);
};
```
### Interact with your Workers AI binding locally
You can interact with your Workers AI bindings locally in one of two ways:
- Configure your Pages project's Wrangler file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1).
- Pass arguments to `wrangler pages dev` directly.
To interact with a Workers AI binding via the Wrangler CLI while developing locally, run:
```sh
npx wrangler pages dev --ai=
```
## Service bindings
[Service bindings](/workers/runtime-apis/bindings/service-bindings/) enable you to call a Worker from within your Pages Function.
To bind your Pages Function to a Worker, configure a Service binding in your Pages Function using the [Wrangler configuration file](/pages/functions/wrangler-configuration/#service-bindings) or the Cloudflare dashboard.
To configure a Service binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **Service binding**.
5. Give your binding a name under **Variable name**.
6. Under **Service**, select your desired Worker.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use Service bindings in your Function. In the following example, your Service binding is called `SERVICE` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequestGet(context) {
return context.env.SERVICE.fetch(context.request);
}
```
```ts
interface Env {
SERVICE: Fetcher;
}
export const onRequest: PagesFunction = async (context) => {
return context.env.SERVICE.fetch(context.request);
};
```
### Interact with your Service bindings locally
You can interact with your Service bindings locally in one of two ways:
- Configure your Pages project's Wrangler file and run [`npx wrangler pages dev`](/workers/wrangler/commands/#dev-1).
- Pass arguments to `wrangler pages dev` directly.
To interact with a [Service binding](/workers/runtime-apis/bindings/service-bindings/) while developing locally, run the Worker you want to bind to via `wrangler dev` and in parallel, run `wrangler pages dev` with `--service =` where `SCRIPT_NAME` indicates the name of the Worker. For example, if your Worker is called `my-worker`, connect with this Worker by running it via `npx wrangler dev` (in the Worker's directory) alongside `npx wrangler pages dev --service MY_SERVICE=my-worker` (in the Pages' directory). Interact with this binding by using `context.env` (for example, `context.env.MY_SERVICE`).
If you set up the Service binding via the Cloudflare dashboard, you will need to append `wrangler pages dev` with `--service =` where `BINDING_NAME` is the name of the Service binding and `SCRIPT_NAME` is the name of the Worker.
For example, to develop locally, if your Worker is called `my-worker`, run `npx wrangler dev` in the `my-worker` directory. In a different terminal, also run `npx wrangler pages dev --service MY_SERVICE=my-worker` in your Pages project directory. Interact with this Service binding by using `context.env` (for example, `context.env.MY_SERVICE`).
Wrangler also supports running your Pages project and bound Workers in the same dev session with one command. To try it out, pass multiple -c flags to Wrangler, like this: `wrangler pages dev -c wrangler.toml -c ../other-worker/wrangler.toml`. The first argument must point to your Pages configuration file, and the subsequent configurations will be accessible via a Service binding from your Pages project.
:::caution
Support for running multiple Workers in the same dev session with one Wrangler command is experimental, and subject to change as we work on the experience. If you run into bugs or have any feedback, [open an issue on the workers-sdk repository](https://github.com/cloudflare/workers-sdk/issues/new)
:::
## Queue Producers
[Queue Producers](/queues/configuration/javascript-apis/#producer) enable you to send messages into a queue within your Pages Function.
To bind a queue to your Pages Function, configure a queue producer binding in your Pages Function using the [Wrangler configuration file](/pages/functions/wrangler-configuration/#queues-producers) or the Cloudflare dashboard:
To configure a queue producer binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Functions** > **Add** > **Queue**.
5. Give your binding a name under **Variable name**.
6. Under **Queue**, select your desired queue.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use a queue producer binding in your Function. In this example, the binding is named `MY_QUEUE` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequest(context) {
await context.env.MY_QUEUE.send({
url: request.url,
method: request.method,
headers: Object.fromEntries(request.headers),
});
return new Response("Sent!");
}
```
```ts
interface Env {
MY_QUEUE: Queue;
}
export const onRequest: PagesFunction = async (context) => {
await context.env.MY_QUEUE.send({
url: request.url,
method: request.method,
headers: Object.fromEntries(request.headers),
});
return new Response("Sent!");
};
```
### Interact with your Queue Producer binding locally
If using a queue producer binding with a Pages Function, you will be able to send events to a queue locally. However, it is not possible to consume events from a queue with a Pages Function. You will have to create a [separate consumer Worker](/queues/get-started/#5-create-your-consumer-worker) with a [queue consumer handler](/queues/configuration/javascript-apis/#consumer) to consume events from the queue. Wrangler does not yet support running separate producer Functions and consumer Workers bound to the same queue locally.
## Hyperdrive configs
:::note
PostgreSQL drivers like [`Postgres.js`](https://github.com/porsager/postgres) depend on Node.js APIs. Pages Functions with Hyperdrive bindings must be [deployed with Node.js compatibility](/workers/runtime-apis/nodejs).
```toml title="wrangler.toml"
compatibility_flags = [ "nodejs_compat" ]
compatibility_date = "2024-09-23"
```
:::
[Hyperdrive](/hyperdrive/) is a service for connecting to your existing databases from Cloudflare Workers and Pages Functions.
To bind your Hyperdrive config to your Pages Function, you can configure a Hyperdrive binding in the [Wrangler configuration file](/pages/functions/wrangler-configuration/#hyperdrive) or the Cloudflare dashboard.
To configure a Hyperdrive binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **Hyperdrive**.
5. Give your binding a name under **Variable name**.
6. Under **Hyperdrive configuration**, select your desired configuration.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use Hyperdrive in your Function. In the following example, your Hyperdrive config is named `HYPERDRIVE` and you can access the binding in your Function code on `context.env`:
```js
import postgres from "postgres";
export async function onRequest(context) {
// create connection to postgres database
const sql = postgres(context.env.HYPERDRIVE.connectionString);
try {
const result = await sql`SELECT id, name, value FROM records`;
return Response.json({result: result})
} catch (e) {
return Response.json({error: e.message, {status: 500}});
}
}
```
```ts
import postgres from "postgres";
interface Env {
HYPERDRIVE: Hyperdrive;
}
type MyRecord = {
id: number;
name: string;
value: string;
};
export const onRequest: PagesFunction = async (context) => {
// create connection to postgres database
const sql = postgres(context.env.HYPERDRIVE.connectionString);
try {
const result = await sql`SELECT id, name, value FROM records`;
return Response.json({result: result})
} catch (e) {
return Response.json({error: e.message, {status: 500}});
}
};
```
### Interact with your Hyperdrive binding locally
To interact with your Hyperdrive binding locally, you must provide a local connection string to your database that your Pages project will connect to directly. You can set an environment variable `WRANGLER_HYPERDRIVE_LOCAL_CONNECTION_STRING_` with the connection string of the database, or use the Wrangler file to configure your Hyperdrive binding with a `localConnectionString` as specified in [Hyperdrive documentation for local development](/hyperdrive/configuration/local-development/). Then, run [`npx wrangler pages dev `](/workers/wrangler/commands/#dev-1).
## Analytics Engine
The [Analytics Engine](/analytics/analytics-engine/) binding enables you to write analytics within your Pages Function.
To bind an Analytics Engine dataset to your Pages Function, you must configure an Analytics Engine binding using the [Wrangler configuration file](/pages/functions/wrangler-configuration/#analytics-engine-datasets) or the Cloudflare dashboard:
To configure an Analytics Engine binding via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Bindings** > **Add** > **Analytics engine**.
5. Give your binding a name under **Variable name**.
6. Under **Dataset**, input your desired dataset.
7. Redeploy your project for the binding to take effect.
Below is an example of how to use an Analytics Engine binding in your Function. In the following example, the binding is called `ANALYTICS_ENGINE` and you can access the binding in your Function code on `context.env`:
```js
export async function onRequest(context) {
const url = new URL(context.request.url);
context.env.ANALYTICS_ENGINE.writeDataPoint({
indexes: [],
blobs: [url.hostname, url.pathname],
doubles: [],
});
return new Response("Logged analytic");
}
```
```ts
interface Env {
ANALYTICS_ENGINE: AnalyticsEngineDataset;
}
export const onRequest: PagesFunction = async (context) => {
const url = new URL(context.request.url);
context.env.ANALYTICS_ENGINE.writeDataPoint({
indexes: [],
blobs: [url.hostname, url.pathname],
doubles: [],
});
return new Response("Logged analytic");
};
```
### Interact with your Analytics Engine binding locally
You cannot use an Analytics Engine binding locally.
## Environment variables
An [environment variable](/workers/configuration/environment-variables/) is an injected value that can be accessed by your Functions. Environment variables are a type of binding that allow you to attach text strings or JSON values to your Pages Function. It is stored as plain text. Set your environment variables directly within the Cloudflare dashboard for both your production and preview environments at runtime and build-time.
To add environment variables to your Pages project, you can use the [Wrangler configuration file](/pages/functions/wrangler-configuration/#environment-variables) or the Cloudflare dashboard.
To configure an environment variable via the Cloudflare dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > **Settings**.
4. Select your Pages environment > **Variables and Secrets** > **Add** .
5. After setting a variable name and value, select **Save**.
Below is an example of how to use environment variables in your Function. The environment variable in this example is `ENVIRONMENT` and you can access the environment variable on `context.env`:
```js
export function onRequest(context) {
if (context.env.ENVIRONMENT === "development") {
return new Response("This is a local environment!");
} else {
return new Response("This is a live environment");
}
}
```
```ts
interface Env {
ENVIRONMENT: string;
}
export const onRequest: PagesFunction = async (context) => {
if (context.env.ENVIRONMENT === "development") {
return new Response("This is a local environment!");
} else {
return new Response("This is a live environment");
}
};
```
### Interact with your environment variables locally
You can interact with your environment variables locally in one of two ways:
- Configure your Pages project's Wrangler file and running `npx wrangler pages dev`.
- Pass arguments to [`wrangler pages dev`](/workers/wrangler/commands/#dev-1) directly.
To interact with your environment variables locally via the Wrangler CLI, add `--binding==` to the `wrangler pages dev` command:
```sh
npx wrangler pages dev --binding==
```
## Secrets
Secrets are a type of binding that allow you to attach encrypted text values to your Pages Function. You cannot see secrets after you set them and can only access secrets programmatically on `context.env`. Secrets are used for storing sensitive information like API keys and auth tokens.
To add secrets to your Pages project:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project > select **Settings**.
4. Select your Pages environment > **Variables and Secrets** > **Add**.
5. Set a variable name and value.
6. Select **Encrypt** to create your secret.
7. Select **Save**.
You use secrets the same way as environment variables. When setting secrets with Wrangler or in the Cloudflare dashboard, it needs to be done before a deployment that uses those secrets. For more guidance, refer to [Environment variables](#environment-variables).
### Local development with secrets
---
# Debugging and logging
URL: https://developers.cloudflare.com/pages/functions/debugging-and-logging/
Access your Functions logs by using the Cloudflare dashboard or the [Wrangler CLI](/workers/wrangler/commands/#deployment-tail).
Logs are a powerful debugging tool that can help you test and monitor the behavior of your Pages Functions once they have been deployed. Logs are available for every deployment of your Pages project.
Logs provide detailed information about events and can give insight into:
* Successful or failed requests to your Functions.
* Uncaught exceptions thrown by your Functions.
* Custom `console.log`s declared within your Functions.
* Production issues that cannot be easily reproduced.
* Real-time view of incoming requests to your application.
There are two ways to start a logging session:
1. Run `wrangler pages deployment tail` [in your terminal](/pages/functions/debugging-and-logging/#view-logs-with-wrangler).
2. Use the [Cloudflare dashboard](/pages/functions/debugging-and-logging/#view-logs-in-the-cloudflare-dashboard).
## Add custom logs
Custom logs are `console.log()` statements that you can add yourself inside your Functions. When streaming logs for deployments that contain these Functions, the statements will appear in both `wrangler pages deployment tail` and dashboard outputs.
Below is an example of a custom `console.log` statement inside a Pages Function:
```js
export async function onRequest(context) {
console.log(`[LOGGING FROM /hello]: Request came from ${context.request.url}`);
return new Response("Hello, world!");
}
```
After you deploy the code above, run `wrangler pages deployment tail` in your terminal. Then access the route at which your Function lives. Your terminal will display:

Your dashboard will display:

## View logs with Wrangler
`wrangler pages deployment tail` enables developers to livestream logs for a specific project and deployment.
To get started, run `wrangler pages deployment tail` in your Pages project directory. This will log any incoming requests to your application in your local terminal.
The output of each `wrangler pages deployment tail` log is a structured JSON object:
```js
{
"outcome": "ok",
"scriptName": null,
"exceptions": [
{
"stack": " at src/routes/index.tsx17:4\n at new Promise ()\n",
"name": "Error",
"message": "An error has occurred",
"timestamp": 1668542036110
}
],
"logs": [],
"eventTimestamp": 1668542036104,
"event": {
"request": {
"url": "https://pages-fns.pages.dev",
"method": "GET",
"headers": {},
"cf": {}
},
"response": {
"status": 200
}
},
"id": 0
}
```
`wrangler pages deployment tail` allows you to customize a logging session to better suit your needs. Refer to the [`wrangler pages deployment tail` documentation](/workers/wrangler/commands/#deployment-tail) for available configuration options.
## View logs in the Cloudflare Dashboard
To view logs for your `production` or `preview` environments associated with any deployment:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/) and select your account.
2. In **Account Home**, select **Workers & Pages**.
3. Select your Pages project, go to the deployment you want to view logs for and select **View details** > **Functions**.
Logging is available for all customers (Free, Paid, Enterprise).
## Limits
The following limits apply to Functions logs:
* Logs are not stored. You can start and stop the stream at any time to view them, but they do not persist.
* Logs will not display if the Function’s requests per second are over 100 for the last five minutes.
* Logs from any [Durable Objects](/pages/functions/bindings/#durable-objects) your Functions bind to will show up in the Cloudflare dashboard.
* A maximum of 10 clients can view a deployment’s logs at one time. This can be a combination of either dashboard sessions or `wrangler pages deployment tail` calls.
## Sourcemaps
If you're debugging an uncaught exception, you might find that the [stack traces](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error/stack) in your logs contain line numbers to generated JavaScript files. Using Pages' support for [source maps](https://web.dev/articles/source-maps) you can get stack traces that match with the line numbers and symbols of your original source code.
:::note
When developing fullstack applications, many build tools (including wrangler for Pages Functions and most fullstack frameworks) will generate source maps for both the client and server, ensure your build step is configured to only emit server sourcemaps or use an additional build step to remove the client source maps. Public source maps might expose the source code of your application to the user.
:::
Refer to [Source maps and stack traces](/pages/functions/source-maps/) for an in-depth explanation.
---
# Get started
URL: https://developers.cloudflare.com/pages/functions/get-started/
This guide will instruct you on creating and deploying a Pages Function.
## Prerequisites
You must have a Pages project set up on your local machine or deployed on the Cloudflare dashboard. To create a Pages project, refer to [Get started](/pages/get-started/).
## Create a Function
To get started with generating a Pages Function, create a `/functions` directory. Make sure that the `/functions` directory is at the root of your Pages project (and not in the static root, such as `/dist`).
:::note[Advanced mode]
For existing applications where Pages Functions’ built-in file path based routing and middleware system is not desirable, use [Advanced mode](/pages/functions/advanced-mode/). Advanced mode allows you to develop your Pages Functions with a `_worker.js` file rather than the `/functions` directory.
:::
Writing your Functions files in the `/functions` directory will automatically generate a Worker with custom functionality at predesignated routes.
Copy and paste the following code into a `helloworld.js` file that you create in your `/functions` folder:
```js
export function onRequest(context) {
return new Response("Hello, world!")
}
```
In the above example code, the `onRequest` handler takes a request [`context`](/pages/functions/api-reference/#eventcontext) object. The handler must return a `Response` or a `Promise` of a `Response`.
This Function will run on the `/helloworld` route and returns `"Hello, world!"`. The reason this Function is available on this route is because the file is named `helloworld.js`. Similarly, if this file was called `howdyworld.js`, this function would run on `/howdyworld`.
Refer to [Routing](/pages/functions/routing/) for more information on route customization.
### Runtime features
[Workers runtime features](/workers/runtime-apis/) are configurable on Pages Functions, including [compatibility with a subset of Node.js APIs](/workers/runtime-apis/nodejs) and the ability to set a [compatibility date or compatibility flag](/workers/configuration/compatibility-dates/).
Set these configurations by passing an argument to your [Wrangler](/workers/wrangler/commands/#dev-1) command or by setting them in the dashboard. To set Pages compatibility flags in the Cloudflare dashboard:
1. Log into the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. Select **Workers & Pages** and select your Pages project.
3. Select **Settings** > **Functions** > **Compatibility Flags**.
4. Configure your Production and Preview compatibility flags as needed.
Additionally, use other Cloudflare products such as [D1](/d1/) (serverless DB) and [R2](/r2/) from within your Pages project by configuring [bindings](/pages/functions/bindings/).
## Deploy your Function
After you have set up your Function, deploy your Pages project. Deploy your project by:
* Connecting your [Git provider](/pages/get-started/git-integration/).
* Using [Wrangler](/workers/wrangler/commands/#pages) from the command line.
:::caution
[Direct Upload](/pages/get-started/direct-upload/) from the Cloudflare dashboard is currently not supported with Functions.
:::
## Related resources
* Customize your [Function's routing](/pages/functions/routing/)
* Review the [API reference](/pages/functions/api-reference/)
* Learn how to [debug your Function](/pages/functions/debugging-and-logging/)
---
# Functions
URL: https://developers.cloudflare.com/pages/functions/
import { DirectoryListing } from "~/components"
Pages Functions allows you to build full-stack applications by executing code on the Cloudflare network with [Cloudflare Workers](/workers/). With Functions, you can introduce application aspects such as authenticating, handling form submissions, or working with middleware. [Workers runtime features](/workers/runtime-apis/) are configurable on Pages Functions, including [compatibility with a subset of Node.js APIs](/workers/runtime-apis/nodejs) and the ability to set a [compatibility date or compatibility flag](/workers/configuration/compatibility-dates/). Use Functions to deploy server-side code to enable dynamic functionality without running a dedicated server.
To provide feedback or ask questions on Functions, join the [Cloudflare Developers Discord](https://discord.com/invite/cloudflaredev) and connect with the Cloudflare team in the [#functions channel](https://discord.com/channels/595317990191398933/910978223968518144).
---
# Local development
URL: https://developers.cloudflare.com/pages/functions/local-development/
Run your Pages application locally with our Wrangler Command Line Interface (CLI).
## Install Wrangler
To get started with Wrangler, refer to the [Install/Update Wrangler](/workers/wrangler/install-and-update/).
## Run your Pages project locally
The main command for local development on Pages is `wrangler pages dev`. This will let you run your Pages application locally, which includes serving static assets and running your Functions.
With your folder of static assets set up, run the following command to start local development:
```sh
npx wrangler pages dev
```
This will then start serving your Pages project. You can press `b` to open the browser on your local site, (available, by default, on [http://localhost:8788](http://localhost:8788)).
:::note
If you have a [Wrangler configuration file](/pages/functions/wrangler-configuration/) file configured for your Pages project, you can run [`wrangler pages dev`](/workers/wrangler/commands/#dev-1) without specifying a directory.
:::
### HTTPS support
To serve your local development server over HTTPS with a self-signed certificate, you can [set `local_protocol` via the [Wrangler configuration file](/pages/functions/wrangler-configuration/#local-development-settings) or you can pass the `--local-protocol=https` argument to [`wrangler pages dev`](/workers/wrangler/commands/#dev-1):
```sh
npx wrangler pages dev --local-protocol=https
```
## Attach bindings to local development
To attach a binding to local development, refer to [Bindings](/pages/functions/bindings/) and find the Cloudflare Developer Platform resource you would like to work with.
## Additional Wrangler configuration
If you are using a Wrangler configuration file in your project, you can set up dev server values like: `port`, `local protocol`, `ip`, and `port`. For more information, read about [configuring local development settings](/pages/functions/wrangler-configuration/#local-development-settings).
---
# Metrics
URL: https://developers.cloudflare.com/pages/functions/metrics/
Functions metrics can help you diagnose issues and understand your workloads by showing performance and usage data for your Functions.
## Functions metrics
Functions metrics aggregate request data for an individual Pages project. To view your Functions metrics:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In **Account Home**, select **Workers & Pages** > in **Overview**, select your Pages project.
3. In your Pages project, select **Functions Metrics**.
There are three metrics that can help you understand the health of your Function:
1. Requests success.
2. Requests errors.
3. Invocation Statuses.
### Requests
In **Functions metrics**, you can see historical request counts broken down into total requests, successful requests and errored requests. Information on subrequests is available by selecting **Subrequests**.
* **Total**: All incoming requests registered by a Function. Requests blocked by [Web Application Firewall (WAF)](https://www.cloudflare.com/waf/) or other security features will not count.
* **Success**: Requests that returned a `Success` or `Client Disconnected` [invocation status](#invocation-statuses).
* **Errors**: Requests that returned a `Script Threw Exception`, `Exceeded Resources`, or `Internal Error` [invocation status](#invocation-statuses)
* **Subrequests**: Requests triggered by calling `fetch` from within a Function. When your Function fetches a static asset, it will count as a subrequest. A subrequest that throws an uncaught error will not be counted.
Request traffic data may display a drop off near the last few minutes displayed in the graph for time ranges less than six hours. This does not reflect a drop in traffic, but a slight delay in aggregation and metrics delivery.
### Invocation statuses
Function invocation statuses indicate whether a Function executed successfully or failed to generate a response in the Workers runtime. Invocation statuses differ from HTTP status codes. In some cases, a Function invocation succeeds but does not generate a successful HTTP status because of another error encountered outside of the Workers runtime. Some invocation statuses result in a Workers error code being returned to the client.
| Invocation status | Definition | Workers error code | Graph QL field |
| ---------------------- | ----------------------------------------------------- | ------------------ | -------------------- |
| Success | Worker script executed successfully | | success |
| Client disconnected | HTTP client disconnected before the request completed | | clientDisconnected |
| Script threw exception | Worker script threw an unhandled JavaScript exception | 1101 | scriptThrewException |
| Exceeded resources^1 | Worker script exceeded runtime limits | 1102, 1027 | exceededResources |
| Internal error^2 | Workers runtime encountered an error | | internalError |
1. The Exceeded Resources status may appear when the Worker exceeds a [runtime limit](/workers/platform/limits/#request-limits). The most common cause is excessive CPU time, but is also caused by a script exceeding startup time or free tier limits.
2. The Internal Error status may appear when the Workers runtime fails to process a request due to an internal failure in our system. These errors are not caused by any issue with the Function code nor any resource limit. While requests with Internal Error status are rare, some may appear during normal operation. These requests are not counted towards usage for billing purposes. If you notice an elevated rate of requests with Internal Error status, review [www.cloudflarestatus.com](http://www.cloudflarestatus.com).
To further investigate exceptions, refer to [Debugging and Logging](/pages/functions/debugging-and-logging)
### CPU time per execution
The CPU Time per execution chart shows historical CPU time data broken down into relevant quantiles using [reservoir sampling](https://en.wikipedia.org/wiki/Reservoir_sampling). Learn more about [interpreting quantiles](https://www.statisticshowto.com/quantile-definition-find-easy-steps/).
In some cases, higher quantiles may appear to exceed [CPU time limits](/workers/platform/limits/#cpu-time) without generating invocation errors because of a mechanism in the Workers runtime that allows rollover CPU time for requests below the CPU limit.
### Duration per execution
The **Duration** chart underneath **Median CPU time** in the **Functions metrics** dashboard shows historical [duration](/workers/platform/limits/#duration) per Function execution. The data is broken down into relevant quantiles, similar to the CPU time chart.
Understanding duration on your Function is useful when you are intending to do a significant amount of computation on the Function itself. This is because you may have to use the Standard or Unbound usage model which allows up to 30 seconds of CPU time.
Workers on the [Bundled Usage Model](/workers/platform/pricing/#workers) may have high durations, even with a 50 ms CPU time limit, if they are running many network-bound operations like fetch requests and waiting on responses.
### Metrics retention
Functions metrics can be inspected for up to three months in the past in maximum increments of one week. The **Functions metrics** dashboard in your Pages project includes the charts and information described above.
---
# Middleware
URL: https://developers.cloudflare.com/pages/functions/middleware/
Middleware is reusable logic that can be run before your [`onRequest`](/pages/functions/api-reference/#onrequests) function. Middlewares are typically utility functions. Error handling, user authentication, and logging are typical candidates for middleware within an application.
## Add middleware
Middleware is similar to standard Pages Functions but middleware is always defined in a `_middleware.js` file in your project's `/functions` directory. A `_middleware.js` file exports an [`onRequest`](/pages/functions/api-reference/#onrequests) function. The middleware will run on requests that match any Pages Functions in the same `/functions` directory, including subdirectories. For example, `functions/users/_middleware.js` file will match requests for `/functions/users/nevi`, `/functions/users/nevi/123` and `functions/users`.
If you want to run a middleware on your entire application, including in front of static files, create a `functions/_middleware.js` file.
In `_middleware.js` files, you may export an `onRequest` handler or any of its method-specific variants. The following is an example middleware which handles any errors thrown in your project's Pages Functions. This example uses the `next()` method available in the request handler's context object:
```js
export async function onRequest(context) {
try {
return await context.next();
} catch (err) {
return new Response(`${err.message}\n${err.stack}`, { status: 500 });
}
}
```
## Chain middleware
You can export an array of Pages Functions as your middleware handler. This allows you to chain together multiple middlewares that you want to run. In the following example, you can handle any errors generated from your project's Functions, and check if the user is authenticated:
```js
async function errorHandling(context) {
try {
return await context.next();
} catch (err) {
return new Response(`${err.message}\n${err.stack}`, { status: 500 });
}
}
function authentication(context) {
if (context.request.headers.get("x-email") != "admin@example.com") {
return new Response("Unauthorized", { status: 403 });
}
return context.next();
}
export const onRequest = [errorHandling, authentication];
```
In the above example, the `errorHandling` function will run first. It will capture any errors in the `authentication` function and any errors in any other subsequent Pages Functions.
---
# Module support
URL: https://developers.cloudflare.com/pages/functions/module-support/
Pages Functions provide support for several module types, much like [Workers](https://blog.cloudflare.com/workers-javascript-modules/). This means that you can import and use external modules such as WebAssembly (Wasm), `text` and `binary` files inside your Functions code.
This guide will instruct you on how to use these different module types inside your Pages Functions.
## ECMAScript Modules
ECMAScript modules (or in short ES Modules) is the official, [standardized](https://tc39.es/ecma262/#sec-modules) module system for JavaScript. It is the recommended mechanism for writing modular and reusable JavaScript code.
[ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules) are defined by the use of `import` and `export` statements. Below is an example of a script written in ES Modules format, and a Pages Function that imports that module:
```js
export function greeting(name: string): string {
return `Hello ${name}!`;
}
```
```js
import { greeting } from "../src/greeting.ts";
export async function onRequest(context) {
return new Response(`${greeting("Pages Functions")}`);
}
```
## WebAssembly Modules
[WebAssembly](/workers/runtime-apis/webassembly/) (abbreviated Wasm) allows you to compile languages like Rust, Go, or C to a binary format that can run in a wide variety of environments, including web browsers, Cloudflare Workers, Cloudflare Pages Functions, and other WebAssembly runtimes.
The distributable, loadable, and executable unit of code in WebAssembly is called a [module](https://webassembly.github.io/spec/core/syntax/modules.html).
Below is a basic example of how you can import Wasm Modules inside your Pages Functions code:
```js
import addModule from "add.wasm";
export async function onRequest() {
const addInstance = await WebAssembly.instantiate(addModule);
return new Response(
`The meaning of life is ${addInstance.exports.add(20, 1)}`
);
}
```
## Text Modules
Text Modules are a non-standardized means of importing resources such as HTML files as a `String`.
To import the below HTML file into your Pages Functions code:
```html
Hello Pages Functions!
```
Use the following script:
```js
import html from "../index.html";
export async function onRequest() {
return new Response(
html,
{
headers: { "Content-Type": "text/html" }
}
);
}
```
## Binary Modules
Binary Modules are a non-standardized way of importing binary data such as images as an `ArrayBuffer`.
Below is a basic example of how you can import the data from a binary file inside your Pages Functions code:
```js
import data from "../my-data.bin";
export async function onRequest() {
return new Response(
data,
{
headers: { "Content-Type": "application/octet-stream" }
}
);
}
```
---
# Pricing
URL: https://developers.cloudflare.com/pages/functions/pricing/
Requests to your Functions are billed as Cloudflare Workers requests. Workers plans and pricing can be found [in the Workers documentation](/workers/platform/pricing/).
## Paid Plans
Requests to your Pages functions count towards your quota for Workers Paid plans, including requests from your Function to KV or Durable Object bindings.
Pages supports the [Standard usage model](/workers/platform/pricing/#example-pricing-standard-usage-model).
:::note
Workers Enterprise accounts are billed based on the usage model specified in their contract. To switch to the Standard usage model, reach out to your Customer Success Manager (CSM). Some Workers Enterprise customers maintain the ability to [change usage models](/workers/platform/pricing/#how-to-switch-usage-models).
:::
### Static asset requests
On both free and paid plans, requests to static assets are free and unlimited. A request is considered static when it does not invoke Functions. Refer to [Functions invocation routes](/pages/functions/routing/#functions-invocation-routes) to learn more about when Functions are invoked.
## Free Plan
Requests to your Pages Functions count towards your quota for the Workers Free plan. For example, you could use 50,000 Functions requests and 50,000 Workers requests to use your full 100,000 daily request usage. The free plan daily request limit resets at midnight UTC.
---
# Routing
URL: https://developers.cloudflare.com/pages/functions/routing/
import { FileTree } from "~/components";
Functions utilize file-based routing. Your `/functions` directory structure determines the designated routes that your Functions will run on. You can create a `/functions` directory with as many levels as needed for your project's use case. Review the following directory:
- ...
- functions
- index.js
- helloworld.js
- howdyworld.js
- fruits
- index.js
- apple.js
- banana.js
The following routes will be generated based on the above file structure. These routes map the URL pattern to the `/functions` file that will be invoked when a visitor goes to the URL:
| File path | Route |
| --------------------------- | ------------------------- |
| /functions/index.js | example.com |
| /functions/helloworld.js | example.com/helloworld |
| /functions/howdyworld.js | example.com/howdyworld |
| /functions/fruits/index.js | example.com/fruits |
| /functions/fruits/apple.js | example.com/fruits/apple |
| /functions/fruits/banana.js | example.com/fruits/banana |
:::note[Trailing slash]
Trailing slash is optional. Both `/foo` and `/foo/` will be routed to `/functions/foo.js` or `/functions/foo/index.js`. If your project has both a `/functions/foo.js` and `/functions/foo/index.js` file, `/foo` and `/foo/` would route to `/functions/foo/index.js`.
:::
If no Function is matched, it will fall back to a static asset if there is one. Otherwise, the Function will fall back to the [default routing behavior](/pages/configuration/serving-pages/) for Pages' static assets.
## Dynamic routes
Dynamic routes allow you to match URLs with parameterized segments. This can be useful if you are building dynamic applications. You can accept dynamic values which map to a single path by changing your filename.
### Single path segments
To create a dynamic route, place one set of brackets around your filename – for example, `/users/[user].js`. By doing this, you are creating a placeholder for a single path segment:
| Path | Matches? |
| ------------------ | -------- |
| /users/nevi | Yes |
| /users/daniel | Yes |
| /profile/nevi | No |
| /users/nevi/foobar | No |
| /nevi | No |
### Multipath segments
By placing two sets of brackets around your filename – for example, `/users/[[user]].js` – you are matching any depth of route after `/users/`:
| Path | Matches? |
| --------------------- | -------- |
| /users/nevi | Yes |
| /users/daniel | Yes |
| /profile/nevi | No |
| /users/nevi/foobar | Yes |
| /users/daniel/xyz/123 | Yes |
| /nevi | No |
:::note[Route specificity]
More specific routes (routes with fewer wildcards) take precedence over less specific routes.
:::
#### Dynamic route examples
Review the following `/functions/` directory structure:
- ...
- functions
- date.js
- users
- special.js
- [user].js
- [[catchall]].js
The following requests will match the following files:
| Request | File |
| --------------------- | ------------------------------------------------- |
| /foo | Will route to a static asset if one is available. |
| /date | /date.js |
| /users/daniel | /users/\[user].js |
| /users/nevi | /users/\[user].js |
| /users/special | /users/special.js |
| /users/daniel/xyz/123 | /users/\[\[catchall]].js |
The URL segment(s) that match the placeholder (`[user]`) will be available in the request [`context`](/pages/functions/api-reference/#eventcontext) object. The [`context.params`](/pages/functions/api-reference/#eventcontext) object can be used to find the matched value for a given filename placeholder.
For files which match a single URL segment (use a single set of brackets), the values are returned as a string:
```js
export function onRequest(context) {
return new Response(context.params.user)
}
```
The above logic will return `daniel` for requests to `/users/daniel`.
For files which match against multiple URL segments (use a double set of brackets), the values are returned as an array:
```js
export function onRequest(context) {
return new Response(JSON.stringify(context.params.catchall))
}
```
The above logic will return `["daniel", "xyz", "123"]` for requests to `/users/daniel/xyz/123`.
## Functions invocation routes
On a purely static project, Pages offers unlimited free requests. However, once you add Functions on a Pages project, all requests by default will invoke your Function. To continue receiving unlimited free static requests, exclude your project's static routes by creating a `_routes.json` file. This file will be automatically generated if a `functions` directory is detected in your project when you publish your project with Pages CI or Wrangler.
:::note
Some frameworks (such as Remix, SvelteKit) will also automatically generate a `_routes.json` file. However, if your preferred framework does not, create an issue on their framework repository with a link to this page or let us know on [Discord](https://discord.cloudflare.com). Refer to the [Framework guide](/pages/framework-guides/) for more information on full-stack frameworks.
:::
### Create a `_routes.json` file
Create a `_routes.json` file to control when your Function is invoked. It should be placed in the output directory of your project.
This file will include three different properties:
* **version**: Defines the version of the schema. Currently there is only one version of the schema (version 1), however, we may add more in the future and aim to be backwards compatible.
* **include**: Defines routes that will be invoked by Functions. Accepts wildcard behavior.
* **exclude**: Defines routes that will not be invoked by Functions. Accepts wildcard behavior. `exclude` always take priority over `include`.
:::note
Wildcards match any number of path segments (slashes). For example, `/users/*` will match everything after the`/users/` path.
:::
#### Example configuration
Below is an example of a `_routes.json`.
```json
{
"version": 1,
"include": ["/*"],
"exclude": []
}
```
This `_routes.json` will invoke your Functions on all routes.
Below is another example of a `_routes.json` file. Any route inside the `/build` directory will not invoke the Function and will not incur a Functions invocation charge.
```json
{
"version": 1,
"include": ["/*"],
"exclude": ["/build/*"]
}
```
### Limits
Functions invocation routes have the following limits:
* You must have at least one include rule.
* You may have no more than 100 include/exclude rules combined.
* Each rule may have no more than 100 characters.
---
# Smart Placement
URL: https://developers.cloudflare.com/pages/functions/smart-placement/
By default, [Workers](/workers/) and [Pages Functions](/pages/functions/) are invoked in a data center closest to where the request was received. If you are running back-end logic in a Pages Function, it may be more performant to run that Pages Function closer to your back-end infrastructure rather than the end user. Smart Placement (beta) automatically places your workloads in an optimal location that minimizes latency and speeds up your applications.
## Background
Smart Placement applies to Pages Functions and middleware. Normally, assets are always served globally and closest to your users.
Smart Placement on Pages currently has some caveats. While assets are always meant to be served from a location closest to the user, there are two exceptions to this behavior:
1. If using middleware for every request (`functions/_middleware.js`) when Smart Placement is enabled, all assets will be served from a location closest to your back-end infrastructure. This may result in an unexpected increase in latency as a result.
2. When using [`env.ASSETS.fetch`](https://developers.cloudflare.com/pages/functions/advanced-mode/), assets served via the `ASSETS` fetcher from your Pages Function are served from the same location as your Function. This could be the location closest to your back-end infrastructure and not the user.
:::note
To understand how Smart Placement works, refer to [Smart Placement](/workers/configuration/smart-placement/).
:::
## Enable Smart Placement (beta)
Smart Placement is available on all plans.
### Enable Smart Placement via the dashboard
To enable Smart Placement via the dashboard:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In Account Home, select **Workers & Pages**.
3. In **Overview**, select your Pages project.
4. Select **Settings** > **Functions**.
5. Under **Placement**, choose **Smart**.
6. Send some initial traffic (approximately 20-30 requests) to your Pages Functions. It takes a few minutes after you have sent traffic to your Pages Function for Smart Placement to take effect.
7. View your Pages Function's [request duration metrics](/workers/observability/metrics-and-analytics/) under Functions Metrics.
## Give feedback on Smart Placement
Smart Placement is in beta. To share your thoughts and experience with Smart Placement, join the [Cloudflare Developer Discord](https://discord.cloudflare.com).
---
# Source maps and stack traces
URL: https://developers.cloudflare.com/pages/functions/source-maps/
import { Render, WranglerConfig } from "~/components"
:::caution
Support for uploading source maps for Pages is available now in open beta. Minimum required Wrangler version: 3.60.0.
:::
## Source Maps
To enable source maps, provide the `--upload-source-maps` flag to [`wrangler pages deploy`](/workers/wrangler/commands/#deploy-1) or add the following to your Pages application's [Wrangler configuration file](/pages/functions/wrangler-configuration/) if you are using the Pages build environment:
```toml
upload_source_maps = true
```
When uploading source maps is enabled, Wrangler will automatically generate and upload source map files when you run [`wrangler pages deploy`](/workers/wrangler/commands/#deploy-1).
## Stack traces
When your application throws an uncaught exception, we fetch the source map and use it to map the stack trace of the exception back to lines of your application’s original source code.
You can then view the stack trace when streaming [real-time logs](/pages/functions/debugging-and-logging/).
:::note
The source map is retrieved after your Pages Function invocation completes — it's an asynchronous process that does not impact your applications's CPU utilization or performance. Source maps are not accessible inside the application at runtime, if you `console.log()` the [stack property](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error/stack), you will not get a deobfuscated stack trace.
:::
## Limits
| Description | Limit |
| ------------------------------ | ------------- |
| Maximum Source Map Size | 15 MB gzipped |
## Related resources
* [Real-time logs](/pages/functions/debugging-and-logging/) - Learn how to capture Pages logs in real-time.
---
# TypeScript
URL: https://developers.cloudflare.com/pages/functions/typescript/
import { PackageManagers, Render } from "~/components";
Pages Functions supports TypeScript. Author any files in your `/functions` directory with a `.ts` extension instead of a `.js` extension to start using TypeScript.
You can add runtime types and Env types by running:
Then configure the types by creating a `functions/tsconfig.json` file:
```json
{
"compilerOptions": {
"target": "esnext",
"module": "esnext",
"lib": ["esnext"],
"types": ["./types.d.ts"]
}
}
```
See [the `wrangler types` command docs](/workers/wrangler/commands/#types) for more details.
If you already have a `tsconfig.json` at the root of your project, you may wish to explicitly exclude the `/functions` directory to avoid conflicts. To exclude the `/functions` directory:
```json
{
"include": ["src/**/*"],
"exclude": ["functions/**/*"],
"compilerOptions": {}
}
```
Pages Functions can be typed using the `PagesFunction` type. This type accepts an `Env` parameter. The `Env` type should have been generated by `wrangler types` and can be found at the top of `types.d.ts`.
Alternatively, you can define the `Env` type manually. For example:
```ts
interface Env {
KV: KVNamespace;
}
export const onRequest: PagesFunction = async (context) => {
const value = await context.env.KV.get("example");
return new Response(value);
};
```
If you are using `nodejs_compat`, make sure you have installed `@types/node` and updated your `tsconfig.json`.
```json
{
"compilerOptions": {
"target": "esnext",
"module": "esnext",
"lib": ["esnext"],
"types": ["./types.d.ts", "node"]
}
}
```
:::note
If you were previously using `@cloudflare/workers-types` instead of the runtime types generated by `wrangler types`, you can refer to this [migration guide](/workers/languages/typescript/#migrating).
:::
---
# Configuration
URL: https://developers.cloudflare.com/pages/functions/wrangler-configuration/
import { Render, TabItem, Tabs, Type, MetaInfo, WranglerConfig } from "~/components";
:::caution
If your project contains an existing Wrangler file that you [previously used for local development](/pages/functions/local-development/), make sure you verify that it matches your project settings in the Cloudflare dashboard before opting-in to deploy your Pages project with the Wrangler configuration file. Instead of writing your Wrangler file by hand, Cloudflare recommends using `npx wrangler pages download config` to download your current project settings into a Wrangler file.
:::
:::note
As of Wrangler v3.91.0, Wrangler supports both JSON (`wrangler.json` or `wrangler.jsonc`) and TOML (`wrangler.toml`) for its configuration file. Prior to that version, only `wrangler.toml` was supported.
:::
Pages Functions can be configured two ways, either via the [Cloudflare dashboard](https://dash.cloudflare.com) or the Wrangler configuration file, a file used to customize the development and deployment setup for [Workers](/workers/) and Pages Functions.
This page serves as a reference on how to configure your Pages project via the Wrangler configuration file.
If using a Wrangler configuration file, you must treat your file as the [source of truth](/pages/functions/wrangler-configuration/#source-of-truth) for your Pages project configuration.
Using the Wrangler configuration file to configure your Pages project allows you to:
- **Store your configuration file in source control:** Keep your configuration in your repository alongside the rest of your code.
- **Edit your configuration via your code editor:** Remove the need to switch back and forth between interfaces.
- **Write configuration that is shared across environments:** Define configuration like [bindings](/pages/functions/bindings/) for local development, preview and production in one file.
- **Ensure better access control:** By using a configuration file in your project repository, you can control who has access to make changes without giving access to your Cloudflare dashboard.
## Example Wrangler file
```toml
name = "my-pages-app"
pages_build_output_dir = "./dist"
[[kv_namespaces]]
binding = "KV"
id = ""
[[d1_databases]]
binding = "DB"
database_name = "northwind-demo"
database_id = ""
[vars]
API_KEY = "1234567asdf"
```
## Requirements
### V2 build system
Pages Functions configuration via the Wrangler configuration file requires the [V2 build system](/pages/configuration/build-image/#v2-build-system) or later. To update from V1, refer to the [V2 build system migration instructions](/pages/configuration/build-image/#v1-to-v2-migration).
### Wrangler
You must have Wrangler version 3.45.0 or higher to use a Wrangler configuration file for your Pages project's configuration. To check your Wrangler version, update Wrangler or install Wrangler, refer to [Install/Update Wrangler](/workers/wrangler/install-and-update/).
## Migrate from dashboard configuration
The migration instructions for Pages projects that do not have a Wrangler file currently are different than those for Pages projects with an existing Wrangler file. Read the instructions based on your situation carefully to avoid errors in production.
### Projects with existing Wrangler file
Before you could use the Wrangler configuration file to define your preview and production configuration, it was possible to use the file to define which [bindings](/pages/functions/bindings/) should be available to your Pages project in local development.
If you have been using a Wrangler configuration file for local development, you may already have a file in your Pages project that looks like this:
```toml
[[kv_namespaces]]
binding = "KV"
id = ""
```
If you would like to use your existing Wrangler file for your Pages project configuration, you must:
1. Add the `pages_build_output_dir` key with the appropriate value of your [build output directory](/pages/configuration/build-configuration/#build-commands-and-directories) (for example, `pages_build_output_dir = "./dist"`.)
2. Review your existing Wrangler configuration carefully to make sure it aligns with your desired project configuration before deploying.
If you add the `pages_build_output_dir` key to your Wrangler configuration file and deploy your Pages project, Pages will use whatever configuration was defined for local use, which is very likely to be non-production. Do not deploy until you are confident that your Wrangler configuration file is ready for production use.
:::caution[Overwriting configuration]
Running [`wrangler pages download config`](/pages/functions/wrangler-configuration/#projects-without-existing-wranglertoml-file) will overwrite your existing Wrangler file with a generated Wrangler file based on your Cloudflare dashboard configuration. Run this command only if you want to discard your previous Wrangler file that you used for local development and start over with configuration pulled from the Cloudflare dashboard.
:::
You can continue to use your Wrangler file for local development without migrating it for production use by not adding a `pages_build_output_dir` key. If you do not add a `pages_build_output_dir` key and run `wrangler pages deploy`, you will see a warning message telling you that fields are missing and that the file will continue to be used for local development only.
### Projects without existing Wrangler file
If you have an existing Pages project with configuration set up via the Cloudflare dashboard and do not have an existing Wrangler file in your Project, run the `wrangler pages download config` command in your Pages project directory. The `wrangler pages download config` command will download your existing Cloudflare dashboard configuration and generate a valid Wrangler file in your Pages project directory.
```sh
npx wrangler pages download config
```
```sh
yarn wrangler pages download config
```
```sh
pnpm wrangler pages download config
```
Review your generated Wrangler file. To start using the Wrangler configuration file for your Pages project's configuration, create a new deployment, via [Git integration](/pages/get-started/git-integration/) or [Direct Upload](/pages/get-started/direct-upload/).
### Handling compatibility dates set to "Latest"
In the Cloudflare dashboard, you can set compatibility dates for preview deployments to "Latest". This will ensure your project is always using the latest compatibility date without the need to explicitly set it yourself.
If you download a Wrangler configuration file from a project configured with "Latest" using the `wrangler pages download` command, your Wrangler configuration file will have the latest compatibility date available at the time you downloaded the configuration file. Wrangler does not support the "Latest" functionality like the dashboard. Compatibility dates must be explicitly set when using a Wrangler configuration file.
Refer to [this guide](/workers/configuration/compatibility-dates/) for more information on what compatibility dates are and how they work.
## Differences using a Wrangler configuration file for Pages Functions and Workers
If you have used [Workers](/workers), you may already be familiar with the [Wrangler configuration file](/workers/wrangler/configuration/). There are a few key differences to be aware of when using this file with your Pages Functions project:
- The configuration fields **do not match exactly** between Pages Functions Wrangler file and the Workers equivalent. For example, configuration keys like `main`, which are Workers specific, do not apply to a Pages Function's Wrangler configuration file. Some functionality supported by Workers, such as [module aliasing](/workers/wrangler/configuration/#module-aliasing) cannot yet be used by Cloudflare Pages projects.
- The Pages' Wrangler configuration file introduces a new key, `pages_build_output_dir`, which is only used for Pages projects.
- The concept of [environments](/pages/functions/wrangler-configuration/#configure-environments) and configuration inheritance in this file **is not** the same as Workers.
- This file becomes the [source of truth](/pages/functions/wrangler-configuration/#source-of-truth) when used, meaning that you **can not edit the same fields in the dashboard** once you are using this file.
## Configure environments
With a Wrangler configuration file, you can quickly set configuration across your local environment, preview deployments, and production.
### Local development
The Wrangler configuration file applies locally when using `wrangler pages dev`. This means that you can test out configuration changes quickly without a need to login to the Cloudflare dashboard. Refer to the following config file for an example:
```toml
name = "my-pages-app"
pages_build_output_dir = "./dist"
compatibility_date = "2023-10-12"
compatibility_flags = ["nodejs_compat"]
[[kv_namespaces]]
binding = "KV"
id = ""
```
This Wrangler configuration file adds the `nodejs_compat` compatibility flag and a KV namespace binding to your Pages project. Running `wrangler pages dev` in a Pages project directory with this Wrangler configuration file will apply the `nodejs_compat` compatibility flag locally, and expose the `KV` binding in your Pages Function code at `context.env.KV`.
:::note
For a full list of configuration keys, refer to [inheritable keys](#inheritable-keys) and [non-inheritable keys](#non-inheritable-keys).
:::
### Production and preview deployments
Once you are ready to deploy your project, you can set the configuration for production and preview deployments by creating a new deployment containing a Wrangler file.
:::note
For the following commands, if you are using git it is important to remember the branch that you set as your [production branch](/pages/configuration/branch-build-controls/#production-branch-control) as well as your [preview branch settings](/pages/configuration/branch-build-controls/#preview-branch-control).
:::
To use the example above as your configuration for production, make a new production deployment using:
```sh
npx wrangler pages deploy
```
or more specifically:
```sh
npx wrangler pages deploy --branch
```
To deploy the configuration for preview deployments, you can run the same command as above while on a branch you have configured to work with [preview deployments](/pages/configuration/branch-build-controls/#preview-branch-control). This will set the configuration for all preview deployments, not just the deployments from a specific branch. Pages does not currently support branch-based configuration.
:::note
The `--branch` flag is optional with `wrangler pages deploy`. If you use git integration, Wrangler will infer the branch you are on from the repository you are currently in and implicitly add it to the command.
:::
### Environment-specific overrides
There are times that you might want to use different configuration across local, preview deployments, and production. It is possible to override configuration for production and preview deployments by using `[env.production]` or `[env.preview]`.
:::note
Unlike [Workers Environments](/workers/wrangler/configuration/#environments), `production` and `preview` are the only two options available via `[env.]`.
:::
Refer to the following Wrangler configuration file for an example of how to override preview deployment configuration:
```toml
name = "my-pages-site"
pages_build_output_dir = "./dist"
[[kv_namespaces]]
binding = "KV"
id = ""
[vars]
API_KEY = "1234567asdf"
[[env.preview.kv_namespaces]]
binding = "KV"
id = ""
[env.preview.vars]
API_KEY = "8901234bfgd"
```
If you deployed this file via `wrangler pages deploy`, `name`, `pages_build_output_dir`, `kv_namespaces`, and `vars` would apply the configuration to local and production, while `env.preview` would override `kv_namespaces` and `vars` for preview deployments.
If you wanted to have configuration values apply to local and preview, but override production, your file would look like this:
```toml
name = "my-pages-site"
pages_build_output_dir = "./dist"
[[kv_namespaces]]
binding = "KV"
id = ""
[vars]
API_KEY = "1234567asdf"
[[env.production.kv_namespaces]]
binding = "KV"
id = ""
[env.production.vars]
API_KEY = "8901234bfgd"
```
You can always be explicit and override both preview and production:
```toml
name = "my-pages-site"
pages_build_output_dir = "./dist"
[[kv_namespaces]]
binding = "KV"
id = ""
[vars]
API_KEY = "1234567asdf"
[[env.preview.kv_namespaces]]
binding = "KV"
id = ""
[env.preview.vars]
API_KEY = "8901234bfgd"
[[env.production.kv_namespaces]]
binding = "KV"
id = ""
[env.production.vars]
API_KEY = "6567875fvgt"
```
## Inheritable keys
Inheritable keys are configurable at the top-level, and can be inherited (or overridden) by environment-specific configuration.
- `name`
- The name of your Pages project. Alphanumeric and dashes only.
- `pages_build_output_dir`
- The path to your project's build output folder. For example: `./dist`.
- `compatibility_date`
- A date in the form `yyyy-mm-dd`, which will be used to determine which version of the Workers runtime is used. Refer to [Compatibility dates](/workers/configuration/compatibility-dates/).
- `compatibility_flags` string\[] optional
- A list of flags that enable features from upcoming features of the Workers runtime, usually used together with `compatibility_date`. Refer to [compatibility dates](/workers/configuration/compatibility-dates/).
- `send_metrics`
- Whether Wrangler should send usage data to Cloudflare for this project. Defaults to `true`. You can learn more about this in our [data policy](https://github.com/cloudflare/workers-sdk/tree/main/packages/wrangler/telemetry.md).
- `limits` Limits optional
- Configures limits to be imposed on execution at runtime. Refer to [Limits](#limits).
- `placement` Placement optional
- Specify how Pages Functions should be located to minimize round-trip time. Refer to [Smart Placement](/workers/configuration/smart-placement/).
- `upload_source_maps` boolean
- When `upload_source_maps` is set to `true`, Wrangler will upload any server-side source maps part of your Pages project to give corrected stack traces in logs.
## Non-inheritable keys
Non-inheritable keys are configurable at the top-level, but, if any one non-inheritable key is overridden for any environment (for example,`[[env.production.kv_namespaces]]`), all non-inheritable keys must also be specified in the environment configuration and overridden.
For example, this configuration will not work:
```toml
name = "my-pages-site"
pages_build_output_dir = "./dist"
[[kv_namespaces]]
binding = "KV"
id = ""
[vars]
API_KEY = "1234567asdf"
[env.production.vars]
API_KEY = "8901234bfgd"
```
`[[env.production.vars]]` is set to override `[vars]`. Because of this `[[kv_namespaces]]` must also be overridden by defining `[[env.production.kv_namespaces]]`.
This will work for local development, but will fail to validate when you try to deploy.
- `vars`
- A map of environment variables to set when deploying your Function. Refer to [Environment variables](/pages/functions/bindings/#environment-variables).
- `d1_databases`
- A list of D1 databases that your Function should be bound to. Refer to [D1 databases](/pages/functions/bindings/#d1-databases).
- `durable_objects`
- A list of Durable Objects that your Function should be bound to. Refer to [Durable Objects](/pages/functions/bindings/#durable-objects).
- `hyperdrive`
- Specifies Hyperdrive configs that your Function should be bound to. Refer to [Hyperdrive](/pages/functions/bindings/#r2-buckets).
- `kv_namespaces`
- A list of KV namespaces that your Function should be bound to. Refer to [KV namespaces](/pages/functions/bindings/#kv-namespaces).
- `queues.producers`
- Specifies Queues Producers that are bound to this Function. Refer to [Queues Producers](/queues/get-started/#4-set-up-your-producer-worker).
- `r2_buckets`
- A list of R2 buckets that your Function should be bound to. Refer to [R2 buckets](/pages/functions/bindings/#r2-buckets).
- `vectorize`
- A list of Vectorize indexes that your Function should be bound to. Refer to [Vectorize indexes](/vectorize/get-started/intro/#3-bind-your-worker-to-your-index).
- `services`
- A list of service bindings that your Function should be bound to. Refer to [service bindings](/pages/functions/bindings/#service-bindings).
- `analytics_engine_datasets`
- Specifies analytics engine datasets that are bound to this Function. Refer to [Workers Analytics Engine](/analytics/analytics-engine/get-started/).
- `ai`
- Specifies an AI binding to this Function. Refer to [Workers AI](/pages/functions/bindings/#workers-ai).
## Limits
You can configure limits for your Pages project in the same way you can for Workers. Read [this guide](/workers/wrangler/configuration/#limits) for more details.
## Bindings
A [binding](/pages/functions/bindings/) enables your Pages Functions to interact with resources on the Cloudflare Developer Platform. Use bindings to integrate your Pages Functions with Cloudflare resources like [KV](/kv/), [Durable Objects](/durable-objects/), [R2](/r2/), and [D1](/d1/). You can set bindings for both production and preview environments.
### D1 databases
[D1](/d1/) is Cloudflare's serverless SQL database. A Function can query a D1 database (or databases) by creating a [binding](/workers/runtime-apis/bindings/) to each database for [D1 Workers Binding API](/d1/worker-api/).
:::note
When using Wrangler in the default local development mode, files will be written to local storage instead of the preview or production database. Refer to [Local development](/workers/local-development/) for more details.
:::
- Configure D1 database bindings via your [Wrangler file](/workers/wrangler/configuration/#d1-databases) the same way they are configured with Cloudflare Workers.
- Interact with your [D1 Database binding](/pages/functions/bindings/#d1-databases).
### Durable Objects
[Durable Objects](/durable-objects/) provide low-latency coordination and consistent storage for the Workers platform.
- Configure Durable Object namespace bindings via your [Wrangler file](/workers/wrangler/configuration/#durable-objects) the same way they are configured with Cloudflare Workers.
:::caution
Durable Object bindings configured in a Pages project's Wrangler configuration file require the `script_name` key. For Workers, the `script_name` key is optional.
:::
- Interact with your [Durable Object namespace binding](/pages/functions/bindings/#durable-objects).
### Environment variables
[Environment variables](/workers/configuration/environment-variables/) are a type of binding that allow you to attach text strings or JSON values to your Pages Function.
- Configure environment variables via your [Wrangler file](/workers/wrangler/configuration/#environment-variables) the same way they are configured with Cloudflare Workers.
- Interact with your [environment variables](/pages/functions/bindings/#environment-variables).
### Hyperdrive
[Hyperdrive](/hyperdrive/) bindings allow you to interact with and query any Postgres database from within a Pages Function.
- Configure Hyperdrive bindings via your [Wrangler file](/workers/wrangler/configuration/#hyperdrive) the same way they are configured with Cloudflare Workers.
### KV namespaces
[Workers KV](/kv/api/) is a global, low-latency, key-value data store. It stores data in a small number of centralized data centers, then caches that data in Cloudflare’s data centers after access.
:::note
When using Wrangler in the default local development mode, files will be written to local storage instead of the preview or production namespace. Refer to [Local development](/workers/local-development/) for more details.
:::
- Configure KV namespace bindings via your [Wrangler file](/workers/wrangler/configuration/#kv-namespaces) the same way they are configured with Cloudflare Workers.
- Interact with your [KV namespace binding](/pages/functions/bindings/#kv-namespaces).
### Queues Producers
[Queues](/queues/) is Cloudflare's global message queueing service, providing [guaranteed delivery](/queues/reference/delivery-guarantees/) and [message batching](/queues/configuration/batching-retries/). [Queue Producers](/queues/configuration/javascript-apis/#producer) enable you to send messages into a queue within your Pages Function.
:::note
You cannot currently configure a [queues consumer](/queues/reference/how-queues-works/#consumers) with Pages Functions.
:::
- Configure Queues Producer bindings via your [Wrangler file](/workers/wrangler/configuration/#queues) the same way they are configured with Cloudflare Workers.
- Interact with your [Queues Producer binding](/pages/functions/bindings/#queue-producers).
### R2 buckets
[Cloudflare R2 Storage](/r2) allows developers to store large amounts of unstructured data without the costly egress bandwidth fees associated with typical cloud storage services.
:::note
When using Wrangler in the default local development mode, files will be written to local storage instead of the preview or production bucket. Refer to [Local development](/workers/local-development/) for more details.
:::
- Configure R2 bucket bindings via your [Wrangler file](/workers/wrangler/configuration/#r2-buckets) the same way they are configured with Cloudflare Workers.
- Interact with your [R2 bucket bindings](/pages/functions/bindings/#r2-buckets).
### Vectorize indexes
A [Vectorize index](/vectorize/) allows you to insert and query vector embeddings for semantic search, classification and other vector search use-cases.
- Configure Vectorize bindings via your [Wrangler file](/workers/wrangler/configuration/#vectorize-indexes) the same way they are configured with Cloudflare Workers.
### Service bindings
A service binding allows you to call a Worker from within your Pages Function. Binding a Pages Function to a Worker allows you to send HTTP requests to the Worker without those requests going over the Internet. The request immediately invokes the downstream Worker, reducing latency as compared to a request to a third-party service. Refer to [About Service bindings](/workers/runtime-apis/bindings/service-bindings/).
- Configure service bindings via your [Wrangler file](/workers/wrangler/configuration/#service-bindings) the same way they are configured with Cloudflare Workers.
- Interact with your [service bindings](/pages/functions/bindings/#service-bindings).
### Analytics Engine Datasets
[Workers Analytics Engine](/analytics/analytics-engine/) provides analytics, observability and data logging from Pages Functions. Write data points within your Pages Function binding then query the data using the [SQL API](/analytics/analytics-engine/sql-api/).
- Configure Analytics Engine Dataset bindings via your [Wrangler file](/workers/wrangler/configuration/#analytics-engine-datasets) the same way they are configured with Cloudflare Workers.
- Interact with your [Analytics Engine Dataset](/pages/functions/bindings/#analytics-engine).
### Workers AI
[Workers AI](/workers-ai/) allows you to run machine learning models, on the Cloudflare network, from your own code – whether that be from Workers, Pages, or anywhere via REST API.
Unlike other bindings, this binding is limited to one AI binding per Pages Function project.
- Configure Workers AI bindings via your [Wrangler file](/workers/wrangler/configuration/#workers-ai) the same way they are configured with Cloudflare Workers.
- Interact with your [Workers AI binding](/pages/functions/bindings/#workers-ai).
## Local development settings
The local development settings that you can configure are the same for Pages Functions and Cloudflare Workers. Read [this guide](/workers/wrangler/configuration/#local-development-settings) for more details.
## Source of truth
When used in your Pages Functions projects, your Wrangler file is the source of truth. You will be able to see, but not edit, the same fields when you log into the Cloudflare dashboard.
If you decide that you do not want to use a Wrangler configuration file for configuration, you can safely delete it and create a new deployment. Configuration values from your last deployment will still apply and you will be able to edit them from the dashboard.
---
# Changelog
URL: https://developers.cloudflare.com/pages/platform/changelog/
import { ProductReleaseNotes } from "~/components";
{/* */}
---
# Platform
URL: https://developers.cloudflare.com/pages/platform/
import { DirectoryListing } from "~/components"
---
# Known issues
URL: https://developers.cloudflare.com/pages/platform/known-issues/
Here are some known bugs and issues with Cloudflare Pages:
## Builds and deployment
- GitHub and GitLab are currently the only supported platforms for automatic CI/CD builds. [Direct Upload](/pages/get-started/direct-upload/) allows you to integrate your own build platform or upload from your local computer.
- Incremental builds are currently not supported in Cloudflare Pages.
- Uploading a `/functions` directory through the dashboard's Direct Upload option does not work (refer to [Using Functions in Direct Upload](/pages/get-started/direct-upload/#functions)).
- Commits/PRs from forked repositories will not create a preview. Support for this will come in the future.
## Git configuration
- If you deploy using the Git integration, you cannot switch to Direct Upload later. However, if you already use a Git-integrated project and do not want to trigger deployments every time you push a commit, you can [disable/pause automatic deployments](/pages/configuration/git-integration/#disable-automatic-deployments). Alternatively, you can delete your Pages project and create a new one pointing at a different repository if you need to update it.
## Build configuration
- `*.pages.dev` subdomains currently cannot be changed. If you need to change your `*.pages.dev` subdomain, delete your project and create a new one.
- Hugo builds automatically run an old version. To run the latest version of Hugo (for example, `0.101.0`), you will need to set an environment variable. Set `HUGO_VERSION` to `0.101.0` or the Hugo version of your choice.
- By default, Cloudflare uses Node `12.18.0` in the Pages build environment. If you need to use a newer Node version, refer to the [Build configuration page](/pages/configuration/build-configuration/) for configuration options.
- For users migrating from Netlify, Cloudflare does not support Netlify's Forms feature. [Pages Functions](/pages/functions/) are available as an equivalent to Netlify's Serverless Functions.
## Custom Domains
- It is currently not possible to add a custom domain with
- a wildcard, for example, `*.domain.com`.
- a Worker already routed on that domain.
- It is currently not possible to add a custom domain with a Cloudflare Access policy already enabled on that domain.
- Cloudflare's Load Balancer does not work with `*.pages.dev` projects; an `Error 1000: DNS points to prohibited IP` will appear.
- When adding a custom domain, the domain will not verify if Cloudflare cannot validate a request for an SSL certificate on that hostname. In order for the SSL to validate, ensure Cloudflare Access or a Cloudflare Worker is allowing requests to the validation path: `http://{domain_name}/.well-known/acme-challenge/*`.
- [Advanced Certificates](/ssl/edge-certificates/advanced-certificate-manager/) cannot be used with Cloudflare Pages due to Cloudflare for SaaS's [certificate prioritization](/ssl/reference/certificate-and-hostname-priority/).
## Pages Functions
- [Functions](/pages/functions/) does not currently support adding/removing polyfills, so your bundler (for example, webpack) may not run.
- `passThroughOnException()` is not currently available for Advanced Mode Pages Functions (Pages Functions which use an `_worker.js` file).
- `passThroughOnException()` is not currently as resilient as it is in Workers. We currently wrap Pages Functions code in a `try`/`catch` block and fallback to calling `env.ASSETS.fetch()`. This means that any critical failures (such as exceeding CPU time or exceeding memory) may still throw an error.
## Enable Access on your `*.pages.dev` domain
If you would like to enable [Cloudflare Access](https://www.cloudflare.com/teams-access/)] for your preview deployments and your `*.pages.dev` domain, you must:
1. Log in to [Cloudflare dashboard](https://dash.cloudflare.com/login).
2. From Account Home, select **Workers & Pages**.
3. In **Overview**, select your Pages project.
4. Go to **Settings** > **Enable access policy**.
5. Select **Edit** on the Access policy created for your preview deployments.
6. In Edit, go to **Overview**.
7. In the **Subdomain** field, delete the wildcard (`*`) and select **Save application**. You may need to change the **Application name** at this step to avoid an error.
At this step, your `*.pages.dev` domain has been secured behind Access. To resecure your preview deployments:
8. Go back to your Pages project > **Settings** > **General** > and reselect **Enable access policy**.
9. Review that two Access policies, one for your `*.pages.dev` domain and one for your preview deployments (`*..pages.dev`), have been created.
If you have a custom domain and protected your `*.pages.dev` domain behind Access, you must:
10. Select **Add an application** > **Self hosted** in [Cloudflare Zero Trust](https://one.dash.cloudflare.com/).
11. Input an **Application name** and select your custom domain from the _Domain_ dropdown menu.
12. Select **Next** and configure your access rules to define who can reach the Access authentication page.
13. Select **Add application**.
:::caution
If you do not configure an Access policy for your custom domain, an Access authentication will render but not work for your custom domain visitors. If your Pages project has a custom domain, make sure to add an Access policy as described above in steps 10 through 13 to avoid any authentication issues.
:::
If you have an issue that you do not see listed, let the team know in the Cloudflare Workers Discord. Get your invite at [discord.cloudflare.com](https://discord.cloudflare.com), and share your bug report in the #pages-general channel.
## Delete a project with a high number of deployments
You may not be able to delete your Pages project if it has a high number (over 100) of deployments. The Cloudflare team is tracking this issue.
As a workaround, review the following steps to delete all deployments in your Pages project. After you delete your deployments, you will be able to delete your Pages project.
1. Download the `delete-all-deployments.zip` file by going to the following link: [https://pub-505c82ba1c844ba788b97b1ed9415e75.r2.dev/delete-all-deployments.zip](https://pub-505c82ba1c844ba788b97b1ed9415e75.r2.dev/delete-all-deployments.zip).
2. Extract the `delete-all-deployments.zip` file.
3. Open your terminal and `cd` into the `delete-all-deployments` directory.
4. In the `delete-all-deployments` directory, run `npm install` to install dependencies.
5. Review the following commands to decide which deletion you would like to proceed with:
- To delete all deployments except for the live production deployment (excluding [aliased deployments](https://developers.cloudflare.com/pages/configuration/preview-deployments/#preview-aliases)):
```sh
CF_API_TOKEN= CF_ACCOUNT_ID= CF_PAGES_PROJECT_NAME= npm start
```
- To delete all deployments except for the live production deployment (including [aliased deployments](https://developers.cloudflare.com/pages/configuration/preview-deployments/#preview-aliases), for example, `staging.example.pages.dev`):
```sh
CF_API_TOKEN= CF_ACCOUNT_ID= CF_PAGES_PROJECT_NAME= CF_DELETE_ALIASED_DEPLOYMENTS=true npm start
```
To find your Cloudflare API token, log in to the [Cloudflare dashboard](https://dash.cloudflare.com), select the user icon on the upper righthand side of your screen > go to **My Profile** > **API Tokens**.
To find your Account ID, refer to [Find your zone and account ID](/fundamentals/setup/find-account-and-zone-ids/).
## Use Pages as Origin in Cloudflare Load Balancer
[Cloudflare Load Balancing](/load-balancing/) will not work without the host header set. To use a Pages project as target, make sure to select **Add host header** when [creating a pool](/load-balancing/pools/create-pool/#create-a-pool), and set both the host header value and the endpoint address to your `pages.dev` domain.
Refer to [Use Cloudflare Pages as origin](/load-balancing/pools/cloudflare-pages-origin/) for a complete tutorial.
---
# Limits
URL: https://developers.cloudflare.com/pages/platform/limits/
import { Render } from "~/components"
Below are limits observed by the Cloudflare Free plan. For more details on removing these limits, refer to the [Cloudflare plans](https://www.cloudflare.com/plans) page.
## Builds
Each time you push new code to your Git repository, Pages will build and deploy your site. You can build up to 500 times per month on the Free plan. Refer to the Pro and Business plans in [Pricing](https://pages.cloudflare.com/#pricing) if you need more builds.
Builds will timeout after 20 minutes. Concurrent builds are counted per account.
## Custom domains
Based on your Cloudflare plan type, a Pages project is limited to a specific number of custom domains. This limit is on a per-project basis.
| Free | Pro | Business | Enterprise |
| ---- | --- | -------- | ---------- |
| 100 | 250 | 500 | 500[^1] |
[^1]: If you need more custom domains, contact your account team.
## Files
Pages uploads each file on your site to Cloudflare's globally distributed network to deliver a low latency experience to every user that visits your site. Cloudflare Pages sites can contain up to 20,000 files.
## File size
The maximum file size for a single Cloudflare Pages site asset is 25 MiB.
:::note[Larger Files]
To serve larger files, consider uploading them to [R2](/r2/) and utilizing the [public bucket](/r2/buckets/public-buckets/) feature. You can also use [custom domains](/r2/buckets/public-buckets/#connect-a-bucket-to-a-custom-domain), such as `static.example.com`, for serving these files.
:::
## Headers
A `_headers` file can have a maximum of 100 header rules.
An individual header in a `_headers` file can have a maximum of 2,000 characters. For managing larger headers, it is recommended to implement [Pages Functions](/pages/functions/).
## Preview deployments
You can have an unlimited number of [preview deployments](/pages/configuration/preview-deployments/) active on your project at a time.
## Redirects
A `_redirects` file can have a maximum of 2,000 static redirects and 100 dynamic redirects, for a combined total of 2,100 redirects. It is recommended to use [Bulk Redirects](/pages/configuration/redirects/#surpass-_redirects-limits) when you have a need for more than the `_redirects` file supports.
## Users
Your Pages site can be managed by an unlimited number of users via the Cloudflare dashboard. Note that this does not correlate with your Git project – you can manage both public and private repositories, open issues, and accept pull requests via without impacting your Pages site.
## Projects
Cloudflare Pages has a soft limit of 100 projects within your account in order to prevent abuse. If you need this limit raised, contact your Cloudflare account team or use the Limit Increase Request Form at the top of this page.
In order to protect against abuse of the service, Cloudflare may temporarily disable your ability to create new Pages projects, if you are deploying a large number of applications in a short amount of time. Contact support if you need this limit increased.
---
# Add custom HTTP headers
URL: https://developers.cloudflare.com/pages/how-to/add-custom-http-headers/
import { WranglerConfig } from "~/components";
:::note
Cloudflare provides HTTP header customization for Pages projects by adding a `_headers` file to your project. Refer to the [documentation](/pages/configuration/headers/) for more information.
:::
More advanced customization of HTTP headers is available through Cloudflare Workers [serverless functions](https://www.cloudflare.com/learning/serverless/what-is-serverless/).
If you have not deployed a Worker before, get started with our [tutorial](/workers/get-started/guide/). For the purpose of this tutorial, accomplish steps one (Sign up for a Workers account) through four (Generate a new project) before returning to this page.
Before continuing, ensure that your Cloudflare Pages project is connected to a [custom domain](/pages/configuration/custom-domains/#add-a-custom-domain).
## Writing a Workers function
Workers functions are written in [JavaScript](https://www.cloudflare.com/learning/serverless/serverless-javascript/). When a Worker makes a request to a Cloudflare Pages application, it will receive a response. The response a Worker receives is immutable, meaning it cannot be changed. In order to add, delete, or alter headers, clone the response and modify the headers on a new `Response` instance. Return the new response to the browser with your desired header changes. An example of this is shown below:
```js title="Setting custom headers with a Workers function"
export default {
async fetch(request) {
// This proxies your Pages application under the condition that your Worker script is deployed on the same custom domain as your Pages project
const response = await fetch(request);
// Clone the response so that it is no longer immutable
const newResponse = new Response(response.body, response);
// Add a custom header with a value
newResponse.headers.append(
"x-workers-hello",
"Hello from Cloudflare Workers",
);
// Delete headers
newResponse.headers.delete("x-header-to-delete");
newResponse.headers.delete("x-header2-to-delete");
// Adjust the value for an existing header
newResponse.headers.set("x-header-to-change", "NewValue");
return newResponse;
},
};
```
## Deploying a Workers function in the dashboard
The easiest way to start deploying your Workers function is by typing [workers.new](https://workers.new/) in the browser. Log in to your account to be automatically directed to the Workers & Pages dashboard. From the Workers & Pages dashboard, write your function or use one of the [examples from the Workers documentation](/workers/examples/).
Select **Save and Deploy** when your script is ready and set a [route](/workers/configuration/routing/routes/) in your domain's zone settings.
For example, [here is a Workers script](/workers/examples/security-headers/) you can copy and paste into the Workers dashboard that sets common security headers whenever a request hits your Pages URL, such as X-XSS-Protection, X-Frame-Options, X-Content-Type-Options, Strict-Transport-Security, Content-Security-Policy (CSP), and more.
## Deploying a Workers function using the CLI
If you would like to skip writing this file yourself, you can use our `custom-headers-example` [template](https://github.com/kristianfreeman/custom-headers-example) to generate a new Workers function with [wrangler](/workers/wrangler/install-and-update/), the Workers CLI tool.
```sh title="Generating a serverless function with wrangler"
git clone https://github.com/cloudflare/custom-headers-example
cd custom-headers-example
npm install
```
To operate your Workers function alongside your Pages application, deploy it to the same custom domain as your Pages application. To do this, update the Wrangler file in your project with your account and zone details:
```toml null {4,6,7}
name = "custom-headers-example"
account_id = "FILL-IN-YOUR-ACCOUNT-ID"
workers_dev = false
route = "FILL-IN-YOUR-WEBSITE.com/*"
zone_id = "FILL-IN-YOUR-ZONE-ID"
```
If you do not know how to find your Account ID and Zone ID, refer to [our guide](/fundamentals/setup/find-account-and-zone-ids/).
Once you have configured your [Wrangler configuration file](/pages/functions/wrangler-configuration/) , run `npx wrangler deploy` in your terminal to deploy your Worker:
```sh
npx wrangler deploy
```
After you have deployed your Worker, your desired HTTP header adjustments will take effect. While the Worker is deployed, you should continue to see the content from your Pages application as normal.
---
# Set build commands per branch
URL: https://developers.cloudflare.com/pages/how-to/build-commands-branches/
This guide will instruct you how to set build commands on specific branches. You will use the `CF_PAGES_BRANCH` environment variable to run a script on a specified branch as opposed to your Production branch. This guide assumes that you have a Cloudflare account and a Pages project.
## Set up
Create a `.sh` file in your project directory. You can choose your file's name, but we recommend you name the file `build.sh`.
In the following script, you will use the `CF_PAGES_BRANCH` environment variable to check which branch is currently being built. Populate your `.sh` file with the following:
```bash
# !/bin/bash
if [ "$CF_PAGES_BRANCH" == "production" ]; then
# Run the "production" script in `package.json` on the "production" branch
# "production" should be replaced with the name of your Production branch
npm run production
elif [ "$CF_PAGES_BRANCH" == "staging" ]; then
# Run the "staging" script in `package.json` on the "staging" branch
# "staging" should be replaced with the name of your specific branch
npm run staging
else
# Else run the dev script
npm run dev
fi
```
## Publish your changes
To put your changes into effect:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In Account Home, select **Workers & Pages** > in **Overview**, select your Pages project.
3. Go to **Settings** > **Build & deployments** > **Build configurations** > **Edit configurations**.
4. Update the **Build command** field value to `bash build.sh` and select **Save**.
To test that your build is successful, deploy your project.
---
# Add a custom domain to a branch
URL: https://developers.cloudflare.com/pages/how-to/custom-branch-aliases/
In this guide, you will learn how to add a custom domain (`staging.example.com`) that will point to a specific branch (`staging`) on your Pages project.
This will allow you to have a custom domain that will always show the latest build for a specific branch on your Pages project.
:::note
Currently, this setup is only supported when using Cloudflare DNS.
If you attempt to follow this guide using an external DNS provider, your custom alias will be sent to the production branch of your Pages project.
:::
First, make sure that you have a successful deployment on the branch you would like to set up a custom domain for.
Next, add a custom domain under your Pages project for your desired custom domain, for example, `staging.example.com`.

To do this:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login).
2. In Account Home, go to **Workers & Pages**.
3. Select your Pages project.
4. Select **Custom domains** > **Setup a custom domain**.
5. Input the domain you would like to use, such as `staging.example.com`
6. Select **Continue** > **Activate domain**

After activating your custom domain, go to [DNS](https://dash.cloudflare.com/?to=/:account/:zone/dns) for the `example.com` zone and find the `CNAME` record with the name `staging` and change the target to include your branch alias.
In this instance, change `your-project.pages.dev` to `staging.your-project.pages.dev`.

Now the `staging` branch of your Pages project will be available on `staging.example.com`.
---
# Deploy a static WordPress site
URL: https://developers.cloudflare.com/pages/how-to/deploy-a-wordpress-site/
## Overview
In this guide, you will use a WordPress plugin, [Simply Static](https://wordpress.org/plugins/simply-static/), to convert your existing WordPress site to a static website deployed with Cloudflare Pages.
## Prerequisites
This guide assumes that you are:
* The Administrator account on your WordPress site.
* Able to install WordPress plugins on the site.
## Setup
To start, install the [Simply Static](https://wordpress.org/plugins/simply-static/) plugin to export your WordPress site. In your WordPress dashboard, go to **Plugins** > **Add New**.
Search for `Simply Static` and confirm that the resulting plugin that you will be installing matches the plugin below.

Select **Install** on the plugin. After it has finished installing, select **Activate**.
### Export your WordPress site
After you have installed the plugin, go to your WordPress dashboard > **Simply Static** > **GENERATE STATIC FILES**.
In the **Activity Log**, find the **ZIP archive created** message and select **Click here to download** to download your ZIP file.
### Deploy your WordPress site with Pages
With your ZIP file downloaded, deploy your site to Pages:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In Account Home, select **Workers & Pages** > **Create application** > **Pages** > **Upload assets**.
3. Name your project > **Create project**.
4. Drag and drop your ZIP file (or unzipped folder of assets) or select it from your computer.
5. After your files have been uploaded, select **Deploy site**.
Your WordPress site will now be live on Pages.
Every time you make a change to your WordPress site, you will need to download a new ZIP file from the WordPress dashboard and redeploy to Cloudflare Pages. Automatic updates are not available with the free version of Simply Static.
## Limitations
There are some features available in WordPress sites that will not be supported in a static site environment:
* WordPress Forms.
* WordPress Comments.
* Any links to `/wp-admin` or similar internal WordPress routes.
## Conclusion
By following this guide, you have successfully deployed a static version of your WordPress site to Cloudflare Pages.
With a static version of your site being served, you can:
* Move your WordPress site to a custom domain or subdomain. Refer to [Custom domains](/pages/configuration/custom-domains/) to learn more.
* Run your WordPress instance locally, or put your WordPress site behind [Cloudflare Access](/pages/configuration/preview-deployments/#customize-preview-deployments-access) to only give access to your contributors. This has a significant effect on the number of attack vectors for your WordPress site and its content.
* Downgrade your WordPress hosting plan to a cheaper plan. Because the memory and bandwidth requirements for your WordPress instance are now smaller, you can often host it on a cheaper plan, or moving to shared hosting.
Connect with the [Cloudflare Developer community on Discord](https://discord.cloudflare.com) to ask questions and discuss the platform with other developers.
---
# Enable Zaraz
URL: https://developers.cloudflare.com/pages/how-to/enable-zaraz/
import { Render } from "~/components"
## Enable
To enable Zaraz on Cloudflare Pages, you need a [custom domain](/pages/configuration/custom-domains/) associated with your project.
After that, [set up Zaraz](/zaraz/get-started/) on the custom domain.
---
# How to
URL: https://developers.cloudflare.com/pages/how-to/
import { DirectoryListing } from "~/components"
---
# Install private packages
URL: https://developers.cloudflare.com/pages/how-to/npm-private-registry/
Cloudflare Pages supports custom package registries, allowing you to include private dependencies in your application. While this walkthrough focuses specifically on [npm](https://www.npmjs.com/), the Node package manager and registry, the same approach can be applied to other registry tools.
You will be be adjusting the [environment variables](/pages/configuration/build-configuration/#environment-variables) in your Pages project's **Settings**. An existing website can be modified at any time, but new projects can be initialized with these settings, too. Either way, altering the project settings will not be reflected until its next deployment.
:::caution
Be sure to trigger a new deployment after changing any settings.
:::
## Registry Access Token
Every package registry should have a means of issuing new access tokens. Ideally, you should create a new token specifically for Pages, as you would with any other CI/CD platform.
With npm, you can [create and view tokens through its website](https://docs.npmjs.com/creating-and-viewing-access-tokens) or you can use the `npm` CLI. If you have the CLI set up locally and are authenticated, run the following commands in your terminal:
```sh
# Verify the current npm user is correct
npm whoami
# Create a readonly token
npm token create --read-only
#-> Enter password, if prompted
#-> Enter 2FA code, if configured
```
This will produce a read-only token that looks like a UUID string. Save this value for a later step.
## Private modules on the npm registry
The following section applies to users with applications that are only using private modules from the npm registry.
In your Pages project's **Settings** > **Environment variables**, add a new [environment variable](/pages/configuration/build-configuration/#environment-variables) named `NPM_TOKEN` to the **Production** and **Preview** environments and paste the [read-only token you created](#registry-access-token) as its value.
:::caution
Add the `NPM_TOKEN` variable to both the **Production** and **Preview** environments.
:::
By default, `npm` looks for an environment variable named `NPM_TOKEN` and because you did not define a [custom registry endpoint](#custom-registry-endpoints), the npm registry is assumed. Local development should continue to work as expected, provided that you and your teammates are authenticated with npm accounts (see `npm whoami` and `npm login`) that have been granted access to the private package(s).
## Custom registry endpoints
When multiple registries are in use, a project will need to define its own root-level [`.npmrc`](https://docs.npmjs.com/cli/v7/configuring-npm/npmrc) configuration file. An example `.npmrc` file may look like this:
```ini
@foobar:registry=https://npm.pkg.github.com
//registry.npmjs.org/:_authToken=${TOKEN_FOR_NPM}
//npm.pkg.github.com/:_authToken=${TOKEN_FOR_GITHUB}
```
Here, all packages under the `@foobar` scope are directed towards the GitHub Packages registry. Then the registries are assigned their own access tokens via their respective environment variable names.
:::note
You only need to define an Access Token for the npm registry (refer to `TOKEN_FOR_NPM` in the example) if it is hosting private packages that your application requires.
:::
Your Pages project must then have the matching [environment variables](/pages/configuration/build-configuration/#environment-variables) defined for all environments. In our example, that means `TOKEN_FOR_NPM` must contain [the read-only npm token](#registry-access-token) value and `TOKEN_FOR_GITHUB` must contain its own [personal access token](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token#creating-a-token).
### Managing multiple environments
In the event that your local development no longer works with your new `.npmrc` file, you will need to add some additional changes:
1. Rename the Pages-compliant `.npmrc` file to `.npmrc.pages`. This should be referencing environment variables.
2. Restore your previous `.npmrc` file – the version that was previously working for you and your teammates.
3. Go to your Pages project > **Settings** > **Environment variables**, add a new [environment variable](/pages/configuration/build-configuration/#environment-variables) named [`NPM_CONFIG_USERCONFIG`](https://docs.npmjs.com/cli/v6/using-npm/config#npmrc-files) and set its value to `/opt/buildhome/repo/.npmrc.pages`. If your `.npmrc.pages` file is not in your project's root directory, adjust this path accordingly.
---
# Preview Local Projects with Cloudflare Tunnel
URL: https://developers.cloudflare.com/pages/how-to/preview-with-cloudflare-tunnel/
[Cloudflare Tunnel](/cloudflare-one/connections/connect-networks/) runs a lightweight daemon (`cloudflared`) in your infrastructure that establishes outbound connections (Tunnels) between your origin web server and the Cloudflare global network. In practical terms, you can use Cloudflare Tunnel to allow remote access to services running on your local machine. It is an alternative to popular tools like [Ngrok](https://ngrok.com), and provides free, long-running tunnels via the [TryCloudflare](/cloudflare-one/connections/connect-networks/do-more-with-tunnels/trycloudflare/) service.
While Cloudflare Pages provides unique [deploy preview URLs](/pages/configuration/preview-deployments/) for new branches and commits on your projects, Cloudflare Tunnel can be used to provide access to locally running applications and servers during the development process. In this guide, you will install Cloudflare Tunnel, and create a new tunnel to provide access to a locally running application. You will need a Cloudflare account to begin using Cloudflare Tunnel.
## Installing Cloudflare Tunnel
Cloudflare Tunnel can be installed on Windows, Linux, and macOS. To learn about installing Cloudflare Tunnel, refer to the [Install cloudflared](/cloudflare-one/connections/connect-networks/downloads/) page in the Cloudflare for Teams documentation.
Confirm that `cloudflared` is installed correctly by running `cloudflared --version` in your command line:
```sh
cloudflared --version
```
```sh output
cloudflared version 2021.5.9 (built 2021-05-21-1541 UTC)
```
## Run a local service
The easiest way to get up and running with Cloudflare Tunnel is to have an application running locally, such as a [React](/pages/framework-guides/deploy-a-react-site/) or [SvelteKit](/pages/framework-guides/deploy-a-svelte-kit-site/) site. When you are developing an application with these frameworks, they will often make use of a `npm run develop` script, or something similar, which mounts the application and runs it on a `localhost` port. For example, the popular `vite` tool runs your in-development React application on port `5173`, making it accessible at the `http://localhost:5173` address.
## Start a Cloudflare Tunnel
With a local development server running, a new Cloudflare Tunnel can be instantiated by running `cloudflared tunnel` in a new command line window, passing in the `--url` flag with your `localhost` URL and port. `cloudflared` will output logs to your command line, including a banner with a tunnel URL:
```sh
cloudflared tunnel --url http://localhost:5173
```
```sh output
2021-07-15T20:11:29Z INF Cannot determine default configuration path. No file [config.yml config.yaml] in [~/.cloudflared ~/.cloudflare-warp ~/cloudflare-warp /etc/cloudflared /usr/local/etc/cloudflared]
2021-07-15T20:11:29Z INF Version 2021.5.9
2021-07-15T20:11:29Z INF GOOS: linux, GOVersion: devel +11087322f8 Fri Nov 13 03:04:52 2020 +0100, GoArch: amd64
2021-07-15T20:11:29Z INF Settings: map[url:http://localhost:5173]
2021-07-15T20:11:29Z INF cloudflared will not automatically update when run from the shell. To enable auto-updates, run cloudflared as a service: https://developers.cloudflare.com/argo-tunnel/reference/service/
2021-07-15T20:11:29Z INF Initial protocol h2mux
2021-07-15T20:11:29Z INF Starting metrics server on 127.0.0.1:42527/metrics
2021-07-15T20:11:29Z WRN Your version 2021.5.9 is outdated. We recommend upgrading it to 2021.7.0
2021-07-15T20:11:29Z INF Connection established connIndex=0 location=ATL
2021-07-15T20:11:32Z INF Each HA connection's tunnel IDs: map[0:cx0nsiqs81fhrfb82pcq075kgs6cybr86v9vdv8vbcgu91y2nthg]
2021-07-15T20:11:32Z INF +-------------------------------------------------------------+
2021-07-15T20:11:32Z INF | Your free tunnel has started! Visit it: |
2021-07-15T20:11:32Z INF | https://seasonal-deck-organisms-sf.trycloudflare.com |
2021-07-15T20:11:32Z INF +-------------------------------------------------------------+
```
In this example, the randomly-generated URL `https://seasonal-deck-organisms-sf.trycloudflare.com` has been created and assigned to your tunnel instance. Visiting this URL in a browser will show the application running, with requests being securely forwarded through Cloudflare's global network, through the tunnel running on your machine, to `localhost:5173`:

## Next Steps
Cloudflare Tunnel can be configured in a variety of ways and can be used beyond providing access to your in-development applications. For example, you can provide `cloudflared` with a [configuration file](/cloudflare-one/connections/connect-networks/do-more-with-tunnels/local-management/configuration-file/) to add more complex routing and tunnel setups that go beyond a simple `--url` flag. You can also [attach a Cloudflare DNS record](/cloudflare-one/connections/connect-networks/routing-to-tunnel/dns/) to a domain or subdomain for an easily accessible, long-lived tunnel to your local machine.
Finally, by incorporating Cloudflare Access, you can provide [secure access to your tunnels](/cloudflare-one/applications/configure-apps/self-hosted-public-app/) without exposing your entire server, or compromising on security. Refer to the [Cloudflare for Teams documentation](/cloudflare-one/) to learn more about what you can do with Cloudflare's entire suite of Zero Trust tools.
---
# Redirecting *.pages.dev to a Custom Domain
URL: https://developers.cloudflare.com/pages/how-to/redirect-to-custom-domain/
import { Example } from "~/components"
Learn how to use [Bulk Redirects](/rules/url-forwarding/bulk-redirects/) to redirect your `*.pages.dev` subdomain to your [custom domain](/pages/configuration/custom-domains/).
You may want to do this to ensure that your site's content is served only on the custom domain, and not the `.pages.dev` site automatically generated on your first Pages deployment.
## Setup
To redirect a `.pages.dev` subdomain to your custom domain:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/?to=/:account/pages/view/:pages-project/domains), and select your account.
2. Select **Workers & Pages** and select your Pages application.
3. Go to **Custom domains** and make sure that your custom domain is listed. If it is not, add it by clicking **Set up a custom domain**.
4. Go **Bulk Redirects**.
5. [Create a bulk redirect list](/rules/url-forwarding/bulk-redirects/create-dashboard/#1-create-a-bulk-redirect-list) modeled after the following (but replacing the values as appropriate):
| Source URL | Target URL | Status | Parameters |
| ------------- | --------------------- | ------ | ------------------------------------------------------------------------------------------------------------------------ |
| `.pages.dev` | `https://example.com` | `301` | - Preserve query string
- Subpath matching
- Preserve path suffix
- Include subdomains
|
6. [Create a bulk redirect rule](/rules/url-forwarding/bulk-redirects/create-dashboard/#2-create-a-bulk-redirect-rule) using the list you just created.
To test that your redirect worked, go to your `.pages.dev` domain. If the URL is now set to your custom domain, then the rule has propagated.
## Related resources
* [Redirect www to domain apex](/pages/how-to/www-redirect/)
* [Handle redirects with Bulk Redirects](/rules/url-forwarding/bulk-redirects/)
---
# Refactor a Worker to a Pages Function
URL: https://developers.cloudflare.com/pages/how-to/refactor-a-worker-to-pages-functions/
In this guide, you will learn how to refactor a Worker made to intake form submissions to a Pages Function that can be hosted on your Cloudflare Pages application. [Pages Functions](/pages/functions/) is a serverless function that lives within the same project directory as your application and is deployed with Cloudflare Pages. It enables you to run server-side code that adds dynamic functionality without running a dedicated server. You may want to refactor a Worker to a Pages Function for one of these reasons:
1. If you manage a serverless function that your Pages application depends on and wish to ship the logic without managing a Worker as a separate service.
2. If you are migrating your Worker to Pages Functions and want to use the routing and middleware capabilities of Pages Functions.
:::note
You can import your Worker to a Pages project without using Functions by creating a `_worker.js` file in the output directory of your Pages project. This [Advanced mode](/pages/functions/advanced-mode/) requires writing your Worker with [Module syntax](/workers/reference/migrate-to-module-workers/).
However, when using the `_worker.js` file in Pages, the entire `/functions` directory is ignored – including its routing and middleware characteristics.
:::
## General refactoring steps
1. Remove the fetch handler and replace it with the appropriate `OnRequest` method. Refer to [Functions](/pages/functions/get-started/) to select the appropriate method for your Function.
2. Pass the `context` object as an argument to your new `OnRequest` method to access the properties of the context parameter: `request`,`env`,`params` and `next`.
3. Use middleware to handle logic that must be executed before or after route handlers. Learn more about [using Middleware](/pages/functions/middleware/) in the Functions documentation.
## Background
To explain the process of refactoring, this guide uses a simple form submission example.
Form submissions can be handled by Workers but can also be a good use case for Pages Functions, since forms are most times specific to a particular application.
Assuming you are already using a Worker to handle your form, you would have deployed this Worker and then added the URL to your form action attribute in your HTML form. This means that when you change how the Worker handles your submissions, you must make changes to the Worker script. If the logic in your Worker is used by more than one application, Pages Functions would not be a good use case.
However, it can be beneficial to use a [Pages Function](/pages/functions/) when you would like to organize your function logic in the same project directory as your application.
Building your application using Pages Functions can help you manage your client and serverless logic from the same place and make it easier to write and debug your code.
## Handle form entries with Airtable and Workers
An [Airtable](https://airtable.com/) is a low-code platform for building collaborative applications. It helps customize your workflow, collaborate, and handle form submissions. For this example, you will utilize Airtable's form submission feature.
[Airtable](https://airtable.com/) can be used to store entries of information in different tables for the same account. When creating a Worker for handling the submission logic, the first step is to use [Wrangler](/workers/wrangler/install-and-update/) to initialize a new Worker within a specific folder or at the root of your application.
This step creates the boilerplate to write your Airtable submission Worker. After writing your Worker, you can deploy it to Cloudflare's global network after you [configure your project for deployment](/workers/wrangler/configuration/). Refer to the Workers documentation for a full tutorial on how to [handle form submission with Workers](/workers/tutorials/handle-form-submissions-with-airtable/).
The following code block shows an example of a Worker that handles Airtable form submission.
The `submitHandler` async function is called if the pathname of the work is `/submit`. This function checks that the request method is a `POST` request and then proceeds to parse and post the form entries to Airtable using your credentials, which you can store using [Wrangler `secret`](/workers/wrangler/commands/#secret).
```js
export default {
async fetch(request, env, ctx) {
const url = new URL(request.url);
if (url.pathname === "/submit") {
return submitHandler(request, env);
}
return fetch(request.url);
},
};
async function submitHandler(request, env) {
if (request.method !== "POST") {
return new Response("Method not allowed", {
status: 405,
});
}
const body = await request.formData();
const { first_name, last_name, email, phone, subject, message } =
Object.fromEntries(body);
const reqBody = {
fields: {
"First Name": first_name,
"Last Name": last_name,
Email: email,
"Phone number": phone,
Subject: subject,
Message: message,
},
};
return HandleAirtableData(reqBody, env);
}
const HandleAirtableData = (body, env) => {
return fetch(
`https://api.airtable.com/v0/${env.AIRTABLE_BASE_ID}/${encodeURIComponent(
env.AIRTABLE_TABLE_NAME,
)}`,
{
method: "POST",
body: JSON.stringify(body),
headers: {
Authorization: `Bearer ${env.AIRTABLE_API_KEY}`,
"Content-type": `application/json`,
},
},
);
};
```
### Refactor your Worker
To refactor the above Worker, go to your Pages project directory and create a `/functions` folder. In `/functions`, create a `form.js` file. This file will handle form submissions.
Then, in the `form.js` file, export a single `onRequestPost`:
```js
export async function onRequestPost(context) {
return await submitHandler(context);
}
```
Every Worker has an `addEventListener` to listen for `fetch` events, but you will not need this in a Pages Function. Instead, you will `export` a single `onRequest` function, and depending on the HTTPS request it handles, you will name it accordingly. Refer to [Function documentation](/pages/functions/get-started/) to select the appropriate method for your function.
The above code takes a `request` and `env` as arguments which pass these properties down to the `submitHandler` function, which remains unchanged from the [original Worker](#handle-form-entries-with-airtable-and-workers). However, because Functions allow you to specify the HTTPS request type, you can remove the `request.method` check in your Worker. This is now handled by Pages Functions by naming the `onRequest` handler.
Now, you will introduce the `submitHandler` function and pass the `env` parameter as a property. This will allow you to access `env` in the `HandleAirtableData` function below. This function does a `POST` request to Airtable using your Airtable credentials:
```js null {4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22}
export async function onRequestPost(context) {
return await submitHandler(context);
}
async function submitHandler(context) {
const body = await context.request.formData();
const { first_name, last_name, email, phone, subject, message } =
Object.fromEntries(body);
const reqBody = {
fields: {
"First Name": first_name,
"Last Name": last_name,
Email: email,
"Phone number": phone,
Subject: subject,
Message: message,
},
};
return HandleAirtableData({ body: reqBody, env: env });
}
```
Finally, create a `HandleAirtableData` function. This function will send a `fetch` request to Airtable with your Airtable credentials and the body of your request:
```js
// ..
const HandleAirtableData = async function onRequest({ body, env }) {
return fetch(
`https://api.airtable.com/v0/${env.AIRTABLE_BASE_ID}/${encodeURIComponent(
env.AIRTABLE_TABLE_NAME,
)}`,
{
method: "POST",
body: JSON.stringify(body),
headers: {
Authorization: `Bearer ${env.AIRTABLE_API_KEY}`,
"Content-type": `application/json`,
},
},
);
};
```
You can test your Function [locally using Wrangler](/pages/functions/local-development/). By completing this guide, you have successfully refactored your form submission Worker to a form submission Pages Function.
## Related resources
- [HTML forms](/pages/tutorials/forms/)
- [Plugins documentation](/pages/functions/plugins/)
- [Functions documentation](/pages/functions/)
---
# Use Direct Upload with continuous integration
URL: https://developers.cloudflare.com/pages/how-to/use-direct-upload-with-continuous-integration/
Cloudflare Pages supports directly uploading prebuilt assets, allowing you to use custom build steps for your applications and deploy to Pages with [Wrangler](/workers/wrangler/install-and-update/). This guide will teach you how to deploy your application to Pages, using continuous integration.
## Deploy with Wrangler
In your project directory, install [Wrangler](/workers/wrangler/install-and-update/) so you can deploy a folder of prebuilt assets by running the following command:
```sh
# Publish created project
$ CLOUDFLARE_ACCOUNT_ID= npx wrangler pages deploy --project-name=
```
## Get credentials from Cloudflare
### Generate an API Token
To generate an API token:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/profile/api-tokens).
2. Select **My Profile** from the dropdown menu of your user icon on the top right of your dashboard.
3. Select **API Tokens** > **Create Token**.
4. Under **Custom Token**, select **Get started**.
5. Name your API Token in the **Token name** field.
6. Under **Permissions**, select *Account*, *Cloudflare Pages* and *Edit*:
7. Select **Continue to summary** > **Create Token**.

Now that you have created your API token, you can use it to push your project from continuous integration platforms.
### Get project account ID
To find your account ID, log in to the Cloudflare dashboard > select your zone in **Account Home** > find your account ID in **Overview** under **API** on the right-side menu. If you have not added a zone, add one by selecting **Add site**. You can purchase a domain from [Cloudflare's registrar](/registrar/).
## Use GitHub Actions
[GitHub Actions](https://docs.github.com/en/actions) is a continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline when using GitHub. You can create workflows that build and test every pull request to your repository or deploy merged pull requests to production.
After setting up your project, you can set up a GitHub Action to automate your subsequent deployments with Wrangler.
### Add Cloudflare credentials to GitHub secrets
In the GitHub Action you have set up, environment variables are needed to push your project up to Cloudflare Pages. To add the values of these environment variables in your project's GitHub repository:
1. Go to your project's repository in GitHub.
2. Under your repository's name, select **Settings**.
3. Select **Secrets** > **Actions** > **New repository secret**.
4. Create one secret and put **CLOUDFLARE\_ACCOUNT\_ID** as the name with the value being your Cloudflare account ID.
5. Create another secret and put **CLOUDFLARE\_API\_TOKEN** as the name with the value being your Cloudflare API token.
Add the value of your Cloudflare account ID and Cloudflare API token as `CLOUDFLARE_ACCOUNT_ID` and `CLOUDFLARE_API_TOKEN`, respectively. This will ensure that these secrets are secure, and each time your Action runs, it will access these secrets.
### Set up a workflow
Create a `.github/workflows/pages-deployment.yaml` file at the root of your project. The `.github/workflows/pages-deployment.yaml` file will contain the jobs you specify on the request, that is: `on: [push]` in this case. It can also be on a pull request. For a detailed explanation of GitHub Actions syntax, refer to the [official documentation](https://docs.github.com/en/actions).
In your `pages-deployment.yaml` file, copy the following content:
```yaml
on: [push]
jobs:
deploy:
runs-on: ubuntu-latest
permissions:
contents: read
deployments: write
name: Deploy to Cloudflare Pages
steps:
- name: Checkout
uses: actions/checkout@v3
# Run your project's build step
# - name: Build
# run: npm install && npm run build
- name: Publish
uses: cloudflare/pages-action@v1
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
projectName: YOUR_PROJECT_NAME # e.g. 'my-project'
directory: YOUR_DIRECTORY_OF_STATIC_ASSETS # e.g. 'dist'
gitHubToken: ${{ secrets.GITHUB_TOKEN }}
```
In the above code block, you have set up an Action that runs when you push code to the repository. Replace `YOUR_PROJECT_NAME` with your Cloudflare Pages project name and `YOUR_DIRECTORY_OF_STATIC_ASSETS` with your project's output directory, respectively.
The `${{ secrets.GITHUB_TOKEN }}` will be automatically provided by GitHub Actions with the `contents: read` and `deployments: write` permission. This will enable our Cloudflare Pages action to create a Deployment on your behalf.
:::note
This workflow automatically triggers on the current git branch, unless you add a `branch` option to the `with` section.
:::
## Using CircleCI for CI/CD
[CircleCI](https://circleci.com/) is another continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline. It can be configured to efficiently run complex pipelines with caching, docker layer caching, and resource classes.
Similar to GitHub Actions, CircleCI can use Wrangler to continuously deploy your projects each time to push to your code.
### Add Cloudflare credentials to CircleCI
After you have generated your Cloudflare API token and found your account ID in the dashboard, you will need to add them to your CircleCI dashboard to use your environment variables in your project.
To add environment variables, in the CircleCI web application:
1. Go to your Pages project > **Settings**.
2. Select **Projects** in the side menu.
3. Select the ellipsis (...) button in the project's row. You will see the option to add environment variables.
4. Select **Environment Variables** > **Add Environment Variable**.
5. Enter the name and value of the new environment variable, which is your Cloudflare credentials (`CLOUDFLARE_ACCOUNT_ID` and `CLOUDFLARE_API_TOKEN`).

### Set up a workflow
Create a `.circleci/config.yml` file at the root of your project. This file contains the jobs that will be executed based on the order of your workflow. In your `config.yml` file, copy the following content:
```yaml
version: 2.1
jobs:
Publish-to-Pages:
docker:
- image: cimg/node:18.7.0
steps:
- checkout
# Run your project's build step
- run: npm install && npm run build
# Publish with wrangler
- run: npx wrangler pages deploy dist --project-name= # Replace dist with the name of your build folder and input your project name
workflows:
Publish-to-Pages-workflow:
jobs:
- Publish-to-Pages
```
Your continuous integration workflow is broken down into jobs when using CircleCI. From the code block above, you can see that you first define a list of jobs that run on each commit. For example, your repository will run on a prebuilt docker image `cimg/node:18.7.0`. It first checks out the repository with the Node version specified in the image.
:::note[Note]
Wrangler requires a Node version of at least `16.17.0`. You must upgrade your Node.js version if your version is lower than `16.17.0`.
:::
You can modify the Wrangler command with any [`wrangler pages deploy` options](/workers/wrangler/commands/#deploy-1).
After all the specified steps, define a `workflow` at the end of your file. You can learn more about creating a custom process with CircleCI from the [official documentation](https://circleci.com/docs/2.0/concepts/).
## Travis CI for CI/CD
Travis CI is an open-source continuous integration tool that handles specific tasks, such as pull requests and code pushes for your project workflow. Travis CI can be integrated into your GitHub projects, databases, and other preinstalled services enabled in your build configuration. To use Travis CI, you should have A GitHub, Bitbucket, GitLab or Assembla account.
### Add Cloudflare credentials to TravisCI
In your Travis project, add the Cloudflare credentials you have generated from the Cloudflare dashboard to access them in your `travis.yml` file. Go to your Travis CI dashboard and select your current project > **More options** > **Settings** > **Environment Variables**.
Set the environment variable's name and value and the branch you want it to be attached to. You can also set the privacy of the value.
### Setup
Go to [Travis-ci.com](https://Travis-ci.com) and enable your repository by login in with your preferred provider. This guide uses GitHub. Next, create a `.travis.yml` file and copy the following into the file:
```yaml
language: node_js
node_js:
- "18.0.0" # You can specify more versions of Node you want your CI process to support
branches:
only:
- travis-ci-test # Specify what branch you want your CI process to run on
install:
- npm install
script:
- npm run build # Switch this out with your build command or remove it if you don't have a build step
- npx wrangler pages deploy dist --project-name=
env:
- CLOUDFLARE_ACCOUNT_ID: { $CLOUDFLARE_ACCOUNT_ID }
- CLOUDFLARE_API_TOKEN: { $CLOUDFLARE_API_TOKEN }
```
This will set the Node.js version to 18. You have also set branches you want your continuous integration to run on. Finally, input your `PROJECT NAME` in the script section and your CI process should work as expected.
You can also modify the Wrangler command with any [`wrangler pages deploy` options](/workers/wrangler/commands/#deploy-1).
---
# Use Pages Functions for A/B testing
URL: https://developers.cloudflare.com/pages/how-to/use-worker-for-ab-testing-in-pages/
In this guide, you will learn how to use [Pages Functions](/pages/functions/) for A/B testing in your Pages projects. A/B testing is a user experience research methodology applied when comparing two or more versions of a web page or application. With A/B testing, you can serve two or more versions of a webpage to users and divide traffic to your site.
## Overview
Configuring different versions of your application for A/B testing will be unique to your specific use case. For all developers, A/B testing setup can be simplified into a few helpful principles.
Depending on the number of application versions you have (this guide uses two), you can assign your users into experimental groups. The experimental groups in this guide are the base route `/` and the test route `/test`.
To ensure that a user remains in the group you have given, you will set and store a cookie in the browser and depending on the cookie value you have set, the corresponding route will be served.
## Set up your Pages Function
In your project, you can handle the logic for A/B testing using [Pages Functions](/pages/functions/). Pages Functions allows you to handle server logic from within your Pages project.
To begin:
1. Go to your Pages project directory on your local machine.
2. Create a `/functions` directory. Your application server logic will live in the `/functions` directory.
## Add middleware logic
Pages Functions have utility functions that can reuse chunks of logic which are executed before and/or after route handlers. These are called [middleware](/pages/functions/middleware/). Following this guide, middleware will allow you to intercept requests to your Pages project before they reach your site.
In your `/functions` directory, create a `_middleware.js` file.
:::note
When you create your `_middleware.js` file at the base of your `/functions` folder, the middleware will run for all routes on your project. Learn more about [middleware routing](/pages/functions/middleware/).
:::
Following the Functions naming convention, the `_middleware.js` file exports a single async `onRequest` function that accepts a `request`, `env` and `next` as an argument.
```js
const abTest = async ({request, next, env}) => {
/*
Todo:
1. Conditional statements to check for the cookie
2. Assign cookies based on percentage, then serve
*/
}
export const onRequest = [abTest]
```
To set the cookie, create the `cookieName` variable and assign any value. Then create the `newHomepagePathName` variable and assign it `/test`:
```js null {1,2}
const cookieName = "ab-test-cookie"
const newHomepagePathName = "/test"
const abTest = async ({request, next, env}) => {
/*
Todo:
1. Conditional statements to check for the cookie
2. Assign cookie based on percentage then serve
*/
}
export const onRequest = [abTest]
```
## Set up conditional logic
Based on the URL pathname, check that the cookie value is equal to `new`. If the value is `new`, then `newHomepagePathName` will be served.
```js null {7,8,9,10,11,12,13,14,15,16,17,18,19}
const cookieName = "ab-test-cookie"
const newHomepagePathName = "/test"
const abTest = async ({request, next, env}) => {
/*
Todo:
1. Assign cookies based on randomly generated percentage, then serve
*/
const url = new URL(request.url)
if (url.pathname === "/") {
// if cookie ab-test-cookie=new then change the request to go to /test
// if no cookie set, pass x% of traffic and set a cookie value to "current" or "new"
let cookie = request.headers.get("cookie")
// is cookie set?
if (cookie && cookie.includes(`${cookieName}=new`)) {
// Change the request to go to /test (as set in the newHomepagePathName variable)
url.pathname = newHomepagePathName
return env.ASSETS.fetch(url)
}
}
}
export const onRequest = [abTest]
```
If the cookie value is not present, you will have to assign one. Generate a percentage (from 0-99) by using: `Math.floor(Math.random() * 100)`. Your default cookie version is given a value of `current`.
If the percentage of the number generated is lower than `50`, you will assign the cookie version to `new`. Based on the percentage randomly generated, you will set the cookie and serve the assets. After the conditional block, pass the request to `next()`. This will pass the request to Pages. This will result in 50% of users getting the `/test` homepage.
The `env.ASSETS.fetch()` function will allow you to send the user to a modified path which is defined through the `url` parameter. `env` is the object that contains your environment variables and bindings. `ASSETS` is a default Function binding that allows communication between your Function and Pages' asset serving resource. `fetch()` calls to the Pages asset-serving resource and returns the asset (`/test` homepage) to your website's visitor.
:::note[Binding]
A Function is a Worker that executes on your Pages project to add dynamic functionality. A binding is how your Function (Worker) interacts with external resources. A binding is a runtime variable that the Workers runtime provides to your code.
:::
```js null {20-36}
const cookieName = "ab-test-cookie"
const newHomepagePathName = "/test"
const abTest = async (context) => {
const url = new URL(context.request.url)
// if homepage
if (url.pathname === "/") {
// if cookie ab-test-cookie=new then change the request to go to /test
// if no cookie set, pass x% of traffic and set a cookie value to "current" or "new"
let cookie = request.headers.get("cookie")
// is cookie set?
if (cookie && cookie.includes(`${cookieName}=new`)) {
// pass the request to /test
url.pathname = newHomepagePathName
return context.env.ASSETS.fetch(url)
} else {
const percentage = Math.floor(Math.random() * 100)
let version = "current" // default version
// change pathname and version name for 50% of traffic
if (percentage < 50) {
url.pathname = newHomepagePathName
version = "new"
}
// get the static file from ASSETS, and attach a cookie
const asset = await context.env.ASSETS.fetch(url)
let response = new Response(asset.body, asset)
response.headers.append("Set-Cookie", `${cookieName}=${version}; path=/`)
return response
}
}
return context.next()
};
export const onRequest = [abTest];
```
## Deploy to Cloudflare Pages
After you have set up your `functions/_middleware.js` file in your project you are ready to deploy with Pages. Push your project changes to GitHub/GitLab.
After you have deployed your application, review your middleware Function:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. In Account Home, select **Workers & Pages**.
3. In **Overview**, select your Pages project > **Settings** > **Functions** > **Configuration**.
---
# Enable Web Analytics
URL: https://developers.cloudflare.com/pages/how-to/web-analytics/
import { Render } from "~/components"
## Enable on Pages project
Cloudflare Pages offers a one-click setup for Web Analytics:
## View metrics
To view the metrics associated with your Pages project:
1. Log in to [Cloudflare dashboard](https://dash.cloudflare.com/login).
2. From Account Home, select **Analytics & Logs** > **Web Analytics**.
3. Select the analytics associated with your Pages project.
For more details about how to use Web Analytics, refer to the [Web Analytics documentation](/web-analytics/data-metrics/).
## Troubleshooting
---
# Redirecting www to domain apex
URL: https://developers.cloudflare.com/pages/how-to/www-redirect/
import { Example } from "~/components";
Learn how to redirect a `www` subdomain to your apex domain (`example.com`).
This setup assumes that you already have a [custom domain](/pages/configuration/custom-domains/) attached to your Pages project.
## Setup
To redirect your `www` subdomain to your domain apex:
1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com) and select your account.
2. Go to **Bulk Redirects**.
3. [Create a bulk redirect list](/rules/url-forwarding/bulk-redirects/create-dashboard/#1-create-a-bulk-redirect-list) modeled after the following (but replacing the values as appropriate):
| Source URL | Target URL | Status | Parameters |
| ----------------- | --------------------- | ------ | ------------------------------------------------------------------------------------------------------------------------ |
| `www.example.com` | `https://example.com` | `301` | - Preserve query string
- Subpath matching
- Preserve path suffix
- Include subdomains
|
4. [Create a bulk redirect rule](/rules/url-forwarding/bulk-redirects/create-dashboard/#2-create-a-bulk-redirect-rule) using the list you just created.
5. Go to **DNS**.
6. [Create a DNS record](/dns/manage-dns-records/how-to/create-dns-records/#create-dns-records) for the `www` subdomain using the following values:
| Type | Name | IPv4 address | Proxy status |
| ---- | ----- | ------------ | ------------ |
| `A` | `www` | `192.0.2.1` | Proxied |
It may take a moment for this DNS change to propagate, but once complete, you can run the following command in your terminal.
```sh
curl --head -i https://www.example.com/
```
Then, inspect the output to verify that the `location` header and status code are being set as configured.
## Related resources
- [Redirect `*.pages.dev` to a custom domain](/pages/how-to/redirect-to-custom-domain/)
- [Handle redirects with Bulk Redirects](/rules/url-forwarding/bulk-redirects/)
---
# Tutorials
URL: https://developers.cloudflare.com/pages/tutorials/
import { GlossaryTooltip, ListTutorials } from "~/components"
View tutorials to help you get started with Pages.
---
# GitHub integration
URL: https://developers.cloudflare.com/pages/configuration/git-integration/github-integration/
You can connect each Cloudflare Pages project to a GitHub repository, and Cloudflare will automatically deploy your code every time you push a change to a branch.
## Features
Beyond automatic deployments, the Cloudflare GitHub integration lets you monitor, manage, and preview deployments directly in GitHub, keeping you informed without leaving your workflow.
### Custom branches
Pages will default to setting your [production environment](/pages/configuration/branch-build-controls/#production-branch-control) to the branch you first push. If a branch other than the default branch (e.g. `main`) represents your project's production branch, then go to **Settings** > **Builds** > **Branch control**, change the production branch by clicking the **Production branch** dropdown menu and choose any other branch.
You can also use [preview deployments](/pages/configuration/preview-deployments/) to preview versions of your project before merging your production branch, and deploying to production. Pages allows you to configure which of your preview branches are automatically deployed using [branch build controls](/pages/configuration/branch-build-controls/). To configure, go to **Settings** > **Builds** > **Branch control** and select an option under **Preview branch**. Use [**Custom branches**](/pages/configuration/branch-build-controls/) to specify branches you wish to include or exclude from automatic preview deployments.
### Preview URLs
Every time you open a new pull request on your GitHub repository, Cloudflare Pages will create a unique preview URL, which will stay updated as you continue to push new commits to the branch. Note that preview URLs will not be created for pull requests created from forks of your repository. Learn more in [Preview Deployments](/pages/configuration/preview-deployments/).

### Skipping a build via a commit message
Without any configuration required, you can choose to skip a deployment on an ad hoc basis. By adding the `[CI Skip]`, `[CI-Skip]`, `[Skip CI]`, `[Skip-CI]`, or `[CF-Pages-Skip]` flag as a prefix in your commit message, and Pages will omit that deployment. The prefixes are not case sensitive.
### Check runs
If you have one or multiple projects connected to a repository (i.e. a [monorepo](/pages/configuration/monorepos/)), you can check on the status of each build within GitHub via [GitHub check runs](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/collaborating-on-repositories-with-code-quality-features/about-status-checks#checks).
You can see the checks by selecting the status icon next to a commit within your GitHub repository. In the example below, you can select the green check mark to see the results of the check run.

Check runs will appear like the following in your repository.

If a build skips for any reason (i.e. CI Skip, build watch paths, or branch deployment controls), the check run/commit status will not appear.
## Manage access
You can deploy projects to Cloudflare Workers from your company or side project on GitHub using the [Cloudflare Workers & Pages GitHub App](https://github.com/apps/cloudflare-workers-and-pages).
### Organizational access
You can deploy projects to Cloudflare Pages from your company or side project on both GitHub and GitLab.
When authorizing Cloudflare Pages to access a GitHub account, you can specify access to your individual account or an organization that you belong to on GitHub. In order to be able to add the Cloudflare Pages installation to that organization, your user account must be an owner or have the appropriate role within the organization (that is, the GitHub Apps Manager role). More information on these roles can be seen on [GitHub's documentation](https://docs.github.com/en/organizations/managing-peoples-access-to-your-organization-with-roles/roles-in-an-organization#github-app-managers).
:::caution[GitHub security consideration]
A GitHub account should only point to one Cloudflare account. If you are setting up Cloudflare with GitHub for your organization, Cloudflare recommends that you limit the scope of the application to only the repositories you intend to build with Pages. To modify these permissions, go to the [Applications page](https://github.com/settings/installations) on GitHub and select **Switch settings context** to access your GitHub organization settings. Then, select **Cloudflare Workers & Pages** > For **Repository access**, select **Only select repositories** > select your repositories.
:::
### Remove access
You can remove Cloudflare Pages' access to your GitHub repository or account by going to the [Applications page](https://github.com/settings/installations) on GitHub (if you are in an organization, select Switch settings context to access your GitHub organization settings). The GitHub App is named Cloudflare Workers and Pages, and it is shared between Workers and Pages projects.
#### Remove Cloudflare access to a GitHub repository
To remove access to an individual GitHub repository, you can navigate to **Repository access**. Select the **Only select repositories** option, and configure which repositories you would like Cloudflare to have access to.

#### Remove Cloudflare access to the entire GitHub account
To remove Cloudflare Workers and Pages access to your entire Git account, you can navigate to **Uninstall "Cloudflare Workers and Pages"**, then select **Uninstall**. Removing access to the Cloudflare Workers and Pages app will revoke Cloudflare's access to _all repositories_ from that GitHub account. If you want to only disable automatic builds and deployments, follow the [Disable Build](/workers/ci-cd/builds/#disconnecting-builds) instructions.
Note that removing access to GitHub will disable new builds for Workers and Pages project that were connected to those repositories, though your previous deployments will continue to be hosted by Cloudflare Workers.
### Reinstall the Cloudflare GitHub app
If you see errors where Cloudflare Pages cannot access your git repository, you should attempt to uninstall and reinstall the GitHub application associated with the Cloudflare Pages installation.
1. Go to the installation settings page on GitHub:
- Navigate to **Settings > Builds** for the Pages project and select **Manage** under Git Repository.
- Alternatively, visit these links to find the Cloudflare Workers and Pages installation and select **Configure**:
| | |
| ---------------- | ---------------------------------------------------------------------------------- |
| **Individual** | `https://github.com/settings/installations` |
| **Organization** | `https://github.com/organizations//settings/installations` |
2. In the Cloudflare Workers and Pages GitHub App settings page, navigate to **Uninstall "Cloudflare Workers and Pages"** and select **Uninstall**.
3. Go back to the [**Workers & Pages** overview](https://dash.cloudflare.com) page. Select **Create application** > **Pages** > **Connect to Git**.
4. Select the **+ Add account** button, select the GitHub account you want to add, and then select **Install & Authorize**.
5. You should be redirected to the create project page with your GitHub account or organization in the account list.
6. Attempt to make a new deployment with your project which was previously broken.
---
# Git integration
URL: https://developers.cloudflare.com/pages/configuration/git-integration/
You can connect each Cloudflare Pages project to a [GitHub](/pages/configuration/git-integration/github-integration) or [GitLab](/pages/configuration/git-integration/gitlab-integration) repository, and Cloudflare will automatically deploy your code every time you push a change to a branch.
:::note
Cloudflare Workers now also supports Git integrations to automatically build and deploy Workers from your connected Git repository. Learn more in [Workers Builds](/workers/ci-cd/builds/).
:::
When you connect a git repository to your Cloudflare Pages project, Cloudflare will also:
- **Preview deployments for custom branches**, generating preview URLs for a commit to any branch in the repository without affecting your production deployment.
- **Preview URLs in pull requests** (PRs) to the repository.
- **Build and deployment status checks** within the Git repository.
- **Skipping builds using a commit message**.
These features allow you to manage your deployments directly within GitHub or GitLab without leaving your team's regular development workflow.
:::caution[You cannot switch to Direct Upload later]
If you deploy using the Git integration, you cannot switch to [Direct Upload](/pages/get-started/direct-upload/) later. However, if you already use a Git-integrated project and do not want to trigger deployments every time you push a commit, you can [disable automatic deployments](/pages/configuration/git-integration/#disable-automatic-deployments) on all branches. Then, you can use Wrangler to deploy directly to your Pages projects and make changes to your Git repository without automatically triggering a build.
:::
## Supported Git providers
Cloudflare supports connecting Cloudflare Pages to your GitHub and GitLab repositories. Pages does not currently support connecting self-hosted instances of GitHub or GitLab.
If you using a different Git provider (e.g. Bitbucket) or a self-hosted instance, you can start with a Direct Upload project and deploy using a CI/CD provider (e.g. GitHub Actions) with [Wrangler CLI](/pages/how-to/use-direct-upload-with-continuous-integration/).
## Add a Git integration
If you do not have a Git account linked to your Cloudflare account, you will be prompted to set up an installation to GitHub or GitLab when [connecting to Git](/pages/get-started/git-integration/) for the first time, or when adding a new Git account. Follow the prompts and authorize the Cloudflare Git integration.
You can check the following pages to see if your Git integration has been installed:
- [GitHub Applications page](https://github.com/settings/installations) (if you're in an organization, select **Switch settings context** to access your GitHub organization settings)
- [GitLab Authorized Applications page](https://gitlab.com/-/profile/applications)
For details on providing access to organization accounts, see the [GitHub](/pages/configuration/git-integration/github-integration/#organizational-access) and [GitLab](/pages/configuration/git-integration/gitlab-integration/#organizational-access) guides.
## Manage a Git integration
You can manage the Git installation associated with your repository connection by navigating to the Pages project, then going to **Settings** > **Builds** and selecting **Manage** under **Git Repository**.
This can be useful for managing repository access or troubleshooting installation issues by reinstalling. For more details, see the [GitHub](/pages/configuration/git-integration/github-integration/#managing-access) and [GitLab](/pages/configuration/git-integration/gitlab-integration/#managing-access) guides.
## Disable automatic deployments
If you are using a Git-integrated project and do not want to trigger deployments every time you push a commit, you can use [branch control](/pages/configuration/branch-build-controls/) to disable/pause builds:
1. Go to the **Settings** of your **Pages project** in the [Cloudflare dashboard](https://dash.cloudflare.com).
2. Navigate to **Build** > edit **Branch control** > turn off **Enable automatic production branch deployments**.
3. You can also change your Preview branch to **None (Disable automatic branch deployments)** to pause automatic preview deployments.
Then, you can use Wrangler to deploy directly to your Pages project and make changes to your Git repository without automatically triggering a build.
---
# Migrating from Netlify to Pages
URL: https://developers.cloudflare.com/pages/migrations/migrating-from-netlify/
In this tutorial, you will learn how to migrate your Netlify application to Cloudflare Pages.
## Finding your build command and build directory
To move your application to Cloudflare Pages, find your build command and build directory. Cloudflare Pages will use this information to build and deploy your application.
In your Netlify Dashboard, find the project that you want to deploy. It should be configured to deploy from a GitHub repository.

Inside of your site dashboard, select **Site Settings**, and then **Build & Deploy**.


In the **Build & Deploy** tab, find the **Build settings** panel, which will have the **Build command** and **Publish directory** fields. Save these for deploying to Cloudflare Pages. In the below image, **Build command** is `yarn build`, and **Publish directory** is `build/`.

## Migrating redirects and headers
If your site includes a `_redirects` file in your publish directory, you can use the same file in Cloudflare Pages and your redirects will execute successfully. If your redirects are in your `netlify.toml` file, you will need to add them to the `_redirects` folder. Cloudflare Pages currently offers limited [supports for advanced redirects](/pages/configuration/redirects/). In the case where you have over 2000 static and/or 100 dynamic redirects rules, it is recommended to use [Bulk Redirects](/rules/url-forwarding/bulk-redirects/create-dashboard/).
Your header files can also be moved into a `_headers` folder in your publish directory. It is important to note that custom headers defined in the `_headers` file are not currently applied to responses from functions, even if the function route matches the URL pattern. To learn more about how to [handle headers, refer to Headers](/pages/configuration/headers/).
:::note
Redirects execute before headers. In the case of a request matching rules in both files, the redirect will take precedence.
:::
## Forms
In your form component, remove the `data-netlify = "true"` attribute or the Netlify attribute from the `