Vercel provides AI infrastructure for you to build agents. Further, agents can programmatically create and manage deployments on Vercel, then later allow users to transfer ownership of the created deployment to their account.
- How to build agents with the AI SDK
- Build your own MCP server
- Build an MCP client with the AI SDK
- Fluid compute
- AI agent examples in production
- Vercel SDK for Deployments
- REST API for Deployments
- MCP Server
- Allowing users to claim deployments
- Sign in with Vercel (private beta)
Vercel provides infrastructure and SDKs to deploy AI agents. You can deploy single JavaScript or TypeScript files, which run single or multi-step workflows. In addition, Vercel can integrate with databases, AI inference, and more through the Vercel Marketplate with first-party billing.
The AI SDK provides a unified interface to switch between different AI models. Additionally, it has full support for tool-calling and building agentic systems.
Learn more in the AI SDK docs about the following patterns:
- Sequential Processing - Steps executed in order
- Parallel Processing - Independent tasks run simultaneously
- Evaluation/Feedback Loops - Results checked and improved iteratively
- Orchestration - Coordinating multiple components
- Routing - Directing work based on context
If you want to build your own MCP server on Vercel’s infrastructure, you can use the following templates for using a framework (Next.js) or without:
Deploying an MCP server to Vercel with Fluid Compute enabled ensures your AI-driven requests always have low-latency and can scale automatically. If your agents need ephemeral storage, consider using Redis from the Vercel Marketplace.
In addition to running your own MCP server, you can connect to it from your AI applications using Vercel’s AI SDK. By initializing an MCP client, your AI code can discover and call tools published by any MCP server—whether local or remote—through a standardized interface.
For more on MCP with the AI SDK, see the official docs.
Fluid compute is Vercel’s next-generation compute platform. It allows your code—generated or orchestrated by an AI agent—to:
- Scale automatically while minimizing cold starts
- Run background tasks after responding to the user (via
after
orwaitUntil
) - Run concurrent workloads in a cost-effective way
This is especially useful for agentic code that may spin up ephemeral processes, generate complex responses, or coordinate multiple steps without timeouts typical in traditional serverless environments.
We strongly recommend enabling Fluid compute for your AI agent applications. Further, you might additionally want to use integrations like Inngest or Upstash Qstash with your Vercel Function.
Here are some examples of AI applications built on Vercel’s agent infrastructure:
- v0.dev – Vercel’s agent for building web apps
- glif.app – Fun AI apps anyone can build
- ohara.ai – Create apps & games and buy a stake in them
- a0.dev – Mobile apps in minutes
- gumloop.com – AI automation framework
The Vercel SDK is a TypeScript toolkit that exposes all the same operations as the REST API, with optional convenient streaming for file uploads, typed responses, and integrated error handling.
import { Vercel } from "@vercel/sdk";import crypto from "node:crypto";
const vercel = new Vercel({ bearerToken: "<YOUR_BEARER_TOKEN>" });const teamId = process.env.TEAM_ID;
async function deployToVercel() { // Create your project files as a buffer const fileBuffer = /* your zipped project files as a buffer */; // Calculate SHA1 hash of the file const fileSha = crypto.createHash("sha1").update(fileBuffer).digest("hex"); // 1. Upload the file to Vercel await vercel.deployments.uploadFile({ teamId, headers: { "Content-Type": "application/octet-stream", "x-vercel-digest": fileSha, "Content-Length": fileBuffer.length.toString(), }, body: fileBuffer, }); // 2. Create deployment with the uploaded file const deployment = await vercel.deployments.createDeployment({ teamId, requestBody: { files: [ { file: ".vercel/source.tgz", sha: fileSha, } ], projectSettings: { framework: "nextjs", }, name: `nextjs-deployment-${Date.now().toString(36)}`, }, }); return deployment;}
You can call the Vercel REST API directly from your AI agent. A typical flow might be:
- Obtain an access token (how to create one).
- Upload your build artifacts with
POST /files
. - Create a deployment with
POST /deployments
. - Optionally, assign a domain or set up aliases.
For more detail, see the claim deployments demo and its source code.
Vercel’s Model Context Protocol (MCP) server allows AI models to talk to your Vercel account. Through tool calling, the MCP server can do the following and more:
- List Vercel projects
- Create deployments
- Upload files for a deployment
- Manage domains and DNS
With MCP, you run a local or hosted server that speaks this protocol. AI agents (e.g., using Claude or Cursor) can connect to it and automatically call Vercel operations without you writing custom code.
For example, configure the Vercel MCP server in your client:
npx --package @vercel/sdk mcp start --bearer-token "<YOUR_BEARER_TOKEN>"
Then, integrate with AI IDEs like Claude Desktop or Cursor by configuring .cursor/mcp.json
or claude_desktop_config.json
:
{ "mcpServers": { "Vercel": { "command": "npx", "args": [ "-y", "--package", "@vercel/sdk", "--", "mcp", "start", "--bearer-token", "..." ] } }}
You can then begin prompting your AI agent to use tools.
For example, you could ask things like “deploy to Vercel” or “list my Vercel projects”. When the updated MCP is released with support for OAuth, we will be moving to a remote server to simplify this process.
AI agents can create deployments on your (the agent’s) Vercel account, but often you want the end user to “take over” that deployment. Claimable deployments is Vercel’s flow for letting users transfer ownership of a deployment by visiting a claim URL.
- Agent sets up the deployment with minimal code.
- Agent provides a Claim Deployments URL to the user.
- User logs in to Vercel and chooses which team or personal account to transfer the deployment to.
- Upload files via
POST /files
. - Create the deployment with
POST /deployments
. - Generate claim URL:
/claim-deployment?code=YOUR_CODE&returnUrl=https://acme.com
- Share the link with the user to complete the transfer.
- User claims the deployment, optionally selects a team, and finalizes the ownership transfer.
For more detail, see the claim deployments demo and its source code.
We’re working on an OAuth provider so agents or third-party apps can integrate with Vercel.
This will allow your applications to securely access Vercel account information when authorized by the user. If you are interested in building an AI agent that could take advantage of this, please reach out to our team.