Vercel Blog
vercel.com/
GPT 5.1 models now available in Vercel AI Gateway
You can now access the two GPT 5.1 models with Vercel's AI Gateway with no other provider accounts required.
Rollbar joins the Vercel Marketplace
Connect Rollbar to your Vercel projects to track errors, monitor releases, and resolve issues faster. Get real-time observability for Next.js, React, and serverless apps.
Vercel: The anti-vendor-lock-in cloud
Framework-defined infrastructure interprets your code and provisions what you need, keeping your application portable across any platform.
How Nous Research used BotID to block automated abuse at scale
Vercel BotID Deep Analysis protected Nous Research by blocking advanced automated abuse from attacking their application
What we learned building agents at Vercel
We're presenting a simple methodology for discovering successful agent projects that perform well with current generation AI
Free Vercel BotID Deep Analysis through January 15
BotID is Vercel's most advanced bot solution, and will be free for all pro and enterprise customers during the Black Friday Cyber Monday/ the holidays.
Free BotID Deep Analysis through end of the year
BotID is Vercel's most advanced bot solution, and will be free for all pro and enterprise customers during the Black Friday Cyber Monday/ the holidays.
Route build traffic through Static IPs
You can now choose whether build traffic, such as calls to external APIs or CMS data sources during the build process, routes through your Static IPs.
Build and deploy data applications on Snowflake with v0
The v0 Snowflake integration lets you build and deploy Next.js data applications with natural language. Your data stays secure in your Snowflake account.
Redirects and Rewrites now available in Observability
Improved observability of redirects and rewrites are now generally available under the Observability tab on Vercel's Dashboard. Redirect and rewrite request logs are now drainable to any configured Drain
Microfrontends now generally available
Microfrontends now generally available, enabling you to split large applications into smaller units that render as one cohesive experience for users.
Zero-configuration support for Fastify
Vercel now detects and deploys Fastify, a fast and low overhead web framework, with zero configuration.
BotID Deep Analysis catches a sophisticated bot network in real-time
BotID Deep Analysis is a sophisticated, invisible bot detection product. This article is about how BotID Deep Analysis adapted to a novel attack in real time, and successfully classified sessions that would have slipped through other services.
Vercel Agent can now run AI investigations
Vercel Agent Investigation intelligently conducts incident response investigations to alert, analyze, and suggest remediation steps
Caching details now available in Runtime Logs
Details on ISR cache keys, cache tags, and cache revalidation reasons are now available in Runtime Logs for all customers
OpenAI's GPT-OSS-Safeguard-20B now available in Vercel AI Gateway
You can now access OpenAI's GPT-OSS-Safeguard-20B with Vercel's AI Gateway with no other provider accounts required.
Vercel achieves TISAX AL2
Vercel has achieved TISAX Assessment Level 2 security standard used in the automotive and manufacturing industries
Vercel achieves TISAX AL2 compliance to serve automotive partners
Vercel has achieved TISAX Assessment Level 2 security standard to align with automotive and manufacturing industries
Bun runtime now in Public Beta for Vercel Functions
The Bun runtime is now available in Public Beta for Vercel Functions. Benchmarks show Bun reduced average latency by 28% for CPU-bound Next.js rendering compared to Node.js.
Bun runtime on Vercel Functions
Vercel Functions now supports the Bun runtime, giving developers faster performance options and greater flexibility for optimizing JavaScript workloads.
David Totten Joins Vercel to Lead Global Field Engineering
David Totten joins Vercel as VP of Global Field Engineering from Databricks to oversee Sales Engineering, Developer Success, Professional Services, and Customer Support Engineering under one integrated organization
Vercel Ship AI 2025 recap
Earlier this year we introduced the foundations of the AI Cloud: a platform for building intelligent systems that think, plan, and act. At Ship AI, we showed what comes next. What and how to build with the AI Cloud.
AI Chat now available on Vercel docs
AI Chat is now live on Vercel docs. Ask questions, load docs as context for page-aware answers, and copy chats as Markdown for easy sharing.
Manage Next.js Server Actions in the Vercel Firewall
Vercel Firewall and Observability Plus can now configure Custom Rules targeting specific server actions
Open source Workflow Development Kit is now in public beta
Turn TypeScript functions into durable, reliable workflows with automatic retries, persistence, and observability built in.
Vercel Python SDK is now available in beta
Vercel now has a native Python library hosted on PyPI to simplify tasks such as executing code on Vercel Sandboxes, transferring objects to Vercel Blob, working with the Runtime Cache and retrieving project tokens.
Introducing AI agents & services on the Vercel Marketplace
Discover AI Agents & Services on the Vercel Marketplace. Integrate agentic tools, automate workflows, and build with unified billing and observability.
Vercel Agent Investigations now in Public Beta
Vercel Agent can now automatically run AI investigations on anomalous events for faster incident response.
Zero-config backends on Vercel AI Cloud
Build, scale, and orchestrate AI backends on Vercel. Deploy Python or Node frameworks with zero config and optimized compute for agents and workflows.
Introducing Vercel Agent: Your new Vercel teammate
Vercel Agent provides AI-powered code reviews and production investigations, delivering accurate, context-aware insights to help you ship reliable software.
Built-in durability: Introducing Workflow Development Kit
The Workflow Development Kit (WDK) makes async workflows in TypeScript reliable, durable, fault-tolerant, and portable across any cloud.
AI agents and services on the Vercel Marketplace
Agents and Tools are available in the Vercel Marketplace, enabling AI-powered workflows in your projects with native integrations, unified billing, and built-in observability.
You can just ship agents
Vercel AI Cloud combines unified model routing and failover, elastic cost-efficient compute that only bills for active CPU time, isolated execution for untrusted code, and workflow durability that survives restarts, deploys, and long pauses.
Faster builds with Turbo build machines
Turbo build machines are now available for all paid plans. With 30 CPUs and 60 GB memory they are our fastest yet, ideal for Turbopack builds and large monorepos.
Dynamically extend timeout of an active Sandbox
You can now extend the duration of a running Vercel Sandbox using the new extendTimeout method. This lets you keep long-running sandboxes alive beyond their initial timeout. It is useful for workflows that take longer than expected
Update regarding Vercel service disruption on October 20, 2025
Update regarding Vercel service disruption on October 20, 2025. Read the summary of impact, timeline, root cause, and steps we’re taking to improve reliability.
Preview links between microfrontends projects now serve all paths
Teams using microfrontends can now visit all routes from any domain in the microfrontends group, enabling teams to test their full site experience without broken links or missing pages.
Zero-configuration support for NestJS
Vercel now detects and deploys NestJS, a framework for building efficient, scalable Node.js server-side applications, with zero configuration.
Braintrust joins the Vercel Marketplace
Braintrust joins the Vercel Marketplace with native support for the Vercel AI SDK and AI Gateway, enabling developers to monitor, evaluate, and improve AI application performance in real time.
Introducing Trace Drains on the Vercel Marketplace
Vercel Drains now power Marketplace integrations, letting developers stream logs, traces, and analytics from Vercel projects to tools like Braintrust for real-time observability and debugging.
Agents at work, a partnership with Salesforce and Slack
Vercel and Salesforce are partnering to help teams build, ship, and scale AI agents across the Salesforce ecosystem, starting with Slack.
Running Next.js inside ChatGPT: A deep dive into native app integration
Next.js now runs natively in ChatGPT with working navigation, React Server Components, and full features. Learn how we made this possible behind ChatGPTs triple iframe architecture and deploy our starter template to get started.
Talha Tariq joins Vercel as CTO (Security)
Talha Tariq joins Vercel as CTO of Security, bringing expertise from HashiCorp and IBM to lead security innovation in the AI era.
Just another (Black) Friday
Vercel customers can treat Black Friday like just another day, ready to scale to billions of requests.
Commits to the same branch now build with no queues
Building and deploying applications within the same GitHub branch, such as main, is now faster when using concurrent builds.
Anomaly alerts now in public beta
You can now subscribe to receive alerts when anomalies spikes in usage or 5XX errors are detected in your Vercel application
Zero-configuration Flask backends
Flask, one of the most popular Python web application frameworks, can now be deployed instantly at Vercel with no configuration changes needed.
Expanded Role-Based Access Control (RBAC) for Enterprise teams
Vercel’s Role-Based Access Control (RBAC) system now supports multiple roles per user and introduces extended permissions for finer-grained access control across Enterprise teams.
ChatGPT apps support on Vercel
Use the Apps SDK, Next.js, and mcp-handler to build and deploy ChatGPT apps on Vercel, complete with custom UI and app-specific functionality.
Server rendering benchmarks: Fluid Compute and Cloudflare Workers
Fluid Compute outperforms Cloudflare Workers by 1.2x–5x in server-side rendering benchmarks, offering faster, more consistent response times through an optimized in-region architecture.
Block Vercel deployment promotions with Github Actions
You can now automatically block a deployment from releasing to production until selected GitHub Actions complete successfully.
New Domains Registrar API for domain search, pricing, purchase, and management
Vercel Domains Registrar APIs let you search TLDs, fetch pricing, check availability, purchase or renew domains, manage nameservers, and handle transfers.
Anomaly alerts now available via email
Subscribe to receive alerts by email when anomalous spikes in usage or HTTP 500 errors occur on your Vercel projects.
Python package manager uv is now available for builds with zero configuration
Vercel now uses uv, a fast Python package manager written in Rust, as the default package manager during the installation step for all Python builds.
Invalidate the CDN cache by tag
You can now invalidate the CDN cache contents by tag providing a way to revalidate content without increasing latency for your users
Static IPs are now available for more secure connectivity
Teams on Pro and Enterprise can now access Static IPs to connect to services that require IP allowlisting. Static IPs give your projects consistent outbound IPs without needing the full networking stack of Secure Compute.
Deployment-level configuration for Fluid compute
You can now enable Fluid Compute on a per-deployment basis. By setting "fluid": true in your vercel.json
Faster time-to-start for v0 builds
Publishing v0 apps on Vercel got 1.1 seconds faster on average due to some optimizations on sending source files during deployment creation.
Towards the AI Cloud: Our Series F
Today, Vercel announced an important milestone: a Series F funding round valuing our company at $9.3 billion.
Stripe is now available in beta on the Vercel Marketplace
stripe claimable sandbox available on vercel marketplace and v0. You can test your flow fully in test mode and go claim it when ready to go live
View & query bot verification data in Vercel Observability
Analyze traffic to your Vercel projects by bot name, bot category, and bot verification status in Vercel Observability
Collaborating with Anthropic on Claude Sonnet 4.5 to power intelligent coding agents
Claude Sonnet 4.5 is now available on Vercel AI Gateway and across the Vercel AI Cloud. Also introducing a new coding agent platform template.
Node.js Vercel Functions now support per-path request cancellation
Vercel Functions using Node.js can now detect when a request is cancelled and stop execution before completion. This includes actions like navigating away, closing a tab, or hitting stop on an AI chat to terminate compute processing early.
Zero-configuration FastAPI backends
FastAPI, a modern, high-performance, web framework for building APIs with Python based on standard Python type hints, is now supported with zero-configuration.
Vercel Domains overhauled with instant search and at-cost pricing
We rebuilt the Vercel Domains experience to make search and checkout significantly faster and more reliable.
Request collapsing for ISR cache misses
Vercel now does regional request collapsing on cache miss for Incremental Static Regeneration (ISR).
Preventing the stampede: Request collapsing in the Vercel CDN
The Vercel CDN now supports request collapsing for ISR routes. For a given path, only one function invocation per region runs at once, no matter how many concurrent requests arrive.
Query data on external API requests in Vercel Observability
It's now possible to run custom queries against all external API requests that where made from Vercel Functions
Claimed deployments now include third-party resources
Vercel now supports transferring resources like databases between teams as part of the Claim Deployments flow. Developers building AI agents, no-code tools, and workflow apps can instantly deploy projects and resources
Anomaly alerts now include error spikes
Enterprise customers with Observability Plus can now receive anomaly alerts for errors through the limited beta
BotID uncovers hidden SEO poisoning
A financial institution's suspicious bot traffic turned out to be Google bots crawling SEO-poisoned URLs from years ago. Here's how BotID revealed the real problem.
Filter deployments by author
Easily filter deployments on the Vercel dashboard by the Vercel username or email, or git username (if applicable).
How we made global routing faster with Bloom filters
We replaced slow JSON path lookups with Bloom filters in our global routing service, cutting memory usage by 15% and reducing 99th percentile lookup times from hundreds of milliseconds to under 1 ms. Here’s how we did it.
Observability Plus replacing legacy Monitoring
Observability Plus will replace the legacy Monitoring subscription. Pro customers using Monitoring should migrate to Observability Plus to author custom queries on their Vercel data.
AI code reviews by Vercel Agent now in Public Beta
Vercel Agent now provides high signal AI code reviews and fix suggestions to speed up your development process
What you need to know about vibe coding
Vibe coding is revolutionizing how we work. English is now the fastest growing programming language. Our state of vibe coding report outlines what you need to know.
Scale to one: How Fluid solves cold starts
Learn how Vercel solves serverless cold starts with scale to one, Fluid compute, predictive scaling, and caching to keep functions warm and fast.
Generate static AI SDK tools from MCP servers with mcp-to-ai-sdk
Use mcp-to-ai-sdk to generate MCP tools directly into your project. Gain security, reliability, and prompt-tuned control while avoiding dynamic MCP risks.
Shai-Halud Supply Chain Campaign — Expanded Impact & Vercel Response
Ongoing Shai-Halud npm supply chain attacks affected popular packages. Vercel responded swiftly, secured builds, and notified impacted users.
AI agents at scale: Rox’s Vercel-powered revenue operating system
Learn more about how Rox runs global, AI-driven sales ops on fast, reliable infrastructure thanks to Vercel
Builds now start up to 30% faster
Vercel builds now initialize faster due optimized cache downloads. We use a worker pool to download the build cache, reducing initialization times by 30%
Helly Hansen migrated to Vercel and drove 80% Black Friday growth
The 150-year-old Norwegian brand leveraged Next.js and Vercel to achieve 154% Black Friday growth and 30%+ conversion lift while competing against industry titans in a crowded space
Updated defaults for deployment retention
Vercel is updating the default retention policy for deployments. Unlimited retention is no longer available.
Introducing Vercel Drains: Complete observability data, anywhere
Vercel Drains give you a single way to stream observability data out of Vercel and into the systems your team already rely on.
Qwen3-Next models are now supported in Vercel AI Gateway
You can now access Qwen3 Next, two models from QwenLM, designed to be ultra-efficient, using Vercel's AI Gateway with no other provider accounts required.
402-mcp enables x402 payments in MCP
Introducing x402-mcp, a library that integrates with the AI SDK to bring x402 paywalls to Model Context Protocol (MCP) servers to let agents discover and call pay for MCP tools easily and securely.
Introducing x402-mcp: Open protocol payments for MCP tools
We built x402-mcp to integrate x402 payments with Model Context Protocol (MCP) servers and the Vercel AI SDK.
New Vercel CLI login flow
Simplified Vercel CLI login with OAuth 2.0 Device Flow. Sign in securely from any browser. Email and provider-based logins deprecated Feb 1. Upgrade now.
LongCat-Flash Chat model is now supported in Vercel AI Gateway
You can now access LongCat-Flash Chat from Meituan using Vercel AI Gateway, with no Meituan account required.
ChatGPT can now integrate with Vercel MCP
Use Vercel MCP with ChatGPT to explore projects, view logs, share access to protected deployments, and more.
Vercel Sandbox maximum duration extended to 5 hours
Pro and Enterprise teams can now run Vercel Sandboxes for up to 5 hours (up from 45 minutes). This extension unlocks new possibilities for workloads that require longer runtimes
MongoDB Atlas joins the Vercel Marketplace
MongoDB Atlas is now available on the Vercel Marketplace, making it simple to provision and scale databases directly from your Vercel dashboard
MongoDB Atlas is now available on the Vercel Marketplace
この記事では、最新のJavaScriptフレームワークであるNext.jsの新機能について詳しく解説しています。特に、画像最適化機能や新しいデータフェッチングの手法が強調されており、これにより開発者はパフォーマンスを向上させることができます。また、Next.jsの新しいAPIルート機能により、サーバーサイドでのデータ処理が簡素化され、開発の効率が向上します。さらに、TypeScriptとの統合が強化され、型安全性が向上した点も重要です。これらの機能は、特に大規模なアプリケーションの開発において、開発者にとって大きなメリットをもたらします。 • Next.jsの新機能として画像最適化機能が追加された。 • 新しいデータフェッチング手法により、パフォーマンスが向上する。 • APIルート機能の追加でサーバーサイドデータ処理が簡素化された。 • TypeScriptとの統合が強化され、型安全性が向上した。 • これらの機能は大規模アプリケーションの開発において特に有用。
The second wave of MCP: Building for LLMs, not developers
The second wave of MCP, building for LLMs, not developers. Explore the evolution of MCP as it shifts from developer-focused tools to LLM-native integrations. Discover the future of AI connectivity.
HIPAA BAAs are now available to Pro teams
Pro teams can now access and sign a Business Associate Agreement (BAA) to enable HIPAA-compliant workloads on Vercel. The BAA is self-serve and available from the team billing dashboard.
Free Viewer seats now available on Pro
Pro teams can now add unlimited free Viewer seats so team members can more flexibly, and cost-efficiently collaborate.
Spend Management now enabled by default on Pro
Pro teams can now add unlimited free Viewer seats so team members can more flexibly, and cost-efficiently collaborate.
Included Pro usage is now credit-based
The Pro plan now includes $20 in monthly usage credit instead of fixed allocations across metrics like data transfer, compute, caching, and more.
No build queues: On-demand concurrent builds now on by default
Teams on the new Pro pricing model will now have on-demand concurrent builds enabled by default. This ensures builds across projects start immediately without waiting in a queue, except when multiple builds target the same Git branch.
A more flexible Pro plan for modern teams
We’re updating Vercel’s Pro plan to better align with how modern teams collaborate and how applications consume infrastructure, and how workloads are changing shape with AI.
AI SDK and AI Gateway now integrated in GitHub Actions
You can now access the AI SDK and AI Gateway with the vercel/ai-action@v2 GitHub Action. Use it to generate text or structured JSON directly in your workflows by configuring a prompt, model, and api-key. Learn more in the docs.
Package installation for v0 builds is now ~70% faster.
NPM packages installation got faster for v0 builds. It went from 5 seconds to 1.5 seconds on average, which is a 70% reduction.
Critical npm supply chain attack response - September 8, 2025
How Vercel responded to the September 2025 npm supply chain attack on chalk, debug and 16 other packages. Incident timeline, impact analysis, and customer remediation.
Vercel Functions now support graceful shutdown
Vercel Functions running on the Node.js or Python runtimes now support graceful shutdown, allowing to run cleanup tasks for up to 500 milliseconds just before shutting down.
Export traces, web analytics events, and speed insights datapoints to any destination
You can now export logs, traces, web analytics events, and speed insights datapoints with Vercel Drains.
Zero-configuration Express backends
Vercel now detects and deploys Express, a fast, unopinionated, minimalist web framework built on web standards, with zero configuration.
Stress testing Biome's noFloatingPromises lint rule
We partnered with Biome to push their noFloatingPromises lint rule to the limit, uncovering edge cases and showing how we solve hard problems together.
Open SDK strategy
Vercel’s Open SDK strategy commits to building frameworks, SDKs, and tools in the open, under permissive licenses. Learn how we’re avoiding lock-in, ensuring portability, and investing in open source to build a better web for everyone.
CVE-2025-55173
A vulnerability affecting Next.js Image Optimization has been addressed. It impacted versions prior to v15.4.5 and v14.2.31.
CVE-2025-57822
A vulnerability affecting Next.js Middleware has been addressed. It impacted versions prior to v14.2.32 and v15.4.7.
CVE-2025-57752
A vulnerability affecting Next.js Image Optimization has been addressed. It impacted versions prior to v15.4.5 and v14.2.31.
Preparing for the worst: Our core database failover test
On July 24, 2025, we successfully performed a full production failover of our core control-plane database from Azure West US to East US 2 with zero customer impact.
s1ngularity: supply chain attack in Nx packages
A critical vulnerability was published in Nx and some of its supporting libraries. Vercel builds are safe from this vulnerability by default.
Anomaly alerts now in limited beta for Enterprise customers
Enterprise customers with Observability Plus can now receive anomaly alerts through the limited beta
Build Slack agents with @vercel/slack-bolt
Deploy your Slack agent to Vercel's AI Cloud using @vercel/slack-bolt to take advantage of AI Gateway, Fluid compute, and more.
Deploy xmcp servers with zero-configuration
xmcp, a framework for building and shipping MCP applications with TypeScript, can now be deployed to Vercel with zero-configuration.
AI-powered prototyping with design systems
Why AI-native design systems unlock true brand-ready, production-aligned prototyping for teams using v0
Introducing Streamdown: Open source Markdown for AI streaming
Streamdown is a new open source, drop-in Markdown renderer built for AI streaming. It powers the AI Elements Response component, but can also be used standalone.
AI Gateway is now generally available
AI Gateway is now generally available, providing a single interface to access hundreds of AI models with transparent pricing and built-in observability.
AI Gateway: Production-ready reliability for your AI apps
AI Gateway, now generally available, ensures availability when a provider fails, avoiding low rate limits and providing consistent reliability for AI workloads.
<script type="text/llms.txt">
llms.txt is an emerging standard for making content such as docs available for direct consumption by AIs. We’re proposing a convention to include such content directly in HTML responses.
Agents can now access protected deployments via Vercel’s MCP server
Vercel's MCP server now lets agents access deployments behind authentication, enabling them to act on your behalf.
Node.js Vercel Functions now support fetch web handlers
Vercel Functions running on the Node.js runtime now support fetch web handlers to enhance interoperability across runtimes and frameworks.
If agents are building your app, who gets the W-2?
If agents can design, build, test, and deploy features, their work should be treated like a developer's under GAAP. With modern AI logging, you can tie usage directly to capitalizable development activity.
Vercel Sandbox increases concurrency and port limits
Run up to 2000 sandboxes at the same time. Sandbox also now supports listening on up to 4 different ports
Improved fake hardware detection with Vercel BotID
Vercel BotID Deep Analysis now uses an updated detection model that expands fingerprinting coverage for bespoke headless browsers and simulated device hardware.
How Coxwave delivers GenAI value faster with Vercel
Coxwave's journey to cutting deployment times by 85% and building AI-native products faster with Vercel
Cutting delivery times in half with v0
Learn how Ready.net uses v0 to reduce ambiguity and accelerate feedback loops with limited resources
v0.dev -> v0.app
v0.dev is now v0.app, the AI builder for everyone: founders, designers, developers, marketers, sales, finance, and more
Cursor now supported on Vercel MCP
Connect Cursor to Vercel MCP to manage projects and deployments, analyze logs, search docs, and more
How Zapier scales product partnerships with v0
The team behind Zapier’s embedded platform uses v0 to turn partner conversations into scalable integrations
Improved metrics search in Observability Plus
We’ve improved the metrics search and navigation experience in Vercel Observability, making it faster and easier to build custom queries.
Vercel collaborates with OpenAI for GPT-5 launch
The GPT-5 family of models released today, are now available through AI Gateway and are in production on our own v0.dev applications. Thanks to OpenAI, Vercel has been testing these models for a few weeks in v0, Next.js, AI SDK, and Vercel Sandbox.
Vercel is the only vendor to be recognized as a Visionary in the 2025 Gartner® Magic Quadrant™ for Cloud-Native Application Platforms
We’re honored to be the only vendor recognized as a Visionary in the 2025 Gartner® Magic Quadrant™ for Cloud Native Application Platforms.
Introducing AI Elements: build AI interfaces faster
Focus on your AI’s intelligence, not the UI scaffolding. AI Elements is now available as a new Vercel product to help frontend engineers build AI-driven interfaces in a fraction of the time.
Introducing Vercel MCP: Connect Vercel to your AI tools
Vercel now has an official hosted MCP server (aka Vercel MCP), which you can use to connect your favorite AI tools, such as Claude or VS Code, directly to Vercel.
Claude 4.1 Opus is now supported in Vercel AI Gateway
You can now access Claude Opus 4.1, a new model released by Anthropic today, using Vercel's AI Gateway with no other provider accounts required.
gpt-oss-20b and gpt-oss-120b are now supported in Vercel AI Gateway
You can now access gpt-oss by OpenAI, an open-weight reasoning model designed to push the open model frontier, using Vercel's AI Gateway with no other provider accounts required.
v0: vibe coding, securely
Vibe coding makes it possible for anyone to ship a viral app. But every line of AI-generated code is a potential vulnerability. Security cannot be an afterthought, it must be the foundation. Turn ideas into secure apps with v0.
New custom visualization in Vercel Observability
You can now customize how you visualize Observability data with line charts, volume charts, table views, or a big number.
A new wave of software, shipped on Vercel
We're launching a new way to showcase standout products built and shipped on Vercel. Submit your project.
Deploy Hono backends with zero configuration
Vercel now detects and deploys Hono, a fast, lightweight web application framework built on web standards, with zero configuration.
AI SDK 5
Introducing type-safe chat, agentic loop control, new specification, tool enhancements, speech generation, and more.
Join the v0 Ambassador Program
Apply today to join the v0 Ambassador Program and help others discover the magic of what's possible with v0.
Z.ai's GLM-4.5 and GLM-4.5 Air are now supported in Vercel AI Gateway
You can now access GLM-4.5 and GLM-4.5 Air, new flagship models from Z.ai designed to unify frontier reasoning, coding, and agentic capabilities, using Vercel's AI Gateway with no other provider accounts required.
Fluid: How we built serverless servers
Fluid Compute cuts cold starts and compute costs by up to 95%, scaling I/O-bound and AI workloads efficiently across 45B+ weekly requests.
Model Context Protocol (MCP) explained: An FAQ
Model Context Protocol (MCP) is a new spec that helps standardize the way large language models (LLMs) access data and systems, extending what they can do beyond their training data.
Vercel and Solara6 partner to build better ecommerce experiences
Solara6 is partnering with us to help ecommerce brands ship faster and deploy with confidence. Through this partnership, ecommerce teams working with Solara6 can expect improved SEO, site speed, and reliability during peak traffic moments.
Qwen3-Coder is now supported in Vercel AI Gateway
You can now access Kimi K2 from Moonshot AI using Vercel's AI Gateway, with no Moonshot AI account required.
Growthbook joins the Vercel Marketplace
Add feature flags and A/B testing to your Vercel projects with GrowthBook, now available on the Vercel Marketplace.
Build your own AI app builder with the v0 Platform API
Learn how to build, extend, and automate AI-generated apps like BI tools and website builders with v0 Platform API
Transform rules are now available in vercel.json
Transform rules allow you to modify request headers, response headers, and request query parameters through the vercel.json.
OpenAI-compatible API endpoints now supported in AI Gateway
OpenAI-compatible API endpoints now supported in AI Gateway giving you access to 100s of models with no code rewrites required
Open Vercel documentation pages in AI providers
Copy Vercel documentation pages as markdown, or open them in AI providers, such as v0, Claude, or ChatGPT.
Grep a million GitHub repositories via MCP
Search 1M+ GitHub repositories from your AI agent using Grep's MCP server. Your agent can now reference coding patterns and solutions used in open source projects to solve problems.
Moonshot AI's Kimi K2 model is now supported in Vercel AI Gateway
You can now access Kimi K2 from Moonshot AI using Vercel's AI Gateway, with no Moonshot AI account required.
OAuth support added to MCP Adapter
Vercel's open-source MCP adapter now supports the latest MCP Authorization spec—letting you securely ship, OAuth-enabled MCP servers.
Search any public GitHub repo with Grep
You can now use Grep to search any public repository on GitHub, no longer limited to the 1M+ pre-indexed repos. Get quick, full-text search across the repo without any setup.
Clerk joins the Vercel Marketplace
Clerk is now available on the Vercel Marketplace, add secure auth, user management, and SSO to your app with a native integration and fully integrated billing.
More Secure Deployment Protection
Standard Deployment Protection now protects all except production custom domains enhancing protection for automatic aliases for production deployments.
The AI Cloud: A unified platform for AI workloads
We made it simple to build, preview, and ship any frontend, from marketing pages to dynamic apps, without managing infrastructure. Now we’re introducing the next layer: the Vercel AI Cloud.
Vercel Blob now available in all Vercel Regions
You can now select the storage region for your Vercel Blob store when creating it. This allows you to store your files in the region closest to your users for reduced latency.
v0 Platform API now in beta
The v0 Platform API enables developers to programmatically generate, retrieve, and manage full stack web apps using RESTful endpoints and TypeScript SDK. Integrate v0 into your workflows, tools, or automation pipelines.
Web Application Firewall control now available with vercel.json
You can now configure firewall mitigation rules through your vercel.json project configuration file. This is in addition to the existing dashboard and API support.
Inngest joins the Vercel Marketplace
Build background jobs and AI workflows with Inngest, now on the Vercel Marketplace. Native support for Next.js, preview environments, and branching.
NuxtLabs joins Vercel
NuxtLabs, creators of Nuxt and Nitro, are joining Vercel. Same license, roadmap, and open governance, but now in a joint mission to build the best web.
Sandbox now supports sudo and installing RPM packages
You can now run commands with sudo inside Vercel Sandbox, giving you full control to install packages at runtime, just like on a traditional Linux system.
Correlate logs and traces with OpenTelemetry in Vercel Log Drains
Correlate Vercel Logs and Traces with OpenTelemetry (OTel) in Vercel Log Drains sent to Datadog and Dash0
CVE-2025-49005
A cache poisoning vulnerability affecting Next.js App Router >=15.3.0 < 15.3.3 / Vercel CLI 41.4.1–42.2.0 has been resolved. The issue allowed page requests for HTML content to return a React Server Component (RSC) payload instead.
CVE-2025-49826
A vulnerability affecting Next.js has been addressed. It impacted versions >=15.1.0 <15.1.8 and involved a cache poisoning bug leading to a Denial of Service (DoS) condition.
Zero-configuration support for Nitro
Vercel now has detects and deploys Nitro, a server toolkit for building webservers, with zero configuration.
New usage dashboard for Pro customers
We’ve launched a new usage dashboard for Pro teams to analyze Vercel usage and costs with detailed breakdowns and export options.
New webhook events for domain management
You can now subscribe to webhook events for deeper visibility into domain operations on Vercel. These events make it easier to automate domain workflows, especially in multi-tenant platforms or when managing a large number of domains.
Vercel Ship 2025 recap
Vercel Ship 2025 added new building blocks for an AI era: Fast, flexible, and secure by default. Lower costs with Fluid's Active CPU pricing, Rolling Releases for safer deployments, invisible CAPTCHA with BotID. See these and more in our recap.
Introducing BotID, invisible bot filtering for critical routes
BotID is a new invisible CAPTCHA layer of protection that stops sophisticated bots before they reach your backend. It's built to secure critical routes like checkouts, logins, and signups or actions that trigger expensive calls like LLM-powered APIs.
Edge Middleware and Edge Functions are now powered by Vercel Functions
The Edge runtime now runs on Vercel Functions, unifying pricing across all compute, and available before and after cache. Edge Middleware and Edge Functions are now deprecated.
Run untrusted code with Vercel Sandbox
Vercel Sandbox securely runs untrusted code in isolated cloud environments, like AI-generated code. Create ephemeral, isolated microVMs using the new Sandbox SDK, with up to 45m execution times. Now in Beta and available to customers on all plans.
Lower pricing with Active CPU pricing for Fluid compute
Pricing for Vercel Functions on Fluid compute has been reduced. All Fluid-based compute now uses an Active CPU pricing model, offering up to 90% savings in addition to the cost efficiency already delivered by Fluid's concurrency model.
Higher defaults and limits for Vercel Functions running Fluid compute
Vercel Functions using Fluid compute now have longer execution times, more memory, and more CPU. The default execution time, for all projects on all plans, is now 300 seconds.
Introducing Active CPU pricing for Fluid compute
Fluid compute now uses Active CPU pricing. Only pay CPU rates when your function is actively computing. Building on existing Fluid gains, this brings additional savings of up to 90% for workloads like LLM calls, AI agents, or tasks with idle time.
WPP and Vercel: Bringing AI to the creative process
Announcing an expansion of our partnership with WPP, a first-of-its-kind agency collaboration that now brings v0 and AI SDK directly to WPP's global network of creative teams and their clients.
Vercel Blob CLI is now available
The Vercel CLI now includes blob commands, allowing you to manage your Vercel Blob storage directly from the terminal.
Keith Messick joins Vercel as CMO
We’re welcoming Keith Messick as Chief Marketing Officer to support our growth, engage on more channels, and (as always) amplify the voice of the developer. Keith is a longtime enterprise CMO and comes to Vercel from database leader, Redis.
Turso Cloud joins the Vercel Marketplace
Turso now offers a native integration on the Vercel Marketplace—deploy fast, edge-optimized SQLite databases with one-click setup and unified billing.
Two-factor authentication (2FA) team enforcement
Team owners can now enforce two-factor authentication (2FA) for every member of their team via a toggle in Security & Privacy under team settings.
Create and share queries with notebooks in Vercel Observability
Observability Plus users can now create a collection of queries in notebooks to collaboratively explore their observability data.
Introducing the Dubai Vercel region (dxb1)
Dubai (dxb1) is now part of Vercel’s global edge network, improving latency for users in the Middle East, Africa, and Central Asia.
Tray.ai cut build times from a day to minutes with Vercel
Tray.ai cut build times from a full day to just two minutes after migrating to Vercel. By consolidating infrastructure and updating their tech stack, they now deliver over a million monthly page views with a faster, more resilient site.
Improved unhandled Node.js errors in Fluid compute
Fluid compute now gracefully handles Node.js uncaught exceptions and unhandled rejections to provide better isolation between requests.
Improved team overview page
The Vercel team overview now sorts by your activity, can be filtered by repository, and shows a window into your usage.
Building efficient MCP servers
MCP is becoming the standard for building AI model integrations. See how you can use Vercel's open-source MCP adapter to quickly build your own MCP server, like the teams at Zapier, Composio, and Solana.
Designing and building the Vercel Ship conference platform
Here's how we designed and built our Vercel Ship conference platform. We generated 15,000+ images and videos with tools like Flux, Veo 2, Runway, and Ideogram. Then, we moved to v0 for prototyping. See our iterations, examples, tech stack, and more.
Filter runtime logs for fatal function errors
You can now filter your runtime logs to identify fatal function errors, such as Node.js crashes, in the Vercel Logs UI.
How we’re adapting SEO for LLMs and AI search
AI is changing how content gets discovered. Now, SEO ranking ≠ LLM visibility. No one has all the answers, but here's how we're adapting our approach to SEO for LLMs and AI search.
Observability added to AI Gateway alpha
Vercel Observability now includes a dedicated AI section to surface metrics related to the AI Gateway.
Claude Code and Cursor Agent no longer require a team seat
Claude Code and Cursor Agent can now trigger builds on Vercel without a team seat, as a part of our bot detection policies.
Bot Protection is now generally available
Vercel's Bot Protection managed ruleset allows users to mitigate unwanted bot activity on their projects in a single click
Pre-generate SSL certs, now in the Domains dashboard
The Domains Dashboard now enables zero-downtime migration by allowing SSL certificates to be pre-provisioned before migrating domains.
The no-nonsense approach to AI agent development
Learn how to build reliable, domain-specific AI agents by simulating tasks manually, structuring logic with code, and optimizing with real-world feedback. A clear, hands-on approach to practical automation.
New firewall challenge metrics now available
You can now monitor and query for challenge outcomes with two new metrics, available in the Firewall and Observability dashboards.