Vercel Blog
vercel.com/Node.js Vercel Functions now support per-path request cancellation
Vercel Functions using Node.js can now detect when a request is cancelled and stop execution before completion. This includes actions like navigating away, closing a tab, or hitting stop on an AI chat to terminate compute processing early.

Request collapsing for ISR cache misses
Vercel now does regional request collapsing on cache miss for Incremental Static Regeneration (ISR).

Preventing the stampede: Request collapsing in the Vercel CDN
The Vercel CDN now supports request collapsing for ISR routes. For a given path, only one function invocation per region runs at once, no matter how many concurrent requests arrive.
Query data on external API requests in Vercel Observability
It's now possible to run custom queries against all external API requests that where made from Vercel Functions

Claimed deployments now include third-party resources
Vercel now supports transferring resources like databases between teams as part of the Claim Deployments flow. Developers building AI agents, no-code tools, and workflow apps can instantly deploy projects and resources

Anomaly alerts now include error spikes
Enterprise customers with Observability Plus can now receive anomaly alerts for errors through the limited beta

BotID uncovers hidden SEO poisoning
A financial institution's suspicious bot traffic turned out to be Google bots crawling SEO-poisoned URLs from years ago. Here's how BotID revealed the real problem.

Filter deployments by author
Easily filter deployments on the Vercel dashboard by the Vercel username or email, or git username (if applicable).

How we made global routing faster with Bloom filters
We replaced slow JSON path lookups with Bloom filters in our global routing service, cutting memory usage by 15% and reducing 99th percentile lookup times from hundreds of milliseconds to under 1 ms. Here’s how we did it.
Observability Plus replacing legacy Monitoring
Observability Plus will replace the legacy Monitoring subscription. Pro customers using Monitoring should migrate to Observability Plus to author custom queries on their Vercel data.

What you need to know about vibe coding
Vibe coding is revolutionizing how we work. English is now the fastest growing programming language. Our state of vibe coding report outlines what you need to know.

Scale to one: How Fluid solves cold starts
Learn how Vercel solves serverless cold starts with scale to one, Fluid compute, predictive scaling, and caching to keep functions warm and fast.

Generate static AI SDK tools from MCP servers with mcp-to-ai-sdk
Use mcp-to-ai-sdk to generate MCP tools directly into your project. Gain security, reliability, and prompt-tuned control while avoiding dynamic MCP risks.

AI agents at scale: Rox’s Vercel-powered revenue operating system
Learn more about how Rox runs global, AI-driven sales ops on fast, reliable infrastructure thanks to Vercel
Builds now start up to 30% faster
Vercel builds now initialize faster due optimized cache downloads. We use a worker pool to download the build cache, reducing initialization times by 30%
Updated defaults for deployment retention
Vercel is updating the default retention policy for deployments. Unlimited retention is no longer available.

Introducing Vercel Drains: Complete observability data, anywhere
Vercel Drains give you a single way to stream observability data out of Vercel and into the systems your team already rely on.
402-mcp enables x402 payments in MCP
Introducing x402-mcp, a library that integrates with the AI SDK to bring x402 paywalls to Model Context Protocol (MCP) servers to let agents discover and call pay for MCP tools easily and securely.

Introducing x402-mcp: Open protocol payments for MCP tools
We built x402-mcp to integrate x402 payments with Model Context Protocol (MCP) servers and the Vercel AI SDK.

New Vercel CLI login flow
Simplified Vercel CLI login with OAuth 2.0 Device Flow. Sign in securely from any browser. Email and provider-based logins deprecated Feb 1. Upgrade now.

LongCat-Flash Chat model is now supported in Vercel AI Gateway
You can now access LongCat-Flash Chat from Meituan using Vercel AI Gateway, with no Meituan account required.
Vercel Sandbox maximum duration extended to 5 hours
Pro and Enterprise teams can now run Vercel Sandboxes for up to 5 hours (up from 45 minutes). This extension unlocks new possibilities for workloads that require longer runtimes

MongoDB Atlas joins the Vercel Marketplace
MongoDB Atlas is now available on the Vercel Marketplace, making it simple to provision and scale databases directly from your Vercel dashboard
MongoDB Atlas is now available on the Vercel Marketplace
この記事では、最新のJavaScriptフレームワークであるNext.jsの新機能について詳しく解説しています。特に、画像最適化機能や新しいデータフェッチングの手法が強調されており、これにより開発者はパフォーマンスを向上させることができます。また、Next.jsの新しいAPIルート機能により、サーバーサイドでのデータ処理が簡素化され、開発の効率が向上します。さらに、TypeScriptとの統合が強化され、型安全性が向上した点も重要です。これらの機能は、特に大規模なアプリケーションの開発において、開発者にとって大きなメリットをもたらします。 • Next.jsの新機能として画像最適化機能が追加された。 • 新しいデータフェッチング手法により、パフォーマンスが向上する。 • APIルート機能の追加でサーバーサイドデータ処理が簡素化された。 • TypeScriptとの統合が強化され、型安全性が向上した。 • これらの機能は大規模アプリケーションの開発において特に有用。

HIPAA BAAs are now available to Pro teams
Pro teams can now access and sign a Business Associate Agreement (BAA) to enable HIPAA-compliant workloads on Vercel. The BAA is self-serve and available from the team billing dashboard.

Free Viewer seats now available on Pro
Pro teams can now add unlimited free Viewer seats so team members can more flexibly, and cost-efficiently collaborate.

Spend Management now enabled by default on Pro
Pro teams can now add unlimited free Viewer seats so team members can more flexibly, and cost-efficiently collaborate.

Included Pro usage is now credit-based
The Pro plan now includes $20 in monthly usage credit instead of fixed allocations across metrics like data transfer, compute, caching, and more.

No build queues: On-demand concurrent builds now on by default
Teams on the new Pro pricing model will now have on-demand concurrent builds enabled by default. This ensures builds across projects start immediately without waiting in a queue, except when multiple builds target the same Git branch.

A more flexible Pro plan for modern teams
We’re updating Vercel’s Pro plan to better align with how modern teams collaborate and how applications consume infrastructure, and how workloads are changing shape with AI.

Critical npm supply chain attack response - September 8, 2025
How Vercel responded to the September 2025 npm supply chain attack on chalk, debug and 16 other packages. Incident timeline, impact analysis, and customer remediation.
Vercel Functions now support graceful shutdown
Vercel Functions running on the Node.js or Python runtimes now support graceful shutdown, allowing to run cleanup tasks for up to 500 milliseconds just before shutting down.

Export traces, web analytics events, and speed insights datapoints to any destination
You can now export logs, traces, web analytics events, and speed insights datapoints with Vercel Drains.

Zero-configuration Express backends
Vercel now detects and deploys Express, a fast, unopinionated, minimalist web framework built on web standards, with zero configuration.

Stress testing Biome's noFloatingPromises lint rule
We partnered with Biome to push their noFloatingPromises lint rule to the limit, uncovering edge cases and showing how we solve hard problems together.

Open SDK strategy
Vercel’s Open SDK strategy commits to building frameworks, SDKs, and tools in the open, under permissive licenses. Learn how we’re avoiding lock-in, ensuring portability, and investing in open source to build a better web for everyone.
CVE-2025-55173
A vulnerability affecting Next.js Image Optimization has been addressed. It impacted versions prior to v15.4.5 and v14.2.31.
CVE-2025-57822
A vulnerability affecting Next.js Middleware has been addressed. It impacted versions prior to v14.2.32 and v15.4.7.
CVE-2025-57752
A vulnerability affecting Next.js Image Optimization has been addressed. It impacted versions prior to v15.4.5 and v14.2.31.