Vercel Blog
vercel.com/
Introducing the Dubai Vercel region (dxb1)
Dubai (dxb1) is now part of Vercel’s global edge network, improving latency for users in the Middle East, Africa, and Central Asia.

Tray.ai cut build times from a day to minutes with Vercel
Tray.ai cut build times from a full day to just two minutes after migrating to Vercel. By consolidating infrastructure and updating their tech stack, they now deliver over a million monthly page views with a faster, more resilient site.
Improved unhandled Node.js errors in Fluid compute
Fluid compute now gracefully handles Node.js uncaught exceptions and unhandled rejections to provide better isolation between requests.

Improved team overview page
The Vercel team overview now sorts by your activity, can be filtered by repository, and shows a window into your usage.

Building efficient MCP servers
MCP is becoming the standard for building AI model integrations. See how you can use Vercel's open-source MCP adapter to quickly build your own MCP server, like the teams at Zapier, Composio, and Solana.

Designing and building the Vercel Ship conference platform
Here's how we designed and built our Vercel Ship conference platform. We generated 15,000+ images and videos with tools like Flux, Veo 2, Runway, and Ideogram. Then, we moved to v0 for prototyping. See our iterations, examples, tech stack, and more.

Filter runtime logs for fatal function errors
You can now filter your runtime logs to identify fatal function errors, such as Node.js crashes, in the Vercel Logs UI.

How we’re adapting SEO for LLMs and AI search
AI is changing how content gets discovered. Now, SEO ranking ≠ LLM visibility. No one has all the answers, but here's how we're adapting our approach to SEO for LLMs and AI search.

Observability added to AI Gateway alpha
Vercel Observability now includes a dedicated AI section to surface metrics related to the AI Gateway.
Claude Code and Cursor Agent no longer require a team seat
Claude Code and Cursor Agent can now trigger builds on Vercel without a team seat, as a part of our bot detection policies.

Bot Protection is now generally available
Vercel's Bot Protection managed ruleset allows users to mitigate unwanted bot activity on their projects in a single click
Pre-generate SSL certs, now in the Domains dashboard
The Domains Dashboard now enables zero-downtime migration by allowing SSL certificates to be pre-provisioned before migrating domains.

The no-nonsense approach to AI agent development
Learn how to build reliable, domain-specific AI agents by simulating tasks manually, structuring logic with code, and optimizing with real-world feedback. A clear, hands-on approach to practical automation.
New firewall challenge metrics now available
You can now monitor and query for challenge outcomes with two new metrics, available in the Firewall and Observability dashboards.

Introducing the v0 composite model family
Learn how v0's composite AI models combine RAG, frontier LLMs, and AutoFix to build accurate, up-to-date web app code with fewer errors and faster output.
Fluid compute now supports ISR background and on-demand revalidation
Fluid compute now supports both background and on-demand revalidations across all Vercel projects. This update brings the same performance and efficiency improvements to on-demand cache updates, with no configuration changes required.

Faster login flow and new Google Sign-in support
We’ve improved the login experience with a new design and support for Google sign-in, including Google One Tap. Signing in with Google is now a single-click experience.

Fluid compute: Evolving serverless for AI workloads
Fluid, our newly announced compute model, eliminates wasted compute by maximizing resource efficiency. Instead of launching a new function for every request, it intelligently reuses available capacity, ensuring that compute isn’t sitting idle.
CVE-2025-48068
A low-severity vulnerability in the Next.js dev server has been addressed. It affects versions 13.0.0 through 14.2.29, and 15.0.0 through 15.2.1 when using the App Router and involves Cross-site WebSocket hijacking (CSWSH) to perform the exploit.

AI query prompting now available in Observability Plus
AI query prompting is now available in Observability Plus, allowing users to write or edit log queries using natural language. Generate shareable, bookmarkable queries without writing syntax.
Faster CDN proxying to external origins
Vercel’s upgraded CDN connection pooling speeds up proxying to external backends by up to 60%, cutting latency for both low-traffic and high-traffic apps.

Middleware insights now available in Vercel Observability
The Vercel Observability dashboard now includes a dedicated view for middleware, showing invocation counts and performance metrics.
Rate limiting now available on Hobby, with higher included usage on Pro
The first 1,000,000 allowed rate limit requests per month are now included. Hobby teams also get 1 free rate limit rule per project, up to the same included allotment.

Vercel security roundup: improved bot defenses, DoS mitigations, and insights
Since February, Vercel blocked over 148 billion attacks from 108 million IPs. This roundup highlights improvements to bot protection, DoS mitigation, and firewall tooling to help teams build securely by default.

External API caching insights now in Observability
External API caching insights. are now available in Observability for external API calls using Vercel Data Cache.

Vercel Blob is now generally available
Vercel Blob is now generally available, bringing high-performance storage integrated with the Vercel application delivery platform.

How Vapi built their MCP server on Vercel
Vapi has used Vercel's MCP Adapter to deploy and host their MCP server on Vercel, leveraging the benefits of Fluid Compute

Vercel Blob is now generally available: Cost-efficient, durable storage
Vercel Blob is now generally available, providing durable object storage that's integrated with Vercel's application delivery network.

Introducing the AI Gateway
With the AI Gateway, build with any model instantly. No API keys, no configuration, no vendor lock-in.