WebPiki
it

Serverless in 2026: Lambda vs Vercel vs Workers

Serverless concepts, platform comparison between AWS Lambda, Vercel Functions, and Cloudflare Workers, plus real use cases and limitations.

Serverless architecture on the cloud

No servers to manage. Upload code and it runs. Traffic spikes get handled automatically. Zero cost when nothing's happening. That's the pitch of serverless.

Of course, servers still exist — you just don't manage them. The cloud provider handles provisioning, OS patching, scaling, and availability. You write code. That's it.

FaaS and BaaS

Serverless splits into two categories.

FaaS (Function as a Service) — Run code at the function level. AWS Lambda, Cloudflare Workers, Vercel Functions. An HTTP request comes in, the function executes, returns a result, done. No resources consumed between invocations.

BaaS (Backend as a Service) — Backend capabilities served as APIs. Firebase (auth, DB, storage), Supabase, Auth0. Instead of building a backend, you use pre-built services.

This article focuses on FaaS.

AWS Lambda

The original serverless FaaS. Launched in 2014, it's the most mature and feature-rich option.

What it offers:

  • Broad language support. Node.js, Python, Java, Go, .NET, Ruby — and custom runtimes for almost anything else
  • Deep AWS service integration. S3 upload triggers Lambda, DynamoDB change triggers Lambda, SQS message triggers Lambda. Event-driven architectures are straightforward to build
  • Maximum execution time of 15 minutes. Short tasks fit naturally; longer batch jobs pair with Step Functions
  • Memory ranges from 128MB to 10GB, with CPU allocated proportionally

Pricing: Per-request plus execution time. The free tier includes 1 million requests and ~111 GB-hours per month, which means small services can run for practically nothing.

Cost = (requests × $0.0000002) + (duration in GB-seconds × $0.0000166667)

A million requests costs $0.20 in request charges. Genuinely cheap.

The main downside is cold starts. When a Lambda function hasn't been invoked for a while, the instance gets torn down. The next request has to spin up a new one, adding latency. Node.js cold starts typically run 100-500ms; Java can hit 1-5 seconds. Provisioned Concurrency keeps instances warm, but that means paying for idle capacity.

Vercel Functions

Serverless functions from the team behind Next.js. Probably the most familiar serverless environment for frontend developers.

What it offers:

  • Next.js API Routes and Server Actions automatically deploy as serverless functions. Drop a file in app/api/ and you're done
  • Git push deploys automatically. No CI/CD pipeline to configure
  • Preview deployments on every PR, which makes review much easier
  • Edge Functions for running code at CDN edge locations

Pricing: The Hobby plan (free) includes 100K requests/month — plenty for side projects. The Pro plan ($20/user/month) is where production workloads make sense. Watch out for bandwidth costs (~$0.15/GB overage after 1 TB included on Pro) and while Spend Management can cap costs, traffic spikes or DDoS events can still get expensive if limits aren't configured.

Primary use case: Vercel Functions aren't general-purpose serverless. They're optimized for backend logic in frontend framework projects. API routes, server-side rendering, ISR — these are what Vercel does well. Triggering functions from S3 events or connecting to SQS isn't Vercel's territory.

Cloudflare Workers

Cloudflare's edge serverless platform. The most architecturally distinct option of the bunch.

What it offers:

  • Runs on V8 isolates, not Node.js. Code executes in isolated Chrome V8 engine instances — lighter and faster than containers or VMs
  • Cold starts are essentially zero. Instance startup under 5ms. The cold start problem that plagues Lambda barely exists here
  • Runs across 300+ edge locations worldwide. Functions execute at the server closest to the user, yielding very low latency
  • Its own storage ecosystem: Workers KV (key-value), Durable Objects (stateful coordination), R2 (S3-compatible storage), D1 (SQLite-based DB)

Pricing: Free plan: 100K requests/day with 10ms CPU time limit. Paid plan ($5/month): 10 million requests/month included, $0.30 per additional million. Billing is based on CPU time only — I/O wait time is excluded, which favors functions that make lots of external API calls. Unlimited bandwidth is a big deal for high-traffic services where Vercel costs would balloon.

Limitations: The V8 isolate environment doesn't support all Node.js APIs. No fs, no net. npm packages that depend on Node.js native modules won't work. Your code needs to be written for the Workers environment or use compatible libraries.

CPU time is capped too. Free plan: 10ms, paid plan: 30 seconds. I/O wait doesn't count, so API-heavy functions are fine, but CPU-intensive work doesn't belong here.

Deno Deploy

A serverless platform running the Deno runtime. Similar to Cloudflare Workers in that it runs at the edge.

What it offers:

  • Native Deno runtime. Built-in TypeScript support, standard Web APIs (fetch, Request, Response)
  • Global edge execution with low latency
  • Natural integration with Fresh (Deno's web framework)
  • Deno KV — a globally distributed key-value store built in

Market share is small, but its Web Standard API foundation makes it portable. Worth watching as the Deno ecosystem grows.

Platform Comparison

FeatureAWS LambdaVercel FunctionsCloudflare WorkersDeno Deploy
RuntimeContainerNode.js/EdgeV8 IsolateDeno
Cold Start100ms-secondsModerateNearly zero (~5ms)Nearly zero
Max Duration15 min60s (Hobby)30s (Paid)50ms CPU
Edge ExecutionLambda@EdgeEdge FunctionsDefaultDefault
Language SupportManyJS/TSJS/TS/WasmJS/TS
Free Tier1M req/month100K req/month100K req/day1M req/month
AWS IntegrationExcellentNoneSome (R2=S3 compat)None

Cold Starts — The Chronic Pain of Serverless

The most common complaint about serverless. When a function sits idle, the instance is destroyed. The next request pays the cost of spinning up a fresh one.

Factors that affect cold start time:

  • Runtime — Python and Node.js are fast (100-300ms); Java and .NET are slow (1-5 seconds)
  • Package size — More dependencies means slower startup. Keeping your zip size small matters on Lambda
  • Memory allocation — Higher memory on Lambda means proportionally more CPU, which speeds up cold starts
  • VPC attachment — Putting Lambda inside a VPC adds latency from ENI creation (though this has improved significantly)

Mitigation strategies:

  • Provisioned Concurrency (Lambda) — Pre-warm instances, but at a cost
  • Periodic pings — Call your function on a schedule to keep instances alive. A workaround, not a fix
  • Switch to Cloudflare Workers / Deno Deploy — V8 isolate architecture sidesteps the problem entirely

When Serverless Fits

API backends — CRUD APIs, webhook handlers, authentication. Runs only when called, which aligns perfectly with serverless.

Event processing — Image resizing on upload, sending emails after payment, log collection.

Variable traffic — Services that are quiet most of the time but spike unpredictably. Serverless auto-scales without any capacity planning.

Prototypes and MVPs — When speed matters most. Skip infrastructure setup and just write code.

When Serverless Doesn't Fit

Long-running tasks — Video encoding, ML training, large batch processing. Execution time limits will bite you, and costs scale worse than containers or VMs.

Stateful workloads — WebSocket connections, game servers. Serverless functions are stateless by design. Durable Objects (Cloudflare) can help, but it adds complexity.

High sustained throughput — Thousands of requests per second, consistently. At that volume, always-on containers are cheaper than per-invocation pricing.

Debugging — A universal serverless weakness. Local reproduction is harder, distributed system debugging complexity stacks up, and serverless-specific issues like cold start timeouts add another layer.

The Edge Computing Trend

Edge computing is serverless's next evolution. Traditional serverless runs functions in specific data center regions. Edge serverless runs them at CDN nodes closest to the user.

Cloudflare Workers leads this space. Vercel Edge Functions and Deno Deploy follow the same model. AWS offers Lambda@Edge and CloudFront Functions, but not as comprehensively.

What the edge is good for:

  • A/B testing — Route users at the edge
  • Auth token validation — Process before hitting the origin
  • Geo-based content customization
  • API response caching and transformation

Edge execution comes with tighter memory and duration limits, and database access can be restricted. The right approach is identifying which operations benefit from edge execution rather than moving everything there.

Serverless isn't a silver bullet. But for the right use cases, it eliminates a massive amount of infrastructure burden. Side projects and small services especially benefit — serverless is simpler and cheaper than maintaining even a single server. Match the tool to your project's scale and characteristics.

#Serverless#AWS Lambda#Vercel#Cloudflare Workers#Cloud

관련 글