Need the #1 custom application developer in Brisbane?Click here →

Infrastructure

Serverless Architecture

10 min readLast reviewed: March 2026

Serverless means you don't manage servers. You write functions that run on demand. No servers to provision, no infrastructure to manage, no capacity planning. You pay only for what you use.

What Is Serverless?

"Serverless" doesn't mean no servers. It means servers are hidden from you. You write a function, upload it to a serverless platform, and the platform runs it when needed.

Traditional server: you provision a server (2GB RAM, 2 CPUs), pay $20/month whether you use it or not, manage it forever.

Serverless function: you write code, upload it, the platform runs it when invoked. If it runs for 100ms and uses 128MB RAM, you pay for 100ms of 128MB. If it doesn't run at all, you pay nothing.

This is revolutionary for applications with unpredictable traffic. A webhook that processes data occasionally—serverless is perfect. A scheduled job that runs nightly—serverless is perfect. A website with consistent traffic—might not be cost-effective.

Serverless Platforms

AWS Lambda: The original serverless platform. Supports Node.js, Python, Java, Go, C#. Integrates with everything in the AWS ecosystem. Powerful but complex.

Google Cloud Functions: Similar to Lambda. Slightly simpler interface. Good integration with Google Cloud services.

Azure Functions: Microsoft's offering. Good for companies using Azure and Microsoft tools.

Vercel Edge Functions: Run code on Cloudflare's edge network worldwide. Great for APIs that need global performance.

Cloudflare Workers: Similar to Vercel Edge Functions. Run code on Cloudflare's edge. Lightweight and focused on API/HTTP workloads.

All are similar in concept. Choose based on existing ecosystem and provider preference.

Cold Starts

When a serverless function runs for the first time (or after being idle), the platform must spin up a container, load your code, initialize it. This takes 100ms to several seconds. This is a "cold start."

Subsequent invocations while the container is warm are fast (milliseconds). If the function sits idle for a while, the container is destroyed. The next invocation is cold again.

Cold starts are the main limitation of serverless. For latency-sensitive applications (real-time APIs), cold starts are unacceptable. For batch jobs or webhooks where 1 second latency is fine, they don't matter.

Strategies to minimize cold starts:

  • Keep code small: Large deployment packages take longer to load. Keep functions focused.
  • Use lightweight languages: Node.js and Python cold start faster than Java or C#.
  • Provisioned concurrency: Keep N containers warm at all times. You pay for the warmth but get instant responses. Good for critical APIs.
  • Scheduled warmup: Invoke the function periodically to keep it warm.
Note
Cold starts are improving. Platforms are getting faster, languages are optimizing startup time. In 5 years, cold starts might be negligible.

When Serverless Is Ideal

Event-driven workloads: User uploads file → trigger image processing function → save results to S3. Perfect for serverless.

APIs with variable traffic: Webhook from third-party service → process and store data. Might receive 1 request/hour or 100 requests/minute. Serverless scales automatically.

Scheduled jobs: Run something daily, weekly, monthly. Schedule with CloudWatch Events or similar. Only pay for execution time.

Real-time data processing: Stream data to Kinesis → Lambda processes and stores. Scales automatically with data volume.

When Serverless Is Not Ideal

Long-running processes: Functions have 15-minute timeouts typically. Cannot run 1-hour jobs. Use containers or VMs instead.

Persistent connections: WebSockets, gRPC, anything that needs to keep a connection open. Serverless functions are short-lived.

Tight latency requirements: High-frequency trading, real-time gaming, anything where 100ms latency matters. Cold starts are unacceptable.

Consistent traffic: Website with 1000 steady requests/second. You pay per invocation. A VM running 24/7 is cheaper. Calculate the math before choosing serverless.

Serverless Databases

Traditional databases don't work well with serverless. A database connection takes time. For a 100ms function, connection overhead is significant. Also, database licensing often charges per connection, which is expensive with many serverless functions.

Serverless databases solve this:

PlanetScale (MySQL): MySQL database as a service. Uses query API instead of connections. Scales to millions of serverless functions.

Neon (PostgreSQL): PostgreSQL as a service. HTTP API or traditional connections. Good for serverless applications.

Supabase (PostgreSQL): PostgreSQL with real-time, authentication, storage. Designed for serverless and next-gen apps.

DynamoDB (AWS): NoSQL database integrated with Lambda. Pay per read/write request. Good for serverless workloads.

These solve the connection problem. Pick based on your data model (SQL vs NoSQL) and existing ecosystem.

Serverless Cost Model

You pay for:

  • Invocations: Number of times the function runs
  • Duration: How long it runs (in 1ms increments typically)
  • Memory allocated: Memory size of the function

AWS Lambda example: $0.20 per million invocations, $0.0000166667 per GB-second.

If you have 1 million invocations/month, each using 256MB for 1 second:

Invocation cost: 1,000,000 * $0.20 / 1,000,000 = $0.20
Duration cost: 1,000,000 * 1s * (256MB / 1024) * $0.0000166667 = ~$4
Total: ~$4.20

Very cheap. But if you have 1 million invocations and each costs $0.01 (integration with premium services), suddenly serverless becomes expensive compared to running a $20/month VM.

Always calculate the cost for your specific use case before assuming serverless is cheaper.

Vendor Lock-In

Serverless APIs are platform-specific. AWS Lambda code doesn't run on Google Cloud Functions. If you decide to switch platforms, you rewrite functions.

This is a real concern for long-term projects. However, most teams find that the benefits of serverless (no ops, auto-scaling) outweigh lock-in risk. And switching is usually a one-time cost, not ongoing.

Mitigation: keep your serverless functions simple and business-logic focused. External code (database access, HTTP calls) should be abstracted away. Makes switching easier.

The Serverless Framework

The Serverless Framework lets you define, deploy, and manage serverless functions:

# serverless.yml
service: my-service

provider:
  name: aws
  region: us-east-1

functions:
  processImage:
    handler: handler.processImage
    events:
      - s3:
        bucket: my-bucket
        event: s3:ObjectCreated:*

This defines a function that triggers when files are uploaded to S3. Deploy with `serverless deploy`. Alternatives: AWS SAM, Terraform, CDK (AWS infrastructure as code in programming languages).

Tip
Serverless is powerful but not free. Calculate costs for your specific workload before committing. For some use cases (batch processing, variable-traffic APIs), it's dramatically cheaper. For others, a simple VPS is cheaper.

Serverless vs Containers vs VMs

NameCostScalingLimits
ServerlessPay per invocation, very cheap for spiky trafficInstant auto-scaling15-min timeout, limited ecosystem, cold starts
ContainersPay per runtime hour, good value for steady trafficFast auto-scaling with orchestrationNeed orchestration infrastructure, more complex
VMs/VPSPay per month, fixed cost, good for steady trafficManual or slow auto-scalingYou manage everything, ops burden

The Reality

Serverless is not a silver bullet. It's perfect for some workloads and terrible for others. The key is understanding your traffic pattern and cost model.

For APIs with variable traffic, event-driven workloads, and scheduled jobs, serverless is great. It means you don't need a DevOps team. Your code just runs.

For steady-traffic applications, long-running processes, or latency-critical systems, serverless might not fit. Use what matches your needs.

Many successful companies use a mix: serverless for APIs, containers for background jobs, managed databases. Mix and match.