Skip to main content
← Back to Documentation

System Architecture Overview

Enterprise-grade infrastructure designed for 100M+ concurrent users

Technical
100M+ Users
99.99% SLA
Multi-Protocol
Global Edge Network

High-Level Architecture

WAVE is built on a distributed, microservices-based architecture optimized for global scale

Client Layer
Edge Network (Cloudflare - 275+ Locations)
API Gateway & Load Balancer
Ingest Services
Processing Pipeline
Distribution Services
Data Layer (Supabase PostgreSQL + Redis Cache)

Core Components

Key architectural components and their responsibilities

Frontend Architecture

Next.js 15 App Router

  • React 19 with Server Components for optimal performance
  • Edge runtime for sub-100ms response times globally
  • Streaming SSR for progressive content delivery
  • Automatic code splitting and route-based chunking

TypeScript 5.3+

  • Strict type safety across entire application
  • Auto-generated API types from OpenAPI spec
  • End-to-end type safety from DB to UI

State Management

  • React Server Components for server state
  • Zustand for global client state
  • TanStack Query for server data caching
  • WebSocket for real-time updates
// Example: Server Component with streaming
export default async function StreamDashboard() {
  const streams = await api.streams.list(); // Server-side fetch

  return (
    <Suspense fallback={<StreamSkeleton />}>
      <StreamList streams={streams} />
    </Suspense>
  );
}

Data Flow Architecture

How data flows through the WAVE system from ingest to playback

1

Ingest

  1. 1. Encoder connects to ingest endpoint
  2. 2. Authentication and authorization
  3. 3. Protocol handshake (WHIP/SRT/RTMP)
  4. 4. Stream data reception begins
  5. 5. Quality analysis and validation
  6. 6. Forward to processing pipeline
2

Processing

  1. 1. Transcode to multiple bitrates
  2. 2. Generate ABR ladder (1080p to 360p)
  3. 3. Audio normalization
  4. 4. Create HLS/DASH manifests
  5. 5. Extract thumbnails
  6. 6. Publish to distribution
3

Distribution

  1. 1. Content cached at edge locations
  2. 2. Player requests manifest
  3. 3. Edge serves cached segments
  4. 4. Adaptive bitrate selection
  5. 5. Analytics collection
  6. 6. Real-time metrics update

Scalability Architecture

How WAVE scales to support 100M+ concurrent users

Horizontal Scaling

  • Stateless API services for unlimited horizontal scale
  • Auto-scaling based on CPU, memory, and request metrics
  • Kubernetes orchestration with HPA and VPA
  • Zero-downtime deployments with rolling updates

Edge Computing

  • 275+ Cloudflare edge locations globally
  • Sub-50ms latency for 95% of global population
  • Edge caching for static and dynamic content
  • Intelligent request routing to nearest edge

Database Scaling

  • Read replicas for query load distribution
  • Connection pooling with PgBouncer
  • Redis caching layer for hot data
  • Sharding strategy for multi-region support

Streaming Scale

  • Origin servers scale independently from edge
  • CDN pull-through caching reduces origin load
  • Multi-CDN strategy for redundancy
  • Automatic failover and traffic shifting

Technical Specifications

Performance Targets

API Response Time<100ms (p95)
Edge Response Time<50ms (p95)
WebRTC Latency300-800ms
HLS Latency2-5 seconds
Uptime SLA99.99%

Capacity Limits

Concurrent Streams10M+
Concurrent Viewers100M+
API Rate Limit (Pro)10K req/min
Max Stream DurationUnlimited
Max Video Bitrate20 Mbps
System Architecture Overview | WAVE Technical Docs | WAVE