Upstash

databaseRedisKafkaserverlessedgepay-per-requestglobal distributionreal-time

Database Platform

Upstash

Overview

Upstash is a serverless data platform providing Redis and Kafka services. Adopting a cost-effective pay-per-request model where you only pay for what you use, it replicates data across 8+ regions to achieve global low-latency access. Seamlessly integrated with edge computing platforms like Vercel Edge, Cloudflare Workers, and Fastly Edge, it's optimized for modern serverless application development.

Details

Serverless Redis

Upstash Redis provides HTTP/REST API in addition to standard Redis protocol, making it accessible even in environments with TCP connection restrictions (such as Cloudflare Workers). Data is automatically replicated across multiple regions, enabling low-latency access from the nearest region.

Serverless Kafka

Serverless implementation of the Kafka event streaming platform, enabling message sending and receiving from serverless and edge functions via HTTP API. Message-based pricing flexibly accommodates traffic fluctuations.

Edge Optimization

Tested and optimized for major edge platforms including Vercel, Cloudflare, and Netlify, delivering optimal performance in globally distributed applications. Dedicated SDKs enhance the development experience in edge environments.

New Services in 2024

  • Vector: Serverless vector database for machine learning and AI applications
  • QStash: HTTP-based messaging service providing CRON scheduling and automatic retry functionality

Pros and Cons

Pros

  • True Pay-Per-Request: Zero idle charges, pay only for usage
  • Price Cap Guarantee: Monthly price ceiling prevents unexpected high bills
  • Global Low Latency: Automatic replication to 8+ regions
  • Edge Ready: Full support for Cloudflare Workers and Vercel Edge Functions
  • Standard Protocol: Redis/Kafka protocol compatibility for easy migration
  • Instant Startup: Request processing without cold starts
  • Developer Experience: Quick development with simple APIs and rich SDKs

Cons

  • Feature Limitations: Some advanced features unavailable compared to full Redis/Kafka
  • Latency Variation: Slight delays due to cross-region replication
  • Data Size Limits: May not be suitable for large datasets
  • Customization Limits: Limited configuration flexibility as managed service
  • Vendor Dependency: Dependency on Upstash platform

Reference Pages

Implementation Examples

Setup and Initial Configuration

# Install Upstash CLI
npm install -g @upstash/cli

# Login
upstash auth login

# Create Redis database
upstash redis create my-redis-db --region us-east-1

# Create Kafka cluster  
upstash kafka create my-kafka-cluster --region us-east-1

Using Redis SDK (Node.js)

// Install @upstash/redis
// npm install @upstash/redis

import { Redis } from '@upstash/redis';

// Initialize Redis connection
const redis = new Redis({
  url: process.env.UPSTASH_REDIS_REST_URL,
  token: process.env.UPSTASH_REDIS_REST_TOKEN,
});

// Basic key-value operations
async function basicOperations() {
  // Set value
  await redis.set('user:1', JSON.stringify({
    id: 1,
    name: 'Alice',
    email: '[email protected]'
  }));

  // Get value
  const user = await redis.get('user:1');
  console.log('User:', JSON.parse(user));

  // Set with TTL (60 seconds)
  await redis.setex('session:abc123', 60, 'user-data');

  // Increment counter
  const views = await redis.incr('page:home:views');
  console.log('Page views:', views);

  // List operations
  await redis.lpush('notifications', 'New message');
  const notification = await redis.rpop('notifications');
}

basicOperations();

Using with Cloudflare Workers

// wrangler.toml
// [vars]
// UPSTASH_REDIS_REST_URL = "https://xxx.upstash.io"
// UPSTASH_REDIS_REST_TOKEN = "your-token"

import { Redis } from '@upstash/redis/cloudflare';

export default {
  async fetch(request, env) {
    const redis = Redis.fromEnv(env);
    
    // Implement rate limiting
    const ip = request.headers.get('CF-Connecting-IP');
    const key = `rate:${ip}`;
    
    const current = await redis.incr(key);
    if (current === 1) {
      await redis.expire(key, 60); // Reset after 60 seconds
    }
    
    if (current > 100) {
      return new Response('Rate limit exceeded', { status: 429 });
    }
    
    // Implement caching
    const cacheKey = `cache:${request.url}`;
    const cached = await redis.get(cacheKey);
    
    if (cached) {
      return new Response(cached, {
        headers: { 'X-Cache': 'HIT' }
      });
    }
    
    // Actual processing...
    const response = 'Hello from Edge!';
    await redis.setex(cacheKey, 300, response); // Cache for 5 minutes
    
    return new Response(response, {
      headers: { 'X-Cache': 'MISS' }
    });
  }
};

Using Kafka

import { Kafka } from '@upstash/kafka';

const kafka = new Kafka({
  url: process.env.UPSTASH_KAFKA_REST_URL,
  username: process.env.UPSTASH_KAFKA_REST_USERNAME,
  password: process.env.UPSTASH_KAFKA_REST_PASSWORD,
});

// Producer
async function publishEvent() {
  const producer = kafka.producer();
  
  const message = {
    eventType: 'user.created',
    userId: '12345',
    timestamp: new Date().toISOString(),
    data: {
      name: 'Alice',
      email: '[email protected]'
    }
  };
  
  const response = await producer.produce('user-events', message);
  console.log('Message published:', response);
}

// Consumer
async function consumeEvents() {
  const consumer = kafka.consumer();
  
  const messages = await consumer.consume({
    consumerGroupId: 'my-group',
    instanceId: 'my-instance',
    topics: ['user-events'],
    autoOffsetReset: 'earliest',
  });
  
  messages.forEach(message => {
    console.log('Received:', message.value);
    // Message processing logic
  });
}

Using Vector (Vector Database)

import { Index } from '@upstash/vector';

const index = new Index({
  url: process.env.UPSTASH_VECTOR_REST_URL,
  token: process.env.UPSTASH_VECTOR_REST_TOKEN,
});

// Add vectors
async function addVectors() {
  await index.upsert([
    {
      id: 'doc1',
      vector: [0.1, 0.2, 0.3, 0.4],
      metadata: { title: 'Introduction to AI' }
    },
    {
      id: 'doc2', 
      vector: [0.2, 0.3, 0.4, 0.5],
      metadata: { title: 'Machine Learning Basics' }
    }
  ]);
}

// Similarity search
async function searchSimilar() {
  const queryVector = [0.15, 0.25, 0.35, 0.45];
  const results = await index.query({
    vector: queryVector,
    topK: 5,
    includeMetadata: true
  });
  
  results.forEach(result => {
    console.log(`Score: ${result.score}, Doc: ${result.metadata.title}`);
  });
}

Using QStash (Message Queue)

import { QStash } from '@upstash/qstash';

const qstash = new QStash({
  token: process.env.QSTASH_TOKEN,
});

// Send message
async function sendMessage() {
  await qstash.publishJSON({
    url: 'https://my-api.com/webhook',
    body: {
      action: 'send-email',
      to: '[email protected]',
      subject: 'Welcome!'
    },
    retries: 3,
    delay: '10s', // Deliver after 10 seconds
  });
}

// Setup CRON job
async function setupCron() {
  await qstash.schedules.create({
    destination: 'https://my-api.com/daily-report',
    cron: '0 9 * * *', // Daily at 9 AM
    body: { type: 'daily-report' }
  });
}