learn.sol

Optimizing RPC Performance for Production

Master RPC performance optimization for production Solana apps. Learn caching strategies, connection pooling, commitment levels, and patterns for DeFi, NFTs, and gaming.

Master performance optimization techniques for GetBlock RPC to build lightning-fast Solana applications that scale with reliable Blockchain-as-a-Service infrastructure.

What You'll Learn

  • Commitment level selection for different use cases
  • Caching strategies to reduce RPC calls
  • Connection pooling for high-throughput apps
  • Performance patterns for DeFi, NFTs, and gaming
  • Real-world optimization case studies

Prerequisites


Understanding Commitment Levels

Choosing the right commitment level is crucial for performance:

commitment-selection.ts
import { Connection, Commitment } from "@solana/web3.js";

interface UseCase {
  name: string;
  commitment: Commitment;
  reason: string;
  avgLatency: string;
}

const COMMITMENT_USE_CASES: UseCase[] = [
  {
    name: "Real-time price feeds",
    commitment: "processed",
    reason: "Need immediate updates, can handle occasional rollbacks",
    avgLatency: "~10ms"
  },
  {
    name: "DeFi transactions",
    commitment: "confirmed",
    reason: "Balance between speed and certainty",
    avgLatency: "~20-30ms"
  },
  {
    name: "NFT minting/transfers",
    commitment: "confirmed",
    reason: "Fast confirmation with low rollback risk",
    avgLatency: "~20-30ms"
  },
  {
    name: "Settlement/finality",
    commitment: "finalized",
    reason: "Absolute certainty required",
    avgLatency: "~15-20 seconds"
  },
  {
    name: "Account polling",
    commitment: "confirmed",
    reason: "Good balance for most monitoring",
    avgLatency: "~20-30ms"
  },
];

// Dynamic commitment selection based on use case
class SmartConnection {
  private connections: Map<Commitment, Connection>;
  
  constructor(endpoint: string) {
    this.connections = new Map([
      ["processed", new Connection(endpoint, "processed")],
      ["confirmed", new Connection(endpoint, "confirmed")],
      ["finalized", new Connection(endpoint, "finalized")],
    ]);
  }
  
  getConnection(commitment: Commitment): Connection {
    return this.connections.get(commitment)!;
  }
  
  // Fast reads (can tolerate rollbacks)
  async getFastAccountInfo(publicKey: PublicKey) {
    return this.connections.get("processed")!.getAccountInfo(publicKey);
  }
  
  // Reliable reads (recommended for most cases)
  async getReliableAccountInfo(publicKey: PublicKey) {
    return this.connections.get("confirmed")!.getAccountInfo(publicKey);
  }
  
  // Finalized reads (for critical operations)
  async getFinalizedAccountInfo(publicKey: PublicKey) {
    return this.connections.get("finalized")!.getAccountInfo(publicKey);
  }
}

Performance Tip
Use processed for UI updates that are immediately visible but not critical. Use confirmed for transaction submissions. Reserve finalized for operations where certainty is paramount.


Caching Strategies

Reduce RPC calls with intelligent caching:

caching.ts
import { Connection, PublicKey, AccountInfo } from "@solana/web3.js";

interface CacheEntry<T> {
  data: T;
  timestamp: number;
  expiresAt: number;
}

class CachedConnection {
  private connection: Connection;
  private cache: Map<string, CacheEntry<any>> = new Map();
  private readonly DEFAULT_TTL = 5000; // 5 seconds
  
  constructor(endpoint: string) {
    this.connection = new Connection(endpoint, "confirmed");
    
    // Cleanup expired entries every minute
    setInterval(() => this.cleanup(), 60000);
  }
  
  private getCacheKey(method: string, args: any[]): string {
    return `${method}:${JSON.stringify(args)}`;
  }
  
  private isExpired(entry: CacheEntry<any>): boolean {
    return Date.now() > entry.expiresAt;
  }
  
  private cleanup() {
    const now = Date.now();
    for (const [key, entry] of this.cache.entries()) {
      if (now > entry.expiresAt) {
        this.cache.delete(key);
      }
    }
  }
  
  async getAccountInfo(
    publicKey: PublicKey,
    ttl: number = this.DEFAULT_TTL
  ): Promise<AccountInfo<Buffer> | null> {
    const cacheKey = this.getCacheKey("getAccountInfo", [publicKey.toBase58()]);
    const cached = this.cache.get(cacheKey);
    
    if (cached && !this.isExpired(cached)) {
      console.log("Cache hit:", publicKey.toBase58());
      return cached.data;
    }
    
    console.log("Cache miss, fetching:", publicKey.toBase58());
    const data = await this.connection.getAccountInfo(publicKey);
    
    this.cache.set(cacheKey, {
      data,
      timestamp: Date.now(),
      expiresAt: Date.now() + ttl,
    });
    
    return data;
  }
  
  async getMultipleAccountsWithCache(
    publicKeys: PublicKey[],
    ttl: number = this.DEFAULT_TTL
  ): Promise<(AccountInfo<Buffer> | null)[]> {
    const results: (AccountInfo<Buffer> | null)[] = new Array(publicKeys.length);
    const uncachedIndices: number[] = [];
    const uncachedKeys: PublicKey[] = [];
    
    // Check cache for each key
    publicKeys.forEach((pk, index) => {
      const cacheKey = this.getCacheKey("getAccountInfo", [pk.toBase58()]);
      const cached = this.cache.get(cacheKey);
      
      if (cached && !this.isExpired(cached)) {
        results[index] = cached.data;
      } else {
        uncachedIndices.push(index);
        uncachedKeys.push(pk);
      }
    });
    
    // Fetch uncached accounts in one batch
    if (uncachedKeys.length > 0) {
      const fetchedAccounts = await this.connection.getMultipleAccountsInfo(uncachedKeys);
      
      fetchedAccounts.forEach((account, i) => {
        const originalIndex = uncachedIndices[i];
        results[originalIndex] = account;
        
        // Cache the result
        const cacheKey = this.getCacheKey("getAccountInfo", [uncachedKeys[i].toBase58()]);
        this.cache.set(cacheKey, {
          data: account,
          timestamp: Date.now(),
          expiresAt: Date.now() + ttl,
        });
      });
    }
    
    return results;
  }
  
  // Invalidate cache for specific account (useful after mutations)
  invalidate(publicKey: PublicKey) {
    const cacheKey = this.getCacheKey("getAccountInfo", [publicKey.toBase58()]);
    this.cache.delete(cacheKey);
  }
  
  // Clear entire cache
  clearCache() {
    this.cache.clear();
  }
  
  getCacheStats() {
    return {
      size: this.cache.size,
      entries: Array.from(this.cache.entries()).map(([key, entry]) => ({
        key,
        age: Date.now() - entry.timestamp,
        ttl: entry.expiresAt - Date.now(),
      })),
    };
  }
}

// Usage with different TTLs for different data types
const cachedConnection = new CachedConnection(
  process.env.NEXT_PUBLIC_SOLANA_RPC_MAINNET!
);

// Static data (program accounts) - cache longer
await cachedConnection.getAccountInfo(programId, 60000); // 1 minute

// Dynamic data (user balances) - cache shorter
await cachedConnection.getAccountInfo(userAccount, 5000); // 5 seconds

// Very dynamic data (token prices) - minimal cache
await cachedConnection.getAccountInfo(priceAccount, 1000); // 1 second

Connection Pooling

For high-throughput applications, use connection pools:

connection-pool.ts
import { Connection, ConnectionConfig } from "@solana/web3.js";

class ConnectionPool {
  private connections: Connection[] = [];
  private currentIndex: number = 0;
  private readonly poolSize: number;
  
  constructor(
    endpoint: string,
    poolSize: number = 5,
    config?: ConnectionConfig
  ) {
    this.poolSize = poolSize;
    
    // Create pool of connections
    for (let i = 0; i < poolSize; i++) {
      this.connections.push(new Connection(endpoint, config));
    }
    
    console.log(`Created connection pool with ${poolSize} connections`);
  }
  
  getConnection(): Connection {
    // Round-robin selection
    const connection = this.connections[this.currentIndex];
    this.currentIndex = (this.currentIndex + 1) % this.poolSize;
    return connection;
  }
  
  async executeParallel<T>(
    operations: (() => Promise<T>)[]
  ): Promise<T[]> {
    // Distribute operations across pool
    const chunks = this.chunkArray(operations, this.poolSize);
    const results: T[] = [];
    
    for (const chunk of chunks) {
      const chunkResults = await Promise.all(
        chunk.map(op => op())
      );
      results.push(...chunkResults);
    }
    
    return results;
  }
  
  private chunkArray<T>(array: T[], size: number): T[][] {
    const chunks: T[][] = [];
    for (let i = 0; i < array.length; i += size) {
      chunks.push(array.slice(i, i + size));
    }
    return chunks;
  }
  
  async getAllAccountInfos(publicKeys: PublicKey[]): Promise<AccountInfo[]> {
    const operations = publicKeys.map(pk => 
      async () => {
        const conn = this.getConnection();
        return conn.getAccountInfo(pk);
      }
    );
    
    return this.executeParallel(operations);
  }
}

// Usage for high-throughput scenarios
const pool = new ConnectionPool(
  process.env.NEXT_PUBLIC_SOLANA_RPC_MAINNET!,
  10, // 10 concurrent connections
  { commitment: "confirmed" }
);

// Fetch 1000 accounts efficiently
const accounts = await pool.getAllAccountInfos(thousandPublicKeys);

DeFi-Specific Optimizations

Performance patterns for DeFi applications:

defi-optimizations.ts
import { Connection, PublicKey } from "@solana/web3.js";

class DeFiOptimizedConnection {
  private connection: Connection;
  private priceCache: Map<string, { price: number; timestamp: number }> = new Map();
  private readonly PRICE_CACHE_TTL = 1000; // 1 second for prices
  
  constructor(endpoint: string) {
    this.connection = new Connection(endpoint, {
      commitment: "confirmed",
      confirmTransactionInitialTimeout: 60000,
    });
  }
  
  // Optimized for AMM pool data
  async getPoolState(poolAddress: PublicKey) {
    // Use processed for immediate updates
    const connection = new Connection(
      this.connection.rpcEndpoint,
      "processed"
    );
    
    return connection.getAccountInfo(poolAddress);
  }
  
  // Batch fetch multiple token prices
  async getBatchTokenPrices(tokenMints: PublicKey[]): Promise<Map<string, number>> {
    const prices = new Map<string, number>();
    const now = Date.now();
    const toFetch: PublicKey[] = [];
    
    // Check cache first
    tokenMints.forEach(mint => {
      const cached = this.priceCache.get(mint.toBase58());
      if (cached && (now - cached.timestamp) < this.PRICE_CACHE_TTL) {
        prices.set(mint.toBase58(), cached.price);
      } else {
        toFetch.push(mint);
      }
    });
    
    // Fetch uncached prices
    if (toFetch.length > 0) {
      // Fetch price accounts in parallel
      const priceAccounts = await this.connection.getMultipleAccountsInfo(toFetch);
      
      priceAccounts.forEach((account, i) => {
        if (account) {
          // Parse price from account data (implementation depends on oracle)
          const price = this.parsePrice(account.data);
          const mint = toFetch[i].toBase58();
          
          prices.set(mint, price);
          this.priceCache.set(mint, { price, timestamp: now });
        }
      });
    }
    
    return prices;
  }
  
  private parsePrice(data: Buffer): number {
    // Implement based on your price oracle (Pyth, Switchboard, etc.)
    // This is a placeholder
    return 0;
  }
  
  // Subscribe to pool updates for real-time trading
  subscribeToPoolUpdates(
    poolAddress: PublicKey,
    callback: (data: any) => void
  ): number {
    return this.connection.onAccountChange(
      poolAddress,
      (accountInfo) => {
        callback(accountInfo);
      },
      "processed" // Fastest updates for trading
    );
  }
  
  // Optimized transaction sending for time-sensitive trades
  async sendTradeTransaction(
    transaction: Transaction,
    signers: Signer[]
  ): Promise<string> {
    // Skip preflight for speed
    const signature = await this.connection.sendTransaction(
      transaction,
      signers,
      {
        skipPreflight: true,
        preflightCommitment: "processed",
        maxRetries: 0, // Don't retry, send immediately
      }
    );
    
    // Confirm in background
    this.connection.confirmTransaction(signature, "confirmed")
      .catch(err => console.error("Transaction confirmation failed:", err));
    
    return signature;
  }
}

DeFi Performance Trade-offs
Using processed commitment and skipping preflight gives maximum speed but increases risk. Use this pattern only for time-sensitive operations where speed is critical.


NFT-Specific Optimizations

Performance patterns for NFT applications:

nft-optimizations.ts
import { Connection, PublicKey } from "@solana/web3.js";
import { Metaplex } from "@metaplex-foundation/js";

class NFTOptimizedConnection {
  private connection: Connection;
  private metaplex: Metaplex;
  private metadataCache: Map<string, any> = new Map();
  
  constructor(endpoint: string) {
    this.connection = new Connection(endpoint, "confirmed");
    this.metaplex = new Metaplex(this.connection);
  }
  
  // Batch fetch NFT metadata efficiently
  async getBatchNFTMetadata(mintAddresses: PublicKey[]): Promise<any[]> {
    // Check cache
    const uncached: PublicKey[] = [];
    const results: any[] = new Array(mintAddresses.length);
    
    mintAddresses.forEach((mint, index) => {
      const cached = this.metadataCache.get(mint.toBase58());
      if (cached) {
        results[index] = cached;
      } else {
        uncached.push(mint);
      }
    });
    
    // Fetch uncached in batches of 100
    const BATCH_SIZE = 100;
    for (let i = 0; i < uncached.length; i += BATCH_SIZE) {
      const batch = uncached.slice(i, i + BATCH_SIZE);
      
      const metadatas = await this.metaplex.nfts().findAllByMintList({
        mints: batch,
      });
      
      metadatas.forEach((metadata, batchIndex) => {
        const originalIndex = mintAddresses.findIndex(
          m => m.equals(batch[batchIndex])
        );
        
        results[originalIndex] = metadata;
        this.metadataCache.set(batch[batchIndex].toBase58(), metadata);
      });
    }
    
    return results;
  }
  
  // Optimized for NFT collection queries
  async getCollectionNFTs(
    collectionAddress: PublicKey,
    limit: number = 100
  ): Promise<any[]> {
    // Use getProgramAccounts with filters for efficiency
    const TOKEN_METADATA_PROGRAM = new PublicKey(
      "metaqbxxUerdq28cj1RbAWkYQm3ybzjb6a8bt518x1s"
    );
    
    const accounts = await this.connection.getProgramAccounts(
      TOKEN_METADATA_PROGRAM,
      {
        filters: [
          {
            memcmp: {
              offset: 326, // Collection field offset
              bytes: collectionAddress.toBase58(),
            },
          },
        ],
        dataSlice: { offset: 0, length: 0 }, // Only fetch pubkeys first
      }
    );
    
    // Then fetch full metadata for first `limit` items
    const mints = accounts.slice(0, limit).map(a => a.pubkey);
    return this.getBatchNFTMetadata(mints);
  }
  
  // Monitor NFT transfers in real-time
  subscribeToNFTTransfers(
    mintAddress: PublicKey,
    callback: (transfer: any) => void
  ): number {
    // Subscribe to token account changes
    return this.connection.onProgramAccountChange(
      new PublicKey("TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA"),
      (accountInfo) => {
        // Parse and filter for this mint
        callback(accountInfo);
      },
      "confirmed",
      [
        {
          memcmp: {
            offset: 0,
            bytes: mintAddress.toBase58(),
          },
        },
      ]
    );
  }
}

Gaming-Specific Optimizations

Performance patterns for gaming applications:

gaming-optimizations.ts
import { Connection, PublicKey } from "@solana/web3.js";

class GamingOptimizedConnection {
  private connection: Connection;
  private stateCache: Map<string, { state: any; slot: number }> = new Map();
  
  constructor(endpoint: string) {
    // Use processed for real-time gaming
    this.connection = new Connection(endpoint, "processed");
  }
  
  // Ultra-low latency game state updates
  async getGameState(gameAccount: PublicKey): Promise<any> {
    const accountInfo = await this.connection.getAccountInfo(gameAccount);
    
    if (!accountInfo) return null;
    
    // Parse game state from account data
    const gameState = this.parseGameState(accountInfo.data);
    
    // Cache with slot number for conflict resolution
    const slot = await this.connection.getSlot("processed");
    this.stateCache.set(gameAccount.toBase58(), { state: gameState, slot });
    
    return gameState;
  }
  
  private parseGameState(data: Buffer): any {
    // Implement based on your game state structure
    return {};
  }
  
  // Subscribe to game events with minimal latency
  subscribeToGameEvents(
    gameAccount: PublicKey,
    callback: (event: any) => void
  ): number {
    return this.connection.onAccountChange(
      gameAccount,
      (accountInfo, context) => {
        const gameState = this.parseGameState(accountInfo.data);
        callback({ gameState, slot: context.slot });
      },
      "processed" // Fastest possible updates
    );
  }
  
  // Batch player actions
  async sendGameAction(
    transaction: Transaction,
    signers: Signer[]
  ): Promise<string> {
    // Send with minimal validation for speed
    const signature = await this.connection.sendRawTransaction(
      transaction.serialize(),
      {
        skipPreflight: true,
        preflightCommitment: "processed",
      }
    );
    
    return signature;
  }
  
  // Predictive state management for smooth gameplay
  async getPredictedState(gameAccount: PublicKey): Promise<any> {
    const cached = this.stateCache.get(gameAccount.toBase58());
    
    if (cached) {
      // Return cached state immediately for responsiveness
      // Fetch updated state in background
      this.getGameState(gameAccount);
      return cached.state;
    }
    
    // No cache, fetch fresh
    return this.getGameState(gameAccount);
  }
}

Performance Monitoring Dashboard

Create a real-time performance dashboard:

performance-dashboard.ts
import { Connection } from "@solana/web3.js";

interface PerformanceSnapshot {
  timestamp: number;
  tps: number;
  avgLatency: number;
  p95Latency: number;
  activeSubscriptions: number;
  cacheHitRate: number;
  errorRate: number;
}

class PerformanceDashboard {
  private connection: Connection;
  private snapshots: PerformanceSnapshot[] = [];
  private maxSnapshots = 100;
  
  constructor(connection: Connection) {
    this.connection = connection;
    
    // Collect metrics every 10 seconds
    setInterval(() => this.collectMetrics(), 10000);
  }
  
  private async collectMetrics() {
    const snapshot: PerformanceSnapshot = {
      timestamp: Date.now(),
      tps: await this.calculateTPS(),
      avgLatency: this.calculateAvgLatency(),
      p95Latency: this.calculateP95Latency(),
      activeSubscriptions: this.getActiveSubscriptions(),
      cacheHitRate: this.getCacheHitRate(),
      errorRate: this.getErrorRate(),
    };
    
    this.snapshots.push(snapshot);
    
    if (this.snapshots.length > this.maxSnapshots) {
      this.snapshots.shift();
    }
  }
  
  private async calculateTPS(): Promise<number> {
    const perfSamples = await this.connection.getRecentPerformanceSamples(1);
    return perfSamples[0]?.numTransactions / perfSamples[0]?.samplePeriodSecs || 0;
  }
  
  private calculateAvgLatency(): number {
    // Implement based on your metrics collection
    return 0;
  }
  
  private calculateP95Latency(): number {
    // Implement based on your metrics collection
    return 0;
  }
  
  private getActiveSubscriptions(): number {
    // Track your WebSocket subscriptions
    return 0;
  }
  
  private getCacheHitRate(): number {
    // Calculate from your cache implementation
    return 0;
  }
  
  private getErrorRate(): number {
    // Calculate from error tracking
    return 0;
  }
  
  // Export for visualization
  getMetricsHistory(): PerformanceSnapshot[] {
    return [...this.snapshots];
  }
  
  // Alert on anomalies
  checkHealth(): { healthy: boolean; issues: string[] } {
    const latest = this.snapshots[this.snapshots.length - 1];
    const issues: string[] = [];
    
    if (latest.avgLatency > 100) {
      issues.push("High average latency detected");
    }
    
    if (latest.errorRate > 5) {
      issues.push("High error rate detected");
    }
    
    if (latest.cacheHitRate < 50) {
      issues.push("Low cache hit rate");
    }
    
    return {
      healthy: issues.length === 0,
      issues,
    };
  }
}

Production Optimization Checklist

Before deploying optimized code:

Commitment Levels

  • Use processed only for non-critical real-time updates
  • Default to confirmed for most operations
  • Reserve finalized for critical finality requirements

Caching

  • Implement TTL-based caching for frequently accessed data
  • Invalidate cache after mutations
  • Monitor cache hit rates

Connection Management

  • Use connection pools for high-throughput scenarios
  • Implement proper connection recycling
  • Monitor connection health

Error Handling

  • Always implement retry logic
  • Have failover endpoints ready
  • Log performance metrics

Monitoring

  • Track latency percentiles (P50, P95, P99)
  • Monitor RPC request distribution
  • Set up alerts for anomalies

Summary

You've learned production-grade performance optimization including:

✅ Commitment level selection for different use cases
✅ Intelligent caching strategies to reduce RPC load
✅ Connection pooling for high-throughput applications
✅ Domain-specific patterns (DeFi, NFTs, Gaming)
✅ Real-time performance monitoring

Your application is now optimized for maximum performance with GetBlock's infrastructure!

Solana Assistant

AI-powered documentation helper

Welcome to Solana Assistant

Ask specific questions about Solana development:

Ask specific questions for better results400px
    Optimizing RPC Performance for Production | learn.sol