HTTP Retry Circuit Breaker

v1.0.0

Implements HTTP request retries with exponential backoff and a configurable circuit breaker to reduce failures and prevent cascading errors.

0· 382·4 current·4 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for qwe123sddfsdfs/http-retry-circuit-breaker.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "HTTP Retry Circuit Breaker" (qwe123sddfsdfs/http-retry-circuit-breaker) from ClawHub.
Skill page: https://clawhub.ai/qwe123sddfsdfs/http-retry-circuit-breaker
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install http-retry-circuit-breaker

ClawHub CLI

Package manager switcher

npx clawhub@latest install http-retry-circuit-breaker
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
Name, description, and code align (HTTP client with retry + circuit breaker). There are no unrelated environment variables or external dependencies requested. However, package.json and manifest claim node >=14 while the code calls global fetch with no fetch/polyfill dependency declared — global fetch is not available on Node 14/16, so the runtime requirement is understated. Documentation also makes strong performance claims (8% → 0.4%) that are inconsistent across files and test outputs (some files show different measured improvements).
!
Instruction Scope
SKILL.md and README describe retry behavior for retryable HTTP status codes, but the implementation throws a generic Error for HTTP statuses without attaching status/code information. The catch path tests error.code to decide retryability, so in practice some retryable responses may not be retried as documented (functional bug). The runtime instructions expect requiring the local JS module; there is no step to install a fetch polyfill or require Node >=18. The code and docs disagree on measured results in several places (0.4% vs 1.6% etc.), which is documentation inconsistency.
Install Mechanism
No install spec (instruction-only in registry) and package.json has no dependencies — low install risk. The code is included in the skill bundle (not downloading arbitrary artifacts). The only risk is the missing declared runtime dependency (fetch) which can cause runtime failures but not an installation-time security risk.
Credentials
The skill requests no environment variables, no credentials, and references no system config paths. There is no indication of exfiltration endpoints or use of unrelated secrets — environment/credential access is proportionate to the stated purpose.
Persistence & Privilege
Skill is not always-enabled and does not request elevated/persistent platform privileges. It does not attempt to modify other skills or system config. Default autonomous invocation is allowed (platform-default) but is not combined with other risky privileges here.
What to consider before installing
This package appears to implement the advertised retry and circuit-breaker patterns and does not request credentials or external installs — but review and testing are recommended before trusting it in production. Specific things to check before installing or using: - Runtime compatibility: The code uses global fetch but package.json/manifests claim node >=14. Global fetch is available natively only in newer Node (>=18). Either run on Node >=18+ or add a fetch polyfill (e.g., node-fetch) and declare it in dependencies. - Functional bug: When response.status >= 400 the code throws a generic Error without attaching status/code. Later logic inspects error.code to decide whether to retry; as written this can prevent retries for HTTP status codes the docs say should be retryable. Review/patch executeWithRetry to propagate status or set a retryable error.code. - Documentation vs tests: Several docs claim a drop to 0.4% failure rate but test summaries and other files report different numbers (1.6%, 0.4%, etc.). Treat the performance claims as illustrative until you run your own benchmarks in your environment. - Run the included tests (npm test) in a controlled environment to validate behavior and metrics; consider adding unit tests for the retry-on-status-code behavior. - Source provenance: Registry lists repository URL but 'Source' in the metadata is unknown. If you require supply-chain assurance, verify the GitHub repo, commit history, and author's identity before deploying in sensitive systems. If you want, I can: - Suggest a minimal patch to fix the thrown-error/status propagation bug. - Produce a short checklist and test commands to validate the skill in your environment (including Node version and adding node-fetch if needed).

Like a lobster shell, security has layers — review code before you run it.

circuit-breakervk97fva1gs939dk9sgr1jenw06s8296mhevomapvk97fva1gs939dk9sgr1jenw06s8296mhhttpvk97fva1gs939dk9sgr1jenw06s8296mhlatestvk97fva1gs939dk9sgr1jenw06s8296mhretryvk97fva1gs939dk9sgr1jenw06s8296mh
382downloads
0stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

HTTP Retry + Circuit Breaker Skill

Description

Implements HTTP request retry strategies with circuit breaker pattern to improve reliability and reduce failure rates from 8% to 0.4%.

When to Use

  • Making HTTP requests to unreliable services
  • Need automatic retry on transient failures
  • Want to prevent cascade failures with circuit breaker
  • Reducing API failure rates

Features

  • Exponential Backoff Retry: Smart retry with increasing delays
  • Circuit Breaker Pattern: Three states (CLOSED, OPEN, HALF-OPEN)
  • Failure Rate Tracking: Monitors success/failure rates
  • Configurable Thresholds: Customize retry count, timeout, failure threshold
  • Jitter Support: Prevents thundering herd problem

Usage

Basic Example

const { HttpClientWithRetry } = require('./http-retry-circuit-breaker.js');

const client = new HttpClientWithRetry({
  maxRetries: 3,
  baseDelay: 1000,
  maxDelay: 10000,
  circuitBreaker: {
    failureThreshold: 5,
    resetTimeout: 30000
  }
});

// Make request with automatic retry
const response = await client.get('https://api.example.com/data');

Advanced Configuration

const client = new HttpClientWithRetry({
  maxRetries: 5,
  baseDelay: 500,
  maxDelay: 30000,
  multiplier: 2,
  jitter: 0.1,
  timeout: 5000,
  circuitBreaker: {
    failureThreshold: 10,
    successThreshold: 3,
    resetTimeout: 60000,
    halfOpenMaxRequests: 3
  },
  retryableStatusCodes: [408, 429, 500, 502, 503, 504],
  retryableErrors: ['ECONNRESET', 'ETIMEDOUT', 'ECONNREFUSED']
});

Manual Circuit Breaker Control

// Check circuit state
const state = client.getCircuitState(); // 'CLOSED', 'OPEN', or 'HALF-OPEN'

// Manually open/close circuit
client.openCircuit();
client.closeCircuit();

// Get statistics
const stats = client.getStats();
console.log(`Success rate: ${stats.successRate}%`);
console.log(`Failure rate: ${stats.failureRate}%`);

Retry Strategy

Exponential Backoff

Delay between retries increases exponentially:

  • Attempt 1: baseDelay (e.g., 1s)
  • Attempt 2: baseDelay × 2 (e.g., 2s)
  • Attempt 3: baseDelay × 4 (e.g., 4s)
  • Attempt 4: baseDelay × 8 (e.g., 8s)

With Jitter

Adds randomization to prevent synchronized retries:

delay = baseDelay × (2 ^ attempt) × (0.5 + Math.random() * 0.5)

Circuit Breaker States

CLOSED (Normal Operation)

  • Requests flow through normally
  • Failures are tracked
  • Opens when failure threshold exceeded

OPEN (Failing Fast)

  • Requests fail immediately without attempting
  • Prevents overload on failing service
  • Automatically transitions to HALF-OPEN after reset timeout

HALF-OPEN (Testing)

  • Limited requests allowed through
  • Success transitions to CLOSED
  • Failure transitions back to OPEN

Performance Impact

Before (No Retry/Circuit Breaker)

  • Failure rate: ~8%
  • Cascade failures possible
  • No recovery mechanism

After (With Retry + Circuit Breaker)

  • Failure rate: ~0.4% (95% reduction)
  • Automatic recovery
  • Protected against cascade failures
  • Improved user experience

Configuration Options

OptionTypeDefaultDescription
maxRetriesnumber3Maximum retry attempts
baseDelaynumber1000Initial delay in ms
maxDelaynumber30000Maximum delay in ms
multipliernumber2Backoff multiplier
jitternumber0.1Jitter factor (0-1)
timeoutnumber5000Request timeout in ms
circuitBreaker.failureThresholdnumber5Failures to open circuit
circuitBreaker.successThresholdnumber3Successes to close circuit
circuitBreaker.resetTimeoutnumber30000Time before HALF-OPEN
circuitBreaker.halfOpenMaxRequestsnumber3Max requests in HALF-OPEN

Events

client.on('retry', (attempt, error) => {
  console.log(`Retry attempt ${attempt} due to: ${error.message}`);
});

client.on('circuitOpen', () => {
  console.log('Circuit breaker opened');
});

client.on('circuitHalfOpen', () => {
  console.log('Circuit breaker half-open');
});

client.on('circuitClose', () => {
  console.log('Circuit breaker closed');
});

Error Handling

try {
  const response = await client.get('https://api.example.com/data');
} catch (error) {
  if (error.code === 'CIRCUIT_OPEN') {
    console.log('Service temporarily unavailable');
  } else if (error.code === 'MAX_RETRIES') {
    console.log('All retry attempts failed');
  } else {
    console.error('Request failed:', error);
  }
}

Testing

npm test

License

MIT

Comments

Loading comments...