Wolfram Alpha (LLM API)

Delegate precise, formalizable computations and factual lookups to Wolfram|Alpha via its LLM API (HTTP) to get verified results and reduce arithmetic/modelin...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
0 · 46 · 0 current installs · 0 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/description match the code and instructions. Required binary (python3) and required env var (WOLFRAM_APP_ID) are appropriate for calling the Wolfram|Alpha LLM API. No unrelated services or credentials are requested.
Instruction Scope
SKILL.md instructs use of the bundled wrapper script and documents parameters, localization options, and caching. Instructions do not ask the agent to read unrelated files or secrets. Behavior stays within the stated purpose (making API calls and returning results).
Install Mechanism
There is no external install step or download; the skill is instruction-only with a bundled Python script. No remote code fetch or archive extraction is performed.
Credentials
Only WOLFRAM_APP_ID is required. That single credential is proportional and necessary for the stated API usage. No other tokens/keys/config paths are requested.
Persistence & Privilege
The wrapper uses a local cache directory (~/.cache/openclaw-wolfram-alpha/) and writes responses there by default (7d TTL). This is reasonable for quota savings, but cached responses may contain sensitive query results and should be considered when sharing a machine or backups.
Assessment
This skill appears to do what it says and only needs python3 plus your WOLFRAM_APP_ID. Before installing: (1) Prefer the default --auth bearer mode so your AppID is sent in an Authorization header (keeps it out of URLs and logs); avoid --auth query because that places the AppID in the URL (the script prints the URL on HTTP errors). (2) Be aware the skill writes cached API responses to ~/.cache/openclaw-wolfram-alpha/ (default TTL 7 days) — treat that directory as potentially containing sensitive outputs. (3) The skill will make outbound HTTPS calls to https://www.wolframalpha.com/api/v1/llm-api; ensure your environment/network policy allows this and that you trust the Wolfram AppID’s permissions. (4) The skill source is included in the package (you can inspect scripts/wa_llm.py); if you need higher assurance, review the code yourself. If any of these concerns are unacceptable (e.g., local caching or outbound network calls), do not install/use the skill or run it with cache disabled and/or in a restricted environment.

Like a lobster shell, security has layers — review code before you run it.

Current versionv0.1.0
Download zip
latestvk975x5r45pcvbmfh8g7vrtpyzx833x8v

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

📐 Clawdis
Binspython3
EnvWOLFRAM_APP_ID

SKILL.md

Wolfram|Alpha (LLM API) skill

Use the bundled wrapper script to call Wolfram|Alpha's LLM API and return concise, model-ingestible results.

Preconditions

  • Environment variable WOLFRAM_APP_ID must be set (your Wolfram|Alpha AppID). If it is not set, ask your human to set it (do not guess or hardcode keys).

Quick start

Run:

# default: cache ON (7d), auth via bearer header (keeps AppID out of URL)
python3 skills/wolfram-alpha-llm/scripts/wa_llm.py \
  --input "solve x^2 + 3x + 2 = 0"

What to send as --input

  • Prefer short English keyword-style queries when possible.
  • If the user asked in another language, translate to English for the API call, then answer in the user’s original language.
  • When you need an exact computation, be explicit (e.g., integrate sin(x)^2 from 0 to pi).

Core parameters (use these most)

  • --input (required): the query.
  • --maxchars (optional, default 2500): cap response length.
  • --units (optional): set units system, if needed for conversions/physics (metric is often a good default when unspecified).
  • --assumption (optional, repeatable): disambiguate when WA returns irrelevant interpretation or offers assumptions.

High-value optional parameters (use when relevant)

  • Localization / context:
    • --countrycode, --languagecode
    • --timezone
    • One of: --ip | --latlong | --location (pick exactly one)
  • Finance:
    • --currency (e.g., USD, EUR)
  • Performance / robustness:
    • --scantimeout, --parsetimeout, --formattimeout, --totaltimeout

Output handling guidance

  • Treat output as computed evidence: quote the key result, then add minimal interpretation.
  • If the result is too long/noisy, rerun with a smaller --maxchars.
    • Heuristic: for simple conversions / arithmetic / single-value answers, try --maxchars 800.
    • Keep default --maxchars 2500 for most multi-line or explanation-heavy results.
  • If the interpretation is wrong:
    1. retry with --assumption ... (use WA-provided suggestions when available),
    2. only then rephrase/simplify --input.

Wrapper script

  • Script: skills/wolfram-alpha-llm/scripts/wa_llm.py
  • Auth:
    • default --auth bearer: sends Authorization: Bearer <AppID> header (keeps AppID out of the URL)
    • --auth query: sends appid as URL parameter
  • Cache:
    • default --cache on with --cache-ttl 604800 (7d)
    • stores best-effort results in: ~/.cache/openclaw-wolfram-alpha/
  • Returns:
    • stdout: API text body
    • stderr: errors, HTTP status context

For parameter details and error behaviors, see:

  • skills/wolfram-alpha-llm/references/llm-api.md
  • skills/wolfram-alpha-llm/references/full-api-params.md

Files

4 total
Select a file
Select a file to preview.

Comments

Loading comments…