Metatextai Inference Api

v1.0.2

Metatext.AI Inference API integration. Manage data, records, and automate workflows. Use when the user wants to interact with Metatext.AI Inference API data.

0· 133·0 current·0 all-time
byMembrane Dev@membranedev
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/description say 'Metatext.AI Inference API' and the SKILL.md explains how to install and use the Membrane CLI to create a connection and run actions against Metatext.AI. Required artifacts and steps (membrane login, connect, action run, request proxy) are coherent with that purpose.
Instruction Scope
Instructions stay within the integration scope (install the Membrane CLI, authenticate via browser, create connections, run actions, and proxy requests). One important runtime behavior: requests and payloads are proxied through Membrane's service (Membrane will see the request bodies and handle auth). This is expected for a proxy-based integration but worth noting for sensitive data.
Install Mechanism
There is no automatic installer in the package; the SKILL.md recommends 'npm install -g @membranehq/cli' (and uses npx in examples). Installing a public npm CLI is a common pattern; it carries the usual npm package risks (supply-chain/backdoor if the package were malicious). Because installation is user-driven and from the @membranehq scope (known project), the risk is moderate and proportionate to functionality.
Credentials
The skill declares no required env vars, no config paths, and the instructions explicitly say Membrane manages credentials and you should not supply API keys. There are no unrelated credential requests.
Persistence & Privilege
always is false, the skill is user-invocable, and there is no instruction to modify other skills or system-wide config. No elevated persistence or unexplained privileges are requested.
Assessment
This skill is an instruction-only guide to using the Membrane CLI as a proxy to Metatext.AI and is internally consistent. Before installing or using it: (1) verify you trust the @membranehq npm package and its version (review package page, changelog, and publisher), (2) be aware that request payloads and any data you send will transit through Membrane's service and be visible to them — avoid sending highly sensitive secrets unless you accept that, (3) prefer using npx or a local install if you want to avoid a global npm install, and (4) test with non-sensitive data first and review Membrane's privacy/security docs (and the Metatext.AI docs) to confirm they meet your requirements.

Like a lobster shell, security has layers — review code before you run it.

latestvk9744wjkrmw2nmm95w777dje3h843mjn

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Comments