Tavily Search
v1.0.0Tavily 搜索 API 集成 | Tavily Search API Integration. 高质量网络搜索、新闻聚合、信息调研 | High-quality web search, news aggregation, research. 触发词:搜索、search、tavily、新闻.
Security Scan
OpenClaw
Suspicious
high confidencePurpose & Capability
Name/description match the implemented behavior: scripts call Tavily MCP endpoints (https://mcp.tavily.com) for search/crawl/extract/research and can save crawl output to local files. However, the skill metadata only declares 'bash' as a required binary while the scripts also rely on curl, jq, base64, date, and npx at runtime—these are not declared. The scripts also attempt to discover an OAuth token from a local cache (~/.mcp-auth), which is consistent with an OAuth-friendly client but is not documented in the registry metadata as a required config/access.
Instruction Scope
Runtime instructions and included scripts will: (a) read files under $HOME/.mcp-auth searching for '*_tokens.json', (b) run npx -y mcp-remote to initiate an OAuth browser flow (downloads and executes code from npm at runtime), (c) make outbound HTTPS requests to mcp.tavily.com, and (d) write crawled pages to any output directory you pass. Reading the local MCP auth cache is scope-relevant for OAuth but may be surprising because it inspects user files and could encounter other token files; the script tries to validate tokens by checking the JWT issuer and expiry, which mitigates but does not eliminate the concern.
Install Mechanism
There is no formal install spec, but the scripts call npx -y mcp-remote at runtime. npx will fetch and execute a package from the public npm registry on demand (transient download/execute). That is a moderate-to-high install-time risk compared with an instruction-only script that does not fetch code. The rest of the skill files are local shell scripts (no packaged third-party install), so the primary runtime risk is dynamic npx execution.
Credentials
The skill does not declare any required environment variables in registry metadata, but the scripts use TAVILY_API_KEY if present and will attempt to obtain an OAuth token from ~/.mcp-auth if not. Requesting/using a Tavily API token is proportional to the skill's purpose. Two small issues: (1) TAVILY_API_KEY is optional but not documented in the metadata 'requires.env' (user-facing docs in SKILL.md do mention it), and (2) the script reads ~/.mcp-auth token files — while it filters tokens by issuer and expiry, it still inspects local auth caches which could contain tokens for other tools if stored there.
Persistence & Privilege
The skill is not always: true and does not request persistent elevated platform privileges. It writes crawl output only to an explicit output directory you pass, and it does not modify other skills or global agent settings. Runtime npx execution is transient and not a declared persistent install.
What to consider before installing
This skill largely does what it says (Tavily search/crawl/extract/research), but review the following before installing: 1) The scripts rely on tools not listed in metadata (curl, jq, base64, npx). Ensure these binaries are present or install them beforehand. 2) On first run the scripts will call 'npx -y mcp-remote ...' which downloads and executes a package from npm to run an OAuth flow — if you prefer not to fetch code at runtime, set TAVILY_API_KEY manually in your agent settings and avoid the OAuth path. 3) The scripts search your home directory for ~/.mcp-auth/*_tokens.json; they only accept tokens whose JWT issuer matches https://mcp.tavily.com/ and check expiry, but this still reads local token cache files — if that is sensitive, run the skill in an isolated environment or remove/inspect that directory first. 4) Network calls target mcp.tavily.com (and docs mention api.tavily.com); verify these domains are expected. If you need lower risk, request the same functionality from a version that documents required binaries and avoids runtime npx fetches. If you decide to proceed, consider specifying TAVILY_API_KEY manually and running in a sandboxed environment.Like a lobster shell, security has layers — review code before you run it.
Runtime requirements
🔍 Clawdis
Binsbash
latest
Tavily Search
Tavily 是一个专业的搜索 API,提供高质量、快速、结构化的搜索结果。
功能
网络搜索
- 智能搜索 - LLM 优化的搜索结果
- 内容提取 - 自动提取网页内容摘要
- 相关性评分 - 每个结果带有相关性分数
新闻搜索
- 时间过滤 - 按天、周、月、年过滤
- 域名过滤 - 指定搜索特定网站
- 深度搜索 - basic/advanced 搜索模式
研究工具
- crawl - 网页爬取
- extract - 内容提取
- research - 深度研究
使用方式
基本搜索
./search/scripts/search.sh '{"query": "AI 最新进展", "max_results": 10}'
新闻搜索
./search/scripts/search.sh '{"query": "科技新闻", "time_range": "week", "max_results": 10}'
域名过滤
./search/scripts/search.sh '{"query": "机器学习", "include_domains": ["arxiv.org", "github.com"]}'
子技能
search- 网络搜索crawl- 网页爬取extract- 内容提取research- 深度研究
认证
使用 Tavily API Key 或 OAuth 认证。
获取 API Key: https://tavily.com
Tavily Search, 智能搜索 🔍
Comments
Loading comments...
