全网搜

Integrates with the web search API to fetch news and articles from Baidu, Google, and a pre-indexed Elasticsearch database using comprehensive search by defa...

MIT-0 · Free to use, modify, and redistribute. No attribution required.
0 · 82 · 0 current installs · 0 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/description (global web search across Baidu/Google/ES) matches the SKILL.md and example scripts which perform POST requests to the documented /web_search endpoint and run parallel searches. There are no unrelated env vars, binaries, or config paths requested.
Instruction Scope
Runtime instructions are limited to POSTing form data (keyword, mode, search_source, page) to the remote API and aggregating results; they do not instruct the agent to read local files, credentials, or other system state. Note: all user search keywords and queries will be sent to an external service (IP 101.245.108.220), which may log or retain them.
Install Mechanism
Instruction-only skill with no install spec and no code files to execute on install — minimal install risk. No archive downloads or third-party package installs are present.
Credentials
No environment variables, credentials, or config paths are requested. This is proportionate to a simple web-search integration. Users should still be aware that queries are transmitted to a remote host (no authentication) which may have privacy implications.
Persistence & Privilege
The skill does not request always:true and is not granted elevated persistence. It is user-invocable and can be invoked autonomously by the agent by default (normal platform behavior). It does not modify other skills or system-wide settings.
Assessment
This skill is coherent for web searching: it simply sends user-provided keywords to a remote HTTP API and returns results. Before installing or enabling autonomous use, consider whether you trust the remote service at IP 101.245.108.220 (it is not a recognizable public domain). Avoid sending sensitive or private queries through this skill because the upstream service requires no authentication and could log or retain queries and responses. If you need stronger privacy or reliability guarantees, prefer a vetted/search provider API (with documented ownership, TLS, and an auth model) or run your own indexed search backend. If you are uncomfortable, disable autonomous invocation or only call the skill manually with non-sensitive test queries first.

Like a lobster shell, security has layers — review code before you run it.

Current versionv1.0.10
Download zip
latestvk979ahj197dbygefwrxn2pcbad83ybk1

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

🔍 Clawdis

SKILL.md

全网搜索 (Whole Network Search)

This skill guides the agent to call the web search API for retrieving articles and news from multiple sources.

When to Use

Apply this skill when the user:

  • Asks to search the web or gather information online
  • Needs news articles or references by keyword
  • Wants to retrieve content from Baidu, Google, or a local ES index
  • Requires real-time web search or pre-indexed warehouse search

By default, this skill performs comprehensive search across ALL available sources simultaneously to provide the most complete results.

API Overview

Endpoint: POST /web_search

Base URL: http://101.245.108.220:9004

Authentication: No authentication required (free service)

Request

For comprehensive search (default behavior), the skill will use the script from overall.md to perform 4 parallel API calls automatically:

  • Call 1: search_source=baidu_search, mode=network (百度新闻/资讯)
  • Call 2: search_source=google_search, mode=network (谷歌新闻/资讯)
  • Call 3: search_source=baidu_search_ai, mode=network (百度 AI 搜索)
  • Call 4: mode=warehouse (Elasticsearch 索引库,忽略 search_source)

Headers

HeaderRequiredDescription

| Content-Type | Yes | application/x-www-form-urlencoded (form data) |

Form Parameters

ParameterTypeRequiredDefaultDescription
keywordstringYes-Search keyword(s),多个关键词用空格分隔
search_sourcestringNo-Engine: baidu_search, google_search, baidu_search_ai. Note: Ignored when using default comprehensive search
modestringNo-network = live crawl, warehouse = ES index. Note: Ignored when using default comprehensive search
pageintNo1Page number (starts from 1)

Comprehensive Search (All Sources)

当用户要求进行全面搜索(即同时搜索所有可用来源)时,必须使用overall.md中的脚本进行搜索,而不是使用下面的示例代码。

When the user wants to search across ALL available sources simultaneously (comprehensive search), you should:

Example Implementation (Python asyncio):

import aiohttp
import asyncio

API_URL = "http://101.245.108.220:9004/web_search"

headers = {"Content-Type": "application/x-www-form-urlencoded"}

SEARCH_CONFIGS = [
    {"name": "百度搜索", "mode": "network", "search_source": "baidu_search"},
    {"name": "谷歌搜索", "mode": "network", "search_source": "google_search"},
    {"name": "百度 AI 搜索", "mode": "network", "search_source": "baidu_search_ai"},
    {"name": "全库搜", "mode": "warehouse", "search_source": None}
]

async def fetch_search(session, semaphore, config, keyword, page):
    async with semaphore:
        data = {
            "keyword": keyword,
            "page": page,
            "mode": config['mode'],
        }
        if config['search_source']:
            data["search_source"] = config['search_source']
        
        async with session.post(API_URL, headers=headers, data=data) as response:
            result = await response.json()
            return result.get('references', [])

async def comprehensive_search(keyword, page=1):
    async with aiohttp.ClientSession() as session:
        semaphore = asyncio.Semaphore(5)  # Max 5 concurrent requests
        tasks = [fetch_search(session, semaphore, config, keyword, page) 
                 for config in SEARCH_CONFIGS]
        results = await asyncio.gather(*tasks)
        # Flatten all references into one list
        all_references = [ref for refs in results for ref in refs]
        return all_references

asyncio.run(comprehensive_search("人工智能"))

Parameter Constraints

  • search_source: One of baidu_search, google_search, baidu_search_ai
  • mode: One of network, warehouse
  • When mode=warehouse, search is performed against the Elasticsearch index (ignores search_source)
  • When mode=network, use search_source to select Baidu, Google, or Baidu AI search

Response Format

{
  "code": 200,
  "message": "success",
  "references": [
    {
      "title": "Article title",
      "sourceAddress": "https://example.com/article",
      "origin": "Source name",
      "publishDate": "2025-03-24 12:00:00",
      "summary": "Article summary or snippet"
    }
  ]
}

Usage Examples

Example 1: Search Baidu news

POST http://101.245.108.220:9004/web_search
Headers: Content-Type: application/x-www-form-urlencoded
Body (form): keyword=人工智能&search_source=baidu_search&mode=network&page=1

Example 2: Search Google news

POST http://101.245.108.220:9004/web_search
Headers: Content-Type: application/x-www-form-urlencoded
Body (form): keyword=AI&search_source=google_search&mode=network&page=1

Example 3: Search warehouse (ES index)

POST http://101.245.108.220:9004/web_search
Headers: Content-Type: application/x-www-form-urlencoded
Body (form): keyword=机器学习&mode=warehouse&page=1

Example 4: cURL

curl -X POST "http://101.245.108.220:9004/web_search" \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "keyword=科技新闻&search_source=baidu_search&mode=network&page=1"

Error Codes

CodeMessageCause

| 400 | search_source参数错误 | Invalid search_source value | | 400 | mode参数错误 | Invalid mode value | | 400 | page参数错误 | Invalid page (non-integer or 0) |

Integration Steps

  1. API base URL: http://101.245.108.220:9004,无需配置API密钥(免费服务)
  2. By default, the skill will perform comprehensive search across all sources using the script from overall.md. Optional: Determine the desired search_source (Baidu / Google / Baidu AI) or mode (network / warehouse) if you want to override the default comprehensive search behavior
  3. Call POST /web_search with form-encoded parameters
  4. Parse references from the response and use title, sourceAddress, summary as needed

Files

3 total
Select a file
Select a file to preview.

Comments

Loading comments…