feapder
PassAudited by ClawScan on May 10, 2026.
Overview
This is a coherent, instruction-only feapder development aid; the main cautions are user-run crawler/database workflows and unsafe eval patterns in vendored reference code, not hidden automatic behavior.
This skill appears safe to install as a feapder reference assistant. Before running any generated spider or CLI command, confirm the target website, Redis key, database name, and credentials; avoid pasting real cookies or secrets into shared prompts or committed files; and be cautious with feapder 1.9.2 internals that use eval on task/debug strings.
Findings (3)
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
If a project runs this kind of code against attacker-controlled Redis/MySQL task data, a crafted task string could execute Python code.
The vendored feapder reference implementation contains dynamic evaluation of task data. It is not auto-run by this instruction-only skill, but it is an unsafe pattern if the upstream code or copied logic processes untrusted task strings.
tasks = [eval(task) for task in tasks]
Do not copy eval-based parsing into new spider code. Prefer json.loads or ast.literal_eval, and only run feapder 1.9.2 task/debug flows against trusted data or patched code.
Running generated batch crawler code against the wrong database or Redis namespace could mark many tasks complete or failed incorrectly.
The documented BatchSpider workflow includes updating task state in backing storage. This is expected for feapder batch crawlers, but it is still a database/queue mutation when generated code is run.
yield self.update_task_batch(request.task_id, 1) # 更新任务状态为1
Review redis_key, task_table, task_state, and database settings before running generated spiders, and test against a non-production environment first.
Copying authenticated browser curl commands into prompts, scripts, or shell history can expose session cookies.
The debugging documentation shows raw curl commands that may include session cookies. This is a normal crawler-debugging technique, but cookies are sensitive credentials.
feapder shell --curl 'https://www.baidu.com/' ... -H 'Cookie: PSTM=...; BDUSS=...'
Redact cookies and authorization headers before sharing prompts or committing code, and use temporary test sessions when debugging authenticated pages.
