每日从集思录抓取可转债基本数据、强赎倒计时、下修倒计时,支持Cookie管理和本地持久化存储

v1.0.0

每日从集思录抓取可转债基本数据、强赎倒计时、下修倒计时,支持Cookie管理和本地持久化存储

0· 43·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for cchunter/jisulu-cb-daily.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "每日从集思录抓取可转债基本数据、强赎倒计时、下修倒计时,支持Cookie管理和本地持久化存储" (cchunter/jisulu-cb-daily) from ClawHub.
Skill page: https://clawhub.ai/cchunter/jisulu-cb-daily
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install jisulu-cb-daily

ClawHub CLI

Package manager switcher

npx clawhub@latest install jisulu-cb-daily
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (daily scrape of jisilu convertible-bond data) matches the included Python script and SKILL.md. Required inputs (kbzw__user_login cookie) and outputs (CSV under output/) are coherent with the stated purpose; no unrelated services, binaries, or credentials are requested.
Instruction Scope
SKILL.md instructs only to obtain a site login cookie, call three jisilu API endpoints, clean and save data locally, and report a summary. It does not instruct reading unrelated system files or contacting endpoints outside jisilu.cn. The code implements the documented steps.
Install Mechanism
This is an instruction-only skill with a small Python script; no install spec or remote downloads are present. Dependencies (requests, pandas) are reasonable and declared.
Credentials
The skill asks the user to supply the kbzw__user_login cookie and stores it in references/cookie.json. That is necessary for accessing member-only jisilu data, but cookies are sensitive credentials—the request is proportionate to the functionality but users should understand the privacy/security implications of providing and persisting a login cookie.
Persistence & Privilege
The skill writes cookie.json and output CSVs inside its own skill directory (references/ and output/). It does not request always:true or modify other skills or system-wide settings. File writes are limited to the skill folder.
Assessment
This skill appears to do what it says: locally scrape jisilu.cn using a logged-in cookie and save CSVs. Before installing: (1) review the script yourself (it's short and readable); (2) only run it locally or in an environment you control (not on a public cloud) because it requires a login cookie; (3) consider using an account with minimal privileges or rotating/deleting the cookie after use; (4) never paste cookies into public places or share them with others; (5) verify network policies if you need to ensure data does not leave your machine. If you are uncomfortable storing your full session cookie, decline to provide it — the skill will not be able to fetch member-only fields without it.

Like a lobster shell, security has layers — review code before you run it.

latestvk9736j61mqap5bsz8vq6gx22r185qat1
43downloads
0stars
1versions
Updated 7h ago
v1.0.0
MIT-0

前置依赖

  • Python 3.8+
  • 依赖包:requests, pandas
  • 集思录账号(需要登录后的 kbzw__user_login Cookie)

Cookie 管理规范

存储位置

Cookie 存储在 Skill 目录下的 references/cookie.json

{
  "kbzw__user_login": "用户输入的cookie值",
  "updated_at": "2026-04-28"
}

检查逻辑

  1. 每次执行任务前,先检查 references/cookie.json 是否存在且包含 kbzw__user_login

  2. 如果不存在或为空,立即停止数据抓取,向用户发送以下提示:

    【集思录登录Cookie缺失】
    
    本Skill需要集思录登录Cookie才能获取完整数据(尤其是强赎/下修等会员数据)。
    
    请按以下步骤获取 kbzw__user_login:
    1. 用Chrome/Edge打开 https://www.jisilu.cn/ 并登录账号
    2. F12打开开发者工具 → Application/应用 → Cookies → https://www.jisilu.cn
    3. 找到名为 kbzw__user_login 的Cookie,复制其Value值
    4. 将Cookie值粘贴给我
    
    获取后我将自动保存,后续每日抓取无需重复输入。
    
  3. 收到用户提供的 Cookie 后,写入 references/cookie.json,然后继续执行

数据源与接口

接口1:可转债基本数据

  • URL: https://www.jisilu.cn/web/data/cb/list
  • 方法: GET
  • Headers:
    User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36
    Referer: https://www.jisilu.cn/web/data/cb/list
    Cookie: kbzw__user_login={cookie值}
    
  • 返回格式: JSON
    {
      "rows": [
        {
          "id": "123045",
          "cell": {
            "bond_id": "123045",
            "bond_nm": "某转债",
            "price": "105.500",
            "increase_rt": "+1.23%",
            "stock_nm": "某正股",
            "sprice": "10.50",
            "sincrease_rt": "+2.10%",
            "convert_price": "12.50",
            "convert_value": "84.00",
            "premium_rt": "25.60%",
            "force_redeem_price": "16.25",
            "put_convert_price": "8.75",
            "year_left": "3.520",
            "ytm_rt": "2.35%",
            "rating_cd": "AA+",
            "dblow": "131.10",
            "force_redeem": null,
            "maturity_dt": "2027-06-15"
          }
        }
      ]
    }
    

接口2:强赎倒计时数据

  • URL: https://www.jisilu.cn/web/data/cb/redeem
  • 方法: GET
  • Headers: 同上,需携带 Cookie
  • 关键字段(预期):
    • bond_id: 转债代码
    • bond_nm: 转债名称
    • redeem_count: 已满足强赎天数
    • redeem_trigger: 强赎触发条件(如 15/30)
    • redeem_status: 强赎状态(如"公告强赎"、"暂不强赎"、"倒计时中")
    • redeem_price: 赎回价格
    • last_redeem_dt: 最后交易日

接口3:下修倒计时数据

  • URL: https://www.jisilu.cn/web/data/cb/adjust
  • 方法: GET
  • Headers: 同上,需携带 Cookie
  • 关键字段(预期):
    • bond_id: 转债代码
    • bond_nm: 转债名称
    • adjust_count: 已满足下修天数
    • adjust_trigger: 下修触发条件
    • adjust_status: 下修状态(如"已公告下修"、"董事会提议"、"倒计时中")
    • adjust_price: 拟下修价格(如有)
    • adjust_dt: 下修股东大会日期

执行工作流

Step 1: Cookie 检查与准备

import json, os

cookie_path = os.path.join(os.path.dirname(__file__), "../references/cookie.json")

if not os.path.exists(cookie_path):
    # 触发Cookie缺失提示,等待用户输入
    raise FileNotFoundError("Cookie文件缺失,请按Skill说明提供kbzw__user_login")

with open(cookie_path, "r", encoding="utf-8") as f:
    cookie_data = json.load(f)

kbzw_cookie = cookie_data.get("kbzw__user_login", "")
if not kbzw_cookie:
    raise ValueError("kbzw__user_login为空,请重新提供")

Step 2: 抓取基本数据

  • 使用 requests.get() 访问 https://www.jisilu.cn/web/data/cb/list
  • 携带 Cookie: kbzw__user_login={kbzw_cookie}
  • 解析 JSON,提取 rowscell 中的字段
  • 转换为 DataFrame,字段重命名为中文或保留英文(建议保留英文原始字段名便于后续处理)

Step 3: 抓取强赎倒计时数据

  • 使用相同 Cookie 访问 https://www.jisilu.cn/web/data/cb/redeem
  • 解析返回数据,提取强赎相关字段
  • bond_id 为键,与基本数据做 LEFT JOIN

Step 4: 抓取下修倒计时数据

  • 使用相同 Cookie 访问 https://www.jisilu.cn/web/data/cb/adjust
  • 解析返回数据,提取下修相关字段
  • bond_id 为键,与已有数据做 LEFT JOIN

Step 5: 数据清洗与保存

清洗规则

  1. 价格字段:去除 % 符号,转为 float
  2. 涨跌幅:同上处理
  3. 日期字段:统一转为 YYYY-MM-DD 格式
  4. 空值处理force_redeem 为空表示"暂不强赎";adjust_status 为空表示"未触发下修"
  5. 去重:按 bond_id 去重,保留最新记录

保存格式

数据保存为 CSV,按日期命名:

output/
└── jisilu_cb_2026-04-28.csv

CSV 必须包含以下核心字段:

字段名来源说明
bond_id基本数据转债代码
bond_nm基本数据转债名称
price基本数据转债现价
increase_rt基本数据转债涨跌幅
stock_nm基本数据正股名称
sprice基本数据正股现价
premium_rt基本数据溢价率
convert_price基本数据转股价
year_left基本数据剩余年限
ytm_rt基本数据到期收益率
rating_cd基本数据评级
dblow基本数据双低值
force_redeem_price基本数据强赎触发价
put_convert_price基本数据回售触发价
redeem_status强赎接口强赎状态
redeem_count强赎接口已满足强赎天数
adjust_status下修接口下修状态
adjust_count下修接口已满足下修天数
adjust_dt下修接口下修股东大会日期
data_date系统生成数据日期(YYYY-MM-DD)

Step 6: 结果汇报

向用户汇报当日数据概况:

【集思录可转债数据抓取完成】
日期:2026-04-28
共抓取转债:XXX 只
其中:
- 公告强赎:XX 只
- 强赎倒计时中:XX 只
- 已公告下修:XX 只
- 下修倒计时中:XX 只

数据已保存至:output/jisilu_cb_2026-04-28.csv

异常处理

异常场景处理方式
Cookie 失效(返回 403/登录页)提示用户重新输入 Cookie,删除旧 cookie.json
接口返回空数据记录日志,重试最多 3 次,仍失败则跳过该接口
网络超时设置 timeout=30s,重试 3 次
字段缺失用空值填充,不中断流程
日期解析失败保留原始字符串,标注"解析异常"

定时执行建议

如需每日自动执行,可在本地 OpenClaw 中配合 cron/systemd:

# 每天 15:30 收盘后执行
30 15 * * * cd ~/.config/agents/skills/jisilu-cb-daily && python scripts/collect_jisilu_cb.py

或在 Kimi Claw 对话中每日发送指令:

执行 jisilu-cb-daily Skill 抓取今日可转债数据

Comments

Loading comments...