Lark Report Collector

v1.0.0

Collect weekly reports from Lark Reports (oa.larksuite.com), summarize into Lark Docs, and notify. Use when: (1) collecting weekly reports from specific team...

0· 644·1 current·2 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for pengxiao-wang/lark-report-collector.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Lark Report Collector" (pengxiao-wang/lark-report-collector) from ClawHub.
Skill page: https://clawhub.ai/pengxiao-wang/lark-report-collector
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Canonical install target

openclaw skills install pengxiao-wang/lark-report-collector

ClawHub CLI

Package manager switcher

npx clawhub@latest install lark-report-collector
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Suspicious
medium confidence
Purpose & Capability
The name/description (collect reports, summarize into Lark Docs, notify) align with the instructions: browser automation to scrape a SPA, build a doc via the Lark API, and send a notification. However, the skill assumes an authenticated browser profile and use of a separate 'lark-api' skill for API auth but does not declare any primary credential or environment variables—this is an omission (not necessarily malicious) but reduces transparency.
!
Instruction Scope
The SKILL.md instructs the agent to use browser automation (navigate, click, run JS evaluate) and to 'append to local file after each extraction'. That file-write step and the scraping of names/departments are data-handling operations not reflected in declared requirements. The instructions also tell sub-agents to follow exact URLs/steps and include custom JS evaluation. Those actions are within the skill's purpose but broaden access to local storage and user data and should be disclosed explicitly.
Install Mechanism
This is an instruction-only skill with no install spec or downloaded code, so it does not write new binaries to disk. That lowers installation risk. The use of browser automation relies on existing platform-provided browser functionality (profile=openclaw).
!
Credentials
No env vars or primary credential are declared, yet the workflow requires: (1) an active authenticated browser session for oa.larksuite.com and (2) API auth via a separate 'lark-api' skill. The skill also writes extracted data to a local file. Requiring authentication and local file access without declaring required credentials/config is a mismatch and reduces ability to reason about privilege and data exfiltration risk.
Persistence & Privilege
always:false and normal model invocation are used. The skill does not request permanent platform presence or attempt to modify other skills or global configs in the instructions. The main persistence concern is the instruction to append to a local file (data retention), which is an operational detail rather than a platform privilege request.
What to consider before installing
Before installing: (1) Confirm where extracted data will be stored (exact local path, retention, access controls) and that you are comfortable with the agent writing names/departments to disk. (2) Ask the author to declare required credentials or environment variables—specifically how API auth is supplied (the skill references a separate 'lark-api' skill) and whether an authenticated browser profile is required. (3) If you will allow this skill to run autonomously, restrict its scope or test it in a sandbox first because it performs browser automation and can access user-visible data. (4) Since source/homepage are unknown, prefer not to grant wide access until the credential/filestore behavior and the identity of the publisher are verified. (5) If you accept it, ensure the lark-api credentials are stored securely (least privilege) and audit the created local files and outgoing notifications the first few runs.

Like a lobster shell, security has layers — review code before you run it.

latestvk972pw322r3hnygpmhy3gq0sz581mq31
644downloads
0stars
1versions
Updated 1mo ago
v1.0.0
MIT-0

Lark Report Collector

Collect weekly report data from Lark Reports, summarize into Lark Docs, and send notifications.

When to Use

  • "Collect this week's/last week's reports for Photo/Bloom/H&F"
  • "Who hasn't submitted their weekly report?"
  • "Summarize weekly reports into a Lark doc"

Hard Rules (battle-tested)

  1. Reports is a SPA — curl/web_fetch returns nothing. Must use browser (profile=openclaw)
  2. Pagination is reversed — Next = older weeks, Previous = newer weeks
  3. Always snapshot to confirm week title after pagination (most common error: collecting wrong week)
  4. One page may show multiple weeks — data is sorted by time, a single page can span 2-3 weeks
  5. block_type mapping — 12=bullet, 13=ordered (NOT 9/10! Those are heading7/heading8)
  6. Never restart gateway inside a sub-agent (kills itself)
  7. Sub-agents need exact URLs and steps — don't let them explore on their own

Complete Workflow

Step 1: Navigate to Reports

browser action=navigate profile=openclaw targetUrl="https://oa.larksuite.com/report/record/entry"

Prerequisites: openclaw browser must have active Lark login session.

Step 2: Select Report Template

Snapshot and click the target template menuitem in the left sidebar "Received by me".

Step 3: Navigate to Target Week

Page defaults to latest data. Week title format: "Feb 2 ~ Feb 8 Submitted: 18"

Pagination (critical):

  • Next button = older weeks ⬅️
  • Previous button = newer weeks ➡️
  • Page display: "2/25" (page 2 of 25), page 1 is newest

⚠️ Snapshot and confirm the date in the title after every page turn!

Step 4: Extract Submitted Members Data

  • Same page may show multiple weeks — only extract rows belonging to target week
  • Paginate through all rows for the target week
  • Append to local file after each extraction (prevents data loss)

Step 5: Get Unsubmitted List

"Not submitted: N" button has no snapshot ref. Click via JS evaluate:

(() => {
  const btns = [...document.querySelectorAll('button')].filter(
    b => /Not submitted.*\d/.test(b.innerText)
  );
  if(btns.length) { btns[0].click(); return 'clicked'; }
  return 'not found';
})()

Dialog shows: unsubmitted count + names + departments.

Step 6: Create Lark Doc

Create document via Lark Open API (see lark-api skill for auth).

block_type reference (verified):

block_typeTypeJSON field
2Text"text"
3Heading 1"heading1"
4Heading 2"heading2"
5Heading 3"heading3"
12Bullet list ✅"bullet"
13Ordered list ✅"ordered"
22Divider"divider"

❌ 9=heading7, 10=heading8. NOT lists!

Step 7: Send Notification

Send message via Lark API with doc link.

Lessons Learned (6 real attempts)

#ResultRoot CauseLesson
1❌ Self-killedSub-agent ran gateway restartNever restart gateway in sub-agent
2⚠️ Wrong weekCollected Feb 10-14 instead of Feb 3-7Always confirm week title after pagination
3❌ 200K tokens burnedTried curl on SPAReports is SPA, browser only
4❌ 200K tokens burnedSub-agent explored on its ownGive exact URLs and steps
5✅ SuccessPrecise instructions + correct block_typesTemplate is key

Known Limitations

  • Lark Report Open API unavailable on international version (returns 404) — browser only
  • Browser login session may expire — re-login needed
  • Export button (Excel) untested — potential alternative

Comments

Loading comments...