Skill flagged — suspicious patterns detected

ClawHub Security flagged this skill as suspicious. Review the scan results before using.

Batch Processor 1.0.0

v1.0.0

Process multiple documents in bulk with parallel execution

0· 467·0 current·0 all-time
byLeo Wing@leowing

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for leowing/batch-processor-1-0-0.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "Batch Processor 1.0.0" (leowing/batch-processor-1-0-0) from ClawHub.
Skill page: https://clawhub.ai/leowing/batch-processor-1-0-0
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Canonical install target

openclaw skills install leowing/batch-processor-1-0-0

ClawHub CLI

Package manager switcher

npx clawhub@latest install batch-processor-1-0-0
Security Scan
VirusTotalVirusTotal
Suspicious
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name/description (bulk document processing) matches the SKILL.md contents: design patterns, Python code examples using concurrent.futures, checkpointing, and suggested document libraries. The listed pip packages (python-docx, openpyxl, python-pptx, reportlab, jinja2) are relevant for converting and manipulating office files and PDFs.
Instruction Scope
Instructions stick to processing user-provided files, progress tracking, and checkpointing. The examples read and write files within the working directory (e.g., checkpoint.json) which is expected for a batch processor. There are no instructions to exfiltrate data, call unrelated external services, or access system credentials. A minor note: the prose allows the agent discretion ('I'll execute the appropriate operations')—this is normal for instruction skills but means the agent will need permission to run file operations and code execution.
Install Mechanism
This is an instruction-only skill (no install spec). The SKILL.md suggests installing Python packages via pip, which is proportionate. Minor inconsistency: the examples use tqdm for progress bars but tqdm is not included in the pip install command—add it if you expect CLI progress bars. No downloads from unknown hosts or archive extraction are present.
Credentials
The skill requests no environment variables, credentials, or config paths. The only file artifact is a local checkpoint (checkpoint.json) created in the working directory — this is appropriate for resume-safe batch jobs.
Persistence & Privilege
always is false and the skill does not request permanent presence or elevated privileges. It writes/reads its own checkpoint file but does not modify other skills or system-wide settings.
Assessment
This skill appears coherent for bulk document processing. Before installing or running: (1) run pip installs in a virtualenv and add tqdm if you want progress bars; (2) ensure the agent/process you give this skill has only the file-system access it needs (it will read input files and create checkpoint.json in the working directory); (3) validate or scan input files (malicious documents can exploit processors); (4) if you plan autonomous runs, restrict the agent's file-path scope to prevent unexpected file access. If you need stronger guarantees, ask the author for explicit input validation and a minimal dependency list.

Like a lobster shell, security has layers — review code before you run it.

latestvk97fd7sxy3a20mhf249n9cnjwd826m8q
467downloads
0stars
1versions
Updated 7h ago
v1.0.0
MIT-0

Batch Processor Skill

Overview

This skill enables efficient bulk processing of documents - convert, transform, extract, or analyze hundreds of files with parallel execution and progress tracking.

How to Use

  1. Describe what you want to accomplish
  2. Provide any required input data or files
  3. I'll execute the appropriate operations

Example prompts:

  • "Convert 100 PDFs to Word documents"
  • "Extract text from all images in a folder"
  • "Batch rename and organize files"
  • "Mass update document headers/footers"

Domain Knowledge

Batch Processing Patterns

Input: [file1, file2, ..., fileN]
         │
         ▼
    ┌─────────────┐
    │  Parallel   │  ← Process multiple files concurrently
    │  Workers    │
    └─────────────┘
         │
         ▼
Output: [result1, result2, ..., resultN]

Python Implementation

from concurrent.futures import ProcessPoolExecutor, as_completed
from pathlib import Path
from tqdm import tqdm

def process_file(file_path: Path) -> dict:
    """Process a single file."""
    # Your processing logic here
    return {"path": str(file_path), "status": "success"}

def batch_process(input_dir: str, pattern: str = "*.*", max_workers: int = 4):
    """Process all matching files in directory."""
    
    files = list(Path(input_dir).glob(pattern))
    results = []
    
    with ProcessPoolExecutor(max_workers=max_workers) as executor:
        futures = {executor.submit(process_file, f): f for f in files}
        
        for future in tqdm(as_completed(futures), total=len(files)):
            file = futures[future]
            try:
                result = future.result()
                results.append(result)
            except Exception as e:
                results.append({"path": str(file), "error": str(e)})
    
    return results

# Usage
results = batch_process("/documents/invoices", "*.pdf", max_workers=8)
print(f"Processed {len(results)} files")

Error Handling & Resume

import json
from pathlib import Path

class BatchProcessor:
    def __init__(self, checkpoint_file: str = "checkpoint.json"):
        self.checkpoint_file = checkpoint_file
        self.processed = self._load_checkpoint()
    
    def _load_checkpoint(self):
        if Path(self.checkpoint_file).exists():
            return json.load(open(self.checkpoint_file))
        return {}
    
    def _save_checkpoint(self):
        json.dump(self.processed, open(self.checkpoint_file, "w"))
    
    def process(self, files: list, processor_func):
        for file in files:
            if str(file) in self.processed:
                continue  # Skip already processed
            
            try:
                result = processor_func(file)
                self.processed[str(file)] = {"status": "success", **result}
            except Exception as e:
                self.processed[str(file)] = {"status": "error", "error": str(e)}
            
            self._save_checkpoint()  # Resume-safe

Best Practices

  1. Use progress bars (tqdm) for user feedback
  2. Implement checkpointing for long jobs
  3. Set reasonable worker counts (CPU cores)
  4. Log failures for later review

Installation

# Install required dependencies
pip install python-docx openpyxl python-pptx reportlab jinja2

Resources

Comments

Loading comments...