OpenClaw SEO Content Writer

v1.0.4

Write and ship SEO blog posts with Tony + Peter workflow, publish QA, deploy verification, and GSC indexing. SEO博客写作/发布/部署/谷歌收录

0· 127·0 current·0 all-time

Install

OpenClaw Prompt Flow

Install with OpenClaw

Best for remote or guided setup. Copy the exact prompt, then paste it into OpenClaw for x-rayluan/openclaw-seo-content-writer.

Previewing Install & Setup.
Prompt PreviewInstall & Setup
Install the skill "OpenClaw SEO Content Writer" (x-rayluan/openclaw-seo-content-writer) from ClawHub.
Skill page: https://clawhub.ai/x-rayluan/openclaw-seo-content-writer
Keep the work scoped to this skill only.
After install, inspect the skill metadata and help me finish setup.
Use only the metadata you can verify from ClawHub; do not invent missing requirements.
Ask before making any broader environment changes.

Command Line

CLI Commands

Use the direct CLI path if you want to install manually and keep every step visible.

OpenClaw CLI

Bare skill slug

openclaw skills install openclaw-seo-content-writer

ClawHub CLI

Package manager switcher

npx clawhub@latest install openclaw-seo-content-writer
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
The name and description (Tony + Peter SEO pipeline with drafting, deploy, and GSC submission) match the content of SKILL.md and reference files. The skill is a workflow guide and does not claim to perform actions that would require unexpected access (e.g., cloud-wide admin rights). References to deployment pipelines, SSH/CLI, and Google Search Console are expected for the declared purpose.
Instruction Scope
SKILL.md stays within the SEO production domain: drafting, preflight, deploy verification, and GSC submission. It does assume presence of tooling and credentials (SSH/CLI to trigger deploys, a verified GSC property, Google Cloud project with Indexing/Search Console APIs enabled, and a service account JSON or OAuth credentials) but does not include instructions that read unrelated system files or request secret exfiltration. Because the skill is a guide, it delegates implementation (e.g., how to use the Indexing API) to the operator — review any implementation you build from this guide to ensure it handles secrets safely.
Install Mechanism
There is no install spec and no code files that would be written to disk at install time; this is low-risk. The registry metadata and package contents are instruction-only documentation and references.
Credentials
The skill does not declare required env vars or ship credentials, which is appropriate for a documentation-only skill. It explicitly expects that the operator provides Google service account/OAuth credentials and deployment access in their environment — these are proportionate to performing GSC submissions and deployments. Users should ensure any service account used has least-privilege (only GSC/Indexing API and necessary site permissions) and that keys are stored/used securely rather than exposed to agents or untrusted processes.
Persistence & Privilege
always is false and the skill does not request persistent presence or modify other skills or system-wide settings. Autonomous invocation is allowed (platform default) but the skill itself only contains guidance; the real risk depends on the implementation you attach to this workflow.
Assessment
This is a documentation-only workflow for producing and shipping SEO blog posts — it does not install code or require environment variables itself. Before using it operationally: (1) review any scripts or automations you build from this guide to ensure they do not leak credentials; (2) provision a Google service account with the minimum permissions needed for Search Console/Indexing APIs and avoid using overly broad keys; (3) restrict any SSH/CI credentials used to deploy to the minimum required scope and test in a staging environment first; (4) ensure receipts/artifacts do not contain sensitive keys; and (5) if you connect this workflow to an agent that can act autonomously, limit the agent's access to secrets and monitor activity. If you want a higher-assurance check, provide the implementation scripts (deploy hooks, GSC submission code) for review so those pieces can be evaluated as well.

Like a lobster shell, security has layers — review code before you run it.

latestvk978pcfmrpz9dhwkcs2gf0vnhh841mas
127downloads
0stars
5versions
Updated 3w ago
v1.0.4
MIT-0

OpenClaw SEO Content Writer

Use this skill to run a full SEO blog production lane instead of treating “write a blog post” as only a drafting task.

Core model

Split ownership clearly:

  • Tony owns draft generation, safe-template normalization, preflight, content-quality gating, source publish readiness, and artifact receipts.
  • Peter owns deployment, live verification, indexability checks, Google Search Console submission, and indexing-status receipts.

Never collapse these into one fuzzy “content done” state.

Required workflow

  1. Gather same-day inputs
    • keyword / topic target
    • audience + search intent
    • Hunter research / evidence / proof links
    • brand positioning + CTA source
  2. Write the draft batch
    • Use a stable SEO skeleton, not freeform prose.
    • Ensure every draft answers the query directly and has enough structure to survive QA.
  3. Normalize through safe template
    • Stamp template metadata.
    • Ensure every draft contains a quick answer, TL;DR, scannable sections, FAQ, conclusion, CTA, and source/proof notes.
  4. Run publish gates
    • structural preflight
    • content-quality audit
  5. If not publishable, recover correctly
    • Do not pad with filler.
    • Call Hunter for bounded same-day recovery research first.
    • Only downgrade or replace the topic if Hunter also cannot make it longform-worthy.
  6. Source publish
    • Only after preflight + quality audit pass.
    • Write a structured source-publish receipt.
  7. Peter closeout
    • deploy production
    • verify live URL and /blog
    • verify canonical / sitemap / noindex / discovery
    • write deployment + indexability receipts
  8. Search Console follow-through
    • submit only INDEX_READY URLs
    • write GSC submission receipt
    • check indexing status later and write GSC status receipt

Non-negotiable rules

  • No filler padding. If word count is thin, add real information or trigger research recovery.
  • No publish without receipts. Preflight, quality audit, source publish, deploy, and indexability should all leave artifacts.
  • No “live” claims without Peter verification. Code-ready is not live.
  • No GSC submission before indexability passes. Live + self-canonical + sitemap + /blog discovery must be true first.

Minimum draft contract

Every publish candidate should have:

  • direct answer / quick answer
  • TL;DR
  • 4+ substantial H2 sections
  • comparison table or clear scannable structure
  • FAQ with at least 4 real questions
  • conclusion
  • CTA
  • source / proof notes
  • enough specificity, proof, and search-intent coverage to be publishable

Recovery logic

If a draft batch fails because it is thin, weakly evidenced, or not publishable:

  1. classify the problem
  2. call Hunter for more research
  3. rewrite once with stronger evidence / FAQ material / angles
  4. rerun gates
  5. only then downgrade or replace the topic if still too weak

Receipts to maintain

Use the same style of structured receipts for every lane:

  • preflight receipt
  • content-quality audit receipt
  • source-publish receipt
  • deployment receipt
  • blog indexability receipt
  • GSC submission receipt
  • GSC index-status receipt

References

Read these files when needed:

  • references/tony-pipeline.md — drafting, gating, recovery, and source publish flow
  • references/peter-closeout.md — deploy, live verification, indexability, and closure rules
  • references/gsc-indexing.md — Search Console submission and indexing-status workflow
  • references/receipt-contracts.md — receipt expectations and truth states

Prerequisites

This skill is a workflow guide, not a standalone tool. It assumes you already have:

Deployment

  • A deployment pipeline for your site (e.g. Vercel, Netlify, Cloudflare Pages, or a custom CI/CD setup)
  • SSH or CLI access to trigger deploys
  • A working sitemap generator (e.g. Next.js sitemap.ts, or equivalent)

Google Search Console

  • A verified GSC property for your domain (sc-domain: or URL-prefix)
  • Google Cloud project with the Indexing API and Search Console API enabled
  • A service account JSON key with GSC permissions, or OAuth credentials
  • googleapis npm package or gcloud CLI available in your environment

Content pipeline

  • A content source directory or CMS where drafts are stored
  • A safe-template normalization script or convention (this skill describes the contract, not the implementation)

Optional but recommended

  • Tavily or web search tool for Hunter research recovery
  • A receipt/artifact directory convention (e.g. mission-control/data/)

No credentials are bundled

This skill does not ship any API keys, service accounts, or deployment configs. All credentials must be configured in your own environment before use.

Comments

Loading comments...