Hugging Face CLI

v1.1.0

Manage Hugging Face Hub via hf CLI. Use when working with HF AI models, datasets, spaces, or repos.

1· 225·0 current·0 all-time
byYevhen Diachenko@yevhendiachenko0
MIT-0
Download zip
LicenseMIT-0 · Free to use, modify, and redistribute. No attribution required.
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name/description (Hugging Face CLI) align with declared requirements: it needs the 'hf' binary and HF_TOKEN. Those are expected and proportional to managing Hub models, datasets, repos, spaces, and jobs.
Instruction Scope
SKILL.md is an instruction-only skill listing many hf commands (including create/delete repo, delete buckets, deploy endpoints, run jobs). The instructions do not ask the agent to read unrelated files or env vars, but they do advise persisting HF_TOKEN in shell profiles and enumerate destructive operations—so the user/agent must limit which commands are run and prefer least-privilege tokens for non-write tasks.
Install Mechanism
No install spec is embedded in the skill (lowest risk). The doc suggests installing via pip or Homebrew (official, expected methods). Nothing in the skill attempts to download arbitrary code or write files.
Credentials
Only HF_TOKEN is requested, which is appropriate. However, a write-scoped HF_TOKEN grants broad power (create/delete/upload, manage endpoints, run jobs). The README correctly distinguishes read vs write scopes — recommend using read-only tokens for exploration and minimal-scope tokens for other tasks.
Persistence & Privilege
always is false and the skill does not request system-wide config changes. Autonomous invocation is allowed (platform default) — combined with a write-token this increases blast radius, so token scope matters.
Assessment
This skill is what it says: a wrapper around the official hf CLI. Before installing or enabling it: 1) Only provide HF_TOKEN (no other creds needed). Use a read-scoped token if you only need to browse/download; use write-scoped tokens only when necessary. 2) Be cautious persisting the token in shared shell profiles — prefer per-session or least-privilege tokens. 3) Review any hf commands the agent plans to run (some are destructive: delete repo, delete buckets, upload, deploy). 4) Install hf from official sources (pip install "huggingface_hub[cli]" or Homebrew). 5) If you want to limit risk, disable autonomous invocation for this skill or supply a read-only token while exploring.

Like a lobster shell, security has layers — review code before you run it.

latestvk97bsngsnngcwtgmfcpsbj2ej582wkpq

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

Runtime requirements

🤗 Clawdis
Binshf
EnvHF_TOKEN
Primary envHF_TOKEN

SKILL.md

Hugging Face CLI

Hugging Face (https://huggingface.co) is the leading platform for sharing and collaborating on AI models, datasets, and spaces. This skill enables interaction with the Hub through the official hf CLI.

Installation

Check if hf is available by running hf version. If not installed:

pip install -U "huggingface_hub[cli]"
# or
brew install hf

If the options above do not work, follow the official installation guide.

After installation, run hf version to verify. If the command is not found, run source ~/.bashrc (or source ~/.zshrc for zsh) to reload the PATH, then try again.

Authentication

A Hugging Face User Access Token is required. The token is provided via the HF_TOKEN environment variable.

If authentication fails or the token is missing, instruct the user to:

  1. Go to https://huggingface.co/settings/tokens
  2. Create a new token — there are two permission levels:
    • Read (safer): sufficient for searching, downloading models/datasets, listing repos, browsing papers, and most read-only operations. Choose this if you only need to explore and download.
    • Write (less safe, broader access): required for creating/deleting repos, uploading files, managing discussions, deploying endpoints, and running jobs. Example 3 (create a repo and upload weights) requires a write token.
  3. Set it as an environment variable: export HF_TOKEN="hf_..." (add to shell profile for persistence)

Important: Do NOT run hf auth login interactively — it requires terminal input. Instead, use the environment variable directly. The hf CLI automatically picks up HF_TOKEN from the environment for all commands. To verify authentication, run:

hf auth whoami

Key Commands

TaskCommand
Check current userhf auth whoami
Download fileshf download <repo_id> [files...] [--local-dir <path>]
Download specific revisionhf download <repo_id> --revision <branch|tag|commit>
Download with filtershf download <repo_id> --include "*.safetensors" --exclude "*.bin"
Upload fileshf upload <repo_id> <local_path> [path_in_repo]
Upload as PRhf upload <repo_id> <local_path> [path_in_repo] --create-pr
Upload (private repo)hf upload <repo_id> <local_path> [path_in_repo] --private
Upload large folderhf upload-large-folder <repo_id> <local_path>
Create a repohf repos create <name> [--repo-type model|dataset|space] [--private]
Delete a repohf repos delete <repo_id>
Delete files from repohf repos delete-files <repo_id> <path>...
Duplicate a repohf repos duplicate <repo_id> [--type model|dataset|space]
Repo settingshf repos settings <repo_id> [--private|--public]
Manage brancheshf repos branch create|delete <repo_id> <branch>
Manage tagshf repos tag create|delete <repo_id> <tag>
List modelshf models ls [--search <query>] [--sort downloads] [--limit N]
Model infohf models info <repo_id>
List datasetshf datasets ls [--search <query>]
Dataset infohf datasets info <repo_id>
Run SQL on datahf datasets sql "<SQL>"
List spaceshf spaces ls [--search <query>]
Space infohf spaces info <repo_id>
Space dev modehf spaces dev-mode <repo_id>
List papershf papers ls [--limit N]
List collectionshf collections ls [--owner <user>] [--sort trending]
Create collectionhf collections create "<title>"
Collection infohf collections info <collection_slug>
Add to collectionhf collections add-item <collection_slug> <repo_id> <type>
Delete collectionhf collections delete <collection_slug>
Run a cloud jobhf jobs run <docker_image> <command>
List jobshf jobs ps
Job logshf jobs logs <job_id>
Cancel a jobhf jobs cancel <job_id>
Job hardwarehf jobs hardware
Deploy endpointhf endpoints deploy <name> --repo <repo_id> --framework <fw> --accelerator <hw> ...
List endpointshf endpoints ls
Endpoint infohf endpoints describe <name>
Pause/resume endpointhf endpoints pause|resume <name>
Delete endpointhf endpoints delete <name>
List discussionshf discussions ls <repo_id>
Create discussionhf discussions create <repo_id> --title "<title>"
Comment on discussionhf discussions comment <repo_id> <num> --body "<text>"
Close discussionhf discussions close <repo_id> <num>
Merge PRhf discussions merge <repo_id> <num>
Manage cachehf cache ls, hf cache rm <id>, hf cache prune
Delete bucket / fileshf buckets delete <user>/<bucket>, hf buckets rm <user>/<bucket>/<path>
Sync to buckethf sync <local_path> hf://buckets/<user>/<bucket>
Print environmenthf env

End-to-End Examples

Example 1: Explore trending models, pick one, and preview a download

hf models ls --sort trending_score --limit 5
hf models info openai-community/gpt2
hf download --dry-run openai-community/gpt2 config.json tokenizer.json
hf download openai-community/gpt2 config.json tokenizer.json --local-dir ./gpt2

Example 2: Browse today's papers and find related datasets

hf papers ls --limit 5
hf datasets ls --search "code" --sort downloads --limit 5
hf datasets info bigcode/the-stack

Example 3: Create a private model repo and upload weights

hf repos create my-fine-tuned-model --private
# create returns <your-username>/my-fine-tuned-model — use that full ID below
hf upload <username>/my-fine-tuned-model ./output --commit-message "Add fine-tuned weights"
hf repos tag create <username>/my-fine-tuned-model v1.0 -m "Initial release"

Further Reference

Reference version: hf CLI v1.x

For the full list of commands and options, use built-in help:

hf --help
hf <command> --help

Safety Rules

  • Destructive commands require explicit user confirmation. Before running any of the following, describe what will happen and ask the user to confirm:
    • hf repos delete — permanently deletes a repository
    • hf repos delete-files — deletes files from a repository
    • hf buckets delete / hf buckets rm — deletes buckets or bucket files
    • hf discussions close / hf discussions merge — closes or merges PRs/discussions
    • hf collections delete — permanently deletes a collection
    • hf endpoints delete — permanently deletes an Inference Endpoint
    • hf jobs cancel — cancels a running compute job
    • Any command with --delete flag (e.g., sync with deletion)
    • hf cache rm / hf cache prune — removes cached data from disk (re-downloadable, but may waste bandwidth)
  • Never expose or log the HF_TOKEN value. Do not include it in command output or commit it to files.
  • When uploading, warn the user if the target repo is public and the upload may contain sensitive data.

Files

1 total
Select a file
Select a file to preview.

Comments

Loading comments…