{"skill":{"slug":"semantic-grep","displayName":"Semantic Grep","summary":"Offline local semantic code search using embeddings to find and index code by meaning with llama.cpp, ONNX, or Ollama backends.","tags":{"latest":"1.0.0"},"stats":{"comments":0,"downloads":103,"installsAllTime":0,"installsCurrent":0,"stars":0,"versions":1},"createdAt":1774304320419,"updatedAt":1774304815150},"latestVersion":{"version":"1.0.0","createdAt":1774304320419,"changelog":"Initial release of semgrepll — local, offline semantic code search.\n\n- Index and search code projects by meaning using embeddings with commands like `semgrep index` and `semgrep search`.\n- Supports multiple offline backends: llama.cpp, ONNX, and Ollama.\n- No external API calls; works fully offline and auto-selects the fastest backend.\n- Embeddings are cached to speed up re-indexing.\n- Easy project management: list or remove indexed projects with `semgrep ls` and `semgrep rm`.\n- Python 3.10+ required; install with optional ONNX support.","license":"MIT-0"},"metadata":null,"owner":{"handle":"rizperdana","userId":"s17d026c7k8mc7catd2md1zvdd83f1kt","displayName":"rizperdana","image":"https://avatars.githubusercontent.com/u/11896928?v=4"},"moderation":{"isSuspicious":true,"isMalwareBlocked":false,"verdict":"suspicious","reasonCodes":["suspicious.llm_suspicious","suspicious.vt_suspicious"],"summary":"Detected: suspicious.llm_suspicious, suspicious.vt_suspicious","engineVersion":"v2.2.0","updatedAt":1774304815150}}