suspicious.exposed_secret_literal
- Location
- scripts/dbw_client.py:218
- Finding
- File appears to expose a hardcoded API secret or token.
AdvisoryAudited by Static analysis on May 10, 2026.
Detected: suspicious.exposed_secret_literal
Artifact-based informational review of SKILL.md, metadata, install specs, static scan signals, and capability signals. ClawScan does not execute the skill or run runtime probes.
The skill can act using the database/cloud privileges of the configured credentials, including listing instances, querying metadata/data, and creating change tickets.
The skill loads cloud/API credentials from the environment and .env. This is expected for a database cloud integration, but it gives the skill delegated access to the user's database/workbench account and is not reflected in the registry credential declarations.
self.ak = ak or os.environ.get("VOLCENGINE_ACCESS_KEY")
self.sk = sk or os.environ.get("VOLCENGINE_SECRET_KEY")
self.api_base = os.environ.get("ARK_SKILL_API_BASE")
self.api_key = os.environ.get("ARK_SKILL_API_KEY")Use least-privilege credentials, scope them to the intended region/instances, and avoid providing production-wide keys unless you intend the skill to access them.
Database request details may be routed through a configured gateway that users may not realize is in use.
When ARK_SKILL_API_BASE and ARK_SKILL_API_KEY are present, database API requests can be sent through an environment-defined API gateway with a bearer token. The visible SKILL.md mainly documents Volcengine credentials and does not clearly describe this gateway trust boundary.
url = f"{self.api_base.rstrip('/')}/?Action={action}&Version=2018-01-01"
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {self.api_key}",Before use, verify ARK_SKILL_API_BASE points to a trusted service, or unset the ARK gateway variables and use explicitly scoped Volcengine credentials.
Sensitive database query results or local files used for analysis can remain on disk and may be reused or exposed in later sessions.
The analyzer persists registered dataframes/files into a DuckDB database under /tmp and explicitly keeps the file after exit, with no visible retention limit or default cleanup.
DB_PATH = os.path.join(tempfile.gettempdir(), 'multi_source_analyzer.duckdb')
...
# 退出时不删除文件,下次可以继续使用
pass
...
self._duck_conn.execute(f"CREATE OR REPLACE TABLE {name} AS SELECT * FROM df")Only analyze data you are comfortable storing locally, and delete /tmp/multi_source_analyzer.duckdb and generated workspace artifacts after sensitive analyses.
The skill can perform powerful database operations, but the documented workflow attempts to prevent direct high-risk mutations.
The skill exposes SQL execution and database-change workflows, but the instructions explicitly constrain direct SQL execution to read-only operations and route writes through work orders.
execute_sql 只能执行只读操作(SELECT、SHOW、EXPLAIN)。你绝不能通过 execute_sql 执行 INSERT/UPDATE/DELETE/DDL... 写操作必须通过工单函数
Review every generated SQL statement and any DML/DDL change ticket before approval, especially on production databases.