Snowflake MCP Connection

Connect to the Snowflake Managed MCP server with Clawdbot or other MCP clients. Use when wiring Snowflake MCP endpoints, validating connectivity, or configuring Cortex AI services.

MIT-0 · Free to use, modify, and redistribute. No attribution required.
3 · 2.3k · 2 current installs · 2 all-time installs
MIT-0
Security Scan
VirusTotalVirusTotal
Benign
View report →
OpenClawOpenClaw
Benign
high confidence
Purpose & Capability
Name, description, and all included files (SKILL.md, SQL examples, client setup, config template) consistently describe creating/using Snowflake MCP servers and configuring clients like Clawdbot. Nothing requested by the skill (no env vars, no binaries, no installs) is unrelated to that purpose.
Instruction Scope
Instructions are scoped to creating MCP servers, testing with curl, and configuring Clawdbot or a local MCP client. They do recommend placing PATs or credentials in config files (mcp.json, ~/.snowflake/connections.toml, service args) and invoking local commands, which is expected for the task but creates potential secret-exposure risks if users save tokens/passwords in project files or pass them on the command line.
Install Mechanism
There is no install spec and no code to install — the skill is instruction-only. One doc suggests installing a package manager 'uv' via brew or pip for a local MCP client; this is reasonable documentation rather than an opaque download-from-URL install.
Credentials
The skill does not request environment variables or platform credentials. It expects the user to create/use Snowflake PATs, key files, or passwords when configuring clients. This is proportionate to the purpose, but the guidance to store credentials in config files or pass passwords as CLI args is a security consideration (use least-privilege tokens, avoid checking into source control, prefer key-based auth or secret stores).
Persistence & Privilege
The skill does not request persistent presence (always:false) and is user-invocable. It does not attempt to modify other skills or system-wide settings.
Assessment
This skill is documentation and SQL examples for configuring Snowflake MCP servers and connecting Clawdbot — it does what it says. Before using: (1) do not paste PATs or passwords into public repos or project files (mcp.json, connections.toml); use a secrets manager or environment-specific secure storage; (2) avoid passing plaintext passwords on command lines where possible (use key auth or protected connection files); (3) grant the smallest Snowflake role/privileges necessary (avoid using ACCOUNTADMIN unless required); (4) when running a local MCP client (snowflake-labs-mcp), install software only from trusted sources and review the project; and (5) review sql_statement_permissions in the provided template to ensure DDL/DML/administrative statements are appropriately restricted. If you need the agent to act autonomously with credentials present, be aware that giving the agent access to stored tokens increases risk — prefer manual steps or tightly-scoped tokens.

Like a lobster shell, security has layers — review code before you run it.

Current versionv2.0.2
Download zip
latestvk97dy5z4s126arcd9aa85dcan9805f7a

License

MIT-0
Free to use, modify, and redistribute. No attribution required.

SKILL.md

Snowflake MCP Connection

Use this skill to integrate the Snowflake Managed MCP server with Clawdbot. It covers endpoint creation, authentication, and tool validation so Snowflake data can be accessed through MCP.

Quick Start

Prerequisites

  • Snowflake account with ACCOUNTADMIN role
  • Programmatic Access Token (PAT) from Snowflake
  • Clawdbot or any MCP-compatible client

Step 1: Create Programmatic Access Token (PAT)

  1. In Snowsight, go to your user menu → My Profile
  2. Select Programmatic Access Tokens
  3. Click Create Token for your role
  4. Copy and save the token securely

Step 2: Create MCP Server in Snowflake

Run this SQL in a Snowsight worksheet to create your MCP server:

CREATE OR REPLACE MCP SERVER my_mcp_server FROM SPECIFICATION
$$
tools:
  - name: "SQL Execution Tool"
    type: "SYSTEM_EXECUTE_SQL"
    description: "Execute SQL queries against the Snowflake database."
    title: "SQL Execution Tool"
$$;

Step 3: Test the Connection

Verify with curl (replace placeholders):

curl -X POST "https://YOUR-ORG-YOUR-ACCOUNT.snowflakecomputing.com/api/v2/databases/YOUR_DB/schemas/YOUR_SCHEMA/mcp-servers/my_mcp_server" \
  --header 'Content-Type: application/json' \
  --header 'Accept: application/json' \
  --header "Authorization: Bearer YOUR-PAT-TOKEN" \
  --data '{
    "jsonrpc": "2.0",
    "id": 12345,
    "method": "tools/list",
    "params": {}
  }'

Step 4: Configure Clawdbot

Create mcp.json at your project root (this is the MCP configuration Clawdbot can load for a session):

{
  "mcpServers": {
    "Snowflake MCP Server": {
      "url": "https://YOUR-ORG-YOUR-ACCOUNT.snowflakecomputing.com/api/v2/databases/YOUR_DB/schemas/YOUR_SCHEMA/mcp-servers/my_mcp_server",
      "headers": {
        "Authorization": "Bearer YOUR-PAT-TOKEN"
      }
    }
  }
}

Start a new Clawdbot session and load mcp.json so the MCP connection is active. The Snowflake tools should appear in your session.

Step 5: Verify in Clawdbot

  1. Start a new Clawdbot session
  2. Load mcp.json for the session
  3. Ask a question that triggers Snowflake tools (for example, a SQL query)

MCP Server Examples

Basic SQL Execution Only

CREATE OR REPLACE MCP SERVER sql_mcp_server FROM SPECIFICATION
$$
tools:
  - name: "SQL Execution Tool"
    type: "SYSTEM_EXECUTE_SQL"
    description: "Execute SQL queries against Snowflake."
    title: "SQL Execution"
$$;

With Cortex Search (RAG)

First create a Cortex Search service in Snowsight (AI & ML → Cortex Search), then:

CREATE OR REPLACE MCP SERVER search_mcp_server FROM SPECIFICATION
$$
tools:
  - name: "Document Search"
    identifier: "MY_DB.MY_SCHEMA.MY_SEARCH_SERVICE"
    type: "CORTEX_SEARCH_SERVICE_QUERY"
    description: "Search and retrieve information from documents using vector search."
    title: "Document Search"
  - name: "SQL Execution Tool"
    type: "SYSTEM_EXECUTE_SQL"
    description: "Execute SQL queries."
    title: "SQL Execution"
$$;

With Cortex Analyst (Semantic Views)

First upload a semantic YAML or create a Semantic View, then:

CREATE OR REPLACE MCP SERVER analyst_mcp_server FROM SPECIFICATION
$$
tools:
  - name: "Sales Analytics"
    identifier: "MY_DB.MY_SCHEMA.SALES_SEMANTIC_VIEW"
    type: "CORTEX_ANALYST_MESSAGE"
    description: "Query sales metrics and KPIs using natural language."
    title: "Sales Analytics"
  - name: "SQL Execution Tool"
    type: "SYSTEM_EXECUTE_SQL"
    description: "Execute SQL queries."
    title: "SQL Execution"
$$;

With Cortex Agent

CREATE OR REPLACE MCP SERVER agent_mcp_server FROM SPECIFICATION
$$
tools:
  - name: "Documentation Agent"
    identifier: "MY_DB.MY_SCHEMA.MY_AGENT"
    type: "CORTEX_AGENT_RUN"
    description: "An agent that answers questions using documentation."
    title: "Documentation Agent"
$$;

Full Featured Server

CREATE OR REPLACE MCP SERVER full_mcp_server FROM SPECIFICATION
$$
tools:
  - name: "Analytics Semantic View"
    identifier: "ANALYTICS_DB.DATA.FINANCIAL_ANALYTICS"
    type: "CORTEX_ANALYST_MESSAGE"
    description: "Query financial metrics, customer data, and business KPIs."
    title: "Financial Analytics"
  - name: "Support Tickets Search"
    identifier: "SUPPORT_DB.DATA.TICKETS_SEARCH"
    type: "CORTEX_SEARCH_SERVICE_QUERY"
    description: "Search support tickets and customer interactions."
    title: "Support Search"
  - name: "SQL Execution Tool"
    type: "SYSTEM_EXECUTE_SQL"
    description: "Execute SQL queries against Snowflake."
    title: "SQL Execution"
  - name: "Send_Email"
    identifier: "MY_DB.DATA.SEND_EMAIL"
    type: "GENERIC"
    description: "Send emails to verified addresses."
    title: "Send Email"
    config:
      type: "procedure"
      warehouse: "COMPUTE_WH"
      input_schema:
        type: "object"
        properties:
          body:
            description: "Email body in HTML format."
            type: "string"
          recipient_email:
            description: "Recipient email address."
            type: "string"
          subject:
            description: "Email subject line."
            type: "string"
$$;

Tool Types Reference

TypePurpose
SYSTEM_EXECUTE_SQLExecute arbitrary SQL queries
CORTEX_SEARCH_SERVICE_QUERYRAG over unstructured data
CORTEX_ANALYST_MESSAGENatural language queries on semantic models
CORTEX_AGENT_RUNInvoke Cortex Agents
GENERICCustom tools (procedures/functions)

Benefits

  • Governed by Design: Same RBAC policies apply as your data
  • No Infrastructure: No local server deployment needed
  • Reduced Integration: Connect any MCP-compatible client
  • Extensible: Add custom tools via procedures/functions

Troubleshooting

Connection Issues

  • SSL Error: Use hyphens instead of underscores in account name
  • 401 Unauthorized: Verify PAT token is valid and not expired
  • 404 Not Found: Check database, schema, and MCP server names

Testing Tools

List available tools:

curl -X POST "https://YOUR-ACCOUNT.snowflakecomputing.com/api/v2/databases/DB/schemas/SCHEMA/mcp-servers/SERVER" \
  -H "Authorization: Bearer PAT" \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'

PAT Token Notes

  • PATs don't evaluate secondary roles
  • Select a single role with all required permissions when creating
  • Create new PAT to change role

Alternative: Local MCP Server

For local deployment using the snowflake-labs-mcp package, see mcp-client-setup.md.

Resources

Files

4 total
Select a file
Select a file to preview.

Comments

Loading comments…