Skip to main content

Snowflake MCP Server

Create a powerful Model Context Protocol (MCP) server for Snowflake in minutes with our AI Gateway. This guide walks you through setting up seamless data warehouse integration with enterprise-grade security and instant OAuth authentication.

About Snowflake API

Snowflake is the Data Cloud platform that enables organizations to mobilize their data with Snowflake's unique architecture. The Snowflake API provides programmatic access to all data operations, from querying to administration.

Key Capabilities

  • SQL Execution: Run queries and stored procedures
  • Data Loading: Bulk data ingestion and streaming
  • Database Management: Create and manage databases, schemas, tables
  • User Management: Control access and permissions
  • Resource Monitoring: Track usage and costs
  • Data Sharing: Secure data collaboration
  • Time Travel: Access historical data versions
  • Zero-Copy Cloning: Instant database duplication

API Features

  • REST API: SQL execution and management
  • Snowpipe API: Continuous data ingestion
  • OAuth 2.0: Secure authentication
  • Multi-Factor Auth: Enhanced security
  • Row-Level Security: Granular access control
  • Result Caching: Improved performance
  • Auto-Scaling: Dynamic compute resources
  • Multi-Cloud: AWS, Azure, GCP support

What You Can Do with Snowflake MCP Server

The MCP server transforms Snowflake's API into a natural language interface, enabling AI agents to:

Data Querying & Analysis

  • SQL Operations

    • "Show total sales by region for last quarter"
    • "Find top 10 customers by revenue"
    • "Calculate year-over-year growth metrics"
    • "Join customer and order tables"
  • Data Exploration

    • "Describe columns in sales table"
    • "Show sample data from transactions"
    • "Find tables containing customer information"
    • "List all available databases"
  • Advanced Analytics

    • "Run time series analysis on daily revenue"
    • "Calculate moving averages for stock prices"
    • "Perform cohort analysis on user signups"
    • "Generate statistical summaries"

Database Management

  • Schema Operations

    • "Create new database for analytics"
    • "Set up schema for staging data"
    • "Clone production database for testing"
    • "Drop temporary tables older than 7 days"
  • Table Management

    • "Create customer dimension table"
    • "Add index on transaction date"
    • "Alter table to add new columns"
    • "Truncate staging tables"
  • View & Procedure Management

    • "Create view for monthly reports"
    • "Update stored procedure logic"
    • "List all materialized views"
    • "Refresh materialized view"

Data Loading & ETL

  • Bulk Loading

    • "Load CSV files from S3 bucket"
    • "Import data from Azure blob storage"
    • "Stage files for processing"
    • "Copy data into target table"
  • Streaming Ingestion

    • "Set up Snowpipe for real-time data"
    • "Configure auto-ingestion from cloud storage"
    • "Monitor streaming pipeline status"
    • "Handle failed file loads"
  • Data Transformation

    • "Transform JSON data to relational format"
    • "Merge incremental updates"
    • "Deduplicate records"
    • "Apply business rules to data"

Access Control & Security

  • User Management

    • "Create read-only user for analytics team"
    • "Grant access to specific schemas"
    • "Revoke permissions from departed employee"
    • "Set up role hierarchy"
  • Role Management

    • "Create custom role for data scientists"
    • "Assign warehouse usage to role"
    • "Configure row-level security"
    • "Manage privilege inheritance"
  • Security Policies

    • "Enable multi-factor authentication"
    • "Set up network policies"
    • "Configure session timeout"
    • "Implement data masking"

Performance & Optimization

  • Query Optimization

    • "Analyze query execution plan"
    • "Identify slow-running queries"
    • "Recommend index additions"
    • "Cache frequently used results"
  • Warehouse Management

    • "Scale warehouse for heavy workload"
    • "Suspend idle warehouses"
    • "Monitor warehouse utilization"
    • "Configure auto-suspend settings"
  • Cost Management

    • "Show compute costs by warehouse"
    • "Track storage usage trends"
    • "Identify expensive queries"
    • "Set up resource monitors"

Data Sharing & Collaboration

  • Share Management

    • "Create secure share for partner"
    • "Add tables to existing share"
    • "Grant access to consumer account"
    • "Monitor share usage"
  • Data Marketplace

    • "Publish dataset to marketplace"
    • "Subscribe to third-party data"
    • "Manage data listings"
    • "Track data monetization"
  • Collaboration Features

    • "Share query results with team"
    • "Create data documentation"
    • "Tag objects for discovery"
    • "Manage data lineage"

Prerequisites

  • Access to Cequence AI Gateway
  • Snowflake account with appropriate privileges
  • OAuth integration setup capabilities
  • Understanding of Snowflake security model

Step 1: Configure Snowflake OAuth

1.1 Create OAuth Security Integration

Connect to Snowflake and run:

CREATE OR REPLACE SECURITY INTEGRATION ai_gateway_oauth
TYPE = OAUTH
ENABLED = TRUE
OAUTH_CLIENT = CUSTOM
OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
OAUTH_REDIRECT_URI = 'https://auth.aigateway.cequence.ai/v1/outbound/oauth/callback'
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = 86400
BLOCKED_ROLES_LIST = ('ACCOUNTADMIN', 'SECURITYADMIN')
COMMENT = 'OAuth integration for AI Gateway MCP';

1.2 Get OAuth Parameters

DESCRIBE SECURITY INTEGRATION ai_gateway_oauth;

Note these values:

  • OAUTH_CLIENT_ID
  • OAUTH_CLIENT_SECRET
  • OAUTH_AUTHORIZATION_ENDPOINT
  • OAUTH_TOKEN_ENDPOINT

1.3 Create OAuth User (Optional)

CREATE USER IF NOT EXISTS ai_gateway_user
PASSWORD = 'StrongPassword123!'
DEFAULT_ROLE = 'AI_GATEWAY_ROLE'
MUST_CHANGE_PASSWORD = FALSE;

CREATE ROLE IF NOT EXISTS AI_GATEWAY_ROLE;

GRANT ROLE AI_GATEWAY_ROLE TO USER ai_gateway_user;

-- Grant necessary privileges
GRANT USAGE ON WAREHOUSE compute_wh TO ROLE AI_GATEWAY_ROLE;
GRANT USAGE ON DATABASE your_database TO ROLE AI_GATEWAY_ROLE;
GRANT USAGE ON SCHEMA your_schema TO ROLE AI_GATEWAY_ROLE;
GRANT SELECT ON ALL TABLES IN SCHEMA your_schema TO ROLE AI_GATEWAY_ROLE;

Step 2-4: Standard Setup

Follow standard steps to access AI Gateway, find Snowflake API, and create MCP server.

Step 5: Configure API Endpoints

  1. Base URL: Your Snowflake account URL
    • Format: https://{account}.snowflakecomputing.com
    • Example: https://xy12345.us-east-1.snowflakecomputing.com
  2. Select endpoints to expose
  3. Click Next

Step 6: MCP Server Configuration

  1. Name: "Snowflake Data Warehouse"
  2. Description: "SQL analytics and data management"
  3. Configure production mode
  4. Click Next

Step 7: Configure Authentication

  1. Authentication Type: OAuth 2.0
  2. Authorization URL: From DESCRIBE command
  3. Token URL: From DESCRIBE command
  4. Enter Client ID and Secret
  5. Configure additional parameters:
    • Account: Your Snowflake account identifier
    • Role: Default role to use
    • Warehouse: Default compute warehouse

Available Snowflake OAuth Scopes

Snowflake uses role-based access control rather than OAuth scopes. Access is determined by:

Database Privileges

  • USAGE: Access database
  • CREATE SCHEMA: Create schemas
  • MODIFY: Alter database
  • MONITOR: View usage

Schema Privileges

  • USAGE: Access schema
  • CREATE TABLE: Create tables
  • CREATE VIEW: Create views
  • CREATE FUNCTION: Create UDFs

Table Privileges

  • SELECT: Read data
  • INSERT: Add data
  • UPDATE: Modify data
  • DELETE: Remove data
  • TRUNCATE: Clear table

Warehouse Privileges

  • USAGE: Use compute
  • OPERATE: Start/stop
  • MODIFY: Change size
  • MONITOR: View metrics

For Read-Only Analytics:

GRANT USAGE ON WAREHOUSE compute_wh TO ROLE analyst_role;
GRANT USAGE ON DATABASE analytics TO ROLE analyst_role;
GRANT USAGE ON ALL SCHEMAS IN DATABASE analytics TO ROLE analyst_role;
GRANT SELECT ON ALL TABLES IN DATABASE analytics TO ROLE analyst_role;
GRANT SELECT ON ALL VIEWS IN DATABASE analytics TO ROLE analyst_role;

For Data Engineering:

GRANT ALL ON WAREHOUSE etl_wh TO ROLE engineer_role;
GRANT ALL ON DATABASE raw_data TO ROLE engineer_role;
GRANT ALL ON ALL SCHEMAS IN DATABASE raw_data TO ROLE engineer_role;
GRANT ALL ON FUTURE TABLES IN DATABASE raw_data TO ROLE engineer_role;

Step 8-10: Complete Setup

Configure security, choose deployment, and deploy.

Using Your Snowflake MCP Server

With Claude Desktop

{
"servers": {
"snowflake": {
"url": "your-mcp-server-url",
"auth": {
"type": "oauth2",
"client_id": "your-client-id"
}
}
}
}

Natural Language Commands

  • "Show total revenue by month for 2024"
  • "Create a staging table for customer data"
  • "Load CSV files from S3 into raw_data schema"
  • "Grant read access to analytics team on sales data"
  • "Optimize the slow-running inventory query"

API Integration Example

// Initialize MCP client
const mcpClient = new MCPClient({
serverUrl: 'your-mcp-server-url',
auth: {
type: 'oauth2',
token: 'access-token'
}
});

// Execute query
const results = await mcpClient.snowflake.query({
statement: `
SELECT
DATE_TRUNC('month', order_date) as month,
SUM(amount) as total_revenue,
COUNT(DISTINCT customer_id) as unique_customers
FROM sales.orders
WHERE order_date >= '2024-01-01'
GROUP BY 1
ORDER BY 1
`,
warehouse: 'COMPUTE_WH',
database: 'ANALYTICS',
schema: 'SALES'
});

// Create table
await mcpClient.snowflake.execute({
statement: `
CREATE OR REPLACE TABLE staging.customer_temp (
customer_id INTEGER,
email VARCHAR,
created_at TIMESTAMP,
attributes VARIANT
)
`
});

// Load data from S3
await mcpClient.snowflake.copyInto({
table: 'staging.customer_temp',
from: '@my_s3_stage/customers/',
fileFormat: 'CSV',
onError: 'CONTINUE',
pattern: '.*customer.*[.]csv'
});

// Create view
await mcpClient.snowflake.execute({
statement: `
CREATE OR REPLACE VIEW analytics.customer_summary AS
SELECT
c.customer_id,
c.email,
COUNT(o.order_id) as order_count,
SUM(o.amount) as lifetime_value
FROM customers c
LEFT JOIN orders o ON c.customer_id = o.customer_id
GROUP BY 1, 2
`
});

// Monitor query history
const queryHistory = await mcpClient.snowflake.queryHistory({
resultLimit: 100,
filters: {
startTime: '2025-01-01T00:00:00Z',
queryType: 'SELECT',
warehouseName: 'COMPUTE_WH'
}
});

Common Use Cases

Analytics & Reporting

  • Ad-hoc business queries
  • Scheduled report generation
  • Dashboard data preparation
  • KPI calculations

Data Engineering

  • ETL pipeline development
  • Data quality checks
  • Schema migrations
  • Performance tuning

Data Governance

  • Access control management
  • Data lineage tracking
  • Compliance reporting
  • Audit logging

Cost Optimization

  • Query optimization
  • Warehouse right-sizing
  • Storage management
  • Resource monitoring

Security Best Practices

  1. Authentication:

    • Use OAuth over passwords
    • Enable MFA
    • Rotate credentials regularly
    • Monitor login activity
  2. Access Control:

    • Principle of least privilege
    • Role-based access
    • Row-level security
    • Column masking
  3. Network Security:

    • IP allowlisting
    • Private endpoints
    • Encrypted connections
    • VPN requirements
  4. Data Protection:

    • Encryption at rest
    • Encryption in transit
    • Secure data sharing
    • Time travel retention

Troubleshooting

Common Issues

  1. Authentication Errors

    • Verify OAuth integration
    • Check account identifier
    • Validate redirect URI
    • Review blocked roles
  2. Permission Denied

    • Check role grants
    • Verify warehouse access
    • Review object privileges
    • Confirm role hierarchy
  3. Query Timeouts

    • Increase warehouse size
    • Optimize query logic
    • Check result set size
    • Review clustering keys
  4. Cost Overruns

    • Set resource monitors
    • Configure auto-suspend
    • Review query patterns
    • Optimize warehouse usage

Getting Help