Skip to main content
You’ve invested in Snowflake — rationalised data, strong models, mature infrastructure. But your customers are still interacting with those capabilities through static dashboards, manual exports, and support tickets. The data is there. The intelligence is there. The customer-facing experience is not. Building that experience means assembling LLM orchestration, conversational UI, widget rendering, multi-tenant session management, compliance, deployment across web, mobile, and messaging channels, testing infrastructure, and production observability. This is a significant engineering effort, and none of it is unique to your product. Every SaaS company building customer-facing AI needs the same stack. Mindset AI handles that entire layer. Your team focuses on what differentiates your product: the APIs, business logic, and ML models built on your Snowflake data. The capabilities only you can build. This section covers how the integration works, how your Snowflake data flows through the system, how to think about scoping and accuracy, and what real deployments look like in practice.

How it works

Every integration follows a three-layer architecture. Your Snowflake instance stays exactly where it is. Nothing changes at the data layer. Your Snowflake data and models (you own) — your warehouse, schemas, Cortex models, Snowpark pipelines. Credentials, connection strings, and raw queries never leave your infrastructure. Your MCP servers (you control) — these expose your Snowflake data as callable tools that agents can use. How much work this takes depends on what you need: Simplest: Snowflake-native MCP. Snowflake can spin up a managed MCP server directly from your account — no separate infrastructure, no custom code. Run a CREATE MCP SERVER command, point it at predefined SQL queries, and they’re available as deterministic tools. Authentication and governance are handled by Snowflake. This is hours of configuration, not weeks of development. Ad-hoc analytics: Cortex Analyst. For questions that can’t be answered by a fixed query set, Cortex Analyst translates natural language into SQL using a semantic model you define. Snowflake reports 90%+ accuracy on real-world use cases. Expose it through the same managed MCP server — no custom endpoints required. Custom logic: your own MCP server. When you need business logic beyond what SQL can express — proprietary scoring models, multi-step workflows, calls to other internal services — you build a custom MCP server in whatever language you prefer and host it in your own infrastructure. Most Snowflake customers start with the first two and only build custom servers when they have a specific reason to. This is where your competitive advantage lives — not in the plumbing, but in the data, the queries, and the models that only you have. Agentic experience layer (Mindset AI) — sits on top. Receives user intent, resolves it against your registered tool definitions, calls your MCP servers with structured parameters, and renders responses with dynamic UI. Never crafts SQL, never sees your credentials, never connects to Snowflake directly. Only sees what your MCP servers return.

How your Snowflake data flows through the system

Tracing a single request end to end:
  1. A user asks a question or requests an action through the conversational interface.
  2. The request arrives with a tenant ID and user context attached via the Context API.
  3. Intent is resolved against your registered tool definitions. No Snowflake data has moved yet — this is purely matching what the user asked to what tools are available.
  4. Your MCP server is called with structured parameters (e.g. { region: "EMEA", min_revenue: 50000, active: true }).
  5. Your MCP server executes the request — whether that’s Snowflake running a predefined query natively, Cortex Analyst generating SQL from the user’s question, or your custom service running proprietary logic.
  6. Your MCP server returns structured results — only the data your code (or Snowflake’s managed server) decides to expose.
  7. A conversational response is generated and the appropriate widget is rendered based on the returned data shape.
Mindset AI never stores or has access to your Snowflake credentials, connection strings, or raw queries. For Snowflake-native MCP connections, credential management is handled by a dedicated integration partner with secure token storage.

Authentication and credential management

Connecting Mindset AI to your MCP servers requires credentials. How those credentials are managed depends on the type of MCP server. Custom MCP servers (you host): You generate an API key and register it with Mindset AI. Mindset AI includes it as a Bearer token on every request. Simple, static, and entirely under your control. Snowflake-native MCP servers: Snowflake’s managed MCP server uses OAuth 2.0 authentication. This means credentials need to be exchanged for short-lived access tokens, and those tokens need to be refreshed. Rather than building this token lifecycle into the Mindset AI platform directly, credential management for Snowflake connections is handled through an integration partner that specialises in secure credential storage and token exchange. Your team connects their Snowflake account once through a managed authentication flow. The integration layer stores credentials securely, handles token refresh, and provides valid access tokens to Mindset AI at request time. Your Snowflake credentials are never stored in the Mindset AI platform.

Access control: roles vs users

An important distinction when using Snowflake’s managed MCP server: access control is role-based, not user-based. When the Mindset AI platform connects to a Snowflake-native MCP server, it authenticates as a specific Snowflake role. Snowflake’s RBAC determines what that role can see — which tools are accessible, which schemas, which data. Row-level security policies apply based on that role’s permissions. Every request through that connection sees the same data. This is the right model when all users within a tenant should see the same Snowflake data — analytics dashboards, aggregate metrics, shared reporting. The role maps to the tenant, and Snowflake’s governance applies at the tenant level. When individual users need to see different data — personal records, user-specific permissions, per-user row-level security — you need either:
  • A custom MCP server that reads the user identity passed by Mindset AI (via the x-user-id header) and applies user-level filtering in your own code before returning results.
  • Per-user authentication where each user connects their own Snowflake account, and the integration layer manages individual tokens. Snowflake then sees each request as a distinct user, and per-user RLS applies natively.
Most deployments use a combination: Snowflake-native MCP for tenant-scoped analytics, and custom MCP servers for user-specific operations.

Why Snowflake customers get to production faster

The integration is database-agnostic — the same architecture works with Postgres, BigQuery, or any data source behind an MCP server. But Snowflake customers skip the most time-consuming step. No MCP server to build. With most databases, you need to build, host, and maintain a custom MCP server. Snowflake eliminates that entirely — CREATE MCP SERVER, define your tools, and you’re live. Two tool strategies from day one. Predefined SQL queries give you deterministic, auditable tools. Cortex Analyst gives you ad-hoc natural language to SQL with 90%+ accuracy. Both are exposed through the same managed MCP server, no custom code required. Governed by default. Snowflake’s RBAC and audit logging apply to the MCP server and its tools automatically. You’re not bolting governance onto a custom service after the fact — it’s inherited from your existing Snowflake configuration. Analysts can own it. Someone who knows the data and the queries can define tools, scoping rules, and semantic models without writing a service. The path from “I know this SQL” to “this is a callable tool” doesn’t require engineering resource.