Choose your path
Building it yourself? If your engineering team wants to build and deploy MCP servers independently, head to Integration paths. That documentation covers everything you need to connect Mindset AI to your Snowflake data on your own terms. Want Mindset AI to work with you? If you want hands-on support getting your first MCP integration live, the process below is how we get started together. We run a structured technical enablement engagement designed to deliver a production use case that demonstrates real value, and the enablement across your team that makes everything that follows self-sustaining.Why do we run this process?
The goal isn’t just to ship one agent use case. It’s to create the conditions for wide-scale adoption across your organisation. Wide-scale agent deployment happens when two things happen together: your team launches a use case that is genuinely exciting and valuable to your customers, and the key people across your product organisation understand how to use the platform to build your own IP on top of it. One without the other doesn’t work. The types of knowledge your team will acquire:- Thinking in agents — which problems are a good fit, how to scope for high accuracy, how to identify agentic vs non-agentic zones, how to define success criteria
- How the infrastructure connects — how LLM orchestration, tool routing, context management, tenant isolation, and generative UI work together
- Agent limitations — where they’re reliable, where they’re not, when to hand off to native UI, how to set guardrails and system prompts
- Taking the right actions — what level of API abstraction the agent needs, how to optimise tool descriptions for LLM routing, handoff protocol design, testing and iterating on agent behaviour
Summary
The engagement follows five phases: discovery, technical discovery and design review, build, validation and testing, and production handover. Your team’s main responsibility is exposing your Snowflake-backed capabilities as MCP tools. Mindset AI handles the agentic infrastructure. The whole process typically takes 8–12 weeks.The technical enablement process
Discovery (weeks 1–2)
What happens: We align on business objectives and identify the first slice of value to take into production. Almost every time, the use case discussed during sales is significantly larger than what should be tackled first. “Customer agent” is a huge feature. “How a user raises a support ticket” is a use case. The journey workshop is where we whittle down to that first slice — exciting enough to demonstrate value, narrow enough to ship with confidence, concrete enough to build against. What we work through:- What are the broader business objectives?
- What is the first slice? Not the full feature — the specific, well-defined workflow that involves querying Snowflake data and taking an action. “Find qualified candidates and assign them” rather than “show assessment scores.”
- What does the ideal user interaction look like?
- What is the agent’s decision logic? When should it query data, take action, or hand off?
- What are the agentic vs non-agentic zones?
- What does success look like?
Technical discovery and design review (weeks 2–4)
What happens: We examine your Snowflake data model, existing APIs, business logic, and infrastructure to figure out how to deliver the validated use case. Key questions we work through:- Which Snowflake tables and views back the target use case? What are the key joins?
- Do external APIs need to be called alongside Snowflake, or can that data be replicated into the warehouse via ETL?
- What tenant scoping logic exists? Row-level security, application-level
WHEREclauses, or both? - What business logic needs to be encoded in the MCP tools vs the agent configuration?
- Where would you intend to host the MCP server? AWS, Google Cloud, Azure, or elsewhere?
- Do you have a preferred language — Python or TypeScript?
Build (weeks 4–7)
Your team builds (or we build on your behalf): MCP servers wrapping your Snowflake queries and APIs, tool definitions, and tenant scoping logic. Mindset AI configures: Agent behaviour (system prompts, tool orchestration rules, guardrails), widget mappings, and multi-surface deployment configuration.Validation and testing (weeks 7–9)
What we validate:- Use cases can be adequately achieved by users end to end
- Connection chain integrity — the full path from Mindset AI to your MCP server to Snowflake and back
- Tool selection accuracy — the agent calls the right Snowflake-backed tools for a given user intent
- Tenant isolation — scoping logic correctly filters Snowflake data per customer
- At-scale performance — concurrent users and high query volumes against your Snowflake-backed MCP servers
- Edge cases — what happens when a query returns no results, intent is ambiguous, or an external API is unavailable
Production handover and team enablement (weeks 9–12)
What happens: The use case is live and validated. We gradually step back while transferring day-to-day ownership to your team. Team enablement includes:- Knowledge transfer sessions with your product and engineering teams covering how MCP servers work, how to define new tools, how to configure agent behaviour, and how to validate new use cases
- Platform walkthrough for product managers and stakeholders
- Documentation of patterns established during the engagement — a reusable playbook
- Identification of the next 2–3 use cases with recommended prioritisation and rough scoping
Key technical decisions
Which Snowflake capabilities to expose first
Start narrow. A tool that pulls available departments for a hospital has almost no failure modes. A “query anything” tool will fail regularly. Pick a use case where the Snowflake queries are well-defined, the user intent is clear, and there’s a tangible action at the end.Where to push filtering logic
Push filtering to Snowflake wherever possible. If operational data from external APIs can be replicated into Snowflake via ETL, a single joined SQL query replaces hundreds of individual API calls.Cortex vs custom MCP servers
If the use case is primarily analytical and queries are well-defined, Cortex as an MCP service accelerates setup significantly. If the use case involves transactional operations or complex multi-step logic across Snowflake and external APIs, custom MCP servers give you more control. Most production deployments use both.Tool description quality
Tool descriptions are business logic. “Queries Snowflake data” leads to poor tool selection. “Returns assessment scores for clinicians in a given department, filtered by minimum tenure and training completion status, ordered by score descending” leads to accurate routing. Consider using LLM-generated first drafts, then refining with domain experts.Tenant scoping strategy
Most teams use both layers: Snowflake row-level security as a safety net, and application-levelWHERE clauses in MCP servers for fine-grained control.
What to bring to the discovery call
- A target use case — one well-defined workflow involving querying Snowflake data and taking an action
- A sense of the user experience — what does the user ask, what should the agent do, what does a good outcome look like
- A list of Snowflake-backed capabilities your customers would benefit from accessing conversationally — think in terms of actions, not just data
- Existing API documentation for any services that sit on top of your Snowflake data
- Tenant scoping requirements — how do you currently filter Snowflake data per customer
- A named technical lead who understands the Snowflake data model and can make decisions about tool scoping and business logic