Skip to main content
Each example includes a summary, the architecture pattern, and a developer-focused section showing what you’d actually build.

Healthcare SaaS: multi-source orchestration on Snowflake

Summary: A healthcare platform with 10,000+ customers uses Snowflake alongside an LMS and competency API. An agent cross-references all three to help administrators find qualified mentor candidates and assign them — replacing a manual process that involved multiple dashboards and CSV exports.

The problem

Administrators needed to identify qualified mentor candidates for new clinicians. This required checking assessment scores (in Snowflake), training completion records (in an LMS API), and competency certifications (in a separate competency API). There was no automated process — administrators manually searched across separate dashboards and exported CSVs to compare data.

How the agent works

  1. Define criteria: An administrator describes what they need — department, minimum assessment scores, required training completion, years of tenure, role/title.
  2. Query Snowflake + external APIs: The agent calls three MCP tools in sequence: a Snowflake MCP tool queries assessment results and builds a candidate pool; an LMS API MCP tool checks training completion; a competency API MCP tool verifies relevant skills and certifications.
  3. Present candidates: The agent returns a filtered, enriched candidate list rendered as an interactive widget.
  4. Take action: The agent calls the LMS API to create a new assignment group with the selected candidates. A multi-step admin workflow becomes a single conversation.

Architecture patterns

Push filtering to Snowflake, not the agent. The anti-pattern to avoid: Snowflake returns 500 candidates, then the agent makes 500 individual API calls to check each one. The correct approach: replicate LMS and competency data into Snowflake via standard ETL and run a single joined query. Narrow tool scope = high accuracy. Each MCP tool does one thing. One pulls assessment data. One checks training completion. One verifies competencies. One creates assignment groups.

For developers: what you’d build

  1. get-candidates — accepts department, min_score, min_tenure. Runs a Snowflake SQL query joining assessment results with user profiles. Returns a filtered candidate list as JSON.
  2. check-training — accepts a list of user IDs. Calls your LMS API to check training completion. Returns completion status per user.
  3. verify-competencies — accepts a list of user IDs and required competency codes. Calls your competency API. Returns pass/fail per user per competency.
  4. create-group — accepts a list of user IDs and a group name. Calls your LMS API to create the student group. Returns confirmation.

Travel commerce: agent-guided purchasing on Snowflake data

Summary: A travel platform uses Snowflake for availability, pricing, and cancellation data. An agent guides new users through a multi-stage upgrade purchasing flow, using Snowflake data at each decision point to reduce anxiety and increase conversion.

The problem

New users had high abandonment rates in the upgrade purchase flow. The platform had rich Snowflake data (average winning bids, success rates per route, cancellation-driven availability) that could reduce friction, but it was locked behind static UI.

How the agent works

  1. Intent capture: User expresses interest in an upgrade. The agent confirms class and route.
  2. Process education: Before showing options, the agent explains how bidding and instant purchase work — no commitment, no payment yet.
  3. Show available upgrades: Queries Snowflake-backed MCP tools for real-time availability, enriched with social proof — average winning bid for the route, number of successful upgrades, potential savings.
  4. Journey selection: User selects a journey, chooses between bidding or purchasing instantly.
  5. Bid placement: User enters a bid amount. The agent shows potential savings calculated from Snowflake pricing data.
  6. Resolution: Instant win triggers celebration UI with savings displayed. Otherwise, handoff to native auction screen.

Architecture patterns

Agentic and non-agentic zones. Search stays in the native app. Payment and notification modals are native components triggered from conversation with callback protocols. The agent handles the guided flow where Snowflake data adds the most value. Progressive commitment. No hard commitment until bid placement. Every earlier stage is explicitly non-binding.

For developers: what you’d build

  1. get-available-upgrades — accepts route, date, class preference. Queries Snowflake for journeys with upgrade availability.
  2. get-social-proof — accepts route and class. Queries Snowflake for average winning bid, success rate, and volume data.
  3. calculate-savings — accepts bid amount and journey ID. Queries Snowflake for standard upgrade price. Returns savings calculation.
  4. place-bid — accepts journey ID, bid amount, user ID. Calls your bidding API. Returns bid confirmation and instant-win status.

HR and recruitment: replacing UX flows with conversations

Summary: An HR platform with 8+ product areas replaced dedicated UX flows with conversational agents calling the same underlying Snowflake-backed APIs. New capabilities now take days to ship instead of weeks.

Architecture pattern

New capabilities = new API endpoints, not new UI flows. Add data model (days), expose API endpoint (days), document for agents (hours). The pattern: write a Snowflake query, wrap it in an MCP server, write a good tool description. No frontend code. No design cycle. The agent handles rendering based on the response shape.

Scenario: fintech — portfolio insights and trade execution

Summary: A B2B fintech platform with Snowflake powering portfolio analytics, risk models, and transaction data. Wealth advisors manage hundreds of client portfolios and need to surface insights, run scenarios, and execute rebalancing actions.

For developers: what you’d build

  • query-portfolio-composition — accepts client IDs, sector filter, allocation threshold. Runs a Snowflake query joining holdings, positions, and sector classification tables.
  • run-risk-assessment — accepts client IDs and risk factor. Queries Snowflake risk model outputs. Returns risk scores and flagged portfolios.
  • generate-rebalance-orders — accepts client IDs and target allocation model. Queries current positions from Snowflake, calculates deltas, generates order set.
  • execute-orders — accepts order set. Calls your OMS API to submit. Returns execution confirmation.

Scenario: logistics — shipment visibility and exception handling

Summary: A logistics SaaS platform with Snowflake storing shipment tracking, carrier performance, and SLA compliance data. Operations managers need to identify exceptions, understand root causes, and take corrective action.

For developers: what you’d build

  • get-sla-exceptions — accepts date range, customer filter, severity threshold. Queries Snowflake joining shipment tracking with SLA definitions.
  • get-carrier-performance — accepts carrier ID, lane, time period. Queries Snowflake for on-time delivery rates, average delay, and trend data.
  • reroute-shipment — accepts shipment IDs and target carrier. Calls your routing API.
  • update-tracking — accepts shipment IDs and new status/carrier. Writes back to Snowflake tracking table.

Scenario: edtech — learner progress and intervention

Summary: An edtech platform with Snowflake storing learner engagement, assessment scores, course completion, and intervention outcomes. Instructors need to identify who’s falling behind and what to do about it.

For developers: what you’d build

  • get-at-risk-learners — accepts cohort ID, risk threshold, assessment target. Queries Snowflake joining engagement logs, interim scores, and a predictive model output table.
  • get-intervention-recommendations — accepts learner profiles. Queries Snowflake for historical intervention outcomes on similar profiles.
  • assign-interventions — accepts learner IDs and intervention types. Calls your LMS API to create enrolments.

Scenario: insurance — claims triage and adjudication support

Summary: An insurtech platform with Snowflake storing claims data, policy details, historical adjudication patterns, and fraud indicators. Claims adjusters need to triage, assess, and route claims efficiently.

For developers: what you’d build

  • get-new-claims — accepts date range, adjuster ID, complexity threshold. Queries Snowflake joining claims, policies, and a complexity scoring model.
  • find-similar-claims — accepts claim attributes. Queries Snowflake for historical matches with outcomes and settlement data.
  • get-fraud-indicators — accepts claim IDs. Queries Snowflake fraud scoring table. Returns flagged claims with indicator details.
  • route-claims — accepts claim IDs and target team. Calls your claims management API.

Common anti-patterns

The N+1 API call problem. Snowflake returns a list of entities, then the agent calls an external API once per entity to enrich or filter. Fix: replicate external data into Snowflake via ETL and run a single joined query.
The “query anything” tool. A single MCP tool that accepts arbitrary natural language and generates SQL against your entire warehouse. Accuracy is low because the tool scope is too broad. Fix: narrow, purpose-built tools that each query a specific subset of your data.
Vague tool descriptions. The LLM uses tool descriptions to decide which tool to call. “Queries data from Snowflake” leads to misrouting. “Returns assessment scores for clinicians in a given department filtered by minimum tenure and training completion status” leads to accurate routing. Tool descriptions are business logic — invest in them. Ignoring semantic-level logic. Everything works at the query level but the agent doesn’t understand your domain vocabulary. Fix: inject business terminology and orchestration rules into the agent configuration. Mixing agentic and non-agentic responsibilities. Not every part of a user journey should run through the agent. Native app components (search bars, payment modals, OS permissions) are often better handled natively with handoff protocols.