dbt
MCP ServerFree** - Official MCP server for [dbt (data build tool)](https://www.getdbt.com/product/what-is-dbt) providing integration with dbt Core/Cloud CLI, project metadata discovery, model information, and semantic layer querying capabilities.
Capabilities13 decomposed
dbt project metadata discovery and graph traversal
Medium confidenceExposes 20 discovery tools that parse dbt project manifests and artifacts to retrieve models, sources, tests, macros, exposures, and lineage relationships. Uses a discovery client that loads compiled dbt artifacts (manifest.json, catalog.json) and traverses the dependency graph to answer structural queries about project composition, model relationships, and data lineage. Implements pagination and caching strategies to optimize context delivery for large projects.
Implements a dedicated discovery client architecture that parses compiled dbt manifests and catalogs, enabling structured graph traversal with built-in pagination and caching strategies optimized for large projects. Unlike REST API approaches, it works offline with local artifacts and supports multi-project mode for monorepo dbt setups.
Faster and more complete than querying dbt Cloud Admin API for metadata because it operates on local compiled artifacts without network latency, and supports full lineage traversal including column-level dependencies.
dbt cli command execution with binary detection and environment isolation
Medium confidenceProvides 10 tools that execute dbt CLI commands (build, run, test, compile, parse, snapshot, seed, freshness, docs generate, retry) by detecting the dbt binary location, validating project structure, and executing commands in isolated subprocess contexts with environment variable injection. Implements CLI binary detection logic that searches system PATH, virtual environments, and project-local installations, then streams command output and exit codes back to the MCP client with error handling and timeout management.
Implements intelligent dbt binary detection that searches multiple installation contexts (system PATH, venv, project-local) and validates project structure before execution. Uses subprocess isolation with environment variable injection to enable safe, repeatable command execution in agent contexts without modifying global state.
More flexible than direct dbt Python API calls because it supports all CLI commands and respects user-configured dbt profiles, and more reliable than shell invocation because it handles binary detection and environment validation automatically.
credential management and oauth authentication flow
Medium confidenceImplements a credential management system that securely stores and retrieves dbt Cloud API tokens, data warehouse credentials, and other authentication secrets. Supports multiple authentication methods including environment variables, credential files, and OAuth flows for dbt Cloud. Uses secure credential storage patterns and implements token refresh logic for OAuth-based authentication. Enables agents to authenticate with dbt Cloud and data warehouses without exposing credentials in tool calls.
Implements a pluggable credential provider system that supports multiple authentication methods (environment variables, files, OAuth) with automatic token refresh for OAuth flows. Enables secure credential management without exposing secrets in tool calls or logs.
More secure than hardcoded credentials because it uses OS-level credential storage and implements token refresh, and more flexible than single-method authentication because it supports multiple credential sources with fallback logic.
tool registration, filtering, and auto-disable based on authentication state
Medium confidenceImplements a dynamic tool registration system that enables/disables tools based on available credentials and configuration. Tools that require dbt Cloud credentials are automatically disabled if authentication fails; tools requiring data warehouse access are disabled if connection validation fails. Uses a validation framework that tests each tool's prerequisites at startup and during runtime, filtering the tool list exposed to MCP clients based on actual availability.
Implements automatic tool filtering based on credential validation, ensuring MCP clients only see tools that are actually available. Uses a validation framework that tests prerequisites at startup and provides clear error messages for disabled tools.
More user-friendly than exposing all tools and failing at runtime because it filters unavailable tools upfront, and more maintainable than manual tool lists because validation is automated and reflects actual server state.
caching and pagination strategies for large project contexts
Medium confidenceImplements intelligent caching of dbt artifacts and query results to optimize performance and reduce context size for large projects. Uses pagination tokens to break large result sets into manageable chunks, implements LRU caching for frequently accessed metadata, and provides cache invalidation strategies. Enables agents to work with large dbt projects without overwhelming context windows or causing performance degradation.
Implements a multi-layer caching strategy with LRU eviction and pagination support, optimized for large dbt projects. Provides cache statistics and invalidation controls to enable agents to manage context efficiently.
More scalable than loading entire project metadata at once because it uses pagination and caching, and more transparent than opaque caching because it exposes cache hit rates and pagination tokens to agents.
dbt semantic layer querying with metricflow sql compilation
Medium confidenceExposes 6 tools that query the dbt Semantic Layer by translating natural language or structured queries into MetricFlow SQL using the Semantic Layer client. Implements a client architecture that authenticates with dbt Cloud, retrieves semantic model definitions (metrics, dimensions, entities), compiles queries to SQL, and executes them against the data warehouse. Supports both direct SQL execution and query compilation for inspection.
Provides direct integration with dbt Semantic Layer via authenticated client that compiles natural language or structured queries to MetricFlow SQL, enabling metric-driven analytics without requiring users to write SQL. Includes query compilation inspection for transparency into metric calculation logic.
More governance-aware than direct SQL querying because it enforces metric definitions and lineage through the Semantic Layer, and more accessible than MetricFlow CLI because it abstracts authentication and query compilation into simple MCP tools.
dbt cloud admin api job orchestration and monitoring
Medium confidenceExposes 11 tools that interact with dbt Cloud Admin API to trigger job runs, monitor execution status, retrieve run artifacts, manage job configurations, and query historical run data. Implements an Admin API client that authenticates with dbt Cloud API tokens, constructs API requests, polls for job completion, and parses run artifacts (logs, manifest, run_results.json). Supports async job triggering with status polling and artifact retrieval.
Implements a full-featured Admin API client with async job triggering, status polling, and artifact retrieval, enabling agents to orchestrate dbt Cloud jobs without manual intervention. Includes intelligent polling with configurable timeouts and error handling for network failures.
More complete than dbt Cloud UI automation because it provides programmatic job triggering and artifact access, and more reliable than webhook-based approaches because it uses synchronous polling with guaranteed artifact retrieval.
sql execution and natural language to sql translation
Medium confidenceProvides 2 tools that execute raw SQL queries against the dbt data warehouse and translate natural language descriptions into executable SQL. The SQL execution tool connects to the warehouse using dbt profiles and credentials, executes queries with timeout protection, and returns structured results. The translation tool leverages LLM capabilities (via the MCP client) to convert natural language intent into SQL, which can then be executed or inspected.
Integrates SQL execution with natural language translation in a single tool pair, allowing agents to both generate and execute queries without context switching. Uses dbt profile credentials for seamless warehouse authentication without requiring separate credential management.
More integrated than separate SQL clients because it combines execution and translation, and more secure than direct SQL input because it validates queries before execution and enforces timeout limits.
dbt code generation with yaml scaffolding and model templates
Medium confidenceExposes 3 tools that generate dbt YAML configurations and SQL templates: source YAML generation from database introspection, model YAML generation with column documentation, and staging model SQL generation. Uses dbt codegen library to introspect database schemas, extract column metadata, and generate boilerplate YAML and SQL that follows dbt best practices. Outputs are ready-to-use templates that reduce manual scaffolding work.
Wraps dbt codegen library to provide three complementary generation tools (source, model, staging) that work together to accelerate dbt project setup. Generates production-ready YAML and SQL that follows dbt best practices without requiring manual template creation.
More complete than manual YAML writing because it introspects database schemas automatically, and more flexible than dbt Cloud IDE templates because it supports custom generation parameters and integrates with agent workflows.
dbt language server protocol (lsp) integration for column-level lineage
Medium confidenceProvides tools that integrate with dbt Fusion Language Server to enable column-level lineage analysis and code intelligence. Uses LSP protocol to query column dependencies, trace data flow through transformations, and provide IDE-like features (hover information, go-to-definition) for dbt SQL code. Requires LSP server running locally or remotely.
Integrates with dbt Fusion LSP to provide column-level lineage analysis that goes beyond model-level dependencies, enabling fine-grained impact analysis and data flow tracing. Uses LSP protocol for standardized code intelligence features.
More precise than model-level lineage because it traces individual columns through transformations, and more interactive than static analysis because it leverages LSP for real-time code intelligence.
dbt product documentation search and retrieval
Medium confidenceProvides tools that search and retrieve dbt product documentation (docs.getdbt.com) to answer questions about dbt features, best practices, and configuration. Implements a documentation search client that indexes dbt docs, supports semantic search, and returns relevant documentation snippets with source URLs. Enables agents to provide context-aware dbt guidance without requiring manual documentation lookup.
Provides semantic search over dbt product documentation, enabling agents to retrieve relevant guidance without requiring exact keyword matching. Integrates documentation retrieval into agent workflows for context-aware dbt assistance.
More accessible than manual documentation browsing because it uses semantic search to find relevant content, and more comprehensive than hardcoded FAQs because it covers the full dbt documentation corpus.
mcp server metadata and capability discovery
Medium confidenceExposes tools that describe the dbt-mcp server itself: available tools, their schemas, authentication requirements, and configuration options. Enables MCP clients to dynamically discover server capabilities, validate tool parameters, and understand authentication flows. Implements introspection endpoints that return tool definitions, required credentials, and supported features.
Provides MCP-native introspection that allows clients to dynamically discover available tools and their schemas, enabling adaptive client behavior based on server capabilities. Implements tool filtering based on authentication state to surface only available tools.
More dynamic than static documentation because it reflects actual server configuration and authentication state, and more client-friendly than manual schema lookup because it provides structured metadata.
multi-project dbt workspace management with project switching
Medium confidenceEnables dbt-mcp to operate on multiple dbt projects within a single workspace by implementing project context switching and isolated tool execution per project. Implements a configuration system that supports multiple project paths, maintains separate artifact caches per project, and routes tool calls to the appropriate project context. Allows agents to work with monorepo dbt setups or multiple independent dbt projects without restarting the server.
Implements project context switching at the MCP server level, allowing a single server instance to manage multiple dbt projects with isolated artifact caches and tool execution contexts. Enables seamless agent workflows across monorepo dbt setups without requiring separate server instances.
More efficient than running separate dbt-mcp servers per project because it consolidates server overhead, and more flexible than single-project servers because it supports dynamic project switching without restart.
Capabilities are decomposed by AI analysis. Each maps to specific user intents and improves with match feedback.
Related Artifactssharing capabilities
Artifacts that share capabilities with dbt, ranked by overlap. Discovered automatically through the match graph.
Euno
Transforms data modeling with seamless dbt™ integration and...
Elementary
Open-source dbt-native data observability and anomaly detection.
dbt-docs
** - MCP server for dbt-core (OSS) users as the official dbt MCP only supports dbt Cloud. Supports project metadata, model and column-level lineage and dbt documentation.
Metaplane
Monitor, manage, and enhance data integrity...
dagster
Dagster is an orchestration platform for the development, production, and observation of data assets.
Meltano
Open-source DataOps platform built on Singer and dbt.
Best For
- ✓dbt practitioners building AI-assisted data documentation and lineage tools
- ✓data engineers automating dbt project analysis and governance
- ✓teams integrating dbt metadata into LLM-powered agents for data discovery
- ✓AI agents orchestrating dbt workflows in CI/CD pipelines
- ✓teams building dbt-integrated automation platforms
- ✓developers creating LLM-powered dbt assistants that need to execute commands
- ✓teams deploying dbt-mcp in multi-user environments requiring credential isolation
- ✓organizations with strict credential management policies
Known Limitations
- ⚠Requires pre-compiled dbt artifacts (manifest.json, catalog.json) — does not parse raw YAML files
- ⚠Graph traversal performance degrades on projects with 1000+ models without caching enabled
- ⚠Column-level lineage requires dbt 1.5+ with manifest v10+ schema
- ⚠Does not support real-time updates — requires re-parsing artifacts after dbt runs
- ⚠Requires dbt binary to be installed and accessible in PATH or project virtual environment
- ⚠Command execution is synchronous — long-running commands (dbt run on large projects) may timeout if not configured
Requirements
Input / Output
UnfragileRank
UnfragileRank is computed from adoption signals, documentation quality, ecosystem connectivity, match graph feedback, and freshness. No artifact can pay for a higher rank.
About
** - Official MCP server for [dbt (data build tool)](https://www.getdbt.com/product/what-is-dbt) providing integration with dbt Core/Cloud CLI, project metadata discovery, model information, and semantic layer querying capabilities.
Categories
Alternatives to dbt
Are you the builder of dbt?
Claim this artifact to get a verified badge, access match analytics, see which intents users search for, and manage your listing.
Get the weekly brief
New tools, rising stars, and what's actually worth your time. No spam.
Data Sources
Looking for something else?
Search →