TeamCity
MCP ServerFree** - MCP server for TeamCity, integrates with Claude Desktop and Cursor.
Capabilities12 decomposed
mcp protocol translation to teamcity rest api
Medium confidenceTranslates incoming Model Context Protocol (MCP) JSON-RPC 2.0 requests into TeamCity REST API calls through a dedicated protocol handler (internal/mcp/handler.go) that manages session lifecycle, request routing, and response marshaling. The handler implements the full MCP specification including initialization, resource discovery, and tool invocation, converting structured MCP messages into authenticated HTTP requests to TeamCity's /app/rest endpoints.
Implements full MCP specification as a dedicated protocol layer (internal/mcp/handler.go) that decouples MCP concerns from TeamCity API logic, enabling clean separation between protocol translation and business logic — most CI/CD integrations embed protocol handling directly in API client code
Provides native MCP support out-of-the-box for Claude Desktop and Cursor, eliminating the need for custom API wrappers or prompt engineering to interact with TeamCity
multi-transport server with http, websocket, and stdio support
Medium confidenceImplements a production-grade server (internal/server/server.go) supporting three distinct transport mechanisms: HTTP for REST-like access, WebSocket for persistent bidirectional communication, and STDIO for local process integration. The server component handles connection lifecycle management, request routing, and graceful shutdown across all transports, allowing flexible deployment in cloud, desktop, and local development environments.
Implements unified transport abstraction (internal/server/server.go) that handles HTTP, WebSocket, and STDIO through a single request/response pipeline, eliminating transport-specific branching in protocol and API logic — typical MCP servers hardcode one transport or duplicate handler logic per transport
Supports STDIO transport natively for seamless Claude Desktop/Cursor integration without requiring separate proxy servers or network configuration
response caching and health check monitoring
Medium confidenceImplements caching layer for frequently accessed TeamCity data (projects, build types, agents) and periodic health checks to monitor TeamCity server availability. The caching system reduces API calls to TeamCity and improves response latency for resource discovery operations. Health checks detect connectivity issues and enable graceful degradation or alerting when TeamCity becomes unavailable.
Combines response caching with active health monitoring in a unified subsystem, allowing the server to serve cached data during TeamCity outages while maintaining visibility into availability — most MCP servers lack built-in caching or health monitoring
Improves response latency and system resilience by caching frequently accessed resources while monitoring TeamCity availability for operational visibility
json-rpc 2.0 protocol compliance and error handling
Medium confidenceImplements full JSON-RPC 2.0 specification compliance in the MCP protocol handler, including proper request/response formatting, error code mapping, and exception handling. The handler validates incoming requests, maps TeamCity API errors to JSON-RPC error codes, and returns properly formatted error responses with descriptive messages. This ensures compatibility with standard JSON-RPC clients and enables clear error communication to AI agents.
Implements strict JSON-RPC 2.0 compliance with proper error code mapping and validation in the protocol handler (internal/mcp/handler.go), ensuring compatibility with standard JSON-RPC clients — many MCP implementations use simplified or non-standard JSON-RPC variants
Provides standards-compliant JSON-RPC 2.0 support that integrates with any JSON-RPC 2.0 client, not just MCP-specific tools
teamcity resource discovery and listing via mcp resources
Medium confidenceExposes TeamCity resources (projects, build types, builds, agents) as MCP resource URIs (teamcity://projects, teamcity://buildTypes, teamcity://builds, teamcity://agents) that map directly to TeamCity REST API endpoints (/app/rest/projects, /app/rest/buildTypes, etc.). The resource handler fetches and structures data from TeamCity, enabling AI agents to discover and enumerate CI/CD infrastructure without needing to understand TeamCity's API structure.
Maps TeamCity REST endpoints directly to MCP resource URIs with transparent JSON transformation, allowing AI agents to discover infrastructure through standard MCP resource protocol rather than custom tool invocations — most CI/CD integrations require separate 'list' tools for each resource type
Provides structured, discoverable access to TeamCity infrastructure that AI agents can explore naturally without memorizing API endpoint patterns or parameter structures
build triggering with branch and property parameters
Medium confidenceImplements the trigger_build tool that initiates new TeamCity builds with support for specifying target branch, custom build parameters, and build type selection. The tool accepts buildTypeId, branchName, and properties parameters, constructs a TeamCity build request, and returns build ID and status. This enables AI agents to programmatically start CI/CD pipelines with context-specific configuration.
Accepts structured parameters (buildTypeId, branchName, properties) that map directly to TeamCity's build request schema, enabling AI agents to construct valid build triggers without understanding TeamCity's internal parameter format — most CI/CD tools require users to know exact parameter names and types
Allows AI agents to trigger builds with branch and parameter context from natural language, reducing the need for users to manually specify technical build configuration details
build cancellation with comment annotation
Medium confidenceImplements the cancel_build tool that stops running TeamCity builds by buildId with optional comment annotation. The tool sends a cancellation request to TeamCity's build management API, allowing AI agents to halt in-progress builds and provide context about why the cancellation occurred. Comments are stored in TeamCity's build history for audit and debugging purposes.
Combines build cancellation with comment annotation in a single tool invocation, allowing AI agents to provide context about cancellation decisions that persists in TeamCity's audit trail — most CI/CD tools separate cancellation and annotation into distinct operations
Enables AI agents to stop builds with explanatory context, improving team visibility into why builds were halted compared to silent cancellations
build pinning to prevent automatic cleanup
Medium confidenceImplements the pin_build tool that marks TeamCity builds as 'pinned' to prevent automatic cleanup and retention policy deletion. The tool accepts buildId, pin (boolean), and optional comment parameters, allowing AI agents to preserve important builds (successful releases, baseline builds) from garbage collection. Pinned builds remain accessible for artifact retrieval and historical analysis.
Provides explicit build pinning as a first-class tool operation with comment annotation, enabling AI agents to make retention decisions and document them in-place — most CI/CD systems require manual UI interaction or complex retention policy configuration to preserve builds
Allows AI agents to programmatically preserve important builds with context, reducing manual intervention in release and artifact management workflows
build tagging with add/remove operations
Medium confidenceImplements the set_build_tag tool that manages TeamCity build tags through add and remove operations. The tool accepts buildId, tags (array of tag strings to add), and removeTags (array of tag strings to remove) parameters, allowing AI agents to organize and categorize builds with semantic labels. Tags enable filtering, searching, and organizing builds by release version, environment, or custom metadata.
Combines add and remove tag operations in a single tool invocation with separate parameters, allowing atomic tag updates without race conditions — most CI/CD tools require separate API calls for adding and removing tags
Enables AI agents to organize builds with semantic labels and manage tag lifecycle programmatically, improving build discoverability and organization in large TeamCity instances
build artifact download and retrieval
Medium confidenceImplements the download_artifact tool that retrieves build artifacts from TeamCity by buildId and artifactPath. The tool constructs artifact download URLs, handles authentication, and streams artifact content to the client. This enables AI agents to access build outputs (binaries, logs, reports) for analysis, deployment, or distribution without requiring direct TeamCity UI access.
Exposes artifact download as a first-class tool operation that handles authentication and path resolution transparently, allowing AI agents to retrieve artifacts without understanding TeamCity's artifact storage structure — most CI/CD integrations require users to construct artifact URLs manually
Enables AI agents to access build artifacts programmatically for analysis and deployment, reducing manual artifact retrieval steps in CI/CD workflows
advanced build search with multi-criteria filtering
Medium confidenceImplements the search_builds tool that queries TeamCity builds using multiple filter criteria: status (success/failure/running), branch, user (who triggered the build), and tags. The tool constructs TeamCity REST API search queries with combined filters, returning matching builds with metadata. This enables AI agents to find specific builds based on complex criteria without enumerating all builds.
Combines multiple search criteria (status, branch, user, tags) in a single tool invocation with AND-based filtering, allowing AI agents to construct complex queries without multiple API calls — most CI/CD tools require separate queries for each filter dimension
Enables AI agents to find specific builds efficiently in large TeamCity instances without enumerating all builds, improving performance and reducing API load
configuration management with environment variables and secrets
Medium confidenceManages server configuration through environment variables and configuration files (internal/config/config.go), including TeamCity server URL, authentication credentials, transport settings, and logging configuration. The configuration system supports multiple authentication methods (token-based, username/password) and transport modes (HTTP, WebSocket, STDIO), allowing flexible deployment across different environments without code changes.
Implements configuration through environment variables and structured config files (internal/config/config.go) with support for multiple authentication methods and transport modes, enabling flexible deployment without code changes — most CI/CD integrations hardcode configuration or require complex setup
Supports environment-based configuration that integrates seamlessly with container orchestration (Docker, Kubernetes) and cloud deployment patterns
Capabilities are decomposed by AI analysis. Each maps to specific user intents and improves with match feedback.
Related Artifactssharing capabilities
Artifacts that share capabilities with TeamCity, ranked by overlap. Discovered automatically through the match graph.
@claude-flow/mcp
Standalone MCP (Model Context Protocol) server - stdio/http/websocket transports, connection pooling, tool registry
llm-analysis-assistant
** <img height="12" width="12" src="https://raw.githubusercontent.com/xuzexin-hz/llm-analysis-assistant/refs/heads/main/src/llm_analysis_assistant/pages/html/imgs/favicon.ico" alt="Langfuse Logo" /> - A very streamlined mcp client that supports calling and monitoring stdio/sse/streamableHttp, and ca
mcp-use
The fullstack MCP framework to develop MCP Apps for ChatGPT / Claude & MCP Servers for AI Agents.
playwright-mcp
Playwright MCP server
@clerk/mcp-tools
Tools for writing MCP clients and servers without pain
@modelcontextprotocol/server-map
MCP App Server example with CesiumJS 3D globe and geocoding
Best For
- ✓AI-powered development tools (Claude Desktop, Cursor) that support MCP
- ✓Teams building custom LLM agents that need TeamCity integration
- ✓Organizations standardizing on MCP for CI/CD tool access
- ✓Teams deploying MCP servers in containerized/cloud infrastructure
- ✓Desktop application developers integrating with Claude Desktop or Cursor
- ✓Organizations needing flexible transport options for different deployment scenarios
- ✓High-traffic MCP server deployments with frequent resource queries
- ✓Production environments requiring availability monitoring
Known Limitations
- ⚠Requires MCP-compatible client (Claude Desktop, Cursor, or custom MCP consumer)
- ⚠Protocol overhead adds latency compared to direct REST API calls
- ⚠MCP specification version compatibility must be maintained across updates
- ⚠STDIO transport limited to single client connection per process instance
- ⚠WebSocket connections require persistent network availability
- ⚠HTTP transport lacks built-in request authentication (relies on TeamCity credentials only)
Requirements
Input / Output
UnfragileRank
UnfragileRank is computed from adoption signals, documentation quality, ecosystem connectivity, match graph feedback, and freshness. No artifact can pay for a higher rank.
About
** - MCP server for TeamCity, integrates with Claude Desktop and Cursor.
Categories
Alternatives to TeamCity
Are you the builder of TeamCity?
Claim this artifact to get a verified badge, access match analytics, see which intents users search for, and manage your listing.
Get the weekly brief
New tools, rising stars, and what's actually worth your time. No spam.
Data Sources
Looking for something else?
Search →