jupyter-mcp-server
MCP ServerFree🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
Capabilities15 decomposed
mcp protocol bridging to jupyter environments
Medium confidenceImplements a FastMCP-based server that translates Model Context Protocol messages from AI clients (Claude Desktop, VS Code, Cursor) into Jupyter API calls, using STDIO and HTTP transports with CORS middleware. The server maintains a singleton ServerContext for configuration and routes requests through a tool registry to 15+ specialized notebook operation tools, enabling stateful interaction with Jupyter kernels and notebook documents.
Dual-mode architecture supporting both standalone MCP server (port 4040) and embedded Jupyter Server extension, enabling deployment flexibility without requiring separate infrastructure. Uses FastMCPWithCORS for native HTTP transport with CORS support, differentiating from stdio-only MCP implementations.
Provides native Jupyter integration via standard Jupyter APIs rather than reverse-engineering notebook formats, ensuring compatibility with JupyterHub, Google Colab, and Datalayer Notebooks simultaneously.
multi-notebook session management with concurrent kernel execution
Medium confidenceThe NotebookManager component maintains isolated session state for multiple notebooks, tracking kernel connections, cell execution order, and output buffers per notebook. It implements session lifecycle management (open, close, switch) and routes execution requests to the correct kernel instance, enabling AI clients to work with multiple notebooks in parallel without cross-contamination of kernel state or variable scope.
Implements explicit notebook session tracking via NotebookManager with per-notebook kernel references, rather than relying on Jupyter's implicit kernel selection. Enables AI clients to maintain multiple concurrent notebook contexts without manual kernel switching.
Provides programmatic multi-notebook orchestration that Jupyter's native UI lacks, allowing AI agents to coordinate work across multiple notebooks as a single logical workflow.
docker containerization with multi-architecture support
Medium confidenceDistributes the MCP server as a multi-architecture Docker image (datalayer/jupyter-mcp-server) supporting amd64 and arm64 platforms. The Dockerfile installs the jupyter-mcp-server package and Jupyter dependencies, enabling one-command deployment in containerized environments. The image includes both standalone server and extension modes, selectable via environment variables or command-line arguments.
Provides multi-architecture Docker images (amd64, arm64) built with GitHub Actions, enabling deployment on diverse infrastructure without requiring local builds.
Eliminates dependency installation and Python version management that manual deployments require, reducing deployment friction in containerized environments.
multimodal output processing with image and plot rendering
Medium confidenceCaptures and processes cell execution outputs in multiple MIME types (text/plain, text/html, image/png, image/svg+xml, application/json), converting matplotlib figures and pandas DataFrames into base64-encoded images or HTML. The output processor preserves the original MIME type metadata, allowing clients to render outputs appropriately (display images, render tables, parse JSON).
Preserves MIME type metadata for each output, enabling clients to render outputs appropriately (images as images, HTML as HTML, JSON as structured data) rather than converting everything to text.
Captures and returns rich outputs (plots, tables) that text-only execution APIs discard, enabling AI to reason about visual results and make data-driven decisions.
configuration management with environment variable and file-based settings
Medium confidenceImplements ServerContext singleton that loads configuration from environment variables and optional config files, managing settings like Jupyter Server URL, authentication tokens, notebook paths, and deployment mode (standalone vs. extension). Configuration is loaded at server startup and cached in memory, allowing clients to query current settings via tools.
Implements ServerContext singleton for centralized configuration management, enabling environment-variable-based configuration suitable for containerized deployments without requiring code changes.
Supports both environment variables and config files, providing flexibility for different deployment scenarios (Docker, Kubernetes, local development) without code changes.
error handling and execution failure reporting with detailed diagnostics
Medium confidenceImplements comprehensive error handling that captures kernel errors (syntax errors, runtime exceptions, timeouts), network errors (connection failures, timeouts), and MCP protocol errors (invalid requests, schema violations). Errors are returned to clients with detailed diagnostic information (error type, traceback, execution context) enabling AI clients to understand failures and retry intelligently.
Captures and returns detailed kernel error tracebacks and execution context, enabling AI clients to understand failures and make intelligent retry decisions rather than treating all errors as opaque failures.
Provides detailed error diagnostics that generic execution APIs might suppress, enabling AI agents to debug and recover from failures autonomously.
prompt templates for notebook-specific ai tasks
Medium confidenceProvides pre-built prompt templates (via MCP's prompts/list and prompts/get endpoints) that guide AI clients in common notebook tasks like code review, debugging, data exploration, and documentation generation. Templates include context about notebook structure and execution state, reducing the need for clients to construct prompts from scratch.
Provides MCP-native prompt templates that guide AI clients in notebook-specific tasks, reducing the need for clients to construct prompts from scratch and standardizing AI behavior across teams.
Offers structured task guidance that generic AI clients lack, enabling consistent and high-quality AI interactions with notebooks without requiring client-side prompt engineering.
cell-level code reading and writing with ast-aware insertion
Medium confidenceExposes tools for reading notebook cell contents (code, markdown, raw) and writing new cells with position control (before, after, replace). The implementation preserves notebook structure by respecting cell boundaries and execution order, allowing AI clients to inspect code context before modification and insert cells at semantically meaningful positions without corrupting the notebook document structure.
Implements position-aware cell insertion (before/after/replace) that maintains notebook execution order semantics, rather than simple append-only operations. Preserves cell metadata and execution counts during modifications.
Provides fine-grained cell-level control that notebook UIs typically hide, enabling AI agents to reason about code structure and insertion points programmatically.
synchronous and asynchronous cell execution with output capture
Medium confidenceExecutes notebook cells against a live kernel using Jupyter's execute_request message protocol, capturing stdout, stderr, and structured outputs (plots, dataframes, images) in real-time. Supports both blocking execution (wait for completion) and non-blocking execution (poll for results), with output buffering that preserves multimodal content including matplotlib figures and pandas DataFrames rendered as HTML.
Implements dual execution pathways (sync and async) with multimodal output processing that preserves matplotlib figures, pandas DataFrames, and other rich MIME types as base64-encoded images and HTML, rather than converting everything to text.
Captures and returns structured outputs (plots, tables) that text-only execution APIs discard, enabling AI clients to reason about visual results and data structures.
notebook metadata inspection and kernel status monitoring
Medium confidenceProvides tools to inspect notebook metadata (kernel name, language, nbformat version), query kernel status (idle, busy, dead), and retrieve execution history (cell execution counts, timestamps). Implements polling-based kernel health monitoring that allows AI clients to determine if a kernel is ready for execution or if previous operations are still in flight.
Exposes kernel status and notebook metadata as queryable tools rather than implicit state, enabling AI clients to make execution decisions based on kernel readiness and language type.
Provides explicit kernel health checks that prevent AI agents from attempting execution on dead or busy kernels, reducing error rates in automated workflows.
jupyterlab ui integration with cell focus and navigation
Medium confidenceImplements tools that interact with JupyterLab's frontend state, including setting the active cell, scrolling to a cell, and triggering UI actions (save, run). Uses JupyterLab's message protocol to communicate with the frontend, allowing AI clients to control the notebook view and focus without requiring direct browser automation or UI scripting.
Bridges MCP protocol to JupyterLab's frontend message protocol, enabling AI clients to control the notebook UI view without browser automation or direct DOM manipulation.
Provides native JupyterLab integration that web-scraping or Selenium-based approaches cannot match, with lower latency and no browser dependency.
notebook file i/o with format preservation
Medium confidenceImplements tools to read and write notebook files (.ipynb) from the filesystem, preserving the Jupyter notebook JSON structure, cell metadata, and output artifacts. Uses Jupyter's nbformat library to parse and serialize notebooks, ensuring compatibility with Jupyter's format versioning and preventing corruption of notebook files during read-write cycles.
Uses Jupyter's nbformat library for format-aware parsing and serialization, ensuring compatibility with Jupyter's versioning and preventing format drift that custom JSON parsing might introduce.
Preserves notebook metadata and output artifacts that text-based or line-oriented file I/O would lose, maintaining full notebook fidelity.
tool discovery and dynamic capability advertisement
Medium confidenceImplements MCP's tools/list and tools/describe endpoints to dynamically advertise the 15+ available tools with their schemas, parameters, and descriptions. The tool registry is populated at server startup from the tools/ directory, allowing clients to discover capabilities without hardcoding tool names. Each tool includes JSON Schema definitions for input validation and output typing.
Implements MCP's standard tool discovery protocol with JSON Schema validation, enabling generic MCP clients to work with the server without prior knowledge of available tools.
Provides self-documenting tool interfaces that REST APIs or custom protocols would require separate documentation for, reducing integration friction.
standalone server deployment with http transport and cors
Medium confidenceDeploys the MCP server as a standalone process (default port 4040) using FastAPI with CORS middleware, enabling HTTP-based MCP clients to connect without STDIO pipes. The server implements FastMCPWithCORS, a custom FastMCP subclass that adds CORS headers for cross-origin requests, allowing web-based MCP clients and browser extensions to communicate with the server.
Implements FastMCPWithCORS, a custom FastMCP subclass with CORS middleware, enabling HTTP-based MCP clients to connect without STDIO pipes. Provides both standalone and embedded deployment modes from the same codebase.
Offers flexible deployment options (standalone or embedded) that stdio-only MCP servers cannot match, enabling integration with web-based clients and containerized infrastructure.
jupyter server extension deployment with native integration
Medium confidenceDeploys the MCP server as a Jupyter Server extension (handlers.py), embedding it directly into the Jupyter Server process without requiring a separate service. The extension registers MCP endpoints alongside Jupyter's native API, allowing clients to connect via the same WebSocket or HTTP connection as the notebook frontend, with automatic authentication inheritance from Jupyter's session management.
Implements dual-mode deployment (standalone and embedded) with automatic authentication inheritance from Jupyter Server, eliminating the need for separate credential management in JupyterHub environments.
Provides seamless JupyterHub integration that standalone servers cannot match, with automatic per-user isolation and authentication without additional configuration.
Capabilities are decomposed by AI analysis. Each maps to specific user intents and improves with match feedback.
Related Artifactssharing capabilities
Artifacts that share capabilities with jupyter-mcp-server, ranked by overlap. Discovered automatically through the match graph.
sandbox
All-in-One Sandbox for AI Agents that combines Browser, Shell, File, MCP and VSCode Server in a single Docker container.
Kubeflow
ML toolkit for Kubernetes — pipelines, notebooks, training, serving, feature store.
playwright-mcp
Playwright MCP server
playwright-mcp
Playwright MCP server
CoCalc
Unlock advanced compute power with optional GPU support, seamless file synchronization, and versatile software environments, all billed by the second for...
Maven Tools
** - Enhanced Maven Central integration with intelligent caching, bulk operations, and version classification
Best For
- ✓AI developers building Claude Desktop or VS Code extensions that need Jupyter integration
- ✓Teams deploying AI agents that must interact with Jupyter notebooks as a data science tool
- ✓Organizations running JupyterHub or JupyterLab who want to expose notebooks to MCP clients
- ✓Data scientists running parallel experiments across multiple notebooks
- ✓AI agents orchestrating multi-step workflows that span multiple notebook files
- ✓Teams collaborating on projects with notebook-per-component architecture
- ✓Kubernetes deployments where the MCP server runs as a sidecar or separate pod
- ✓Docker Compose setups that bundle Jupyter and MCP server together
Known Limitations
- ⚠Requires a running Jupyter server instance — cannot operate on offline notebooks
- ⚠STDIO transport adds latency for high-frequency tool calls due to process serialization overhead
- ⚠Multi-notebook session management requires explicit notebook ID tracking; no automatic context switching
- ⚠Session state is ephemeral — restarting the MCP server loses all notebook session references
- ⚠No built-in persistence of execution history across server restarts
- ⚠Kernel memory is shared within a notebook session; no automatic isolation between cells
Requirements
Input / Output
UnfragileRank
UnfragileRank is computed from adoption signals, documentation quality, ecosystem connectivity, match graph feedback, and freshness. No artifact can pay for a higher rank.
Repository Details
Last commit: Apr 21, 2026
About
🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
Categories
Alternatives to jupyter-mcp-server
Are you the builder of jupyter-mcp-server?
Claim this artifact to get a verified badge, access match analytics, see which intents users search for, and manage your listing.
Get the weekly brief
New tools, rising stars, and what's actually worth your time. No spam.
Data Sources
Looking for something else?
Search →