multi-language script execution with auto-inferred json schemas
Executes code in 13+ languages (Python, TypeScript, Go, Bash, Java, Rust, C#, PHP, Deno, Bun, Ansible, Nu, SQL) by routing to language-specific executors in windmill-worker that parse function signatures using language-specific parsers (windmill-parser-*) to automatically infer JSON schemas without manual type annotation. Workers poll PostgreSQL queue table using SELECT FOR UPDATE SKIP LOCKED, execute in sandboxed nsjail environments, and store results in completed_job table or S3, enabling polyglot workflow composition.
Unique: Uses language-specific AST parsers (not regex) to infer JSON schemas directly from function signatures, eliminating manual type annotation while supporting 13+ languages with isolated execution via nsjail per job
vs alternatives: Faster and more flexible than cloud-only solutions like Zapier because execution is local/self-hosted, and more polyglot-friendly than Temporal or Prefect which optimize for Python/TypeScript
openflow-based workflow orchestration with state tracking
Composes multi-step workflows using OpenFlow specification (openflow.openapi.yaml) where modules execute sequentially or in parallel with full state tracking in PostgreSQL flow_status JSONB column. Each step can branch on conditions, loop over arrays, or call other flows/scripts, with intermediate results passed between steps via variable interpolation. The worker processes flow definitions by parsing the DAG, executing modules in dependency order, and persisting state after each step for resumability and debugging.
Unique: Tracks full execution state in PostgreSQL JSONB (not just logs), enabling step-level resumability and debugging; OpenFlow spec is open and language-agnostic unlike proprietary workflow DSLs
vs alternatives: More transparent than Zapier (full state visibility) and simpler than Airflow (no DAG compilation step) while supporting both visual and code-based workflow definition
client libraries for programmatic access (typescript, python, powershell)
Provides official SDKs in TypeScript, Python, and PowerShell for programmatically calling Windmill scripts and flows from external applications. The SDKs handle authentication, request serialization, and response deserialization, with type hints generated from script schemas. Clients support both synchronous and asynchronous execution, polling for job completion, and streaming results. The SDKs are auto-generated from the OpenAPI spec (windmill-api/openapi.yaml) ensuring consistency with the API.
Unique: Auto-generated from OpenAPI spec ensuring consistency; provides type hints based on inferred script schemas; supports both sync and async execution patterns
vs alternatives: More convenient than raw HTTP clients because of type safety and built-in serialization, and more flexible than webhooks for request-response patterns
job result visualization and artifact management
Stores job results in PostgreSQL completed_job table with full execution context (inputs, outputs, logs, duration), and provides a web UI for browsing results with filtering by status, date, and user. Large payloads (>1MB) are stored in S3 with references in the database. Results can be visualized as tables, charts, or raw JSON depending on output type, and artifacts (files, exports) are downloadable. The system maintains result history per script/flow for trend analysis and debugging.
Unique: Results stored with full execution context (inputs, outputs, logs, duration) in PostgreSQL; large payloads spilled to S3; web UI provides filtering and visualization
vs alternatives: More integrated than external logging systems because results are stored alongside execution metadata, and simpler than building custom dashboards
dependency management with lockfile generation and caching
Automatically detects dependencies in scripts (imports, requires, use statements) and generates language-specific lockfiles (requirements.txt for Python, package-lock.json for Node.js, go.mod for Go, etc.) to ensure reproducible execution. Dependencies are cached on workers to avoid repeated downloads, and the system detects when lockfiles change to invalidate caches. The parser (windmill-parser-*) extracts imports from code and resolves them to specific versions, supporting both public registries and private package repositories.
Unique: Automatically detects and resolves dependencies from code without manual lockfile editing; generates language-specific lockfiles and caches on workers for fast execution
vs alternatives: More automatic than manual requirements management, and more reproducible than relying on latest versions
webhook-based job triggering with signature verification
Exposes webhook endpoints for each script/flow that accept HTTP POST requests and enqueue jobs with the request payload as parameters. Webhooks support signature verification (HMAC-SHA256) to ensure requests come from trusted sources, and can be triggered by external services (GitHub, Slack, Stripe, etc.) without authentication. The system generates unique webhook URLs per script and supports custom headers and query parameters for routing. Webhook delivery is retried with exponential backoff if the job fails.
Unique: Generates unique webhook URLs per script with optional HMAC-SHA256 signature verification; integrates with external services without requiring API keys in Windmill
vs alternatives: More secure than unauthenticated webhooks because of signature verification, and simpler than building custom webhook handlers
auto-generated rest apis and uis from scripts
Automatically exposes any script as a REST API endpoint and generates a web form UI by introspecting the inferred JSON schema. The API server (windmill-api) creates routes dynamically for each script, accepting JSON payloads that map to function parameters. The frontend (SvelteKit) renders form components based on schema type (string, number, object, array) with validation, and submits to the API which enqueues a job. Results are returned synchronously for short-running scripts or via polling/webhooks for long-running jobs, eliminating manual API/UI boilerplate.
Unique: Generates both REST API and web UI from a single source (function signature), with schema inference eliminating manual OpenAPI specs; form validation happens client-side and server-side
vs alternatives: Faster iteration than building custom APIs with FastAPI/Express, and more flexible than low-code platforms like Retool which require UI-first thinking
cron-based job scheduling with timezone and concurrency control
Schedules scripts and flows to run on cron expressions with timezone awareness, storing schedule definitions in PostgreSQL and using a background scheduler service to enqueue jobs at the specified times. The scheduler respects concurrency limits per script (preventing duplicate runs if previous execution hasn't completed) and supports both simple cron syntax and human-readable schedules. Failed scheduled jobs are retried with exponential backoff, and execution history is logged for audit and debugging.
Unique: Integrates scheduling directly into the platform with concurrency limits and timezone awareness, avoiding separate cron infrastructure; schedule definitions are version-controlled as code
vs alternatives: Simpler than Airflow for basic scheduling (no DAG compilation), and more reliable than system cron because execution is tracked in the database with retry logic
+6 more capabilities