via “observability-and-logging-with-callback-system”
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
Unique: Implements a callback-based observability system where developers register custom callbacks for lifecycle events (pre-request, post-request, on-error), with built-in integrations to Langfuse and support for custom backends via webhook callbacks, enabling flexible logging without tight coupling
vs others: More flexible than provider-native logging; supports custom callbacks and multiple observability backends simultaneously, enabling vendor-agnostic observability vs. being locked into provider dashboards