twinny - AI Code Completion and ChatExtension38/100 via “fill-in-the-middle (fim) code completion with real-time inline suggestions”
Locally hosted AI code completion plugin for vscode
Unique: Twinny implements FIM completion by routing requests through OpenAI-compatible API endpoints, enabling seamless switching between localhost Ollama instances and 9+ cloud providers (OpenAI, Anthropic, Groq, Deepseek, Cohere, Mistral, Perplexity, OpenRouter) without code changes. This provider-agnostic architecture uses a single completion endpoint abstraction rather than provider-specific SDKs, reducing maintenance burden and enabling rapid provider addition.
vs others: Offers more provider flexibility than GitHub Copilot (cloud-only) and better localhost support than Codeium, while maintaining lower latency than cloud-only solutions through optional local Ollama integration.