Skip to content

Feat/add litellm provider#1739

Open
RheagalFire wants to merge 4 commits intoanomalyco:devfrom
RheagalFire:feat/add-litellm-provider
Open

Feat/add litellm provider#1739
RheagalFire wants to merge 4 commits intoanomalyco:devfrom
RheagalFire:feat/add-litellm-provider

Conversation

@RheagalFire
Copy link
Copy Markdown

Adds LiteLLM as a provider with 48 models across 8 providers.

LiteLLM is an AI gateway/proxy that routes to 100+ LLM providers through a single OpenAI-compatible endpoint. Users who run a LiteLLM proxy server can connect it to opencode and access any model behind it.

Provider config

  • npm: @ai-sdk/openai-compatible
  • env: LITELLM_API_KEY
  • api: http://localhost:4000/v1 (default proxy URL)

Models added (48)

  • Anthropic (9): Claude Opus 4.6/4.5/4.1, Sonnet 4.6/4.5, Haiku 4.5, 3.7 Sonnet, 3.5 Sonnet v2, 3 Haiku
  • OpenAI (20): GPT-5.5 through GPT-4 Turbo, o3/o4 series, Codex variants
  • Google (5): Gemini 2.5/2.0/1.5 Flash & Pro
  • DeepSeek (2): Chat, Reasoner
  • Mistral (5): Large, Small, NeMo, Codestral, Devstral
  • Cohere (2): Command A, Command R+
  • xAI (3): Grok 3, 3 Mini, 2
  • Groq (2): Llama 3.3 70B, 3.1 8B

All models use extends to inherit definitions from existing providers. bun validate passes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant