Skip to content

feat(providers): add Atomic Chat OpenAI-compatible provider#12365

Open
yanalialiuk wants to merge 1 commit into
RooCodeInc:mainfrom
yanalialiuk:feat/atomic-chat-local-provider
Open

feat(providers): add Atomic Chat OpenAI-compatible provider#12365
yanalialiuk wants to merge 1 commit into
RooCodeInc:mainfrom
yanalialiuk:feat/atomic-chat-local-provider

Conversation

@yanalialiuk
Copy link
Copy Markdown

@yanalialiuk yanalialiuk commented May 14, 2026

Summary

Adds Atomic Chat as a local OpenAI-compatible provider for Roo Code. Atomic Chat exposes a local OpenAI-style HTTP API (default: http://127.0.0.1:1337) with support for GET /v1/models and POST /v1/chat/completions, allowing Roo Code to use locally hosted models through the same workflow as other OpenAI-compatible providers.

Project: https://github.com/AtomicBot-ai/Atomic-Chat

What changed

  • Added new provider id: atomic-chat
  • Added provider settings:
    • atomicChatModelId
    • atomicChatBaseUrl
    • atomicChatApiKey
  • Updated:
    • localProviders
    • provider catalog
    • model-id maps
    • secret key handling
    • extension message types for model refresh
  • Added AtomicChatHandler
  • Uses OpenAI SDK with configurable baseURL
  • Uses automatic /v1 routing
  • Reuses shared Anthropic → OpenAI message conversion
  • Uses the same streaming flow as existing OpenAI-compatible local providers
  • Added model discovery support:
    • getAtomicChatModels
    • modelCache integration branch
  • Added webview request/response flow:
    • requestAtomicChatModels
    • atomicChatModels
  • Uses the same refresh/discovery pattern as LM Studio and related local providers
  • Added webview support:
    • provider settings UI
    • validation
    • model picker support
    • localization strings (EN)

How to test

  1. Start Atomic Chat (or another compatible OpenAI-style local server) using a base URL such as:
http://127.0.0.1:1337
  1. Ensure the server exposes:
  • GET /v1/models
  • POST /v1/chat/completions
  • Expected model response format:
{
  "data": [
    {
      "id": "model-name"
    }
  ]
}

3.Open Roo Code settings.
4.Select provider: Atomic Chat
5.Configure:
-base URL (optional if using default)
-API key (optional)
-model
6.Verify:
-models populate correctly
-chat completion succeeds
-streaming responses function normally

Notes

  • Default base URL matches a common local setup: http://127.0.0.1:1337
  • Users can override the endpoint as needed
    -Model metadata currently follows the same defaults used by other OpenAI-compatible providers until richer metadata -----becomes available from the upstream API
    -The implementation intentionally follows existing OpenAI-compatible provider patterns to minimize maintenance overhead and behavioral differences

Interactively review PR in Roo Code Cloud

- Add atomic-chat provider with configurable base URL (default 127.0.0.1:1337)
- Fetch models from GET /v1/models; optional Bearer API key
- Wire extension host, types, settings UI, validation, and model picker
- Fix validate.spec RouterModels mock for new provider key

Co-authored-by: Cursor <cursoragent@cursor.com>
@dosubot dosubot Bot added size:XL This PR changes 500-999 lines, ignoring generated files. Enhancement New feature or request labels May 14, 2026
super()
this.options = options

const baseRoot = (this.options.atomicChatBaseUrl || "http://127.0.0.1:1337").replace(/\/+$/, "")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement New feature or request size:XL This PR changes 500-999 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants