Please verify in LLMs_Manager:
- Provider is selected
- API key is filled in
- Clicked Save
- Enable in Nodes is ON
Then refresh ComfyUI web UI (Cmd/Ctrl + R).
- Save provider config first
- Refresh browser after config changes
- Confirm the provider is enabled in manager
This toolkit uses a lightweight OpenAI-compatible API path and does not require transformers for normal use.
If your environment still reports transformers errors:
- Update ComfyUI to latest stable
- Reinstall node dependencies:
pip install -r requirements.txt -U
- Check for conflicting custom nodes that import incompatible transformer/torch stacks
- Restart ComfyUI and capture full traceback for issue reports
When reporting, include:
- ComfyUI version
- Python version
- OS
- Full error traceback
- Minimal workflow to reproduce
- Ensure endpoint is OpenAI-compatible (e.g.
http://localhost:11434/v1) - Verify model name matches server-side model id
- Test with a simple chat completion request first
- Use
Image Preprocessorbefore connecting to adapter nodeprep_img - Ensure image tensor/PIL input is valid
Open an issue and include environment + full traceback: