logo

OpenCode-LLM-proxy – use any OpenCode model via OpenAI/Anthropic/Gemini API

Posted by kochc |3 hours ago |2 comments

kochc 3 hours ago

I use OpenCode which already has GitHub Copilot, Ollama, Anthropic, Gemini etc. configured. But every other tool — Open WebUI, LangChain, my own scripts — needed the same models re-entered separately. This plugin starts a local HTTP server on port 4010 that translates between OpenAI, Anthropic, and Gemini API formats and whichever model OpenCode has configured. So you point any tool at http://127.0.0.1:4010/v1 and it just works.

kochc 3 hours ago

Setup: npm install opencode-llm-proxy Add "plugin": ["opencode-llm-proxy"] to opencode.json, start OpenCode, done. Supports streaming for all four formats. 112 tests, MIT license.