Local-first by default
Your sessions stay under ~/.owlcoda. No telemetry, no upload — there is no OwlCoda server.
~/.owlcoda/sessions/<id>.json A coding workbench that wires Ollama, Kimi, MiniMax, OpenRouter, and more — all from one config. Your models. Your tools. Your data.
$ owlcoda init && owlcoda Why OwlCoda
Your sessions stay under ~/.owlcoda. No telemetry, no upload — there is no OwlCoda server.
~/.owlcoda/sessions/<id>.json Ollama for everyday work, Kimi or MiniMax for the heavy lift, switch with --model.
owlcoda --model heavy → Kimi K2 42+ tools, 69+ slash commands, sessions, learned skills. Drag-select and copy work like any terminal app.
/model · /cost · /budget · /skills Source install for now; npm / Homebrew / standalone binary planned for 1.0. Requires Node ≥ 18 and any OpenAI-compatible local backend.
Clone and build OwlCoda
git clone https://github.com/yeemio/owlcoda.git
cd owlcoda && npm install && npm run build Expose owlcoda globally
npm link # or: node /path/to/owlcoda/dist/cli.js Point at a backend (Ollama shown)
owlcoda init --router http://127.0.0.1:11434/v1 Start the REPL
owlcoda OwlCoda does not ship its own model — you point it at one. Out of the box it speaks to:
http://127.0.0.1:11434/v1 http://127.0.0.1:1234/v1 http://127.0.0.1:8000/v1 user-supplied https://api.moonshot.ai/v1 https://api.kimi.com/coding https://api.minimax.io/anthropic https://openrouter.ai/api/v1 https://dashscope.aliyuncs.com/compatible-mode/v1 https://api.openai.com/v1 OwlCoda runs as a native terminal REPL — not a web wrapper.