Install
Get OwlCoda running on macOS, Linux, or Windows-WSL in under 5 minutes.
OwlCoda is currently distributed as source. npm / Homebrew / standalone binary are planned for 1.0.
Prerequisites
- Node.js ≥ 18 (Node 20+ recommended)
- A local OpenAI-compatible inference backend (Ollama / LM Studio / vLLM / any custom router) — required for local-only operation
- macOS, Linux, or Windows (WSL recommended on Windows)
Steps
git clone https://github.com/yeemio/owlcoda.git
cd owlcoda
npm install
npm run build
npm link # exposes `owlcoda` globally
If npm link fails
This usually means your global npm prefix is not writable.
- Option A — sudo:
sudo npm link - Option B — user-local prefix:
npm config set prefix ~/.local export PATH=~/.local/bin:$PATH # add this to your shell profile cd /path/to/owlcoda && npm link - Option C — skip the link entirely:
node /path/to/owlcoda/dist/cli.js …
Verify the install
owlcoda --version
owlcoda doctor
owlcoda doctor checks Node version, the local runtime reachability, and configured models. See Troubleshooting for what to do when a check fails.
Next
Configure your first backend in Configure providers.