配置 backend

把 OwlCoda 接到一个或多个后端 —— 本地 runtime、云端 provider,或两者并存。

owlcoda init 会写一份起步 config.json 并自动探测本机标准端口上的本地 runtime。下面是按 provider 给的具体配置。

本地:Ollama

owlcoda init --router http://127.0.0.1:11434/v1
owlcoda

本地:LM Studio

owlcoda init --router http://127.0.0.1:1234/v1
owlcoda

本地:vLLM

owlcoda init --router http://127.0.0.1:8000/v1
owlcoda

云端:Kimi (Moonshot)

export KIMI_API_KEY=sk-...
owlcoda init --router https://api.moonshot.ai/v1

然后编辑 config.json 把 key 接上:

{
  "routerUrl": "https://api.moonshot.ai/v1",
  "models": [
    {
      "id": "kimi-k2",
      "label": "Kimi K2",
      "backendModel": "moonshot-v1-128k",
      "endpoint": "https://api.moonshot.ai/v1",
      "apiKeyEnv": "KIMI_API_KEY",
      "aliases": ["default", "kimi"],
      "default": true
    }
  ]
}

云端:MiniMax(Messages 形态)

{
  "routerUrl": "https://api.minimax.io/anthropic",
  "models": [
    {
      "id": "minimax-m27",
      "label": "MiniMax M2.7",
      "backendModel": "minimax-m2.7-highspeed",
      "endpoint": "https://api.minimax.io/anthropic",
      "apiKeyEnv": "MINIMAX_API_KEY",
      "localRuntimeProtocol": "anthropic_messages",
      "aliases": ["default", "minimax"],
      "default": true
    }
  ]
}

云端:OpenRouter(多模型网关)

{
  "routerUrl": "https://openrouter.ai/api/v1",
  "models": [
    {
      "id": "openrouter-default",
      "label": "OpenRouter selection",
      "backendModel": "qwen/qwen3-coder",
      "endpoint": "https://openrouter.ai/api/v1",
      "apiKeyEnv": "OPENROUTER_API_KEY",
      "aliases": ["default"],
      "default": true
    }
  ]
}

云端:阿里百炼 / DashScope

{
  "routerUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1",
  "models": [
    {
      "id": "qwen-plus",
      "label": "Qwen Plus",
      "backendModel": "qwen-plus",
      "endpoint": "https://dashscope.aliyuncs.com/compatible-mode/v1",
      "apiKeyEnv": "BAILIAN_API_KEY",
      "aliases": ["default"],
      "default": true
    }
  ]
}

混合本地 + 云端(一份 config 多个模型)

{
  "routerUrl": "http://127.0.0.1:11434/v1",
  "models": [
    { "id": "qwen-local", "backendModel": "qwen2.5-coder:7b",
      "aliases": ["default", "fast"], "default": true },
    { "id": "kimi-cloud", "backendModel": "moonshot-v1-128k",
      "endpoint": "https://api.moonshot.ai/v1",
      "apiKeyEnv": "KIMI_API_KEY",
      "aliases": ["heavy", "kimi"] }
  ]
}

owlcoda --model heavy → Kimi。默认 → 本地 Qwen。

Schema 参考

完整 schema 见 config.example.json。每个 model 常用字段:

字段用途
idAPI 中使用的稳定 model id
labelUI 显示用的友好名
backendModelbackend 自身识别的 model id
endpoint这一个 model 单独覆盖 routerUrl
apiKey / apiKeyEnv云端凭据(直接值或 env var 名)
localRuntimeProtocolauto / openai_chat / anthropic_messages
aliases--model 可用的别名
tierfast / balanced / heavy(UI 分组)
default一份 config 里有一个默认 model