Psi supports custom LLM providers through models.edn files.
This lets you add providers such as MiniMax, Ollama, LM Studio, vLLM, llama.cpp, or any other service that exposes an OpenAI-compatible or Anthropic-compatible API.
You can define custom providers in either or both of these files:
~/.psi/agent/models.edn<worktree>/.psi/models.ednIf the same custom provider/model pair appears in both places, the project-local entry wins.
Built-in models remain available alongside custom ones.
Each provider entry defines:
"minimax" or "ollama":base-url — the API root for that provider:api — which wire protocol psi should use:auth settings:modelsSupported custom-provider API protocols are:
:openai-completions:anthropic-messages:openai-codex-responsesIn practice, most custom hosted providers fit the first two.
Illustrative example: confirm the provider's current base URL and model ids in
its own docs, then place a definition like this in ~/.psi/agent/models.edn or
.psi/models.edn:
{:version 1
:providers
{"minimax"
{:base-url "https://api.minimax.chat/v1"
:api :openai-completions
:auth {:api-key "env:MINIMAX_API_KEY"}
:models [{:id "MiniMax-M1"
:name "MiniMax M1"
:supports-reasoning true
:supports-text true
:context-window 128000
:max-tokens 16384
:latency-tier :medium
:cost-tier :medium}]}}}
Then export your key:
export MINIMAX_API_KEY=...
Notes:
minimax:api
is :openai-completionsIf a provider exposes an Anthropic Messages-compatible API, configure it the
same way but set :api to :anthropic-messages.
{:version 1
:providers
{"my-anthropic-proxy"
{:base-url "https://example.com/anthropic"
:api :anthropic-messages
:auth {:api-key "env:MY_PROXY_API_KEY"}
:models [{:id "proxy-sonnet"
:name "Proxy Sonnet"
:supports-reasoning true
:supports-text true
:context-window 200000
:max-tokens 8192}]}}}
For Anthropic-compatible providers, psi uses the Anthropic transport and will send the configured key through the compatible auth path.
The :auth map supports more than just an API key:
{:auth {:api-key "env:LOCAL_LLM_KEY"
:auth-header? false
:headers {"X-Client" "psi"}}}
Use cases:
:api-key — literal key or "env:VAR_NAME":auth-header? false — omit the normal auth header for servers that reject it:headers — add custom request headersA common use for :auth-header? false is an OpenAI-compatible local server that
accepts requests without a bearer token and rejects unexpected auth headers.
If psi is already running, reload the definitions after editing either models file:
/reload-models
That reloads:
~/.psi/agent/models.edn<worktree>/.psi/models.ednAfter reloading, use the normal model-selection surface.
In-session:
/model minimax MiniMax-M1
or, for the Anthropic-compatible example:
/model my-anthropic-proxy proxy-sonnet
Once selected, the custom model behaves like any other model in psi.
You can define multiple providers in the same file, for example:
minimaxollamastaging-openaicompany-anthropic-proxyThis already satisfies the issue's requested workflow of configuring multiple providers in a config file and switching between them at runtime.
/reload-models.(provider, model-id) as a built-in model,
the custom definition is skipped to avoid shadowing built-ins.(provider, model-id),
the project-local definition wins.Can you improve this documentation?Edit on GitHub
cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs |
| ← | Move to previous article |
| → | Move to next article |
| Ctrl+/ | Jump to the search field |