The litellm.router namespace provides a configuration-based API for managing multiple LLM providers and models.
(require '[litellm.router :as router])
Best for:
The Router API separates configuration from usage:
Register a provider configuration with a keyword name.
(router/register! config-name config-map)
Simple Configuration:
(router/register! :fast
{:provider :openai
:model "gpt-4o-mini"
:config {:api-key "sk-..."}})
With Router Function:
(router/register! :adaptive
{:router (fn [{:keys [priority]}]
(if (= priority :high)
{:provider :anthropic :model "claude-3-opus-20240229"}
{:provider :openai :model "gpt-4o-mini"}))
:configs {:openai {:api-key "sk-..."}
:anthropic {:api-key "sk-ant-..."}}})
Remove a configuration.
(router/unregister! :fast)
List all registered configuration names.
(router/list-configs)
;; => [:fast :smart :adaptive]
Retrieve a configuration.
(router/get-config :fast)
;; => {:provider :openai :model "gpt-4o-mini" :config {...}}
Clear all configurations (useful for testing).
(router/clear-router!)
Main completion function using registered configurations.
(router/completion config-name request-map)
Examples:
;; Basic usage
(def response
(router/completion :fast
{:messages [{:role :user :content "Hello!"}]}))
;; With options
(def response
(router/completion :smart
{:messages [{:role :user :content "Explain quantum computing"}]
:temperature 0.7
:max-tokens 500}))
;; Streaming
(def ch
(router/completion :fast
{:messages [{:role :user :content "Tell a story"}]
:stream true}))
Simplified chat interface.
(router/chat config-name message & {:keys [system-prompt]})
Examples:
;; Simple question
(router/chat :fast "What is 2+2?")
;; With system prompt
(router/chat :smart
"Explain general relativity"
:system-prompt "You are a physics professor")
Auto-configure from environment variables.
(router/quick-setup!)
Sets up configurations for providers with available API keys:
:openai if OPENAI_API_KEY is set:anthropic if ANTHROPIC_API_KEY is set:gemini if GEMINI_API_KEY is set:mistral if MISTRAL_API_KEY is set:openrouter if OPENROUTER_API_KEY is set:ollama (always, defaults to localhost)Quick setup for OpenAI.
(router/setup-openai! & {:keys [config-name api-key model]})
Examples:
;; Use defaults (config-name :openai, model "gpt-4o-mini")
(router/setup-openai!)
;; Custom configuration
(router/setup-openai!
:config-name :gpt4
:model "gpt-4"
:api-key "sk-...")
Quick setup for Anthropic.
(router/setup-anthropic! & {:keys [config-name api-key model]})
Examples:
;; Defaults: config-name :anthropic, model "claude-3-sonnet-20240229"
(router/setup-anthropic!)
;; Custom
(router/setup-anthropic!
:config-name :claude-opus
:model "claude-3-opus-20240229")
Quick setup for Google Gemini.
(router/setup-gemini! & {:keys [config-name api-key model]})
Quick setup for Mistral.
(router/setup-mistral! & {:keys [config-name api-key model]})
Quick setup for Ollama (local models).
(router/setup-ollama! & {:keys [config-name api-base model]})
Example:
(router/setup-ollama!
:config-name :local
:model "llama3"
:api-base "http://localhost:11434")
Quick setup for OpenRouter.
(router/setup-openrouter! & {:keys [config-name api-key model]})
Create a dynamic router configuration.
(router/create-router router-fn provider-configs)
Example:
(def adaptive-router
(router/create-router
(fn [request]
(let [complexity (get-in request [:metadata :complexity])]
(case complexity
:high {:provider :anthropic :model "claude-3-opus-20240229"}
:medium {:provider :openai :model "gpt-4"}
:low {:provider :openai :model "gpt-4o-mini"})))
{:openai {:api-key "sk-..."}
:anthropic {:api-key "sk-ant-..."}}))
(router/register! :adaptive adaptive-router)
;; Use with metadata
(router/completion :adaptive
{:messages [{:role :user :content "Complex task"}]
:metadata {:complexity :high}})
These are re-exported from [[litellm.core]]:
;; Extract content
(router/extract-content response)
;; Extract full message
(router/extract-message response)
;; Get usage stats
(router/extract-usage response)
Re-exported from [[litellm.core]]:
;; List providers
(router/list-providers)
;; Check availability
(router/provider-available? :openai)
;; Get info
(router/provider-info :openai)
;; Check capabilities
(router/supports-streaming? :anthropic)
(router/supports-function-calling? :openai)
;; Validate request
(router/validate-request :openai {...})
Can you improve this documentation?Edit on GitHub
cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs |
| ← | Move to previous article |
| → | Move to next article |
| Ctrl+/ | Jump to the search field |