LiteLLM Clojure provides two complementary APIs for interacting with LLM providers:
litellm.router)Best for: Applications with multiple models, configuration-based workflows, production deployments
The Router API uses named configurations to manage providers and models:
(require '[litellm.router :as router])
;; Register configurations
(router/register! :fast {:provider :openai :model "gpt-4o-mini" :config {...}})
(router/register! :smart {:provider :anthropic :model "claude-3-opus-20240229" :config {...}})
;; Use by name
(router/completion :fast {:messages [...]})
(router/completion :smart {:messages [...]})
Benefits:
litellm.core)Best for: Simple scripts, direct provider access, learning, prototyping
The Core API provides direct access to providers:
(require '[litellm.core :as core])
;; Direct provider calls
(core/completion :openai "gpt-4o-mini" {...} {:api-key "sk-..."})
(core/completion :anthropic "claude-3-opus-20240229" {...} {:api-key "sk-ant-..."})
Benefits:
| Scenario | Recommended API | 
|---|---|
| Production application | Router API | 
| Multiple models/providers | Router API | 
| Configuration-driven workflow | Router API | 
| Quick prototype/script | Core API | 
| Learning LiteLLM | Core API | 
| Testing different providers | Either (Router is easier) | 
(ns my-app.llm
  (:require [litellm.router :as router]))
;; On application startup
(defn init-llm! []
  (router/register! :default
    {:provider :openai
     :model "gpt-4o-mini"
     :config {:api-key (System/getenv "OPENAI_API_KEY")}})
  
  (router/register! :advanced
    {:provider :anthropic
     :model "claude-3-opus-20240229"
     :config {:api-key (System/getenv "ANTHROPIC_API_KEY")}}))
;; In your application
(defn simple-query [text]
  (-> (router/completion :default {:messages [{:role :user :content text}]})
      router/extract-content))
(defn complex-query [text]
  (-> (router/completion :advanced {:messages [{:role :user :content text}]})
      router/extract-content))
(ns my-script
  (:require [litellm.core :as llm]))
(defn analyze-text [text]
  (let [response (llm/completion :openai "gpt-4"
                   {:messages [{:role :user :content text}]}
                   {:api-key (System/getenv "OPENAI_API_KEY")})]
    (llm/extract-content response)))
(println (analyze-text "Summarize quantum computing"))
For detailed API documentation:
Both APIs provide provider discovery functions:
;; List all available providers
(router/list-providers)
;; => [:openai :anthropic :gemini :mistral :ollama :openrouter]
;; Check if provider is available
(router/provider-available? :openai)
;; => true
;; Get provider capabilities
(router/provider-info :openai)
;; => {:streaming true :function-calling true ...}
;; Check specific capabilities
(router/supports-streaming? :anthropic)
;; => true
(router/supports-function-calling? :gemini)
;; => false
Both APIs use the same error handling system. See [[Error Handling|/doc/error_handling.md]] for details.
Can you improve this documentation?Edit on GitHub
cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs | 
| ← | Move to previous article | 
| → | Move to next article | 
| Ctrl+/ | Jump to the search field |