A Clojure port of the popular LiteLLM library, providing a unified interface for multiple LLM providers with comprehensive observability and thread pool management.
LiteLLM Clojure provides a unified, idiomatic Clojure interface for interacting with multiple Large Language Model (LLM) providers. Whether you're using OpenAI, Anthropic, Google Gemini, or any other supported provider, you can use the same API with consistent patterns.
Key Benefits:
| Provider | Status | Models | Function Calling | Streaming |
|---|---|---|---|---|
| OpenAI | ✅ Supported | GPT-3.5-Turbo, GPT-4, GPT-4o | ✅ | ✅ |
| Anthropic | ✅ Supported | Claude 3 (Opus, Sonnet, Haiku), Claude 2.x | ✅ | ✅ |
| OpenRouter | ✅ Supported | All OpenRouter models | ✅ | ✅ |
| Google Gemini | ✅ Supported | Gemini Pro, Gemini Pro Vision, Gemini Ultra | ❌ | ✅ |
| Mistral | ✅ Supported | Mistral Small/Medium/Large, Codestral, Magistral | ✅ | ✅ |
| Ollama | ✅ Supported | Local models | ❌ | ✅ |
Add to your deps.edn:
{:deps {tech.unravel/litellm-clj {:mvn/version "0.2.0"}}}
Add to your project.clj:
[tech.unravel/litellm-clj "0.2.0"]
For configuration-based workflows with named configs:
(require '[litellm.router :as router])
;; Quick setup from environment variables
(router/quick-setup!)
;; Or register custom configurations
(router/register! :fast
{:provider :openai
:model "gpt-4o-mini"
:config {:api-key (System/getenv "OPENAI_API_KEY")}})
;; Use registered configs
(def response (router/completion :fast
{:messages [{:role :user :content "Hello, how are you?"}]}))
;; Access the response
(println (router/extract-content response))
For simple, direct provider calls:
```clojure
(require '[litellm.core :as core])
;; Direct provider calls without registration
(let [response (core/completion :openai "gpt-4o-mini"
{:messages [{:role :user :content "Hello"}]
:api-key (System/getenv "OPENAI_API_KEY")})]
(println (core/extract-content response)))
(require '[litellm.core :as llm]
'[litellm.streaming :as streaming]
'[clojure.core.async :refer [go-loop <!]])
;; Stream responses - returns a core.async channel
(let [ch (llm/completion :openai "gpt-4"
{:messages [{:role :user :content "Write a poem"}]
:stream true}
{:api-key (System/getenv "OPENAI_API_KEY")})]
;; Consume the stream with go-loop
(go-loop []
(when-let [chunk (<! ch)]
(when-let [content (streaming/extract-content chunk)]
(print content)
(flush))
(recur))))
;; Or use callback-based API
(let [ch (llm/completion :openai "gpt-4"
{:messages [{:role :user :content "Write a poem"}]
:stream true}
{:api-key (System/getenv "OPENAI_API_KEY")})]
(streaming/consume-stream-with-callbacks ch
(fn [chunk] (print (streaming/extract-content chunk)))
(fn [response] (println "\nStream complete!"))
(fn [error] (println "Error:" error))))
(require '[litellm.core :as llm])
(def response (llm/completion :openai "gpt-4"
{:messages [{:role :user :content "What's the weather in Boston?"}]
:tools [{:type "function"
:function {:name "get_weather"
:description "Get the current weather"
:parameters {:type "object"
:properties {:location {:type "string"
:description "City name"}}
:required ["location"]}}}]}
{:api-key (System/getenv "OPENAI_API_KEY")}))
;; Check for function call
(let [message (llm/extract-message response)]
(when-let [tool-calls (:tool-calls message)]
(doseq [tool-call tool-calls]
(println "Tool to call:" (get-in tool-call [:function :name]))
(println "Arguments:" (get-in tool-call [:function :arguments])))))
This project is licensed under the MIT License - see the LICENSE file for details.
Can you improve this documentation?Edit on GitHub
cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs |
| ← | Move to previous article |
| → | Move to next article |
| Ctrl+/ | Jump to the search field |