Core API for LiteLLM - Direct provider calls with model names as-is
Core API for LiteLLM - Direct provider calls with model names as-is
(anthropic-completion model request-map & {:as config})Direct Anthropic completion
Direct Anthropic completion
(calculate-cost provider-name model prompt-tokens completion-tokens)Calculate cost for a request/response
Calculate cost for a request/response
(chat provider-name model message & {:keys [system-prompt] :as config})Simple chat completion function for single user messages.
Convenience wrapper around completion for simple question-answer interactions.
Parameters:
provider-name - Provider keyword (:openai, :anthropic, etc.)model - Model name stringmessage - User message stringconfig - Optional keyword args including :system-prompt, :api-key, etc.Examples:
;; Simple question
(chat :openai "gpt-4" "What is 2+2?"
:api-key "sk-...")
;; With system prompt
(chat :openai "gpt-4" "Explain quantum physics"
:api-key "sk-..."
:system-prompt "You are a physics professor")
See also: completion, extract-content
Simple chat completion function for single user messages.
Convenience wrapper around [[completion]] for simple question-answer interactions.
**Parameters:**
- `provider-name` - Provider keyword (`:openai`, `:anthropic`, etc.)
- `model` - Model name string
- `message` - User message string
- `config` - Optional keyword args including `:system-prompt`, `:api-key`, etc.
**Examples:**
```clojure
;; Simple question
(chat :openai "gpt-4" "What is 2+2?"
:api-key "sk-...")
;; With system prompt
(chat :openai "gpt-4" "Explain quantum physics"
:api-key "sk-..."
:system-prompt "You are a physics professor")
```
**See also:** [[completion]], [[extract-content]](completion provider-name model request-map)(completion provider-name model request-map config)Direct completion function - accepts provider keyword and model name as-is.
Supports both streaming and non-streaming requests.
For streaming requests, returns a core.async channel.
Parameters:
provider - Provider keyword (:openai, :anthropic, :gemini, :mistral, :ollama, :openrouter)model - Model name string (e.g., "gpt-4", "claude-3-opus-20240229")request-map - Request with :messages, :temperature, :max-tokens, etc.config - Optional config with :api-key, :api-base, :timeoutReturns:
:choices, :usage, etc.core.async channel with response chunksExamples:
;; Non-streaming completion
(completion :openai "gpt-4"
{:messages [{:role :user :content "Hello"}]}
{:api-key "sk-..."})
;; Streaming completion (returns channel)
(completion :openai "gpt-4"
{:messages [{:role :user :content "Hello"}]
:stream true}
{:api-key "sk-..."})
;; Anthropic Claude
(completion :anthropic "claude-3-sonnet-20240229"
{:messages [{:role :user :content "Hello"}]}
{:api-key "sk-ant-..."})
See also: chat, extract-content, extract-message
Direct completion function - accepts `provider` keyword and `model` name as-is.
Supports both streaming and non-streaming requests.
For streaming requests, returns a `core.async` channel.
**Parameters:**
- `provider` - Provider keyword (`:openai`, `:anthropic`, `:gemini`, `:mistral`, `:ollama`, `:openrouter`)
- `model` - Model name string (e.g., `"gpt-4"`, `"claude-3-opus-20240229"`)
- `request-map` - Request with `:messages`, `:temperature`, `:max-tokens`, etc.
- `config` - Optional config with `:api-key`, `:api-base`, `:timeout`
**Returns:**
- Non-streaming: Response map with `:choices`, `:usage`, etc.
- Streaming: `core.async` channel with response chunks
**Examples:**
```clojure
;; Non-streaming completion
(completion :openai "gpt-4"
{:messages [{:role :user :content "Hello"}]}
{:api-key "sk-..."})
;; Streaming completion (returns channel)
(completion :openai "gpt-4"
{:messages [{:role :user :content "Hello"}]
:stream true}
{:api-key "sk-..."})
;; Anthropic Claude
(completion :anthropic "claude-3-sonnet-20240229"
{:messages [{:role :user :content "Hello"}]}
{:api-key "sk-ant-..."})
```
**See also:** [[chat]], [[extract-content]], [[extract-message]](embedding provider-name model request-map)(embedding provider-name model request-map config)Generate embeddings for text input.
Parameters:
provider - Provider keyword (:openai, :mistral, :gemini)model - Model name string (e.g., "text-embedding-3-small", "mistral-embed")request-map - Request with :input (string or vector of strings)config - Optional config with :api-key, :api-base, :timeoutReturns:
:data (vector of embeddings), :usage, etc.Examples:
;; Single text embedding
(embedding :openai "text-embedding-3-small"
{:input "Hello world"}
{:api-key "sk-..."})
;; Multiple texts
(embedding :openai "text-embedding-3-small"
{:input ["Hello" "World"]}
{:api-key "sk-..."})
;; Mistral embeddings
(embedding :mistral "mistral-embed"
{:input "Hello world"}
{:api-key "..."})
;; Gemini embeddings
(embedding :gemini "text-embedding-004"
{:input "Hello world"}
{:api-key "..."})
See also: openai-embedding, mistral-embedding, gemini-embedding
Generate embeddings for text input.
**Parameters:**
- `provider` - Provider keyword (`:openai`, `:mistral`, `:gemini`)
- `model` - Model name string (e.g., `"text-embedding-3-small"`, `"mistral-embed"`)
- `request-map` - Request with `:input` (string or vector of strings)
- `config` - Optional config with `:api-key`, `:api-base`, `:timeout`
**Returns:**
- Response map with `:data` (vector of embeddings), `:usage`, etc.
**Examples:**
```clojure
;; Single text embedding
(embedding :openai "text-embedding-3-small"
{:input "Hello world"}
{:api-key "sk-..."})
;; Multiple texts
(embedding :openai "text-embedding-3-small"
{:input ["Hello" "World"]}
{:api-key "sk-..."})
;; Mistral embeddings
(embedding :mistral "mistral-embed"
{:input "Hello world"}
{:api-key "..."})
;; Gemini embeddings
(embedding :gemini "text-embedding-004"
{:input "Hello world"}
{:api-key "..."})
```
**See also:** [[openai-embedding]], [[mistral-embedding]], [[gemini-embedding]](estimate-request-tokens request)Estimate token count for a request
Estimate token count for a request
(estimate-tokens text)Estimate token count for text
Estimate token count for text
(extract-content response)Extract text content from a completion response.
Retrieves the generated text from the first choice in the response.
Example:
(def response (completion :openai "gpt-4" {...}))
(extract-content response)
;; => "The generated text content..."
See also: extract-message, extract-usage
Extract text content from a completion response.
Retrieves the generated text from the first choice in the response.
**Example:**
```clojure
(def response (completion :openai "gpt-4" {...}))
(extract-content response)
;; => "The generated text content..."
```
**See also:** [[extract-message]], [[extract-usage]](extract-message response)Extract the full message object from a completion response.
Returns the complete message including :content, :role, and :tool-calls (if any).
Example:
(def response (completion :openai "gpt-4" {...}))
(extract-message response)
;; => {:role :assistant :content "..." :tool-calls [...]}
See also: extract-content, extract-usage
Extract the full message object from a completion response.
Returns the complete message including `:content`, `:role`, and `:tool-calls` (if any).
**Example:**
```clojure
(def response (completion :openai "gpt-4" {...}))
(extract-message response)
;; => {:role :assistant :content "..." :tool-calls [...]}
```
**See also:** [[extract-content]], [[extract-usage]](extract-usage response)Extract token usage information from a completion response.
Returns a map with :prompt-tokens, :completion-tokens, and :total-tokens.
Example:
(def response (completion :openai "gpt-4" {...}))
(extract-usage response)
;; => {:prompt-tokens 10 :completion-tokens 20 :total-tokens 30}
See also: extract-content, calculate-cost
Extract token usage information from a completion response.
Returns a map with `:prompt-tokens`, `:completion-tokens`, and `:total-tokens`.
**Example:**
```clojure
(def response (completion :openai "gpt-4" {...}))
(extract-usage response)
;; => {:prompt-tokens 10 :completion-tokens 20 :total-tokens 30}
```
**See also:** [[extract-content]], [[calculate-cost]](gemini-completion model request-map & {:as config})Direct Gemini completion
Direct Gemini completion
(gemini-embedding model request-map & {:as config})Direct Gemini embedding
Direct Gemini embedding
(list-providers)List all available providers (registered via multimethods)
List all available providers (registered via multimethods)
(mistral-completion model request-map & {:as config})Direct Mistral completion
Direct Mistral completion
(mistral-embedding model request-map & {:as config})Direct Mistral embedding
Direct Mistral embedding
(ollama-completion model request-map & {:as config})Direct Ollama completion
Direct Ollama completion
(openai-completion model request-map & {:as config})Direct OpenAI completion
Direct OpenAI completion
(openai-embedding model request-map & {:as config})Direct OpenAI embedding
Direct OpenAI embedding
(openrouter-completion model request-map & {:as config})Direct OpenRouter completion
Direct OpenRouter completion
(provider-available? provider-name)Check if a provider is available
Check if a provider is available
(provider-info provider-name)Get information about a provider
Get information about a provider
(supports-function-calling? provider-name)Check if provider supports function calling
Check if provider supports function calling
(supports-streaming? provider-name)Check if provider supports streaming
Check if provider supports streaming
(validate-request provider-name request)Validate a request against provider capabilities
Validate a request against provider capabilities
(with-error-handling f)Execute function with comprehensive error handling
Execute function with comprehensive error handling
cljdoc builds & hosts documentation for Clojure/Script libraries
| Ctrl+k | Jump to recent docs |
| ← | Move to previous article |
| → | Move to next article |
| Ctrl+/ | Jump to the search field |