Liking cljdoc? Tell your friends :D

instructor-clj.core


create-chat-completionclj

(create-chat-completion client-params)

Creates a chat completion using litellm-clj (supports multiple LLM providers).

Argument is a map with keys :messages, :model, :response-model, :provider, and optionally :api-key. :messages should be a vector of maps, each map representing a message with keys :role and :content. :model specifies the model to use (e.g., "gpt-3.5-turbo", "claude-3-opus-20240229", "gemini-pro"). :response-model is a Malli schema specifying the expected response structure. :provider (required) - must be specified explicitly (e.g., :openai, :anthropic, :gemini, :mistral, :ollama) :api-key (optional) - if not provided, will use environment variable OPENAI_API_KEY

Note: API keys can be set via environment variables:

  • OPENAI_API_KEY for OpenAI models
  • ANTHROPIC_API_KEY for Anthropic models
  • GEMINI_API_KEY for Google Gemini models
  • OPENROUTER_API_KEY for OpenRouter models

Example: (require '[instructor-clj.core :as ic])

(def User [:map [:name :string] [:age :int]])

(ic/create-chat-completion {:messages [{:role "user", :content "Jason Liu is 30 years old"}] :model "gpt-3.5-turbo" :provider :openai :response-model User})

Returns a map with extracted information in a structured format.

Creates a chat completion using litellm-clj (supports multiple LLM providers).

Argument is a map with keys :messages, :model, :response-model, :provider, and optionally :api-key.
:messages should be a vector of maps, each map representing a message with keys :role and :content.
:model specifies the model to use (e.g., "gpt-3.5-turbo", "claude-3-opus-20240229", "gemini-pro").
:response-model is a Malli schema specifying the expected response structure.
:provider (required) - must be specified explicitly (e.g., :openai, :anthropic, :gemini, :mistral, :ollama)
:api-key (optional) - if not provided, will use environment variable OPENAI_API_KEY

Note: API keys can be set via environment variables:
- OPENAI_API_KEY for OpenAI models
- ANTHROPIC_API_KEY for Anthropic models
- GEMINI_API_KEY for Google Gemini models
- OPENROUTER_API_KEY for OpenRouter models

Example:
(require '[instructor-clj.core :as ic])

(def User
  [:map
    [:name :string]
    [:age :int]])

(ic/create-chat-completion
 {:messages [{:role "user", :content "Jason Liu is 30 years old"}]
  :model "gpt-3.5-turbo"
  :provider :openai
  :response-model User})

Returns a map with extracted information in a structured format.
sourceraw docstring

default-client-paramsclj

source

instructclj

(instruct prompt
          response-schema
          &
          {:keys [api-key _max-tokens _model _temperature max-retries]
           :as client-params
           :or {max-retries 0}})

Attempts to obtain a valid response from the LLM based on the given prompt and schema, retrying up to max-retries times if necessary.

Note: API keys can be provided via :api-key parameter or OPENAI_API_KEY environment variable.

Attempts to obtain a valid response from the LLM based on the given prompt and schema,
retrying up to `max-retries` times if necessary.

Note: API keys can be provided via :api-key parameter or OPENAI_API_KEY environment variable.
sourceraw docstring

llm->responseclj

(llm->response {:keys [prompt response-schema max-tokens model temperature
                       api-key provider]})

The function performs the LLM call and tries to destructure and get the actual response. Returns nil in cases where the LLM is not able to generate the expected response.

Supports multiple LLM providers through litellm-clj 0.3.0-alpha. Provider must be specified explicitly via :provider key (e.g., :openai, :anthropic, :gemini).

The function performs the LLM call and tries to destructure and get the actual response.
Returns nil in cases where the LLM is not able to generate the expected response.

Supports multiple LLM providers through litellm-clj 0.3.0-alpha.
Provider must be specified explicitly via :provider key (e.g., :openai, :anthropic, :gemini).
sourceraw docstring

normalize-messageclj

(normalize-message message)

Normalizes a message map to ensure :role is a keyword. litellm-clj 0.3.0-alpha requires roles to be keywords.

Normalizes a message map to ensure :role is a keyword.
litellm-clj 0.3.0-alpha requires roles to be keywords.
sourceraw docstring

normalize-messagesclj

(normalize-messages messages)

Normalizes a collection of messages to ensure all roles are keywords.

Normalizes a collection of messages to ensure all roles are keywords.
sourceraw docstring

parse-generated-bodyclj

(parse-generated-body body)

Parses the body of a response generated by an LLM API call. Extracts and converts the message content into a Clojure map.

Parses the body of a response generated by an LLM API call.
Extracts and converts the message content into a Clojure map.
sourceraw docstring

schema->system-promptclj

(schema->system-prompt schema)

Converts a malli schema into JSON schema and generates a system prompt for responses

Converts a malli schema into JSON schema and generates a system prompt for responses
sourceraw docstring

cljdoc builds & hosts documentation for Clojure/Script libraries

Keyboard shortcuts
Ctrl+kJump to recent docs
Move to previous article
Move to next article
Ctrl+/Jump to the search field
× close