Liking cljdoc? Tell your friends :D

instructor-clj.core


call-llm-apiclj

(call-llm-api api-url headers body)

Makes a POST request to a specified LLM API endpoint with given headers and body.

Makes a POST request to a specified LLM API endpoint with given headers and body.
sourceraw docstring

create-chat-completionclj

(create-chat-completion chat-completion-fn client-params)
(create-chat-completion chat-completion-fn client-params opts)

Creates a chat completion using OpenAI API.

This function takes OpenAI chat completion function as the first argument.

Second argument is a map with keys :messages, :model, and :response-model. :messages should be a vector of maps, each map representing a message with keys :role and :content. :model specifies the OpenAI model to use for generating completions. :response-model is a map specifying the schema and name of the response model.

Alternatively the api-key, organization, api-endpoint can be passed in the options argument of each api function. https://github.com/wkok/openai-clojure/blob/main/doc/01-usage-openai.md#options

Also, request options may be set on the underlying hato http client by adding a :request map to :options for example setting the request timeout. https://github.com/wkok/openai-clojure/blob/main/doc/01-usage-openai.md#request-options

Example: (require '[instructor-clj.core :as ic]) (require '[wkok.openai-clojure.api :as client])

(def User [:map [:name :string] [:age :int]])

(ic/create-chat-completion client {:messages [{:role "user", :content "Jason Liu is 30 years old"}] :model "gpt-3.5-turbo" :response-model User})

Returns a map with extracted information in a structured format.

Creates a chat completion using OpenAI API.

This function takes OpenAI chat completion function as the first argument.

Second argument is a map with keys :messages, :model, and :response-model.
:messages should be a vector of maps, each map representing a message with keys :role and :content.
:model specifies the OpenAI model to use for generating completions.
:response-model is a map specifying the schema and name of the response model.

Alternatively the api-key, organization, api-endpoint can be passed in
the options argument of each api function.
https://github.com/wkok/openai-clojure/blob/main/doc/01-usage-openai.md#options

Also, request options may be set on the underlying hato http client by adding
a :request map to :options for example setting the request timeout.
https://github.com/wkok/openai-clojure/blob/main/doc/01-usage-openai.md#request-options

Example:
(require '[instructor-clj.core :as ic])
(require '[wkok.openai-clojure.api :as client])

(def User
  [:map
    [:name :string]
    [:age :int]])

(ic/create-chat-completion
 client
 {:messages [{:role "user", :content "Jason Liu is 30 years old"}]
  :model "gpt-3.5-turbo"
  :response-model User})

Returns a map with extracted information in a structured format.
sourceraw docstring

default-client-paramsclj

source

instructclj

(instruct prompt
          response-schema
          &
          {:keys [api-key _max-tokens _model _temperature max-retries]
           :as client-params
           :or {max-retries 0}})

Attempts to obtain a valid response from the LLM based on the given prompt and schema, retrying up to max-retries times if necessary.

Attempts to obtain a valid response from the LLM based on the given prompt and schema,
retrying up to `max-retries` times if necessary.
sourceraw docstring

llm->responseclj

(llm->response {:keys [prompt response-schema api-key max-tokens model
                       temperature]})

The function performs the LLM call and tries to destructure and get the actual response. Returns nil in cases where the LLM is not able to generate the expected response.

@TODO Add ability to plugin different LLMs @TODO Getting response is brittle and not extensible for different LLMs

The function performs the LLM call and tries to destructure and get the actual response.
 Returns nil in cases where the LLM is not able to generate the expected response.

@TODO Add ability to plugin different LLMs
@TODO Getting response is brittle and not extensible for different LLMs
sourceraw docstring

parse-generated-bodyclj

(parse-generated-body body)

Parses the body of a response generated by an LLM API call. Extracts and converts the message content into a Clojure map.

Parses the body of a response generated by an LLM API call.
Extracts and converts the message content into a Clojure map.
sourceraw docstring

schema->system-promptclj

(schema->system-prompt schema)

Converts a malli schema into JSON schema and generates a system prompt for responses

Converts a malli schema into JSON schema and generates a system prompt for responses
sourceraw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close