Liking cljdoc? Tell your friends :D

bosquet.llm.generator


->chatmlclj

(->chatml messages)

Convert messages in tuple format ot ChatML. There is a caviat. When content might not be a string (a likely case when one LLM call result in say JSON feeds into another call.

Convert `messages` in tuple format ot ChatML. There is a caviat. When
`content` might not be a string (a likely case when one LLM call result in say JSON
feeds into another call.
sourceraw docstring

append-generation-instructionclj

(append-generation-instruction string-template)

If template does not specify generation function append the default one.

If template does not specify generation function append the default one.
sourceraw docstring

call-llmclj

(call-llm llm-config
          {llm-impl wkk/service
           model-params wkk/model-params
           use-cache wkk/cache
           :as properties}
          messages)

Make a call to the LLM service.

  • llm-config provides a map containing LLM service configurations, the LLM to call is specified in
  • properties providing model parameters and other details of LLM invocation
  • messages contains the context/prompt to be supplied to LLM.
Make a call to the LLM service.
- `llm-config` provides a map containing LLM service configurations, the
   LLM to call is specified in
- `properties` providing model parameters and other details of LLM invocation
- `messages` contains the context/prompt to be supplied to LLM.
sourceraw docstring

chatclj

(chat llm-config messages inputs)

Chat completion using

  • llm-config holding LLM service configuration
  • messages chat message tuples [[:system "You are ..."] [:user "Please, ..."] [:assistant {...}]]
  • inputs data map to fill the tempalte slots
Chat completion using
- `llm-config` holding LLM service configuration
- `messages` chat message tuples
             [[:system "You are ..."]
              [:user "Please, ..."]
              [:assistant {...}]]
- `inputs` data map to fill the tempalte slots
sourceraw docstring

complete-graphclj

(complete-graph llm-config graph vars-map)

Completion case when we are processing prompt graph. Main work here is on constructing the output format with usage and completions sections.

Completion case when we are processing prompt graph. Main work here is on constructing
the output format with `usage` and `completions` sections.
sourceraw docstring

complete-templateclj

(complete-template llm-config template vars-map)

Completion for a case when we have simple string prompt

Completion for a case when we have simple string `prompt`
sourceraw docstring

completionsclj

Result map key holding LLM generated parts

Result map key holding LLM generated parts
sourceraw docstring

conversationclj

Result map key holding full chat conversation including generated parts

Result map key holding full chat conversation including generated parts
sourceraw docstring

default-template-completionclj

Simple string template generation case does not create var names for the completions, compared to Map generation where map keys are the var names.

This is a key for completion entry

Simple string template generation case does not create var names for the completions,
compared to Map generation where map keys are the var names.

This is a key for completion entry
sourceraw docstring

default-template-promptclj

Simple string template generation case does not create var names for the completions, compared to Map generation where map keys are the var names.

This is a key for prompt entry

Simple string template generation case does not create var names for the completions,
compared to Map generation where map keys are the var names.

This is a key for prompt entry
sourceraw docstring

find-refering-templatesclj

(find-refering-templates var-name context-map)

Given all the templates in a context-map find out which ones have references to var-name

Given all the templates in a `context-map` find out which ones
have references to `var-name`
sourceraw docstring

funclj

(fun impl args)
source

gen-environmentclj

(gen-environment llm-config context vars-map)
source

generateclj

(generate messages)
(generate messages inputs)
(generate env messages inputs)

Generate completions for various modes. Generation mode is determined by the type of the messages:

  • Vector of tuples, triggers chat mode completion
     [[:system "You are ..."]
      [:user "Please, ..."]
      [:assistant {...}]]
    
  • A map, triggers graph mode completion
    {:question-answer "Question: {{question}}"
     :answer          {...}}
    
  • A string results in a template completion mode

env/config holds configuration to make LLM calls and inputs has a data map for template slot filling.

Generate completions for various modes. Generation mode is determined
by the type of the `messages`:

- Vector of tuples, triggers `chat` mode completion
  ```
   [[:system "You are ..."]
    [:user "Please, ..."]
    [:assistant {...}]]
  ```
- A map, triggers `graph` mode completion
  ```
  {:question-answer "Question: {{question}}"
   :answer          {...}}
  ```
- A `string` results in a `template` completion mode

`env/config` holds configuration to make LLM calls and `inputs` has a data map
for template slot filling.
sourceraw docstring

llmclj

(llm service-or-model & args)

A helper function to create LLM spec for calls during the generation process. It comes back with a map constructed from service-or-model and args. service-or-model can be one of:

  • serivice (like openai, mistral, ...), in this case args need to specify {:llm/model-params {:model :x}}
  • model in this case service will be determined from env/model-providers, no need to specify {:llm/model-params {:model :x}}
{:llm/service      service
 :llm/cache        true
 :llm/model-params params}
A helper function to create LLM spec for calls during the generation process.
It comes back with a map constructed from `service-or-model` and `args`.
`service-or-model` can be one of:
- `serivice` (like openai, mistral, ...), in this case args need to specify
   {:llm/model-params {:model :x}}
- `model` in this case `service` will be determined from `env/model-providers`,
  no need to specify {:llm/model-params {:model :x}}
```
{:llm/service      service
 :llm/cache        true
 :llm/model-params params}
```
sourceraw docstring

run-node-functionclj

(run-node-function node available-data)

Run a function definition in the prompt tree.

  • node is the function defining node in the tree
  • available-data already resolved data, must contain function params
Run a function definition in the prompt tree.
- `node` is the function defining node in the tree
- `available-data` already resolved data, must contain function params
sourceraw docstring

top-level-templateclj

(top-level-template index context)
source

usageclj

Result map key holding LLM token usage data

Result map key holding LLM token usage data
sourceraw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close