(->chatml messages)
Convert messages
in tuple format ot ChatML. There is a caviat. When
content
might not be a string (a likely case when one LLM call result in say JSON
feeds into another call.
Convert `messages` in tuple format ot ChatML. There is a caviat. When `content` might not be a string (a likely case when one LLM call result in say JSON feeds into another call.
(append-generation-instruction string-template)
If template does not specify generation function append the default one.
If template does not specify generation function append the default one.
(call-llm llm-config
{llm-impl wkk/service
model-params wkk/model-params
use-cache wkk/cache
:as properties}
messages)
Make a call to the LLM service.
llm-config
provides a map containing LLM service configurations, the
LLM to call is specified inproperties
providing model parameters and other details of LLM invocationmessages
contains the context/prompt to be supplied to LLM.Make a call to the LLM service. - `llm-config` provides a map containing LLM service configurations, the LLM to call is specified in - `properties` providing model parameters and other details of LLM invocation - `messages` contains the context/prompt to be supplied to LLM.
(chat llm-config messages inputs)
Chat completion using
llm-config
holding LLM service configurationmessages
chat message tuples
[[:system "You are ..."]
[:user "Please, ..."]
[:assistant {...}]]inputs
data map to fill the tempalte slotsChat completion using - `llm-config` holding LLM service configuration - `messages` chat message tuples [[:system "You are ..."] [:user "Please, ..."] [:assistant {...}]] - `inputs` data map to fill the tempalte slots
(complete-graph llm-config graph vars-map)
Completion case when we are processing prompt graph. Main work here is on constructing
the output format with usage
and completions
sections.
Completion case when we are processing prompt graph. Main work here is on constructing the output format with `usage` and `completions` sections.
(complete-template llm-config template vars-map)
Completion for a case when we have simple string prompt
Completion for a case when we have simple string `prompt`
Result map key holding LLM generated parts
Result map key holding LLM generated parts
Result map key holding full chat conversation including generated parts
Result map key holding full chat conversation including generated parts
Simple string template generation case does not create var names for the completions, compared to Map generation where map keys are the var names.
This is a key for completion entry
Simple string template generation case does not create var names for the completions, compared to Map generation where map keys are the var names. This is a key for completion entry
Simple string template generation case does not create var names for the completions, compared to Map generation where map keys are the var names.
This is a key for prompt entry
Simple string template generation case does not create var names for the completions, compared to Map generation where map keys are the var names. This is a key for prompt entry
(find-refering-templates var-name context-map)
Given all the templates in a context-map
find out which ones
have references to var-name
Given all the templates in a `context-map` find out which ones have references to `var-name`
(generate messages)
(generate messages inputs)
(generate env messages inputs)
Generate completions for various modes. Generation mode is determined
by the type of the messages
:
chat
mode completion
[[:system "You are ..."]
[:user "Please, ..."]
[:assistant {...}]]
graph
mode completion
{:question-answer "Question: {{question}}"
:answer {...}}
string
results in a template
completion modeenv/config
holds configuration to make LLM calls and inputs
has a data map
for template slot filling.
Generate completions for various modes. Generation mode is determined by the type of the `messages`: - Vector of tuples, triggers `chat` mode completion ``` [[:system "You are ..."] [:user "Please, ..."] [:assistant {...}]] ``` - A map, triggers `graph` mode completion ``` {:question-answer "Question: {{question}}" :answer {...}} ``` - A `string` results in a `template` completion mode `env/config` holds configuration to make LLM calls and `inputs` has a data map for template slot filling.
(llm service-or-model & args)
A helper function to create LLM spec for calls during the generation process.
It comes back with a map constructed from service-or-model
and args
.
service-or-model
can be one of:
serivice
(like openai, mistral, ...), in this case args need to specify
{:llm/model-params {:model :x}}model
in this case service
will be determined from env/model-providers
,
no need to specify {:llm/model-params {:model :x}}{:llm/service service
:llm/cache true
:llm/model-params params}
A helper function to create LLM spec for calls during the generation process. It comes back with a map constructed from `service-or-model` and `args`. `service-or-model` can be one of: - `serivice` (like openai, mistral, ...), in this case args need to specify {:llm/model-params {:model :x}} - `model` in this case `service` will be determined from `env/model-providers`, no need to specify {:llm/model-params {:model :x}} ``` {:llm/service service :llm/cache true :llm/model-params params} ```
(run-node-function node available-data)
Run a function definition in the prompt tree.
node
is the function defining node in the treeavailable-data
already resolved data, must contain function paramsRun a function definition in the prompt tree. - `node` is the function defining node in the tree - `available-data` already resolved data, must contain function params
Result map key holding LLM token usage data
Result map key holding LLM token usage data
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close