LLM#
dbally.llms.base.LLM
#
LLM(model_name: str, default_options: Optional[LLMOptions] = None)
Bases: Generic[LLMClientOptions]
, ABC
Abstract class for interaction with Large Language Model.
Constructs a new LLM instance.
PARAMETER | DESCRIPTION |
---|---|
model_name |
Name of the model to be used.
TYPE:
|
default_options |
Default options to be used.
TYPE:
|
RAISES | DESCRIPTION |
---|---|
TypeError
|
If the subclass is missing the '_options_cls' attribute. |
Source code in src/dbally/llms/base.py
count_tokens
#
count_tokens(prompt: PromptTemplate) -> int
Counts tokens in the prompt.
PARAMETER | DESCRIPTION |
---|---|
prompt |
Formatted prompt template with conversation and response parsing configuration.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
int
|
Number of tokens in the prompt. |
Source code in src/dbally/llms/base.py
generate_text
async
#
generate_text(prompt: PromptTemplate, *, event_tracker: Optional[EventTracker] = None, options: Optional[LLMOptions] = None) -> str
Prepares and sends a prompt to the LLM and returns the response.
PARAMETER | DESCRIPTION |
---|---|
prompt |
Formatted prompt template with conversation and response parsing configuration.
TYPE:
|
event_tracker |
Event store used to audit the generation process.
TYPE:
|
options |
Options to use for the LLM client.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
str
|
Text response from LLM. |
RAISES | DESCRIPTION |
---|---|
LLMError
|
If LLM text generation fails. |
Source code in src/dbally/llms/base.py
dbally.llms.clients.base.LLMClient
#
Bases: Generic[LLMClientOptions]
, ABC
Abstract client for a direct communication with LLM.
Constructs a new LLMClient instance.
PARAMETER | DESCRIPTION |
---|---|
model_name |
Name of the model to be used.
TYPE:
|
Source code in src/dbally/llms/clients/base.py
call
abstractmethod
async
#
call(conversation: ChatFormat, options: LLMClientOptions, event: LLMEvent, json_mode: bool = False) -> str
Calls LLM inference API.
PARAMETER | DESCRIPTION |
---|---|
conversation |
List of dicts with "role" and "content" keys, representing the chat history so far.
TYPE:
|
options |
Additional settings used by LLM.
TYPE:
|
event |
LLMEvent instance which fields should be filled during the method execution.
TYPE:
|
json_mode |
Force the response to be in JSON format.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
str
|
Response string from LLM. |
Source code in src/dbally/llms/clients/base.py
dbally.llms.clients.base.LLMOptions
dataclass
#
Bases: ABC
Abstract dataclass that represents all available LLM call options.
dict
#
Creates a dictionary representation of the LLMOptions instance. If a value is None, it will be replaced with a provider-specific not-given sentinel.
RETURNS | DESCRIPTION |
---|---|
Dict[str, Any]
|
A dictionary representation of the LLMOptions instance. |