LLMClient
LLMClient
is an abstract class designed to interact with LLMs.
Concrete implementations for specific LLMs, like OpenAILLMClient, can be found in this section of our documentation.
LLMClient
configuration options include: template, format, event tracker, and optional generation parameters like
frequency_penalty, max_tokens, and temperature.
It constructs prompts using the PromptBuilder
instance.
dbally.llm_client.base.LLMClient
Bases: ABC
Abstract client for interaction with LLM.
It accepts parameters including the template, format, event tracker,
and optional generation parameters like frequency_penalty, max_tokens, and temperature
(the full list of options is provided by the LLMOptions
class).
It constructs a prompt using the PromptBuilder
instance and generates text using the self.call
method.
Source code in src/dbally/llm_client/base.py
text_generation
async
text_generation(template: PromptTemplate, fmt: dict, *, event_tracker: Optional[EventTracker] = None, frequency_penalty: Optional[float] = 0.0, max_tokens: Optional[int] = 128, n: Optional[int] = 1, presence_penalty: Optional[float] = 0.0, seed: Optional[int] = None, stop: Optional[Union[str, List[str]]] = None, temperature: Optional[float] = 1.0, top_p: Optional[float] = 1.0) -> str
For a given a PromptType and format dict creates a prompt and returns the response from LLM.
RETURNS | DESCRIPTION |
---|---|
str
|
Text response from LLM. |
Source code in src/dbally/llm_client/base.py
call
abstractmethod
async
call(prompt: Union[str, ChatFormat], response_format: Optional[Dict[str, str]], options: LLMOptions, event: LLMEvent) -> str
Calls LLM API endpoint.
PARAMETER | DESCRIPTION |
---|---|
prompt |
prompt passed to the LLM.
TYPE:
|
response_format |
Optional argument used in the OpenAI API - used to force a json output
TYPE:
|
options |
Additional settings used by LLM.
TYPE:
|
event |
an LLMEvent instance which fields should be filled during the method execution.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
str
|
Response string from LLM. |