Skip to content

LLMClient

LLMClient is an abstract class designed to interact with LLMs.

Concrete implementations for specific LLMs, like OpenAILLMClient, can be found in this section of our documentation.

LLMClient configuration options include: template, format, event tracker, and optional generation parameters like frequency_penalty, max_tokens, and temperature.

It constructs prompts using the PromptBuilder instance.

dbally.llm_client.base.LLMClient

LLMClient(model_name: str)

Bases: ABC

Abstract client for interaction with LLM.

It accepts parameters including the template, format, event tracker, and optional generation parameters like frequency_penalty, max_tokens, and temperature (the full list of options is provided by the LLMOptions class). It constructs a prompt using the PromptBuilder instance and generates text using the self.call method.

Source code in src/dbally/llm_client/base.py
def __init__(self, model_name: str):
    self.model_name = model_name
    self._prompt_builder = PromptBuilder(self.model_name)

model_name instance-attribute

model_name = model_name

text_generation async

text_generation(template: PromptTemplate, fmt: dict, *, event_tracker: Optional[EventTracker] = None, frequency_penalty: Optional[float] = 0.0, max_tokens: Optional[int] = 128, n: Optional[int] = 1, presence_penalty: Optional[float] = 0.0, seed: Optional[int] = None, stop: Optional[Union[str, List[str]]] = None, temperature: Optional[float] = 1.0, top_p: Optional[float] = 1.0) -> str

For a given a PromptType and format dict creates a prompt and returns the response from LLM.

RETURNS DESCRIPTION
str

Text response from LLM.

Source code in src/dbally/llm_client/base.py
async def text_generation(  # pylint: disable=R0913
    self,
    template: PromptTemplate,
    fmt: dict,
    *,
    event_tracker: Optional[EventTracker] = None,
    frequency_penalty: Optional[float] = 0.0,
    max_tokens: Optional[int] = 128,
    n: Optional[int] = 1,
    presence_penalty: Optional[float] = 0.0,
    seed: Optional[int] = None,
    stop: Optional[Union[str, List[str]]] = None,
    temperature: Optional[float] = 1.0,
    top_p: Optional[float] = 1.0,
) -> str:
    """
    For a given a PromptType and format dict creates a prompt and
    returns the response from LLM.

    Returns:
        Text response from LLM.
    """

    options = LLMOptions(
        frequency_penalty=frequency_penalty,
        max_tokens=max_tokens,
        n=n,
        presence_penalty=presence_penalty,
        seed=seed,
        stop=stop,
        temperature=temperature,
        top_p=top_p,
    )

    prompt = self._prompt_builder.build(template, fmt)

    event = LLMEvent(prompt=prompt, type=type(template).__name__)

    event_tracker = event_tracker or EventTracker()
    async with event_tracker.track_event(event) as span:
        event.response = await self.call(prompt, template.response_format, options, event)
        span(event)

    return event.response

call abstractmethod async

call(prompt: Union[str, ChatFormat], response_format: Optional[Dict[str, str]], options: LLMOptions, event: LLMEvent) -> str

Calls LLM API endpoint.

PARAMETER DESCRIPTION
prompt

prompt passed to the LLM.

TYPE: Union[str, ChatFormat]

response_format

Optional argument used in the OpenAI API - used to force a json output

TYPE: Optional[Dict[str, str]]

options

Additional settings used by LLM.

TYPE: LLMOptions

event

an LLMEvent instance which fields should be filled during the method execution.

TYPE: LLMEvent

RETURNS DESCRIPTION
str

Response string from LLM.

Source code in src/dbally/llm_client/base.py
@abc.abstractmethod
async def call(
    self,
    prompt: Union[str, ChatFormat],
    response_format: Optional[Dict[str, str]],
    options: LLMOptions,
    event: LLMEvent,
) -> str:
    """
    Calls LLM API endpoint.

    Args:
        prompt: prompt passed to the LLM.
        response_format: Optional argument used in the OpenAI API - used to force a json output
        options: Additional settings used by LLM.
        event: an LLMEvent instance which fields should be filled during the method execution.

    Returns:
        Response string from LLM.
    """