Natural Language Responder#
NLResponder
class can be utilized to transform a database output into a natural language response to user queries.
flowchart LR
Q[Do we have any Data Scientists?]
Result["`[{'experience': 5, 'position': 'Data Scientist'},
{'experience': 2, 'position': 'Data Scientist'}]`"]
Responder[Natural Language Responder]
Q --> Responder
Result --> Responder
Responder --> Answer["Yes, we have 2 Data scientists in our company."]
The method used to generate the response is self.generate_response
. It essentially generates a natural language response to the user's question using a LLM. Here's a breakdown of the steps:
- The results of the query execution are converted to a Markdown table.
- The number of tokens in
nl_responder_prompt_template
supplemented with the table from point 1 is counted. - If the token count exceeds a predefined maximum (
self.max_tokens_count
), a response is generated using aiql_explainer_prompt_template
. It will return a description of the results based on the query, omitting table analysis step. Otherwise, a response is generated using anl_responder_prompt_template
.
Tip
To understand general idea better, visit the NL Responder concept page.
dbally.nl_responder.nl_responder.NLResponder
#
NLResponder(llm: LLM, prompt_template: Optional[PromptTemplate[NLResponsePromptFormat]] = None, explainer_prompt_template: Optional[PromptTemplate[QueryExplanationPromptFormat]] = None, max_tokens_count: int = 4096)
Class used to generate natural language response from the database output.
Constructs a new NLResponder instance.
PARAMETER | DESCRIPTION |
---|---|
llm |
LLM used to generate natural language response.
TYPE:
|
prompt_template |
Template for the prompt used to generate the NL response
if not set defaults to
TYPE:
|
max_tokens_count |
Maximum number of tokens that can be used in the prompt.
TYPE:
|
Source code in src/dbally/nl_responder/nl_responder.py
generate_response
async
#
generate_response(result: ViewExecutionResult, question: str, event_tracker: EventTracker, llm_options: Optional[LLMOptions] = None) -> str
Uses LLM to generate a response in natural language form.
PARAMETER | DESCRIPTION |
---|---|
result |
Object representing the result of the query execution.
TYPE:
|
question |
User question.
TYPE:
|
event_tracker |
Event store used to audit the generation process.
TYPE:
|
llm_options |
Options to use for the LLM client.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
str
|
Natural language response to the user question. |
RAISES | DESCRIPTION |
---|---|
LLMError
|
If LLM text generation fails. |