evolving_ideas.infra.responder¶
evolving_ideas.infra.responder
Attributes¶
Classes¶
Wraps the underlying LLM for easier substitution/testing. |
Functions¶
|
Module Contents¶
- evolving_ideas.infra.responder.logger¶
- evolving_ideas.infra.responder.LLM_BACKENDS¶
- evolving_ideas.infra.responder.get_llm_backend(name: str) evolving_ideas.infra.llm_interface.LLMInterface ¶
- class evolving_ideas.infra.responder.LLMResponder¶
Wraps the underlying LLM for easier substitution/testing.
- ask(prompt: str, context='You are a helpful assistant.') str ¶
Ask the LLM a question with a given prompt and context.
- Parameters:
prompt (str) – The prompt to send to the LLM.
context (str) – Additional context for the LLM (default is “You are a helpful assistant.”).
- Returns:
The LLM’s response.
- Return type:
str
- chat(chatlog: list) str ¶
Start a chat session with the LLM.
- Parameters:
chatlog (list) – A list of messages to include in the chat context.
- Returns:
The LLM’s response.
- Return type:
str