bob_llm.backend_clients

Classes

OpenAICompatibleClient

Client for OpenAI-compatible APIs.

Module Contents

class bob_llm.backend_clients.OpenAICompatibleClient(api_url: str, api_key: str, model: str, logger, temperature: float = 0.7, top_p: float = 1.0, max_tokens: int = 0, stop: list = None, presence_penalty: float = 0.0, frequency_penalty: float = 0.0, timeout: float = 60.0, response_format: dict = None)

Client for OpenAI-compatible APIs.

api_url
api_key
model
logger
headers
temperature = 0.7
top_p = 1.0
max_tokens = 0
stop = None
presence_penalty = 0.0
frequency_penalty = 0.0
timeout = 60.0
response_format = None
_build_payload(history: list, tools: list = None, stream: bool = False) dict

Construct the JSON payload for an API request.

Parameters:
  • history – The list of messages in the chat history.

  • tools – An optional list of tool definitions.

  • stream – A boolean indicating whether to enable streaming.

Returns:

A dictionary representing the complete JSON payload.

process_prompt(history: list, tools: list = None)

Send a non-streaming request to the LLM to get a complete response.

Parameters:
  • history – The list of messages in the chat history.

  • tools – An optional list of tool definitions.

Returns:

A tuple containing (success, message).

stream_prompt(history: list, tools: list = None)

Send a streaming request to the LLM and yield response chunks.

Parameters:
  • history – The list of messages in the chat history.

  • tools – An optional list of tool definitions.

Yield:

String chunks of the generated text content.