bob_llm.llm_node

Classes

LLMNode

ROS 2 node that provides an interface to LLMs and VLMs.

Functions

main([args])

Module Contents

class bob_llm.llm_node.LLMNode

Bases: rclpy.node.Node

ROS 2 node that provides an interface to LLMs and VLMs.

This node handles prompts, manages conversation history, and executes tools.

chat_history = []
_prefix_history_len
_is_generating = False
_cancel_requested = False
_prompt_queue
_queue_timer = None
_queue_timer_period = 0.1
sub
pub_response
pub_stream
pub_latest_turn
_initialize_chat_history()

Populate the initial chat history from ROS parameters.

load_llm_client()

Load and configure the LLM client based on ROS parameters.

_load_tools()

Dynamically load tool modules specified in ‘tool_interfaces’.

Returns:

A tuple containing (all_tools, all_functions).

_publish_latest_turn(user_prompt: str, assistant_message: dict)

Process the latest conversational turn for publishing and logging.

Parameters:
  • user_prompt – The string content of the user’s latest prompt.

  • assistant_message – The final message dictionary from the assistant.

_get_truncated_history()

Return a copy of chat history with long strings truncated.

_trim_chat_history()

Prevent the chat history from growing indefinitely.

It trims the oldest conversational turns to stay within the limit.

prompt_callback(msg)

Process an incoming prompt from the ‘llm_prompt’ topic.

_process_queue_callback()

Timer callback to process queued prompts.

bob_llm.llm_node.main(args=None)