bob_llama_cpp.llm

Classes

BaseNode

Lifecycle ROS Node

LlmNode

LLM ROS Node wrapper

Functions

main([args])

Module Contents

class bob_llama_cpp.llm.BaseNode

Bases: bob_llama_cpp.lnode.LNode

Lifecycle ROS Node

class bob_llama_cpp.llm.LlmNode

Bases: BaseNode

LLM ROS Node wrapper

system_prompt = None
tools_module = None
n_keep = -1
sub_in = None
sentence = ''
session_thread = None
queue = []
history = []
pub_generator
pub_out
pub_sentence
pub_dialog
configure() None

Start/restart initialization of the node

destroy() None

Stop llm thread, destroy subscriber

llm_in_callback(msg: std_msgs.msg.String) None
tokenize(text: str, add_special: bool = False) list

Tokenize text and return the tokens list.

jsonfy(dialog, prompt_user) str

Format a json object from an array with user/assistant messages. The prompt_user should match ^username:.*

generate(prompt: str)

Generate streaming completion

session() None

Process incoming queue messages.

token_handler(token: str) None

Publishes to the generator and sentence publishers.

publish(data: str, pub) None

Publish data to given String topic.

print(s: str)

Print to stdout.

bob_llama_cpp.llm.main(args=None)