bob_llama_cpp.prompt_tools

Attributes

tool_calls

tool_functions

model_default

tokenizer

Functions

generate_tool_results(→ str)

Generate tool result response from tool_calls list.

parse_tool_calls(→ str)

Parse text for json arrays, multiple arrays are merged together

detect_and_process_tool_calls(→ int)

Check if tools calls exists in the response. If one or more are found

import_module_from_path(→ Any)

Function to dynamically import a module from file path

apply_chat_template(→ str)

Generate a system prompt with the available tools using HF AutoTokenizer apply_chat_template function.

Module Contents

bob_llama_cpp.prompt_tools.tool_calls = []
bob_llama_cpp.prompt_tools.tool_functions
bob_llama_cpp.prompt_tools.model_default = 'mistralai/Mistral-7B-Instruct-v0.3'
bob_llama_cpp.prompt_tools.tokenizer = None
bob_llama_cpp.prompt_tools.generate_tool_results() str

Generate tool result response from tool_calls list. If global module var tokenizer is not initialized the tool results are generated in lama2 format

Returns:

The string with generated tool results.

Returns None if there were no tool calls to process :rtype: str

bob_llama_cpp.prompt_tools.parse_tool_calls(text: str, calls: list) str

Parse text for json arrays, multiple arrays are merged together into the calls argument.

Parameters:
  • text (str) – The string containing the json array

  • calls (list) – The list where array items are added if found a json array

Returns:

The remaing text without the python array

Return type:

str

bob_llama_cpp.prompt_tools.detect_and_process_tool_calls(response: str) int

Check if tools calls exists in the response. If one or more are found and it matches a tool call it will try to execute it/them. If the execution fails an exception will be raised. The tool detection works with [] and <tool_call></tool_call> pairs.

Parameters:

response (str) – The text where to look for json function tool calls

Returns:

The count of called tool function

Return type:

int

bob_llama_cpp.prompt_tools.import_module_from_path(module_name: str, path: str) Any

Function to dynamically import a module from file path

Parameters:
  • module_name (str) – Name of the module

  • path (str) – Path to python module file

Returns:

The loaded python module

Return type:

Any

bob_llama_cpp.prompt_tools.apply_chat_template(conversation: list = None, model_id: str = None) str

Generate a system prompt with the available tools using HF AutoTokenizer apply_chat_template function. It takes into account the model specific prompt format.

Parameters:
  • conversation (list, optional) – Array with role/content dicts, defaults to None

  • model_id (str, optional) – The Huggingface model_id to be used by the tokenizer, defaults to None

Returns:

The generated prompt

Return type:

str