llm_chat_with_tool_call_output

llm_chat_with_tool_call_output: Chat with Tool Integration

Allows the language model to decide which tools to use based on the conversation context.

def llm_chat_with_tool_call_output(
    agent_name: str, 
    messages: List[Dict[str, Any]], 
    tools: List[Dict[str, Any]],
    base_url: str = aios_kernel_url,
    llms: List[Dict[str, Any]] = None
) -> LLMResponse

Parameters:

  • agent_name: Identifier for the agent making the request

  • messages: List of message dictionaries

  • tools: List of available tools and their specifications

  • base_url: API endpoint URL

  • llms: Optional list of LLM configurations

Returns:

  • LLMResponse object containing tool calls made by the model

Example:

Last updated