AIOS Docs
  • Welcome
  • Getting Started
    • Installation
    • Quickstart
      • Use Terminal
      • Use WebUI
    • Environment Variables Configuration
  • AIOS Kernel
    • Overview
    • LLM Core(s)
      • LiteLLM Compatible Backend
      • vLLM Backend
      • Hugging Face Backend
      • LLM Routing
    • Scheduler
      • FIFOScheduler
      • RRScheduler
    • Context
    • Memory
      • Base Layer
      • Agentic Memory Operations
    • Storage
      • sto_mount
      • sto_create_file
      • sto_create_directory
      • sto_write
      • sto_retrieve
      • sto_rollback
      • sto_share
    • Tools
    • Access
    • Syscalls
    • Terminal
  • AIOS Agent
    • How to Use Agent
    • How to Develop Agents
      • Develop with Native SDK
      • Develop with AutoGen
      • Develop with Open-Interpreter
      • Develop with MetaGPT
    • How to Publish Agents
  • AIOS-Agent SDK
    • Overview
    • LLM Core API
      • llm_chat
      • llm_chat_with_json_output
      • llm_chat_with_tool_call_output
      • llm_call_tool
      • llm_operate_file
    • Memory API
      • create_memory
      • get_memory
      • update_memory
      • delete_memory
      • search_memories
      • create_agentic_memory
    • Storage API
      • mount
      • create_file
      • create_dir
      • write_file
      • retrieve_file
      • rollback_file
      • share_file
    • Tool API
      • How to Develop Tools
    • Access API
    • Post API
    • Agent API
  • Community
    • How to Contribute
Powered by GitBook
On this page
  1. AIOS-Agent SDK
  2. LLM Core API

llm_call_tool

llm_call_tool: Direct Tool Invocation

Explicitly instructs the language model to use specified tools with user input.

def llm_call_tool(
    agent_name: str, 
    messages: List[Dict[str, Any]], 
    tools: List[Dict[str, Any]], 
    base_url: str = aios_kernel_url,
    llms: List[Dict[str, Any]] = None
) -> LLMResponse

Parameters:

  • agent_name: Identifier for the agent making the request

  • messages: List of message dictionaries

  • tools: List of available tools and their specifications

  • base_url: API endpoint URL

  • llms: Optional list of LLM configurations

Returns:

  • LLMResponse object containing tool calls and their results

Example:

# Use weather tool to get forecast
response = llm_call_tool(
    "weather_agent",
    messages=[
        {"role": "user", "content": "What's the weather like in New York today?"}
    ],
    tools=[{
        "name": "weather_service/get_forecast",
        "description": "Get weather forecast for a location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {"type": "string"},
                "units": {"type": "string", "enum": ["metric", "imperial"]}
            },
            "required": ["location"]
        }
    }]
)

# Process the tool response
print(response["response"]["response_message"])
Previousllm_chat_with_tool_call_outputNextllm_operate_file

Last updated 1 month ago