llm_chat

llm_chat: Basic Chat Interaction

Enables simple text-based conversations with the language model.

def llm_chat(
    agent_name: str, 
    messages: List[Dict[str, Any]], 
    base_url: str = aios_kernel_url,
    llms: List[Dict[str, Any]] = None
) -> LLMResponse

Parameters:

  • agent_name: Identifier for the agent making the request

  • messages: List of message dictionaries (system, user, assistant)

  • base_url: API endpoint URL (default: configured AIOS kernel URL)

  • llms: Optional list of LLM configurations to use

Returns:

  • LLMResponse object containing the model's text response

Example:

# Basic chat interaction
response = llm_chat(
    "my_assistant",
    messages=[
        {"role": "system", "content": "You are a helpful AI assistant."},
        {"role": "user", "content": "What are the main features of Python?"}
    ]
)
print(response["response"]["response_message"])

Last updated