LLM Core API
API for calling LLM Core(s)
Within the LLM Core API, it relies on three important components: LLMQuery, LLMResponse and send_request function
The LLM's response follows the Response
class structure:
Common rules for LLM Core APIs
When leveraging LLM Core APIs, you can specify which LLM to use with the llms
parameter:
It is important to mention that if you would like to pass multiple LLMs as the backends, the AIOS will automatically adopt the LLM routing to select a model from the LLMs you passed to address your queries. The selection is optimized on both model capability to solve the task and the cost to solve the task. This feature is detailed in LLM router.
Last updated