LiteLLM Compatible Backend
Last updated
Last updated
LiteLLM is a unified interface for various LLM providers, offering a consistent API across different model providers. Supported model providers can be found at .
LiteLLM compatible backends are initialized by using string identifiers for models in the format {provider}/{model}
, such as:
openai/gpt-4o-mini
anthropic/claude-3.5-sonnet
gemini/gemini-1.5-flash
In the code, these models are initialized and stored as string identifiers in the llms
array:
Standard Text Input
For standard text generation, LiteLLM backends process requests using the completion()
function:
The system passes the messages array, temperature, and max_tokens parameters to the completion function.
Tool Calls
When processing requests with tools, the tool definitions are added to the completion parameters:
JSON-Formatted Responses
For JSON-formatted responses, the adapter adds the format parameter:
Note that in some providers such as OpenAI and Anthropic, tool names can not contain "/" character when it is passed. Therefore, it is required to use function to convert tools
The function processes the raw response to extract and format the tool calls.