LiteLLM Compatible Backend
LiteLLM is a unified interface for various LLM providers, offering a consistent API across different model providers. Supported model providers can be found at https://docs.litellm.ai/docs/providers.
LiteLLM compatible backends are initialized by using string identifiers for models in the format {provider}/{model}
, such as:
openai/gpt-4o-mini
anthropic/claude-3.5-sonnet
gemini/gemini-1.5-flash
In the code, these models are initialized and stored as string identifiers in the llms
array:
Standard Text Input
For standard text generation, LiteLLM backends process requests using the completion()
function:
The system passes the messages array, temperature, and max_tokens parameters to the completion function.
Tool Calls
When processing requests with tools, the tool definitions are added to the completion parameters:
Note that in some providers such as OpenAI and Anthropic, tool names can not contain "/" character when it is passed. Therefore, it is required to use slash_to_double_underscore
function to convert tools
The decode_litellm_tool_calls()
function processes the raw response to extract and format the tool calls.
JSON-Formatted Responses
For JSON-formatted responses, the adapter adds the format parameter:
Last updated