vLLM Backend
vLLM backends are handled using the OpenAI client class due to compatibility issues with LiteLLM.
These backends are initialized as OpenAI client instances with a custom base URL:
It is important to note that a dummy API key ("sk-1234") is required to set up the OpenAI hosted client since these backends typically don't require authentication when run locally.
Standard Text Input
For standard text input, the OpenAI client is used directly:
Tool Calls
When processing tool calls with OpenAI client-based backends:
The response is processed using the same decode_litellm_tool_calls
function as for LiteLLM backends.
JSON-Formatted Responses
JSON formatting uses the same approach as standard OpenAI clients:
Last updated