vLLM Backend
Last updated
Last updated
vLLM backends are handled using the OpenAI client class due to compatibility issues with LiteLLM.
These backends are initialized as OpenAI client instances with a custom base URL:
It is important to note that a dummy API key ("sk-1234") is required to set up the OpenAI hosted client since these backends typically don't require authentication when run locally.
Standard Text Input
For standard text input, the OpenAI client is used directly:
Tool Calls
When processing tool calls with OpenAI client-based backends:
JSON-Formatted Responses
JSON formatting uses the same approach as standard OpenAI clients:
The response is processed using the same function as for LiteLLM backends.