Replies: 1 comment
-
|
This is a very interesting feature to add. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Summary
I would like to request support for LiteLLM as a global model provider in Langflow, alongside the existing providers such as Ollama, IBM, and OpenAI.
This would allow users to configure LiteLLM centrally in the Global Model Providers section and use it across flows, including for building Knowledge Bases powered by models served through LiteLLM.
Background
Langflow recently introduced Global Model Providers, allowing language model providers to be configured centrally and reused across the entire Langflow instance.
Instead of managing provider credentials and settings inside individual LLM components, users can now:
Currently supported providers include:
However, LiteLLM is not yet available as a global model provider.
Motivation
LiteLLM is widely used as a unified gateway to multiple model backends (OpenAI-compatible APIs, Azure, Anthropic, local models, etc.). Supporting LiteLLM as a global provider would:
This is especially valuable for teams using LiteLLM as an abstraction layer to manage cost control, logging, routing, or multi-model strategies.
Proposed Enhancement
Add LiteLLM as a selectable provider in the Global Model Providers configuration, with support for:
Ideally, LiteLLM would be usable anywhere other providers (OpenAI, Ollama, IBM) are currently supported, including:
Expected Impact
Thank you for considering this feature request!
Beta Was this translation helpful? Give feedback.
All reactions