Skip to content

fix: set litellm.drop_params to support non-OpenAI providers (#63)#85

Open
smwaqas89 wants to merge 1 commit intolm-sys:mainfrom
smwaqas89:fix/drop-unsupported-litellm-params
Open

fix: set litellm.drop_params to support non-OpenAI providers (#63)#85
smwaqas89 wants to merge 1 commit intolm-sys:mainfrom
smwaqas89:fix/drop-unsupported-litellm-params

Conversation

@smwaqas89
Copy link

Summary

Fixes #63. Also addresses the issue described in #32.

Problem

When using non-OpenAI providers (Ollama, Groq, local models) as the weak model, litellm raises UnsupportedParamsError because parameters like presence_penalty and frequency_penalty are not supported by all providers.

The ChatCompletionRequest in openai_server.py defaults these to 0.0, and model_dump(exclude_none=True) always includes them since they're not None. When litellm forwards these to providers that don't support them, it crashes.

Solution

Set litellm.drop_params = True at module level in controller.py. This is litellm's built-in mechanism for handling provider compatibility — unsupported parameters are silently dropped rather than raising errors.

This enables RouteLLM to work seamlessly with any provider supported by litellm (Ollama, Groq, Together AI, local models, etc.) without requiring users to manually configure parameter handling.

Changes

  • routellm/controller.py: Added import litellm and set litellm.drop_params = True

Testing

  • Verified drop_params flag is set on module load
  • Tested with Ollama as weak model — no more UnsupportedParamsError
  • No impact on OpenAI/Anyscale providers (they already support these params)

Fixes lm-sys#63. Also addresses the issue in lm-sys#32.

When using providers like Ollama or Groq as the weak model,
litellm raises UnsupportedParamsError for parameters like
presence_penalty that these providers don't support.

Setting litellm.drop_params = True causes litellm to silently
drop unsupported parameters, enabling seamless routing to any
provider supported by litellm.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

litellm.drop_params error when running the openapi server

1 participant