Skip to content

[Feature] Add LoRA and chat_template_kwargs support to VLLMwithChatTemplate#2310

Merged
mzr1996 merged 2 commits intoopen-compass:mainfrom
Jensen246:pr-lora-support
Nov 26, 2025
Merged

[Feature] Add LoRA and chat_template_kwargs support to VLLMwithChatTemplate#2310
mzr1996 merged 2 commits intoopen-compass:mainfrom
Jensen246:pr-lora-support

Conversation

@Jensen246
Copy link
Contributor

Motivation

  • VLLMwithChatTemplate currently doesn't support LoRA evaluation
  • Cannot customize chat template parameters for model-specific features

Changes

  • Add lora_path parameter to support LoRA adapter evaluation
  • Add chat_template_kwargs to enable chat template customization
  • Fully backward compatible with existing configurations

Use Cases

  1. Evaluate LoRA fine-tuned chat models
  2. Enable kwargs like Qwen3 thinking mode: chat_template_kwargs={'enable_thinking': True}
  3. Support other model-specific template features

- Add lora_path parameter to support LoRA adapter evaluation
- Add chat_template_kwargs parameter for chat template customization
- Enable control of model-specific template features (e.g. Qwen3 thinking mode)
- Fully backward compatible with existing configurations
@mzr1996 mzr1996 merged commit 31fd7ac into open-compass:main Nov 26, 2025
7 of 8 checks passed
iamkaia pushed a commit to iamkaia/opencompass that referenced this pull request Feb 4, 2026
…te (open-compass#2310)

- Add lora_path parameter to support LoRA adapter evaluation
- Add chat_template_kwargs parameter for chat template customization
- Enable control of model-specific template features (e.g. Qwen3 thinking mode)
- Fully backward compatible with existing configurations
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants