Skip to content

[SW-225282] - Handle Batch Dimension for LoRA#1182

Merged
vivekgoe merged 2 commits intohabana_mainfrom
dev/hlahkar/lora_rebase
May 5, 2025
Merged

[SW-225282] - Handle Batch Dimension for LoRA#1182
vivekgoe merged 2 commits intohabana_mainfrom
dev/hlahkar/lora_rebase

Conversation

@hlahkar
Copy link
Copy Markdown

@hlahkar hlahkar commented Apr 29, 2025

CustomOp Implementation for:

  1. ColumnParallelLinearWithLoRA
  2. RowParallelLinearWithLoRA
  3. VocabParallelEmbeddingWithLoRA

to take care of hpu, as hpu does not support flat tensors.

@hlahkar hlahkar changed the title [SW-225433] - Handle Batch Dimension for LoRA [SW-225282] - Handle Batch Dimension for LoRA Apr 29, 2025
@michalkuligowski
Copy link
Copy Markdown

/run-gaudi-tests

@vivekgoe vivekgoe force-pushed the dev/hlahkar/lora_rebase branch from e27a9eb to 44cc37f Compare April 30, 2025 05:01
@hlahkar hlahkar force-pushed the dev/hlahkar/lora_rebase branch from 44cc37f to a6e7580 Compare May 2, 2025 07:54
@vivekgoe
Copy link
Copy Markdown

vivekgoe commented May 2, 2025

/run-gaudi-tests

@vivekgoe vivekgoe enabled auto-merge (squash) May 2, 2025 10:06
@michalkuligowski
Copy link
Copy Markdown

/skip-gaudi-tests - multimodal tests pass its a false negative

@vivekgoe vivekgoe merged commit 38186e1 into habana_main May 5, 2025
40 of 42 checks passed
@vivekgoe vivekgoe deleted the dev/hlahkar/lora_rebase branch May 5, 2025 07:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants