Skip to content

Commit c9fcb7e

Browse files
niehen6174niehen6174
authored andcommitted
[diffusion] lora: fix LoRA dtype handling and weight attribute access for z-image model (sgl-project#14543)
Co-authored-by: niehen6174 <nihen6174@gmail.com>
1 parent a8e80dc commit c9fcb7e

File tree

1 file changed

+9
-1
lines changed
  • python/sglang/multimodal_gen/runtime/layers/lora

1 file changed

+9
-1
lines changed

python/sglang/multimodal_gen/runtime/layers/lora/linear.py

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,14 @@ def __init__(
5959
self.lora_A = None
6060
self.lora_B = None
6161

62+
@property
63+
def weight(self):
64+
return self.base_layer.weight
65+
66+
@property
67+
def bias(self):
68+
return getattr(self.base_layer, "bias", None)
69+
6270
@torch.compile()
6371
def forward(self, x: torch.Tensor) -> torch.Tensor:
6472
lora_A = self.lora_A
@@ -79,7 +87,7 @@ def forward(self, x: torch.Tensor) -> torch.Tensor:
7987
return out + delta, output_bias
8088
else:
8189
out, output_bias = self.base_layer(x)
82-
return out.to(x), output_bias
90+
return out, output_bias
8391

8492
def slice_lora_a_weights(self, A: torch.Tensor) -> torch.Tensor:
8593
return A

0 commit comments

Comments
 (0)