Skip to content

saftetensor in subfolder not supported #1154

@yxli2123

Description

@yxli2123

System Info

When I put adapter_model.safetensor in a subfolder of a Huggingface Hub, for example, LoftQ/Llama-2-7b-hf-4bit-64rank, PeftModel.from_pretrained("oftQ/Llama-2-7b-hf-4bit-64rank", subfolder='loftq_init') is not able to find the adapter_model.safetensor properly. It only supports adapter_model.bin. It would be great if you can support safetensors since PeftModel.save_pretrained() is automatically using safetensors. Thank you~

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

from transformers import (
    AutoConfig,
    AutoModelForCausalLM,
)


config = AutoConfig.from_pretrained("LoftQ/Llama-2-13b-hf-4bit-64rank")
model = AutoModelForCausalLM.from_pretrained(
            "LoftQ/Llama-2-7b-hf-4bit-64rank",
            config=config,
            )

peft_model = PeftModel.from_pretrained(model, "LoftQ/Llama-2-13b-hf-4bit-64rank", subfolder="loftq_init", is_trainable=True)

Expected behavior

peft_model = PeftModel.from_pretrained(model, "LoftQ/Llama-2-13b-hf-4bit-64rank", subfolder="loftq_init", would find the adapter_model.safetensors automatically.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions