How do how handle cases where you already have lora weights and want to re-apply them to the model?

how do how handle cases where you already have lora weights and want to re-apply them to the model?

I see the model = FastLanguageModel.get_peft_model( method, but that seems to initialize brand new weights.

What about in cases where you already have the lora weights saved separately.

Would you do the FastLanguageModel for the base model, then use model = PeftModel.from_pretrained(model, ?