Replies: 1 comment
-
Okay, so to make things clear: You have a base model
If for some reason, you do not want to go through all these steps to obtain |
Beta Was this translation helpful? Give feedback.
-
Okay, so to make things clear: You have a base model
If for some reason, you do not want to go through all these steps to obtain |
Beta Was this translation helpful? Give feedback.
-
Hi,
Let's say I have several LoRA models for Llama3. First, I merged them using the merge_and_unload function, and then I obtained a new LoRA model using get_peft_model from the merged model. If I train the new PEFT model and save a checkpoint, does the inference require all the LoRA models I merged, the final saved LoRA model, and Llama3? Or does it just need the Llama3 model and the final saved LoRA model? Thank you guys.
Beta Was this translation helpful? Give feedback.
All reactions