-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Issues: huggingface/peft
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Error of load_adapter of Target module is not supported when using Qwen2-VL
#2296
opened Dec 24, 2024 by
bigmouthbabyguo-530
1 of 4 tasks
PEFT model doesn't update params when having changed LoRA config
#2295
opened Dec 23, 2024 by
d-kleine
4 tasks done
Cannot import name 'EncoderDecoderCache' from 'transformers'
#2292
opened Dec 21, 2024 by
Huang-jia-xuan
4 tasks
get_peft_model() adds unwanted arguments to CLIPModel
#2291
opened Dec 20, 2024 by
TimonKaeppel
2 of 4 tasks
Inconsistent Parameter Mismatches After Merging PEFT and Base Models
#2289
opened Dec 19, 2024 by
enhulu-ms
2 of 4 tasks
TypeError when inference with different LoRA adapters in the same batch
#2283
opened Dec 15, 2024 by
yuxiang-guo
2 of 4 tasks
Incompatibility of X-LoRA and MistralForSequenceClassification
#2281
opened Dec 13, 2024 by
cyx96
2 of 4 tasks
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'eva_config'
#2275
opened Dec 11, 2024 by
Mohankrish08
2 of 4 tasks
Different Results When Predicting with Multiple LoRA Adapters in a Loop VS. Using only One LoRA
#2270
opened Dec 10, 2024 by
beyondguo
4 tasks
Can't PromptTuning in Multi-GPU with DeepSpeed and Qwen2.5-14B-Instruct
#2266
opened Dec 9, 2024 by
dongshou
2 of 4 tasks
Guidance Needed on Two-Stage Fine-Tuning with LoRA(SFT and DPO) for Model Adaptation
#2264
opened Dec 6, 2024 by
none0663
Could you provide example code for AdaLoRA finetuning decoder-only model?
#2262
opened Dec 5, 2024 by
SpeeeedLee
Is it possible to support the transformer engine when using Lora in Megatron?
#2260
opened Dec 5, 2024 by
liulong11
Adapter name conflict with tuner prefix leads to unclear warning during model loading
#2252
opened Dec 3, 2024 by
pzdkn
2 of 4 tasks
Request for adding the lora implementation for Conv1d rather than transormers.utils.Conv1d
contributions-welcome
#2241
opened Nov 28, 2024 by
HelloWorldLTY
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'exclude_modules'
#2208
opened Nov 9, 2024 by
imrankh46
4 tasks done
modules_to_save Incorrect Overlap in Multiple LoRA Adapters
#2206
opened Nov 8, 2024 by
saeid93
2 of 4 tasks
Previous Next
ProTip!
no:milestone will show everything without a milestone.