Hello,
First of all, thank you for your work and the paper.
I have been trying to reconstruct the merging process of adapters with my GPTQ model directly. Could you provide an example of a GPTQ model with a qa_lora adapter where the merge is known to work correctly?
Additionally, have I understood it correctly that the direct merge is only a shift of the beta (qzeros) parameter?
I'm encountering an issue where my perplexity consistently drops after the merging process. Is this expected, or could there be something wrong with my approach?
Best regards