Skip to content

Conversation

@l1ghtsource
Copy link
Contributor

Problem Description

Currently, the FastModel.get_peft_model method does not support the modules_to_save argument.

If a user passes modules_to_save=["score"] when training an LLM classifier, the head (score) remains a regular Linear module and is not wrapped in ModulesToSaveWrapper.

This results in the head’s weights not being saved in adapter_model.safetensors along with the LoRA adapters, making reuse or further fine-tuning impossible.

Solution

  • Added support to pass the modules_to_save argument into the internal LoRA configuration.
  • The head (or any specified modules) is now correctly wrapped in ModulesToSaveWrapper.

@danielhanchen
Copy link
Contributor

Thanks - I'll also refactor to allow all other items!

@danielhanchen danielhanchen merged commit 3808d80 into unslothai:main Sep 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants