Skip to content

Conversation

@BenjaminBossan
Copy link
Member

@BenjaminBossan BenjaminBossan commented Feb 18, 2025

Supersedes #2383.

X-LoRA tests started failing after this transformers PR (see e.g. here):

huggingface/transformers#35724

The solution appears to be to disable caching completely when calling generate on the X-LoRA model. This also makes some previously xfail-ing tests pass.

I tested this locally with transformers checked out before and after the mentioned PR and the tests pass in both circumstances. I also tested changing the base model from "facebook/opt-125m" to "trl-internal-testing/tiny-random-LlamaForCausalLM" and the tests passed with both.

X-LoRA tests started failing after this transformers PR:

huggingface/transformers#35724

The solution appears to be to disable caching completely when calling
generate on the X-LoRA model. This also makes some previously xfail-ing
tests pass.

I tested this locally with transformers checked out before and after the
mentioned PR and the tests pass in both circumstances. I also tested
changing the base model from "facebook/opt-125m" to
"trl-internal-testing/tiny-random-LlamaForCausalLM" and the tests passed
with both.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

It was marked as xfail beforehand, but it is in fact just flaky.
@BenjaminBossan BenjaminBossan merged commit 1793a95 into huggingface:main Feb 18, 2025
14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-failing-xlora-test-transformers-4.49.0 branch February 18, 2025 16:29
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
X-LoRA tests started failing after this transformers PR:

huggingface/transformers#35724

The solution appears to be to disable caching completely when calling
generate on the X-LoRA model. This also makes some previously xfail-ing
tests pass.

I tested this locally with transformers checked out before and after the
mentioned PR and the tests pass in both circumstances. I also tested
changing the base model from "facebook/opt-125m" to
"trl-internal-testing/tiny-random-LlamaForCausalLM" and the tests passed
with both.

Also, mark X-LoRA save_load_function test as flaky.
It was marked as xfail beforehand, but it is in fact just flaky.
cyyever pushed a commit to cyyever/peft that referenced this pull request Sep 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants