Skip to content

Conversation

@neph1
Copy link
Owner

@neph1 neph1 commented Apr 20, 2025

No description provided.

@neph1 neph1 merged commit 7a7ec1c into main Apr 20, 2025
@FurkanGozukara
Copy link

your branch not working i installed it directly

tested lora @neph1

https://civitai.com/models/1084814/studio-ghibli-style-hunyuanvideo

High-VRAM Mode: False
Downloading shards: 100%|██████████████████████████████████████████████████████████████| 4/4 [00:00<00:00, 3935.54it/s]
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 4/4 [00:00<00:00,  8.60it/s]
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 3/3 [00:00<00:00, 86.96it/s]
Loading default_0 was unsucessful with the following error:
Target modules {'img_mod.linear', 'fc2', 'img_attn_qkv', 'linear2', 'img_attn_proj', 'txt_mod.linear', 'modulation.linear', 'linear1', 'txt_attn_proj', 'txt_attn_qkv', 'fc1'} not found in the base model. Please check the target modules and try again.
Traceback (most recent call last):
  File "E:\temp_test_lora\FramePack\demo_gradio.py", line 89, in <module>
    transformer = load_lora(transformer, lora_path, lora_name)
  File "E:\temp_test_lora\FramePack\diffusers_helper\load_lora.py", line 30, in load_lora
    transformer.load_lora_adapter(state_dict, network_alphas=None)
  File "E:\temp_test_lora\FramePack\venv\lib\site-packages\diffusers\loaders\peft.py", line 351, in load_lora_adapter
    inject_adapter_in_model(lora_config, self, adapter_name=adapter_name, **peft_kwargs)
  File "E:\temp_test_lora\FramePack\venv\lib\site-packages\peft\mapping.py", line 76, in inject_adapter_in_model
    peft_model = tuner_cls(model, peft_config, adapter_name=adapter_name, low_cpu_mem_usage=low_cpu_mem_usage)
  File "E:\temp_test_lora\FramePack\venv\lib\site-packages\peft\tuners\lora\model.py", line 142, in __init__
    super().__init__(model, config, adapter_name, low_cpu_mem_usage=low_cpu_mem_usage)
  File "E:\temp_test_lora\FramePack\venv\lib\site-packages\peft\tuners\tuners_utils.py", line 180, in __init__
    self.inject_adapter(self.model, adapter_name, low_cpu_mem_usage=low_cpu_mem_usage)
  File "E:\temp_test_lora\FramePack\venv\lib\site-packages\peft\tuners\tuners_utils.py", line 527, in inject_adapter
    raise ValueError(error_msg)
ValueError: Target modules {'img_mod.linear', 'fc2', 'img_attn_qkv', 'linear2', 'img_attn_proj', 'txt_mod.linear', 'modulation.linear', 'linear1', 'txt_attn_proj', 'txt_attn_qkv', 'fc1'} not found in the base model. Please check the target modules and try again.
Press any key to continue . . .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants