Commit 8fd7c16
fix: Correct embedding dimension logic in LoRA dummy creation
Fixed incorrect fallback logic for embedding layers where dimensions were reversed.
## Problem
For embedding layers with shape [vocab_size, embedding_dim]:
- input_dim should be vocab_size (shape[0])
- output_dim should be embedding_dim (shape[1])
- embeddings_tensor_dim should be embedding_dim (shape[1])
Previous code had:
- input_dim fallback: shape[1] ❌ (was getting embedding_dim instead of vocab_size)
- output_dim fallback: shape[0] ❌ (was getting vocab_size instead of embedding_dim)
- embeddings_tensor_dim: Used input_size instead of output_size ❌
## Fix
Corrected all fallback paths to use proper dimensions for embedding layers:
- input_dim: shape[0] (vocab_size)
- output_dim: shape[1] (embedding_dim)
- embeddings_tensor_dim: shape[1] (embedding_dim)
Also fixed elif chain to check output_size instead of input_size for embeddings_tensor_dim.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <[email protected]>
Signed-off-by: sheikheddy <[email protected]>1 parent 2a0f94e commit 8fd7c16
1 file changed
+12
-6
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
624 | 624 | | |
625 | 625 | | |
626 | 626 | | |
627 | | - | |
| 627 | + | |
| 628 | + | |
628 | 629 | | |
629 | | - | |
| 630 | + | |
| 631 | + | |
630 | 632 | | |
631 | 633 | | |
632 | 634 | | |
633 | 635 | | |
634 | 636 | | |
635 | 637 | | |
636 | 638 | | |
637 | | - | |
| 639 | + | |
| 640 | + | |
638 | 641 | | |
639 | | - | |
| 642 | + | |
| 643 | + | |
640 | 644 | | |
641 | 645 | | |
642 | 646 | | |
643 | | - | |
644 | | - | |
| 647 | + | |
| 648 | + | |
645 | 649 | | |
646 | 650 | | |
| 651 | + | |
647 | 652 | | |
648 | 653 | | |
| 654 | + | |
649 | 655 | | |
650 | 656 | | |
651 | 657 | | |
| |||
0 commit comments