Skip to content

Commit 7b2fee1

Browse files
gdengkGao Deng
andauthored
pass in the arg (#13940)
Signed-off-by: Gao Deng <[email protected]> Co-authored-by: Gao Deng <[email protected]>
1 parent 716392b commit 7b2fee1

File tree

1 file changed

+1
-0
lines changed
  • nemo/collections/vlm/llama4/model

1 file changed

+1
-0
lines changed

nemo/collections/vlm/llama4/model/base.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -161,6 +161,7 @@ def configure_model(self, tokenizer, vp_stage: Optional[int] = None) -> "MCoreNe
161161
# set token_drop setting from config
162162
self.language_transformer_config.moe_pad_expert_input_to_capacity = self.moe_pad_expert_input_to_capacity
163163
self.language_transformer_config.moe_expert_capacity_factor = self.moe_expert_capacity_factor
164+
self.language_transformer_config.tp_comm_overlap = self.tp_comm_overlap
164165

165166
# During fake lightning initialization, pass 0 to bypass the assertion that vp_stage must be
166167
# non-None when using virtual pipeline model parallelism

0 commit comments

Comments
 (0)