Skip to content

Commit afc13b6

Browse files
committed
test=allcase
1 parent c5c4aa2 commit afc13b6

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

python/paddle/distributed/collective.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1013,6 +1013,7 @@ def _parallel_linear(x,
10131013
main_block = paddle.static.default_main_program().current_block()
10141014
startup_block._find_var_recursive(linear.weight.name).is_distributed = True
10151015
main_block._find_var_recursive(linear.weight.name).is_distributed = True
1016+
10161017
# set is_distributed for splited bias
10171018
# if a linear layer is splited by row, each rank would hold a complete bias and they should be the same in each rank.
10181019
# if a linear layer is splited by col, the bias would also be split into each rank as its weight

0 commit comments

Comments
 (0)