Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions python/paddle/distributed/collective.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -1008,6 +1008,10 @@ def _parallel_linear(x,
main_block = paddle.static.default_main_program().global_block()
startup_block.vars[linear.weight.name].is_distributed = True
main_block.vars[linear.weight.name].is_distributed = True
# set is_distributed for splited bias
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

API文档可以补清楚一些,axis=0以及axis=1时的切法。注释也可以再描述清楚一点,为何column切分时bias也需要切分。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done~

if axis == 1 and linear._bias_attr != False:
startup_block.vars[linear.bias.name].is_distributed = True
main_block.vars[linear.bias.name].is_distributed = True

if not gather_out: return linear_out

Expand Down
1 change: 1 addition & 0 deletions python/paddle/fluid/contrib/mixed_precision/fp16_lists.py
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,7 @@ def _update_list(self):
'sign',
'cast',
'fused_bn_add_activation',
'c_identity',
}

# The set of ops that don't support fp16 calculation
Expand Down