Skip to content

Commit cb8a229

Browse files
[BugFix][NPU] fix llama attn_mask astype error (#8528)
1 parent f8cdd3d commit cb8a229

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

paddlenlp/transformers/llama/modeling.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1567,7 +1567,7 @@ def forward(
15671567
if is_casual and alibi is None:
15681568
attention_mask = None
15691569
else:
1570-
attention_mask = attention_mask.astype("bool")
1570+
attention_mask = None if attention_mask is None else attention_mask.astype("bool")
15711571
hidden_states = inputs_embeds
15721572
# decoder layers
15731573
all_hidden_states = () if output_hidden_states else None

0 commit comments

Comments
 (0)