Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion paddlenlp/transformers/llama/modeling.py
Original file line number Diff line number Diff line change
Expand Up @@ -1653,7 +1653,7 @@ def forward(
is_casual = True
else:
is_casual = is_casual_mask(attention_mask)
if get_env_device() != "npu" or get_env_device() != "mlu":
if get_env_device() != "npu" and get_env_device() != "mlu":
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if get_env_device() != "npu" and get_env_device() != "mlu":
if get_env_device() not in ["npu", "mlu"]:

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这样会不会好一点

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

嗯嗯,明确一些

if is_casual and alibi is None:
attention_mask = None
else:
Expand Down