Skip to content

Conversation

@avjves
Copy link
Contributor

@avjves avjves commented Sep 29, 2025

What

Small refactoring to gate missing flash_attn warning logs behind not having AITER installed.

Why

Currently when running xDiT on AMD devices, one can use AITER instead of flash_attn, which means flash_attn doesn't even have to be installed.
Having AITER installed, xDiT correctly logs that AITER is installed and will be used as the attention library, but missing flash_attn then triggers another warning log:

Flash Attention library "flash_attn" not found, using pytorch attention implementation

SDPA will not be used if AITER is present, so this PR gates this warning, so only if AITER is not installed will this be logged.

Copy link
Collaborator

@feifeibear feifeibear left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@feifeibear feifeibear merged commit e585d98 into xdit-project:main Sep 30, 2025
@jcaraban jcaraban mentioned this pull request Nov 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants