Upgrade Flux to new diffusers format #580
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What?
Updated Flux to use the latest diffusers format.
Why?
Current flux implementation depended on old diffusers version. This made it impossible to upgrade diffusers without breaking support for Flux. Upgrading diffusers is necessary for supporting future models. It also allows us to take advantage of new diffusers features in the current models.
How?
I took the new version of Flux attention processor from diffusers and modified it to support both sequence parallelism (Ulysses/Ring) and pipeline parallelism. The transformer required a couple of changes as well, but nothing major. Flux pipeline is left as-is to support pipeline parallelism as well. Also added gating mechanisms, so certain code paths are not loaded at all if the version isn't correct. This allows different models to require different diffusers versions.
The attention processor now also lives in the
transformer_flux.pyfile, not inattention_processor.py. This follows the style diffusers uses and allows us to do a clean version gating without having bunch of if-else statements.Tests
Tested both with Ulysses/Ring and with pipefusion:


Tested running:
Other