Skip to content

FLUX.2-klein-4B/9B not supported in custom models (text encoder mismatch) #4591

@lazariv

Description

@lazariv

Feature request / 功能建议

Requested support for black-forest-labs/FLUX.2-klein-9B and black-forest-labs/FLUX.2-klein-4B

FLUX.2-klein uses Qwen3 as text encoder, but Xinference's FLUX.2 integration
(2.0.0) only supports Mistral3 (from FLUX.2-dev). When attempting to load
FLUX.2-klein as a custom model, Diffusers validation fails due to text encoder
type mismatch, leaving the pipeline with meta tensors.

Error: Expected types for text_encoder: Mistral3..., got Qwen3...
Then: Cannot copy out of meta tensor; no data!

Please add explicit support for FLUX.2-klein's Qwen3 text encoder.

Motivation / 动机

Smaller models are very useful for smaller GPU cards

Your contribution / 您的贡献

https://huggingface.co/black-forest-labs/FLUX.2-klein-9B#using-with-diffusers-🧨

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions