Replies: 4 comments
-
I think this should be fairly straightforward to do by training the model as if it were Schnell, similar to what is done with the de-distill models. I'm going to experiment with that approach and see if it works. |
Beta Was this translation helpful? Give feedback.
-
I can confirm that forcing the model to be trained as schnell by changing:
to
In library/flux_utils.py, line 62 seems to allow it to finetune directly. |
Beta Was this translation helpful? Give feedback.
-
I can also confirm that the new guidance module that Ostris created gets destroyed in the process. It will need to be explicitly excluded. |
Beta Was this translation helpful? Give feedback.
-
This seems to be the key snippet of code to bypass the guidance block in ai-toolkit:
Where |
Beta Was this translation helpful? Give feedback.
-
https://huggingface.co/ostris/Flex.1-alpha
Flex.1 alpha is a pre-trained base 8 billion parameter rectified flow transformer capable of generating images from text descriptions. It has a similar architecture to FLUX.1-dev, but with fewer double transformer blocks (8 vs 19). It began as a finetune of FLUX.1-schnell which allows the model to retain the Apache 2.0 license. A guidance embedder has been trained for it so that it no longer requires CFG to generate images.
Beta Was this translation helpful? Give feedback.
All reactions