Flux clean 3 in 1 model loader for low vram GPU
5.0
0 reviewsDescription
Flux-dev-fp8: (12G+ vram)
https://huggingface.co/Kijai/flux-fp8/resolve/main/flux1-dev-fp8.safetensors
Flux-dev-bnb-nf4-V2: (8G vram)
https://huggingface.co/lllyasviel/flux1-dev-bnb-nf4/resolve/main/flux1-dev-bnb-nf4-v2.safetensors
Flux-dev-2Q_K.gguf: (6G vram)
https://huggingface.co/city96/FLUX.1-dev-gguf/resolve/main/flux1-dev-Q2_K.gguf
ComfyUI-GGUF loader:
https://github.com/city96/ComfyUI-GGUF
*** If you’re rocking an RTX 4090 , you can safely ignore this post — this is for the rest of us who are still in the "low VRAM" club and just want to see if the Flux model can work its magic without making our GPUs cry for mercy.
*** I put together this workflow just for you guys - don't forget to leave me a cute little heart ❤️! And if you can, swing by our Facebook page; it would make my day!
___________________
Comfyui
https://www.facebook.com/groups/comfyui
Discussion
(No comments yet)
Loading...
Reviews
No reviews yet
Versions (2)
- latest (a year ago)
- v20240819-215320
Node Details
Primitive Nodes (7)
CLIPTextEncodeFlux (1)
CheckpointLoaderNF4 (2)
Note (1)
PrimitiveNode (2)
UnetLoaderGGUF (1)
Custom Nodes (11)
ComfyUI
- DualCLIPLoader (1)
- VAELoader (1)
- EmptyLatentImage (1)
- ConditioningZeroOut (1)
- VAEDecode (1)
- SaveImage (1)
- ImpactSwitch (3)
- ToBasicPipe (1)
- ImpactKSamplerBasicPipe (1)
Model Details
Checkpoints (2)
Flux\flux1-dev-bnb-nf4-v2.safetensors
Flux\flux1-dev-fp8.safetensors
LoRAs (0)