Skip to content

accemlcc/FLUX.2-Gradio-UI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

FLUX.2 Gradio UI

A lightweight, decoupled Gradio interface for running FLUX.2-dev locally with full VRAM management.

โš ๏ธ Hardware Requirements

This model is extremely resource-intensive. Please ensure your hardware meets the minimum requirements:

  • Text Encoder: Requires at least 15 GB VRAM (loaded on CUDA:0).
  • FLUX.2 Model: Requires at least 20 GB VRAM (loaded on CUDA:1 or separate).
  • Total System: A multi-GPU setup or a single GPU with massive VRAM (e.g., A6000/A100) is recommended.
  • Note: Using input images (Image-to-Image) will require additional VRAM beyond these minimums.

๐Ÿ› ๏ธ Installation & Setup

1. Create Environment

You must create your own Python environment (Conda or venv) and install dependencies suitable for your specific hardware (CUDA version, PyTorch, etc.). This repo does not provide an automatic installer.

Recommended packages include:

  • torch
  • diffusers
  • transformers
  • gradio
  • huggingface_hub
  • facexlib (if using specific tools)

2. Configure Launch Script

The environment must be activated via the startup script.

  1. Open run_flux_fast.bat in a text editor.
  2. Locate the line:
    call "C:\path\to\anaconda3\Scripts\activate.bat" your_env_name
  3. Edit this line to point to your actual Conda/Python activation script and your created environment name.

๐Ÿš€ Usage

  1. Double-click run_flux_fast.bat.
  2. The Gradio UI will launch in your browser.
  3. Click "Load Models (Start Worker)": This launches the separate worker process and loads the heavy models into VRAM.
  4. Generate: Enter your prompt and settings.
  5. Click "Unload Models (Kill Worker)": This completely kills the worker process, ensuring 100% of VRAM is freed immediately.

๐Ÿ”— Model Info

About

A lightweight, decoupled Gradio interface for running FLUX.2-dev locally with full VRAM management.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors