-
Notifications
You must be signed in to change notification settings - Fork 53
Description
When attempting to set up ToonComposer on Windows 11, installation fails because the repository tries to install flash-attn, which is not supported on Windows.
Steps I followed:
Cloned the repo:
git clone https://github.com/TencentARC/ToonComposer.git
cd ToonComposer
Created a clean environment:
conda create -n tooncomposer python=3.10
conda activate tooncomposer
Installed dependencies:
pip install -r requirements.txt
Ran:
python app.py
Observed Behavior
During execution, ToonComposer tries to automatically install flash-attn:
subprocess.check_call([sys.executable, "-m", "pip", "install", "flash_attn==2.8.2", "--no-build-isolation"])
This fails on Windows, since flash-attn only supports Linux with specific CUDA builds.
Even after cleaning requirements, some modules (dit.py, image_encoder.py, util/env_resolver.py) hard-import or attempt to install flash_attn.
Expected Behavior
ToonComposer should be able to run without flash-attn (falling back to PyTorch’s scaled_dot_product_attention), since flash-attn is only an optimization.
The code should not force an installation on platforms where flash-attn is unavailable.
Suggested Fix
Wrap flash-attn imports in try/except and provide a safe fallback with torch.nn.functional.scaled_dot_product_attention.
Remove/disable automatic installation logic from env_resolver.py.
Provide an official Windows setup guide (requirements without flash-attn and deepspeed).
Additional Notes
This issue blocks ToonComposer from being used on Windows 11 (without manually editing the source code to bypass flash-attn).
Other repos (e.g. Hugging Face diffusers, PyTorch Lightning projects) typically handle optional dependencies gracefully — ToonComposer could adopt a similar approach.
please send an update with fewer dependencies, with an equivalent of flash attn or the deep one. So it becomes usable in windows
Prepare a version or tutorial to use with ComfyUI, thanks