Skip to content

[Ray] Add Support for Disaggregating VAE and DiT#422

Merged
feifeibear merged 5 commits intoxdit-project:mainfrom
lihuahua123:main
Jan 11, 2025
Merged

[Ray] Add Support for Disaggregating VAE and DiT#422
feifeibear merged 5 commits intoxdit-project:mainfrom
lihuahua123:main

Conversation

@lihuahua123
Copy link
Contributor

@lihuahua123 lihuahua123 commented Jan 5, 2025

This PR introduces support for disaggregating the VAE and DiT using Ray.
We have introduced a new argument --vae_parallel_size and --dit_parallel_size. When vae_parallel_size is set to 0, the VAE and DiT remain combined and are not separated. If vae_parallel_size is set to 1 or greater, the VAE is placed in a separate parallel Ray worker, enabling independent parallelism.

ray_world_size = vae_parallel_size + dit_parallel_size
dit_parallel_size = (data_parallel_degree *
classifier_free_guidance_degree *
sequence_parallel_degree *
pipeline_parallel_degree *
tensor_parallel_degree)

Currently, only FluxPipeline is supported for testing. Support for additional pipelines is in progress.
The separation of loading VAE and DiT is underway. For now, both modules are fully loaded within each Ray worker.
It is necessary to use the latest DistVAE.

Configuration Average Time (seconds)
2 dit workers 0 vae worker 11.41
2 dit workers 1 vae worker 11.53
2 dit workers 2 vae workers 11.50

Each configuration was tested three times.

@lihuahua123 lihuahua123 marked this pull request as ready for review January 10, 2025 08:33
@lihuahua123 lihuahua123 changed the title [WIP] Disaggregate the vae and dit [Ray] Add Support for Disaggregating VAE and DiT Jan 10, 2025
@feifeibear feifeibear merged commit 055b41f into xdit-project:main Jan 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants