Fix BucketBatchSampler cache alignment in DreamBooth scripts#13353
Fix BucketBatchSampler cache alignment in DreamBooth scripts#13353azolotenkov wants to merge 5 commits intohuggingface:mainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
Fixes cache misalignment in DreamBooth training examples by making BucketBatchSampler yield batches in a stable, precomputed order across __iter__() calls, preventing reshuffles that break step-indexed latent/prompt-embedding caches.
Changes:
- Removed
random.shuffle(self.batches)fromBucketBatchSampler.__iter__()in multiple DreamBooth example scripts. - Added an explanatory comment describing why batch order must remain stable for cache alignment.
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| examples/dreambooth/train_dreambooth_lora_flux2.py | Keep precomputed batch order stable in BucketBatchSampler.__iter__() to avoid cache misalignment. |
| examples/dreambooth/train_dreambooth_lora_flux2_img2img.py | Same sampler iteration stabilization for cache alignment. |
| examples/dreambooth/train_dreambooth_lora_flux2_klein.py | Same sampler iteration stabilization for cache alignment. |
| examples/dreambooth/train_dreambooth_lora_flux2_klein_img2img.py | Same sampler iteration stabilization for cache alignment. |
| examples/dreambooth/train_dreambooth_lora_z_image.py | Same sampler iteration stabilization for cache alignment. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Updated based on review: the sampler now shuffles precomputed batches once at construction time, while keeping iteration order fixed across epochs for cache alignment. |
|
good catch @azolotenkov! |
|
@linoytsaban yes, that would be the most appropriate! |
|
Updated. I kept the per-epoch reshuffle in the normal path, and only switch to a fixed precomputed batch order when the script relies on step-indexed caches. I scoped that to |
What does this PR do?
This PR fixes a bug where
BucketBatchSamplerreshuffled precomputed batches on each__iter__()call.DreamBooth training scripts precompute latents and/or prompt embeddings and later consume them by dataloader step index, so changing batch order between the caching pass and the training pass can misalign the cached tensors with the current batch.
This PR removes the per-iteration
random.shuffle(self.batches)call fromBucketBatchSampler.__iter__()and instead shuffles the precomputed batch list once at sampler construction time, keeping the batch order fixed across epochs.A future rework may still be needed to support true epoch-wise reshuffling, since batch membership is currently fixed once at sampler construction time.
Applied to:
examples/dreambooth/train_dreambooth_lora_flux2.pyexamples/dreambooth/train_dreambooth_lora_flux2_img2img.pyexamples/dreambooth/train_dreambooth_lora_flux2_klein.pyexamples/dreambooth/train_dreambooth_lora_flux2_klein_img2img.pyexamples/dreambooth/train_dreambooth_lora_z_image.pyBefore submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Training examples: @sayakpaul