Skip to content

Fix multiple bugs in distributed utils, SpatialTransformer, and diffusion schedules#162

Open
Mr-Neutr0n wants to merge 1 commit intoali-vilab:mainfrom
Mr-Neutr0n:fix/multiple-bug-fixes
Open

Fix multiple bugs in distributed utils, SpatialTransformer, and diffusion schedules#162
Mr-Neutr0n wants to merge 1 commit intoali-vilab:mainfrom
Mr-Neutr0n:fix/multiple-bug-fixes

Conversation

@Mr-Neutr0n
Copy link

Summary

  • DiffScatter missing forward method (utils/distributed.py): The DiffScatter class had two methods both named symbolic, so the second one overwrote the first, leaving no forward method. Renamed the second symbolic to forward.
  • reduce_dict missing parentheses (utils/distributed.py): keys = list(input_dict.keys) referenced the method object instead of calling it. Added parentheses: input_dict.keys().
  • SpatialTransformer proj_out swapped dims (tools/modules/unet/util.py): When use_linear=True, proj_out was initialized as nn.Linear(in_channels, inner_dim) but it should project from inner_dim back to in_channels. Fixed to nn.Linear(inner_dim, in_channels).
  • linear_schedule typo (tools/modules/diffusions/schedules.py): ast_beta = last_beta or scale * 0.02 assigned to a dead variable instead of updating last_beta. Fixed to last_beta = last_beta or scale * 0.02.

Test plan

  • Verify DiffScatter.apply() works correctly in distributed settings
  • Verify reduce_dict works with OrderedDict inputs
  • Verify SpatialTransformer with use_linear=True produces correct output dimensions
  • Verify linear_schedule uses the computed default value for last_beta

…sformer proj_out dims, linear_schedule typo

- Fix DiffScatter class having duplicate 'symbolic' method instead of 'forward'
- Fix reduce_dict missing parentheses on input_dict.keys call
- Fix SpatialTransformer proj_out Linear dimensions being swapped when use_linear=True
- Fix linear_schedule typo 'ast_beta' that should be 'last_beta'
@CLAassistant
Copy link

CLAassistant commented Feb 11, 2026

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants