Skip to content

BUG in safe_divide #3

@JiatengLiu

Description

@JiatengLiu

Hi, when I use function safe_divide, I found some bug caused by in-place operation. It may be caused by the following code.

a[(a < eps) & (a >= 0)] = eps
a[(a > -eps) & (a <= 0)] = -eps
b[(b < eps) & (b >= 0)] = eps
b[(b > -eps) & (b <= 0)] = -eps

I have used torch.autograd.set_detect_anomaly(True) to see what was causing the problem, but only this piece of information:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: 
[torch.cuda.FloatTensor [684918, 3]], which is output 0 of IndexPutBackward0, is at version 2; expected version 0 instead. Hint: the 
backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or 
anywhere later. Good luck!

And I fixed it by

a = torch.where((a < eps) & (a >= 0), torch.tensor(eps), a)
a = torch.where((a > -eps) & (a <= 0), torch.tensor(-eps), a)
b = torch.where((b < eps) & (b >= 0), torch.tensor(eps), b)
b = torch.where((b > -eps) & (b <= 0), torch.tensor(-eps), b)

it seem work, please tell me is this right?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions