Skip to content

onehot encoding of the lables #1

@Xing-CHEN18

Description

@Xing-CHEN18

Hi, thank you for the code!

I have a small comment:
For the one hot encoding of the labels, why the encodings are shifted by subtracting 0.5 (Ygood, Ybad). However, for the accuracy test for each 10 epoch, the variable 'unq_oh ' is the onehot encoding without subtracting 0.5, this might be a mistake?

I test the code without subtracting 0.5 of the encodings, it also works. And this shift seems does not affect the training accuracy.
Please let me know if there is misunderstanding, thank you!

Best,
Xing

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions