Skip to content

About the Generator and Discriminatror #8

@Fly2flies

Description

@Fly2flies

Hello,
I'm a fresh man for the pytorch and GAN. Thanks for you to share such a good implementation. But I'm confused about your training procesure about the G & D. To my best knownledge, the D should try to identify different modalities, while the G do oppsite. Meanwhile, they should be trained respectively and alternately. But as your code:

if epoch%K:
            loss_total = loss_G - gamma*loss_D
        else:
            loss_total = nu*(gamma*loss_D - loss_G)

        loss_total.backward()
        optimizer.step()

The G and D are trained simultaneously, which casued the loss of D become very high while G is being trained, and then the training of G become unstable because of the big loss of D when it's D's turn.
When I try to apply the framework to other work, the loss become Nan quickly. So the training process cannot continue.Is there something wrong with my idea, or the codes.
Another confusion is that adding the L2 regularization is a neccesary for avoiding overfitting?
Looking forward your reply.

Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions