Skip to content

Incorporate minibatch-training #11

Open
hsmaan wants to merge 1 commit intomainfrom
minibatch_Training
Open

Incorporate minibatch-training #11
hsmaan wants to merge 1 commit intomainfrom
minibatch_Training

Conversation

@hsmaan
Copy link
Copy Markdown
Member

@hsmaan hsmaan commented Jan 22, 2024

Currently, minibatch training is not incorporated because of limitations in the graph-base training setup.

Aim of PR is to:

  • Enable minibatch training in the model
  • Test stability of minibatch training, especially with current hyperparameter setup of the model
  • After stability ensured, incorporate explicit parallelization (DataParallel already available in BaseTrainer, but DDP not implemented) for multi-gpu training

@hsmaan hsmaan added the enhancement New feature or request label Jan 22, 2024
@hsmaan hsmaan requested a review from subercui January 22, 2024 21:14
@hsmaan hsmaan mentioned this pull request Jan 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants