Conversation
- add optional argument `onehot` (default True) to indicate net type - change loss function to MultiLabelSoftMarginLoss() if not `onehot` - add argument to get_dataloader() to change y datatype if not `onehot` - change predict() function to output y vectors if not `onehot`
|
So I was running this and got a runtime warning: It's still running so I can't confirm, but I think the issue might be here: Unlike the one-hot case, I need to wrap the sigmoid around the net outputs to get 0-1s: |
|
I think it is just a warning. A solution could be this little trick which needs to be adapted to the vectorized case. Another solution is to silence that working inside the function call. In any case it should not affect the output of your code. |
Add option to have multiple labels in the PyTorch learner. This is helpful when the strategy for the integer variables corresponds to the actual integer variables vector. (Similar holds for active constraints).
This PR also adds the
n_layersas parameter.TODO
onehotflag to initialize learner. This also adapts the learner cost function.onehotencoding or the multiple labels for active constraints and integer variables.