[optim] add missing 'maximize' parameter to LBFGS, NAdam and RAdam optimizers #126642
Labels
actionable
module: loss
Problem is related to loss function
module: optimizer
Related to torch.optim
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
馃殌 The feature, motivation and pitch
Currently,
LBFGS
,NAdam
andRAdam
optimizers do not have themaximize
parameter in their constructors, which means they cannot be used with some loss functions (like R2Score).Alternatives
In the case
maximize
can't be implemented due to a constraint of the optimizer, it would be nice to make it clear thatmaximize
is intentionally not supported. Some possible suggestions:Additional context
No response
cc @vincentqb @jbschlosser @albanD @janeyx99 @crcrpar
The text was updated successfully, but these errors were encountered: