Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Arguments passed to BoTorch optimize_acqf don't work (in an intuitive way) or don't make sense #2467

Open
esantorella opened this issue May 16, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@esantorella
Copy link
Contributor

esantorella commented May 16, 2024

I wanted to use nonlinear inequality constraints, which seems like it should be doable because BoTorch's optimize_acqf supports nonlinear inequality constraints. However, optimize_acqf happens after Ax has applied transforms, so arguments such as nonlinear_inequality_constraints and batch_initial_conditions operate in the transformed space, causing surprising behavior.

Example:

import torch
from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy
from ax.modelbridge.registry import Models
from ax.utils.testing.core_stubs import get_branin_data, get_branin_experiment

inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]

botorch_gen_step = GenerationStep(
    model=Models.BOTORCH_MODULAR,
    num_trials=-1,
    model_gen_kwargs={
        "model_gen_options": {
            "optimizer_kwargs": {
                "nonlinear_inequality_constraints": inequality_constraints,
                "batch_initial_conditions": torch.ones(((1, 1, 2))),
            }
        }
    },
)

constrained_gs = GenerationStrategy(steps=[botorch_gen_step])

inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]

botorch_gen_step = GenerationStep(
    model=Models.BOTORCH_MODULAR,
    num_trials=-1,
    model_gen_kwargs={
        "model_gen_options": {
            "optimizer_kwargs": {
                "nonlinear_inequality_constraints": inequality_constraints,
                "batch_initial_conditions": torch.ones(((1, 1, 2))),
            }
        }
    },
)

constrained_gs = GenerationStrategy(steps=[botorch_gen_step])
# {'x1': 10.0, 'x2': 15.0} -- does not obey the constraints
generator_run.arms[0].parameters

Suggested resolution:

I suggest not surfacing arguments to optimize_acqf to the user, possibly with a few exceptions added as needed. Although some of the arguments can be helpful when constructed by Ax, almost all are nonsensical, redundant with Ax, or will behave surprisingly when passed by the user. Redundant arguments include acq_function, bounds, q, and inequality_constraints. return_best_only is nonsensical when used with Ax. And others, such as nonlinear_inequality_constraints and batch_initial_conditions, operate in the transformed space and thus are nearly impossible to use correctly without a detailed understanding of what Ax does under the hood. Users with such a detailed understanding might as well use BoTorch.

I think this can be achieved by not constructing opt_options here, and instead erroring when optimizer_kwargs are present in model_gen_options.

opt_options = checked_cast(

@Balandat
Copy link
Contributor

Similar considerations arise when passing arguments to the acquisition function constructors, see #2401. It would be great if we could automatically apply the transforms to the arguments for which they are needed, but doing this in a generic fashion seems very challenging.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants