Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NotUniqueError encountered for constrained problems. #461

Closed
iffanh opened this issue Feb 27, 2024 · 4 comments
Closed

NotUniqueError encountered for constrained problems. #461

iffanh opened this issue Feb 27, 2024 · 4 comments

Comments

@iffanh
Copy link

iffanh commented Feb 27, 2024

Describe the bug
Using bayesian-optimization==1.4.3 I'd like to run a constrained optimization problem. However, I encountered bayes_opt.util.NotUniqueError even though I have set allow_duplicate_points=True

I checked bayesian_optimization.py and observe that allow_duplicate_points=True is not imposed for generally constrained problems. Adding this argument to the TargetSpace class (line 149 in bayesian_optimization.py) resulted in cycling effect, with no visible progress on the iterates. See the stack below:

|   iter    |  target   |  allowed  |     x     |     y     |
-------------------------------------------------------------
| 4         | 0.9135    | True      | 2.195     | -1.897    |
| 7         | 1.524     | True      | 2.017     | 2.993     |
| 15        | 1.541     | True      | 2.0       | 2.88      |
| 16        | 1.556     | True      | 2.0       | -3.0      |
Data point [ 2. -3.] is not unique. 1 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 2 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 3 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 4 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 5 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 6 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 7 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 8 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 9 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 10 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 11 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 12 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 13 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 14 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 15 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 16 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 17 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 18 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 19 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 20 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 21 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 22 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 23 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 24 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 25 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 26 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 27 duplicates registered. Continuing ...
Data point [2. 3.] is not unique. 28 duplicates registered. Continuing ...
Data point [ 2. -3.] is not unique. 29 duplicates registered. Continuing ...

To Reproduce
To reproduce, I use one of the examples given in the repository and changed n_iter to 100.

Ex:

from bayes_opt import BayesianOptimization
import matplotlib.pyplot as plt
from scipy.optimize import NonlinearConstraint


def target_function(x, y):
    # Gardner is looking for the minimum, but this packages looks for maxima, thus the sign switch
    return np.cos(2*x)*np.cos(y) + np.sin(x)

def constraint_function_2_dim(x, y):
    return np.array([
        - np.cos(x) * np.cos(y) + np.sin(x) * np.sin(y),
        - np.cos(x) * np.cos(-y) + np.sin(x) * np.sin(-y)])
# Bounded region of parameter space
pbounds = {'x': (2, 4), 'y': (-3, 3)}

constraint_lower = np.array([-np.inf, -np.inf])
constraint_upper = np.array([0.6, 0.6])

constraint = NonlinearConstraint(constraint_function_2_dim, constraint_lower, constraint_upper)
optimizer = BayesianOptimization(
    f=target_function,
    constraint=constraint,
    pbounds=pbounds,
    allow_duplicate_points=True,
    verbose=1, # verbose = 1 prints only when a maximum is observed, verbose = 0 is silent
    random_state=1,
)

optimizer.maximize(
    init_points=2,
    n_iter=100 ## I changed this one
)

Expected behavior
The iterates should converge to the maximum, regardless whether it is allowed to have duplicate points.

Environment (please complete the following information):

  • OS: Ubuntu
  • python Version 3.8.10
  • numpy Version 1.24.4
  • scipy Version 1.10.1
  • bayesian-optimization Version 1.4.3
@till-m
Copy link
Member

till-m commented Feb 27, 2024

Could you maybe check if this was fixed with #437? I think we haven't made a release since that PR, but you can install from master here.

E: If I understand correctly, you implemented exactly what the PR does. Could it be that you have simply found the maximum?

Sorry, I am on my phone, so I can't run anything currently.

@iffanh
Copy link
Author

iffanh commented Feb 27, 2024

Hi, I checked and the commit that was fixed in #437 was exactly what I tested too in my local env.
Ideally, I would want the algorithm itself to detect if such maximum has been found and terminate the program.

Perhaps I should raise another ticket for this?

@till-m
Copy link
Member

till-m commented Feb 28, 2024

Something like this has also been raised in #381. I'm still of the opinion I expressed there, i.e. that .maximize should be kept simple and that we should leave these things to the user (see #373 for how to do this). OTOH I know that other people disagree with me here and I would be open to adding it, generally speaking.

@iffanh
Copy link
Author

iffanh commented Feb 28, 2024

I could see such automatic detection become very useful for problems with expensive evaluation.

Anyway, I'll close this ticket as it will be fixed in the next release (I presume). Thanks for answering!

@iffanh iffanh closed this as completed Feb 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants