Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question regarding convergence of integer optimization #542

Open
yolking opened this issue Jan 17, 2025 · 1 comment
Open

Question regarding convergence of integer optimization #542

yolking opened this issue Jan 17, 2025 · 1 comment

Comments

@yolking
Copy link

yolking commented Jan 17, 2025

Hello! I am using pre-release version because I am interested in integer optimization. So far on the one hand I get pretty fast close to global optimum using it.

from bayes_opt import SequentialDomainReductionTransformer
pbounds = {
    "x1": (1, 1000, int),  
    "x2": (1, 1000, int),
    "x3": (1, 1000, int),
}
optimizer = BayesianOptimization(
    f=opt_func,
    pbounds=pbounds,
    verbose=2, 
)

best_target_ind = 0
best_target = -1000000000000
for i in count():
    next_point = optimizer.suggest()
    target = opt_func(**next_point)
    optimizer.register(params=next_point, target=target)
    if target>best_target:
        best_target = target
        best_target_ind = i
    if i-best_target_ind>150:      
        print(f"\t{next_point}")
        break
print("Best result:")
print(optimizer.max)

But BO seems to make very little effort to improve found optimum and continues to look far away from found optimal point. I checked docs for possible solutions to make it look closer to found optimum. SequentialDomainReductionTransformer may have been one of them, but it isn't available currently for integer optimization. Changing acquisition function after N unsuccessful iterations to
acquisition.ExpectedImprovement(xi=0.0)
seems like another possible solution, but it doesn't seem to do much.
Here is chunk of log of my optimization. The best optimum was found on 270 evaluation 0.5739394489614946 at [129, 2,740] and after that for 150 iterations no better point was found. Unreached global optimum is 0.5751567341679235 at [128, 2, 711]

  1. 0.5739394489614946 : ( 129, 2, 740)
  2. 0.562122923291349 : ( 141, 433, 755)
  3. 0.5637633205086482 : ( 133, 591, 706)
  4. 0.5671965653573532 : ( 137, 75, 682)
  5. 0.5648119388511881 : ( 135, 384, 718)
  6. 0.5556753869790383 : ( 168, 94, 804)
  7. 0.5391487427464875 : ( 201, 750, 693)
  8. 0.5620342532315347 : ( 135, 575, 757)
  9. 0.5094327951926114 : ( 441, 6, 710)
  10. 0.49312577749028136 : ( 325, 854, 978)
  11. 0.5717251849734983 : ( 129, 45, 700)
  12. 0.4993160246208407 : ( 389, 388, 934)
  13. 0.5616862869471217 : ( 112, 19, 756)
  14. 0.5668709566752652 : ( 141, 94, 728)
  15. 0.5412575931496266 : ( 120, 235, 856)
  16. 0.5260654968073734 : ( 293, 224, 772)
  17. 0.5607839319758643 : ( 164, 1, 808)
  18. 0.5570700592187176 : ( 154, 426, 727)
  19. 0.5679625492297719 : ( 132, 2, 789)
  20. 0.4135498252513754 : (1000, 646, 641)
  21. 0.5578181264865069 : ( 145, 265, 795)
  22. 0.4923969331624663 : ( 559, 1, 709)
  23. 0.5713064661872675 : ( 132, 29, 701)
  24. 0.5496693450599132 : ( 127, 287, 581)
  25. 0.5608862509741278 : ( 134, 475, 781)
  26. 0.5653586598894028 : ( 123, 395, 676)
  27. 0.4899158653866104 : ( 399, 95, 999)
  28. 0.5738924692628588 : ( 125, 1, 699)
  29. 0.5649884129475095 : ( 150, 6, 754)
  30. 0.5403166987731776 : ( 198, 545, 821)
  31. 0.5048640345902715 : ( 481, 590, 863)
  32. 0.5152179966179371 : ( 420, 8, 849)
  33. 0.5695567422933128 : ( 122, 76, 700)
  34. 0.5526048705799629 : ( 153, 687, 682)
  35. 0.551958875136273 : ( 189, 3, 680)
  36. 0.5220407644276102 : ( 329, 136, 877)
  37. 0.5624521509865348 : ( 139, 479, 704)
  38. 0.5624314035303094 : ( 133, 123, 790)
  39. 0.5726957026706154 : ( 132, 3, 743)
  40. 0.5574865183309355 : ( 147, 601, 733)
  41. -0.003711854927608495: ( 991, 338, 14)
  42. 0.5717020699778077 : 133, 22, 703
  43. 0.5659367577295359 : 135, 5, 633
  44. 0.5712659572231308 : 125, 29, 678
  45. 0.5639185645465573 : 146, 2, 798
  46. 0.5677979749663973 : 119, 70, 674
  47. 0.5721542636991069 : 129, 32, 692
  48. 0.5728171813402892 : 124, 19, 694
  49. 0.5727085645868546 : 137, 2, 727
  50. 0.5733057101998734 : 128, 20, 693
  51. 0.5622654692128538 : 117, 186, 651
  52. 0.5730132343477814 : 132, 7, 716
  53. 0.5713460763264467 : 131, 23, 684
  54. 0.5722638791551792 : 122, 2, 741
  55. 0.4259971144997654 : 997, 265, 991
  56. 0.5712120332543718 : 139, 6, 709
  57. 0.5669250140427018 : 139, 91, 716
  58. 0.4380931357144683 : 543, 1, 495
  59. 0.5702820490066987 : 136, 39, 720
  60. 0.5714631035555393 : 119, 1, 726
  61. 0.48784666007730465 : 539, 827, 739
  62. 0.5711345693363578 : 132, 27, 693
  63. 0.5640471909903313 : 134, 229, 767
  64. 0.5713302396633932 : 136, 20, 718
  65. 0.5705349537264771 : 121, 29, 674
  66. 0.5685407764156462 : 122, 10, 769
  67. 0.5709470591129177 : 130, 31, 748
  68. 0.5717805631344037 : 127, 37, 711
  69. 0.5706775254772627 : 133, 1, 767
  70. 0.5654006178672313 : 127, 161, 760
  71. 0.5696959478895336 : 115, 2, 706
  72. 0.56588772894393 : 121, 203, 728
Is this expected behavior? Other algorithms like DE and CMA become much more interested in convergence around optimal point after some time. Is it currently possible after, say, 100 unsuccessful iterations to look for optimum close to best point so far? Probably modifying bounds by hand or starting again with smaller bounds is one possible solution.
@till-m
Copy link
Member

till-m commented Jan 17, 2025

Hey @yolking,

would it be possible for you to post a complete snippet, including the function you're optimizing, so that I can run the code and check for myself?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants