You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be very useful if Hyperactive has the ability to save the optimization backend (via pickle, dill, cloudpickle, ...) to disk and load it later into Hyperactive to continue the optimization run.
So the goal is, that the optimizer can be saved during one code execution and loaded at a later time during a second code execution. The optimization run should behave as if there was no break between the two optimization runs.
The optimization backend of Hyperactive is Gradient-Free-Optimizers. So I first confirmed that GFO optimizer-objects can be saved and loaded in two different code executions. In the following script the optimizer-object is saved if it does not exist, yet. This code must then be executed a second time. The optimizer-object is loaded and continues the search.
So lets try to now access the optimizer objects from within Hyperactive, save it and load it during a second code execution:
Save and load optimizer (GFO-wrapper) from within Hyperactive
importosimportnumpyasnpfromhyperactiveimportHyperactiveimportdillaspklfile_name="./optimizer.pkl"defload(file_name):
ifos.path.isfile(file_name):
withopen(file_name, "rb") aspickle_file:
returnpkl.load(pickle_file)
else:
print("---> Warning: No file found in path:", file_name)
defsave(file_name, data):
withopen(file_name, "wb") asf:
pkl.dump(data, f)
defparabola_function(para):
loss=para["x"] *para["x"]
return-losssearch_space= {"x": list(np.arange(-10, 10, 0.1))}
opt_loaded=load(file_name)
ifopt_loaded:
print("Optimizer loaded!")
# do stuffelse:
hyper=Hyperactive()
hyper.add_search(parabola_function, search_space, n_iter=100)
hyper.run()
# access the optimizer attribute from the list of resultsoptimizer=hyper.opt_pros[0]._optimizer# not official APIsave(file_name, optimizer)
print("Optimizer saved!")
If you executed the code above two times you will probably encounter the error message further down. The reason why this error occurs is a mystery to me. There is a FileNotFoundError even though the file is present. I do not have expert knowledge about pickling processes/functions, so I would be very grateful to get help with this problem.
If you take a look at the type of hyper.opt_pros[0]._optimizer from Hyperactive you can see, that it is the same GFO optimizer-object as in the GFO stand-alone-code (the first example).
My guess would be, that the optimizer-class in Hyperactive receives parameters that cannot be pickled by dill (or couldpickle) for some reason. The source code where GFO receives parameters within Hyperactive can be found here.
Traceback (most recent call last):
File "hyper_pkl_optimizer.py", line 33, in<module>
opt_loaded = load(file_name)
File "hyper_pkl_optimizer.py", line 15, in load
return pkl.load(pickle_file)
File "/home/simon/anaconda3/envs/dev/lib/python3.8/site-packages/dill/_dill.py", line 373, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/home/simon/anaconda3/envs/dev/lib/python3.8/site-packages/dill/_dill.py", line 646, in load
obj = StockUnpickler.load(self)
File "/home/simon/anaconda3/envs/dev/lib/python3.8/multiprocessing/managers.py", line 959, in RebuildProxy
return func(token, serializer, incref=incref, **kwds)
File "/home/simon/anaconda3/envs/dev/lib/python3.8/multiprocessing/managers.py", line 809, in __init__
self._incref()
File "/home/simon/anaconda3/envs/dev/lib/python3.8/multiprocessing/managers.py", line 863, in _incref
conn = self._Client(self._token.address, authkey=self._authkey)
File "/home/simon/anaconda3/envs/dev/lib/python3.8/multiprocessing/connection.py", line 502, in Client
c = SocketClient(address)
File "/home/simon/anaconda3/envs/dev/lib/python3.8/multiprocessing/connection.py", line 630, in SocketClient
s.connect(address)
FileNotFoundError: [Errno 2] No such file or directory
So the goal is now to fix the problem with the second code example and enable the correct saving and loading of the optimizer-object from Hyperactive.
The text was updated successfully, but these errors were encountered:
Explanation
It would be very useful if Hyperactive has the ability to save the optimization backend (via pickle, dill, cloudpickle, ...) to disk and load it later into Hyperactive to continue the optimization run.
So the goal is, that the optimizer can be saved during one code execution and loaded at a later time during a second code execution. The optimization run should behave as if there was no break between the two optimization runs.
The optimization backend of Hyperactive is Gradient-Free-Optimizers. So I first confirmed that GFO optimizer-objects can be saved and loaded in two different code executions. In the following script the optimizer-object is saved if it does not exist, yet. This code must then be executed a second time. The optimizer-object is loaded and continues the search.
Save and load GFO-optimizer
The code above works fine!
So lets try to now access the optimizer objects from within Hyperactive, save it and load it during a second code execution:
Save and load optimizer (GFO-wrapper) from within Hyperactive
If you executed the code above two times you will probably encounter the error message further down. The reason why this error occurs is a mystery to me. There is a
FileNotFoundError
even though the file is present. I do not have expert knowledge about pickling processes/functions, so I would be very grateful to get help with this problem.If you take a look at the type of
hyper.opt_pros[0]._optimizer
from Hyperactive you can see, that it is the same GFO optimizer-object as in the GFO stand-alone-code (the first example).My guess would be, that the optimizer-class in Hyperactive receives parameters that cannot be pickled by dill (or couldpickle) for some reason. The source code where GFO receives parameters within Hyperactive can be found here.
So the goal is now to fix the problem with the second code example and enable the correct saving and loading of the optimizer-object from Hyperactive.
The text was updated successfully, but these errors were encountered: