Replies: 6 comments 20 replies
-
Thanks for the proposal, @worc4021! |
Beta Was this translation helpful? Give feedback.
-
I have been monitoring
What is the problem with licensed stuff and vcpkg? In vcpkg you can have arbitrary ports define in any directory, that you pass via Also spack may be convenient in the context of "all deps from source" scenario, but I do not have a lot of experience with it. |
Beta Was this translation helpful? Give feedback.
-
No opinion. We'll work around any changes and keep building via Yggdrasil. People could just download the prebuilt binaries from https://github.com/cvanaret/Uno/releases |
Beta Was this translation helpful? Give feedback.
-
Hello guys, thank you for your responses! Is the sentiment then that we all three aren't big fans of the uno build? @traversaro you'd like uno to be managed in conda wheras @odow you build with yggdrasil.. The point you raise about vcpkg is valid, @traversaro. I had not considered this option for uno as it would require providing not only a cmake build for dependencies, but also the portfile for each one and on top of that we'd add vcpkg to the party. I.e. either a submodule or a required system wide install. Whereas cmake is already required to build. I am not a heavy python user, but from my understanding conda is a package manager rather than library management tool for a build. Superficially, conda forge looks quite similar in that sense. The reason to promote a build from source vs downloading prebuilt binaries is control of dependencies. In my experience nlp solvers are rarely deployed into standalone applications but end up in some tool chain which does many other things and goes many other places. Before you know it your hardware in the loop setup needs to work out how to incorporate MPI, because someone introduced uno with a mumps dependency in some trajectory planner. Naturally, at this point the practitioner will have to modify the builds and he's kind of on his own. But at the moment we're all kind of on our own. Another point is that the QP/LP interface is fairly simple, i.e. if one is paying for e.g. a gurobi or xpress licence, one might want to use those qp solvers instead of bqpd. Without being able to build uno there is no way to incorporate even for the simplest of tools into it. Not sure, where this leaves us, tbf. |
Beta Was this translation helpful? Give feedback.
-
As a user who recently trying to build Ipopt on Windows without msys toolchain and gave up eventually, I deeply understand the frustration on building NLP solvers with dependencies on Windows. The main difficulties are:
For standard binary build, However, I want to point out that there is another way called This approach is especially useful in the field of optimization because there are many non-free dependencies that cannot be distributed or only exist in binary form (like HSL, BQPD and other commercial solvers). As I mentioned above, there are also open source dependencies that are difficult to build on Windows. Dynamic loading can solve these issues because the main executable/library does not need these dependencies to build (dependency-free build), while keeping the ability to use them. The users can acquire these dependencies and use them with the main executable/library. It even allows switching the customized binaries at runtime if the user has an enhanced or modified version of dependency. This method has been used by Ipopt to load HSL. OR-tools from Google and my project PyOptInterface also use similar techniques to load commercial optimizers. As the QP/LP interfaces in Uno get implemented for multiple optimizers, I think that dynamic loading will become more appealing. For open source solvers, if Uno bundles the static libraries, it is difficult to switch customized builds. For non-free dependencies, there are two options:
I am happy to provide assistance if Uno plans to use dynamic loading to manage some of its dependencies. |
Beta Was this translation helpful? Give feedback.
-
Hello! |
Beta Was this translation helpful? Give feedback.
-
Hello!
I would like to propose the creation of a dependency build repository for Uno. I use the word in its conventional sense, not necessarily a single git repository.
Personally, I really like that Uno is built with cmake, I think cmake is great at managing the platform tool chain on my behalf. However, when it comes to dependency management cmake has multiple different options and not quite the one unifying one..
In principle there are three different options:
find_library
. The user somehow obtained whichever library and we find the library as a file and make cmake aware of this. There is a little more going on but essentially we're looking for a file path. If some macros have to be defined or we have to build position independent code or whatever, we have to tell cmake because this mechanism doesn't determine this. Similarly additional dependencies that the library itself has remain our responsibility.IMPORTED
library. Very similar to the previous option, only that we make the library into a full cmake target to use syntax such astarget_link_library(<target> PRIVATE foo)
instead oftarget_link_library(<target> PRIVATE ${foo_LIBRARY})
andtarget_include_directories(<target> PRIVATE ${foo_INCLUDE_DIR})
. Slightly neater to be using, but we still have to somehow obtain the library.install
ed targets. This is similar in the usage to an imported target, however, cmake does all the flag defining for us. This is possible if we have source code access tofoo
. In this case cmake generates a file for us to use the library in any other cmake project, e.g. in uno.My proposition, is to incorporate dependency builds to levy cmake's installed targets. Whoever wants to build Uno will need source code access to all the solvers, so option 3 becomes available. This would entail the creation of repositories with
CMakeLists.txt
files for metis (and its toolbox function GKLib), bqpd, ampl, ma57 and mumps.As and when additional solvers are added to the list of options, we'd have to add their dependency build as well. But this would shift all the complexity to individually testable builds, which can address whichever platform, tool chain, etc specific issues we might encounter.
The sequence of events for a fresh Uno build in my proposed scenario would be like this:
For point 3, I have added a single folder where I keep installed libraries to the environment and refer to it, rather than modifying
CMakeLists.txt
files every time.I have examples/potential candidates of the installable cmake builds for all but the ma57, e.g. here is one for AMPL. All but the mumps build turned out to be trivial.
What is done elsewhere? There aren't many other big solvers where we can see the build to compare. There are other smaller sovlers that rely e.g. on Metis, but usually the list of dependencies is short and the build itself is hardly ever what you'd call modern. Notably, ipopt is the only sophisticated build that really comes to mind. There dependencies are indeed managed with dependency builds similar to what I propose. The build tool chain is very unix focused and if you ever had the pleasure of compiling ipopt for commercial applications on windows outside the msys ecosystem to maximise performance you will know that it is a challenge. Having said that, I think that Uno has the opportunity to also be the first big nonlinear solver to have a slick integration into modern build processes.
I would be keen to hear whether integration into other domains might require a different solution altogether, as I have no insight into e.g. the current julia integration efforts.
I'd be happy to contribute to dependency builds, if we chose to go into that direction.
Cheers,
Manuel
PS. technically, there is a fourth option of how to incorporate dependencies. Including their build into the uno build, such as
FetchContent
orCPM
, but I discarded them since this wouldn't scale well and its a rather poor compromise in my experience. If you thought about vcpkg.. Wouldn't it be cool if it worked for licensed stuff..Beta Was this translation helpful? Give feedback.
All reactions