Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

We need to find a way to be able to copy binary files into our backup system #2

Open
ypriverol opened this issue Sep 22, 2018 · 18 comments
Assignees
Labels
enhancement New feature or request

Comments

@ypriverol
Copy link
Member

@osallou

One of the ideas of having the ftp binary folder is to find a way to copy all the binaries needed to generate the containers into our internal file system. We need to find out a way to allow users or administrators of Biocontainers upon request to copy to the binary folder https://containers.biocontainers.pro/s3/ the binary files.

Best Regards
Yasset

@ypriverol ypriverol added the enhancement New feature or request label Sep 22, 2018
@bgruening
Copy link
Member

to copy to the binary folder https://containers.biocontainers.pro/s3/ the binary files.

You mean copy tarballs and binaries into this folder?

@ypriverol
Copy link
Member Author

We have now the backup from your side, which is what we aim at the very beginning. However, we don't have a way to copy a tarball or binary in that folder in the s3. one possible option is that you @bgruening give us access to copy in the original folder and this will be finally reflected in the s3.

@bgruening
Copy link
Member

This already exists for years. Its called cargo-port and you can find it here: https://github.com/galaxyproject/cargo-port

@ypriverol
Copy link
Member Author

How can I copy a binary to the ftp using cargo?

Thanks @bgruening for the feedback

@bgruening
Copy link
Member

There is documentation in this repo. You need to edit a tsv file add a temporary location of your binary and create a PR. On merge cargo-port will download the binary and archive it.

This is done in this way to give all people the same change to add files and to prevent uploading malicious content. GitHub acts here as a review place for the links and binaries.

@ypriverol
Copy link
Member Author

Ok, we can formally use this mechanism to add the binaries/ tarballs to the system thanks. Can we give to @BioContainers/core members rights to merge there?

@bgruening
Copy link
Member

Can we give to @BioContainers/core members rights to merge there?

I think so. If it turns out to be useful and Biocontainers/core keeps maintaining it, why not. Please note it is not my project.

@ypriverol
Copy link
Member Author

Quick Question. I have seen that the url.tsv file hasn't been modified in 3 months. How the bioconda that rely on binaries has been released recently?

Yasset

@ypriverol
Copy link
Member Author

ypriverol commented Sep 22, 2018

@bgruening I have check one recently added container and you use this path https://github.com/bgruening/download_store/raw/master/thermorawfileparser/thermorawfileparser2018_09_07.zip

Why you didn't use the cargo project? Any special reason?

@bgruening
Copy link
Member

Because you told me its not official :-)

@ypriverol
Copy link
Member Author

Then, we go for that PR in that system.

@osallou
Copy link
Contributor

osallou commented Sep 23, 2018

@ypriverol, sorry but i do not understand what you mean here by 'binaries needed to generate the containers'
What do you need to save here for biocontainers?
Admin (we) can upload things there, using s3 cmds...
Users, who and what would they upload there?

@ypriverol
Copy link
Member Author

ypriverol commented Sep 23, 2018

@osallou I have a binary now and I would like to generate my container with it. First thing, I will need to copy the binary to a server like the s3 and then create the container. This probably can be achieved by a similar mechanism that @bgruening proposed before using a controlled tsv and PRs.

@osallou
Copy link
Contributor

osallou commented Sep 23, 2018 via email

@ypriverol
Copy link
Member Author

We can force the creator to create a git repo with the binary. But when we accept the PR we should copy the file to our internal S3 folder. Whit this we guarantee that we can reproduce the container in case the binary/repo in git is deleted. Is that ok @osallou ?

@osallou
Copy link
Contributor

osallou commented Sep 23, 2018 via email

@ypriverol
Copy link
Member Author

But binaries could be put in containers repo itself.
This is when the container has been created by us. But int some cases the binaries are already in github and we don't need to copy them again into containers repo. But we should keep the copy of all the binaries in the S3 because if for some reason the binaries can't be store in the future, we have already them in S3.

Furthermore, should we accept containers from binaries only? Are sources
not mandatory (open source)?

We have containers already from binaries. Even if the code is available some specific arch is needed to produce the binary, and it will take times in our side.

@osallou
Copy link
Contributor

osallou commented Sep 24, 2018

as cargo-port is already here and matches those requirements, I think we can close this issue.
Maybe a link to this repo (and explaining why) in biocontainers web site would be enough

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants