Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatically add cargo-port URLs to recipes #21

Closed
daler opened this issue Oct 9, 2016 · 8 comments
Closed

Automatically add cargo-port URLs to recipes #21

daler opened this issue Oct 9, 2016 · 8 comments

Comments

@daler
Copy link
Member

daler commented Oct 9, 2016

Once a recipe gets archived, it should be updated to reflect the new URL (at least as a backup).

Maybe a separate archive module, which could also check for and handle bioconductor URL degradation.

xref bioconda/bioconda-recipes#1015 (comment)

@epruesse

This comment has been minimized.

@epruesse epruesse changed the title cross-check galaxy depot for archived tarballs Automatically add cargo-port URLs to recipes Mar 14, 2019
@epruesse
Copy link
Member

epruesse commented Mar 14, 2019

The cargo-port URL(s) should be added to the meta.yamls source:url: section(s) automatically.

Rationale:

  • Avoid failures if the source has gone away when the recipe is rebuild for pinning updates.
  • Meet license requirements (e.g. GPL)

Status:

cargo-port uses its jenkins to do a nightly scan of bioconda-recipes master for new recipes / urls and adds those to its repository.

(incidentally - there is something going wrong there, see galaxyproject/cargo-port#158)

This means that the URLs become live only up to 1 day after a PR is merged.

Options:

We can add the URL(s) to the meta.yaml at various stages:

  • before the PR is merged (by the bot)
  • while the PR is built and uploaded (by the build system)
  • when the URL becomes active (by cargo-port)
  • in a cron job on the CI comparing our packages/urls to cargo-port's data

Alternatively, we could just keep the link in the docs (suggestion by @dpryan79).

If we want to be fancy, we can build any other workflow as well. E.g. having a @biocondabot backup source command that has the bot send a webhook to the cargo-port jenkins that in turn processes the upload and sends a webhook back to the bot which then adds the URL to the recipe.

@bgruening Cargo-Port is your baby. You get to having to decide. :)

@daler
Copy link
Member Author

daler commented Mar 14, 2019

Is there a way to figure out how much the cargo-port URLs are being used by conda-build in practice? I love the idea of archiving the source, but not sure it needs to be in the recipe. If it's used rarely, we could simply have guidance in the docs that if original source is missing, check cargo-port.

@epruesse
Copy link
Member

It's rare, but possible. I know of at least one package that was blacklisted because the source had gone away.

@bgruening
Copy link
Member

I treat cargo-port as last resort backup and would not expose it in the recipe, the exception is bioconductor where we know packages will disappear. Adding it to the doc and highlighting it as some community feature is my preferred solution.

@epruesse
Copy link
Member

What's the CF approach to this? GPL mandates that the source must be made available to all to whom the binaries are made available.

@bgruening
Copy link
Member

I don't think they have a solution in place for that. Tarballs can disappear :(

@epruesse epruesse added this to the April 2019 Meeting milestone Apr 6, 2019
@epruesse
Copy link
Member

epruesse commented Apr 8, 2019

Tarballs can disappear :(

Or change.... See bioconda/bioconda-recipes#14373 (trimmomatic 0.39 got a fix, but neither URL nor version changed...).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants