Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Define method of installation validation of wmcore packages that are uploaded to Pypi #11331

Open
khurtado opened this issue Oct 10, 2022 · 7 comments
Assignees
Labels

Comments

@khurtado
Copy link
Contributor

khurtado commented Oct 10, 2022

Impact of the new feature
We have the ability to build and upload packages to Pypi based on tags. We need a way to validate that the services we are building don't have version conflicts within dependencies, ideally before uploading them to Pypi. This is linked to the package versions defined below:

https://github.com/dmwm/WMCore/blob/master/requirements.txt

Is your feature request related to a problem? Please describe.
This is related to #11327. E.g.: Different tags for wmagent are showing different pypi issues when installing, but all are AFAIK all related to the requirements.txt file where we define the package version dependencies.

Describe the solution you'd like
There are different ways to go around this:

If we do the first, we could append something like .pypi_validated at the end of the tag, so the github actions will only pick those tags to upload to Pypi.

Describe alternatives you've considered
Fix #11327 and do nothing afterwards. We should be okay for a while until we are not, which would require another issue reporting an installation bug to fix it.

@khurtado khurtado changed the title Define method of installation validation of wmcore packages that are uploaded to pypy Define method of installation validation of wmcore packages that are uploaded to Pypi Oct 10, 2022
@vkuznet
Copy link
Contributor

vkuznet commented Oct 10, 2022

@khurtado , there is 3rd and more easy option which I'll recommend. We should use user virtual environment where packages can be installed directly. If I'm not mistaken the build produce (s)dist python packages and venv can use them. @todor-ivanov can you provide instructions how to use local (s)dist python packages (after the python setup.py build) within venv. If it is not possible I rather suggest to upload packages to test.pypi.org and then use venv to fetch/install them from there.

@amaltaro amaltaro added High Priority deployment Issue related to deployment of the services containerization labels Oct 10, 2022
@amaltaro amaltaro added the CI/CD label Oct 10, 2022
@todor-ivanov
Copy link
Contributor

Hi @vkuznet thanks fro mentioning it here.
Hi @khurtado Please fetch the script for the virtual environment installation from here: #10980 and try to use it with the default options as a start. Test how it goes, and please give us your feedback as well. The script can be used in two modes:

  • Interactive - where it will guide you through the set of options and also providing you with some default values.
e.g.:
./deploy-centralvenv.sh -i test -l wmcore==2.1.3.pre1 -d /data/tmp/WMCore.venv/

Here:
-d  is the deployment destination
-i   is the Pypi index to be used for the deployment
  • Automated - where it will follow the whole deployment process based only on the set of parameters you provide on the command line:
e.g.:
./deploy-centralvenv.sh -i test -l wmcore==2.1.3.pre1 -d /data/tmp/WMCore.venv/ -y

Here:
-y assumes 'Yes' to all questions asked during the deployment process... Meaning if you have not provided deployment parameter at the command line the script will just use the default values.

The idea behind this script was exactly to also be able to deploy our packages from the test Pypi repository/index and give us the ability to use it as a validation repository.

There is a minor problem that needs to be mentioned though, some of our external packages dependencies do exist only in the production Pypi repository/index. Because of that, the script would ask you at some point if you would like to try to resolve those dependencies from the production Pypi repository. It is up to you to chose that path and let the script do it for you, or to take the list of unresolved dependencies and do it manually post deployment process.

The script can also deploy not the full WMCore stack but just a single component.

e.g.:
./deploy-centralvenv.sh -i test -l reqmgr2==2.1.3.pre1 -d /data/tmp/WMCore.venv/ -y

this will deploy reqmgr2 version 2.1.3.pre1 and will resolve all internal WMCore dependencies based on the requirements.txt file and also will search for all external python dependencies from both test and prod Pypi indexes.

For help on the full set of options you may use:

./deploy-centralvenv.sh -h

Please let me know if you need more instructions on this. Or if you find something broken, or anything that is needed and not worked out to the end. I will be extremely happy to work on fixing it and help speeding up this process.

@todor-ivanov
Copy link
Contributor

p.s.
If you skip the package version it will just deploy the latest one uploaded to Pypi. e.g. reqmgr2 instead of reqmgr2==2.1.3.pre1.

@vkuznet
Copy link
Contributor

vkuznet commented Oct 11, 2022

@todor-ivanov , I think we need an option to install packages whose wheel or dist tarballs are available locally, i..e before uploading them to pypi. If I recall correctly the whl file can be installed as simple as pip install <file.whl>.

@khurtado , I think the workflow should be the following:

  • build new package, python setup.py sdist or python setup.py bdist_wheel --universal
  • this will create dist area where newly created version of wheel and tar ball will reside
  • we need @todor-ivanov script to provide option to read from dist area to install packages
  • install packages locally and test them, once test is completed
  • upload to Pypi

My point is that you should not upload to pypi a broken package and we need ability to test it before we upload it. Or, if there is no such option we should upload package to test.pypi.org and use it. Otherwise we may produce lots of garbage on pypi.

@todor-ivanov
Copy link
Contributor

todor-ivanov commented Oct 11, 2022

Hi @vkuznet :

we need @todor-ivanov script to provide option to read from dist area to install packages

Agreed. I am on it already.

should not upload to pypi a broken package and we need ability to test it before we upload it

Agreed again. Or to put it in some other perspective, we should make sure at least that the packages to be uploaded in Pypi (test index) are install-able/deploy-able. While for later bugs and errors ... well here comes into the picture the proper validation process to find and clean them, for which the Pypi Test index should be used. And later to Pypi prod index we upload hopefully only bug and error free versions.

@todor-ivanov
Copy link
Contributor

todor-ivanov commented Oct 11, 2022

Sorry ... the prevous version of this comment was meant to end up in another issue, but It is for sure useful information here as well.
#11327 (comment)

@amaltaro
Copy link
Contributor

@khurtado maybe we could simplify it by actually installing all the packages/versions listed in the requirements.txt file. This should be the first job in our GH action and then the subsequent jobs would have to use the needs syntax to enforce a dependency and not build packages if we fail to install the required software stack.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants