-
Notifications
You must be signed in to change notification settings - Fork 225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Publish releases to PyPI via a GitHub Action #674
Conversation
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## master #674 +/- ##
=======================================
Coverage 79.12% 79.12%
=======================================
Files 126 126
Lines 8895 8895
=======================================
Hits 7038 7038
Misses 1857 1857
Flags with carried forward coverage won't be shown. Click here to find out more. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @ascillitoe!
What else is needed to do to test the workflows with the real PyPI?
I like the idea of pushing to test PyPI unconditionally and then having a manual step checking on the dev machine if things work as intended installing the newly published version from test PyPI. Slight disadvantage up to now was that if things didn't work, then one need's to invent a "non-existent" version before pushing to test PyPI again (e.g. if testing v0.8.0
and for some reason it's broken, we now need something like v0.8.0-rc
to be pushed to test PyPI even though eventually the real PyPI version will be v0.8.0
- this can introduce mistakes in the process).
I wonder if we should have a compulsory step to release an "rc" version of test PyPI before releasing the real one? Compulsory here could mean it's just our release process, I don't think we have to programmatically enforce that an "rc" version has to exist on test PyPI before a non-rc version can be pushed (but we could in the future).
Couple of questions:
- is "rc" the right type of tag for this kind of validation?
- perhaps publishing to real PyPI could/should be done by a manual workflow trigger to add an extra check (e.g. if someone pushes a
v-
tag and we don't programmatically enforce a pre-requisite to have an "rc" tag) - maybe for a follow-up, we could have a simple script that automatically bumps the versions (perhaps an option with minor/patch release and rc/regular) and dates in the various files
pyproject.toml
Outdated
"setuptools~=62.0.0", | ||
"wheel~=0.37.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the reasoning behind these versions?
Also, I assume this file is now read during the call to python -m build
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the reasoning behind these versions?
These were the latest versions when I first wrote the pyproject.toml
in #495. The fact that they are now out of date (setuptools
is on 65.5.1
) highlights the slight maintainability burden here. I feel like it is safer to pin (maybe to minor instead?) but can't see a way to avoid these going out of date then...
Also, I assume this file is now read during the call to python -m build ?
Yep, the idea here is that build
(and pip install
) reads these deps, and then installs them in an isolated virtual env, hence giving us more control over the build env. Since we are now building for release via GA, we could of course define the build env via a requirements.txt
. However, the pyproject.toml
approach gives more consistency between the build environment for PyPI and that used when users install locally.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sidenote: The caveat of this approach is that the above defines the backend. There isn't currently a way to define the frontend deps (i.e. the build
version) in the pyproject.toml
, hence why I am having to define the version for that in the Makefile
and GA yaml.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see what you mean wrt dependencies going out of date if pinned, but for release builds it shouldn't matter (and also note that the frontend build
in the GA is now pinned exactly). On the other hand, the minimum version via ~
makes sense to me too, though that could have a small risk of breaking things. TBF I think leaving ~
here is fine.
Btw when you say the pyproject.toml
deps are installed in an isolated virtual enviroment, how does that work exactly? Assuming I'm already in a virtualenv/conda env, would in this case, after calling pip install .
, setuptools
and wheel
be installed in a new (not visible to user?) virtualenv, build takes place in that env, and then the package is installed in the original user env?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
setuptools
andwheel
be installed in a new (not visible to user?) virtualenv, build takes place in that env, and then the package is installed in the original user env?
Yes, from what I understand, exactly this. For packages conforming to PEP 518 (i.e. ones that spec their build-system
in a pyproject.toml
, running pip install
creates a virtual env hidden from the user, installs build deps, does the build, and then installs the built package in the users own environment (i.e. their system python, or conda/virtualenv etc). When running pip install .
with this PR we get:
Processing /home/ascillitoe/Software/alibi-detect
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Note the extra lines compared to when we run the same with our v0.10.4:
Processing /home/ascillitoe/Software/alibi-detect
Preparing metadata (setup.py) ... done
Note: You can disable the above and use your already-installed build tools with the pip
--no-build-isolation flag. Described as:
Disable isolation when building a modern source distribution. Build dependencies specified by PEP 518 must be already installed if this option is used.
Makefile
Outdated
build: ## Build the Python package (for debugging, building for release is done via a GitHub Action) | ||
# (virtualenv installed due to Debian issue https://github.com/pypa/build/issues/224) | ||
pip install --upgrade pip build virtualenv | ||
python -m build --sdist --wheel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this command be identical to the one in the GA to minimize dev confusion?
.github/workflows/publish.yml
Outdated
|
||
# Build | ||
- name: Install pypa/build | ||
run: python -m pip install --user build==0.9.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Assume the specific version is just the latest stable one?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep. I toyed with the idea of doing build~=0.9
or something, but wondering if it is safer to pin to a specific version since we will be releasing builds straight off of it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, will we document the release process somewhere?
.github/workflows/publish.yml
Outdated
password: ${{ secrets.TEST_PYPI_API_TOKEN }} | ||
repository_url: https://test.pypi.org/legacy/ | ||
|
||
# TODO - Test below by: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What else is needed to do to test the workflows with the real PyPI?
I think these steps should work. The thing we want to avoid is a a non-stable tag triggering a release, so we want to test the if: steps.version.outputs.prerelease == ''
line. As long as we try out the below before adding a PYPI_API_TOKEN
to the SeldonIO/alibi-detect
repo we should be able to test with no risk of accidental release.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
p.s. just realised I am being silly here. We can test the condition bit by just adding a temporary echo
...
Agree, we should deffo do this.
I think so?
Agree we might want some way to avoid immediately releasing to PyPI if a
This would certainly be nice to further streamline the release process. I'm already doing this for the |
@jklaise based on this comment, I've had another think, and have changed the workflow (subject to agreement of course). The new idea is to publish to Test PyPI whenever So, the release workflow would be something like:
In step 1 we could directly push In my eyes, this process gives a bit more redundancy, in addition to ensuring one does not forget to publish a new release on the github side too? Edit: I tested this on my fork here: The v.10.5alpha2 action was triggered by clicking through the release workflow, and would publish to real PyPI. The Tmp change version action was triggered by pushing a tag, and would publish to Test PyPI. |
.github/workflows/publish.yml
Outdated
run: | | ||
python -m pip install . | ||
echo "Dummy release of alibi-detect..." | ||
python -c "from alibi_detect.version import __version__; print(__version__)" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point, I'll update the release process wiki (to inc. our latest thinking on |
@ascillitoe the proposed process makes sense to me and sounds quite nice! One thing to note is that because of version uniqueness, deleting the tag and pushing the same one would trigger the GA but fail uploading to test PyPI, so every push would have to have a unique version tag - though I don't think it's a big deal, just something to keep in mind. I like the extra precaution before publishing on real PyPI and also tying it together with the Github release. Edit1: perhaps we can also later automate the "manual" install and check step? I.e. have the GA action pushing to test PyPI also have a job that installs it, launches the Python interpreter and imports Edit2: separate to this PR, but wonder if we should also consider making Github release artifacts be the same as on PyPI. In reality this doesn't matter that much as pretty much everyone is using PyPI, but perhaps we need to publish a wheel and an |
Yes true. I guess we should have a clear instruction in the release process to use unique version tags.
I've changed a few things up yet again to achieve this. Main change is running |
Possible follow-up's after this would be:
|
alibi_detect/version.py
Outdated
@@ -3,4 +3,4 @@ | |||
# 2) we can import it in setup.py for the same reason | |||
# 3) we can import it into your module module | |||
|
|||
__version__ = "0.11.0dev" | |||
__version__ = "0.10.5-alpha5" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: revert before merging
.github/workflows/publish_test.yml
Outdated
run: | | ||
make clean | ||
python -m pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple alibi-detect==${{ env.PACKAGE_VERS }} | ||
python -c "from alibi_detect.cd import KSDrift" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would change this to have imports like import alibi_detect
, from alibi_detect.cd import *
etc. Also, does it capture any stdout/stderr and print it out? Asking as I didn't see it in the example CI run here: https://github.com/ascillitoe/alibi-detect/actions/runs/3490747918/jobs/5842565978
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Re stdout I think this line was stdout:
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
In b3aa94e I've changed to the imports you suggested and added print()
's so there is more obvious stdout. CI result here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Re stderr I'm not sure a python error would actually fail the GA step. Will look into this. Think we're pretty much there after that!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK seems to fail as desired:
https://github.com/ascillitoe/alibi-detect/actions/runs/3496723210/jobs/5854969140
.github/workflows/publish.yml
Outdated
# Wait for a while to give PyPI time to update (otherwise the pip install might fail) | ||
- name: Sleep for 10 minutes | ||
shell: bash | ||
run: sleep 10m |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
10 minutes sounds long, usually I can install almost straight away. Not sure though if worth trying programmatically installing (e.g. in a while loop) until it succeeds.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would a while
loop not be essentially the same thing though? i.e. wouldn't we have to set some sort of break condition to prevent an infinite loop, so we'd still have to set a time or number of attempts limit?
I've reduced it to 5 mins now. Perhaps we reduce it even more to say 1 or 2 mins, and then increase it if we regularly get failures? An occasional failure would not be the end of the world since the publish still would still have been done, and the user could check manually.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Annoyingly 1 minute wasn't enough!
https://github.com/ascillitoe/alibi-detect/actions/runs/3496674516/jobs/5854858932
I've settled on 3 for now... we can add a fetch a cup of tea 🍵 stage to the instructions?!
Moving back to WIP until final two TODO's are done. |
This PR implements a GitHub action workflow to publish releases. It is a more simple version of #495, with a
pyproject.toml
added to specify thebuild-system
only (instead of replacingsetup.py
).This removes the
make build_pypi
andmake push_pypi
steps (steps 7, 8. 10 in our wiki) from the release process. This simplifies the release process a little, but the real benefit is that the process is properly isolated (and repeatable). It prevents differences in a dev's system from affecting the release process. For example, we have had instances before where a release has been broken due to a dev following the usual release process but in a conda env instead of avirtualenv
(setuptools
in conda for some reason includes different files in thesdist
).Proposed release workflow
The expected release workflow is as follows:
release/
branch i.e. form arelease/
branch from a previous tag and cherry-pick fixes, or branch offmaster
etc.version.py
,README.md
etc).git commit -am "v0.x.x"; git push upstream master
git tag v0.x.x; git push upstream v0.x.x
Upon pushing the tag in step 4, the GitHub Action will automatically build the release and publish it to PyPI.
Additional validation stage
An additional validation stage can be incorporated between steps 2 and 3:
git commit -am "v0.x.x-rc1"; git push upstream master
The
rc
(oralpha
orbeta
) in the tag means the GitHub Action will instead publish to Test PyPI. We can then install it withpip install -i https://test.pypi.org/simple/ alibi-detect==0.x.xrc1
and check it before pushing the final release tag.Differences to #495
This PR is a watered-down version of #495, in case we'd rather think about this without committing to removing
setup.py
. #495 contains the following additional "improvements":setuptools_scm
is implemented. This setsalibi_detect
's version number from the latest git tag, meaning we don't have to hardcode the version inalibi_detect/version.py
(annoyingly we still have to manually update theREADME.md
andCITATION.cff
).setup.py
is fully replaced bypyproject.toml
. See Publish releases to PyPI via a GitHub Action (and replacesetup.py
withpyproject.toml
) #495 for a full discussion on this. In short, this is the future!TODO's