Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Publish releases to PyPI via a GitHub Action #674

Closed
wants to merge 12 commits into from
Closed

Publish releases to PyPI via a GitHub Action #674

wants to merge 12 commits into from

Conversation

ascillitoe
Copy link
Contributor

@ascillitoe ascillitoe commented Nov 11, 2022

This PR implements a GitHub action workflow to publish releases. It is a more simple version of #495, with a pyproject.toml added to specify the build-system only (instead of replacing setup.py).

This removes the make build_pypi and make push_pypi steps (steps 7, 8. 10 in our wiki) from the release process. This simplifies the release process a little, but the real benefit is that the process is properly isolated (and repeatable). It prevents differences in a dev's system from affecting the release process. For example, we have had instances before where a release has been broken due to a dev following the usual release process but in a conda env instead of a virtualenv (setuptools in conda for some reason includes different files in the sdist).

Proposed release workflow

The expected release workflow is as follows:

  1. Perform the usual git ops to get a release/ branch i.e. form a release/ branch from a previous tag and cherry-pick fixes, or branch off master etc.
  2. Make the usual changes to hard-coded version numbers (in version.py, README.md etc).
  3. Make the final release commit and push it: git commit -am "v0.x.x"; git push upstream master
  4. Tag the release commit and push it: git tag v0.x.x; git push upstream v0.x.x

Upon pushing the tag in step 4, the GitHub Action will automatically build the release and publish it to PyPI.

Additional validation stage

An additional validation stage can be incorporated between steps 2 and 3:

  • Make a pre-release commit and push it: git commit -am "v0.x.x-rc1"; git push upstream master

The rc (or alpha or beta) in the tag means the GitHub Action will instead publish to Test PyPI. We can then install it with pip install -i https://test.pypi.org/simple/ alibi-detect==0.x.xrc1 and check it before pushing the final release tag.

Differences to #495

This PR is a watered-down version of #495, in case we'd rather think about this without committing to removing setup.py. #495 contains the following additional "improvements":

  1. setuptools_scm is implemented. This sets alibi_detect's version number from the latest git tag, meaning we don't have to hardcode the version in alibi_detect/version.py (annoyingly we still have to manually update the README.md and CITATION.cff).
  2. setup.py is fully replaced by pyproject.toml. See Publish releases to PyPI via a GitHub Action (and replace setup.py with pyproject.toml) #495 for a full discussion on this. In short, this is the future!

TODO's

  • Finish new release process instructions on wiki.
  • Update GitHub Actions to pull in PyPI secrets for user that triggered action (if they exist).

@codecov-commenter
Copy link

codecov-commenter commented Nov 11, 2022

Codecov Report

Merging #674 (ce20eaa) into master (5748864) will not change coverage.
The diff coverage is n/a.

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #674   +/-   ##
=======================================
  Coverage   79.12%   79.12%           
=======================================
  Files         126      126           
  Lines        8895     8895           
=======================================
  Hits         7038     7038           
  Misses       1857     1857           
Flag Coverage Δ
macos-latest-3.10 76.23% <ø> (ø)
ubuntu-latest-3.10 79.01% <ø> (ø)
ubuntu-latest-3.7 78.91% <ø> (ø)
ubuntu-latest-3.8 78.95% <ø> (ø)
ubuntu-latest-3.9 78.95% <ø> (ø)
windows-latest-3.9 76.17% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

@ascillitoe ascillitoe requested review from jklaise and mauicv November 15, 2022 10:25
Copy link
Contributor

@jklaise jklaise left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @ascillitoe!

What else is needed to do to test the workflows with the real PyPI?

I like the idea of pushing to test PyPI unconditionally and then having a manual step checking on the dev machine if things work as intended installing the newly published version from test PyPI. Slight disadvantage up to now was that if things didn't work, then one need's to invent a "non-existent" version before pushing to test PyPI again (e.g. if testing v0.8.0 and for some reason it's broken, we now need something like v0.8.0-rc to be pushed to test PyPI even though eventually the real PyPI version will be v0.8.0 - this can introduce mistakes in the process).

I wonder if we should have a compulsory step to release an "rc" version of test PyPI before releasing the real one? Compulsory here could mean it's just our release process, I don't think we have to programmatically enforce that an "rc" version has to exist on test PyPI before a non-rc version can be pushed (but we could in the future).

Couple of questions:

  • is "rc" the right type of tag for this kind of validation?
  • perhaps publishing to real PyPI could/should be done by a manual workflow trigger to add an extra check (e.g. if someone pushes a v- tag and we don't programmatically enforce a pre-requisite to have an "rc" tag)
  • maybe for a follow-up, we could have a simple script that automatically bumps the versions (perhaps an option with minor/patch release and rc/regular) and dates in the various files

pyproject.toml Outdated
Comment on lines 3 to 4
"setuptools~=62.0.0",
"wheel~=0.37.0",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the reasoning behind these versions?

Also, I assume this file is now read during the call to python -m build ?

Copy link
Contributor Author

@ascillitoe ascillitoe Nov 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the reasoning behind these versions?

These were the latest versions when I first wrote the pyproject.toml in #495. The fact that they are now out of date (setuptools is on 65.5.1) highlights the slight maintainability burden here. I feel like it is safer to pin (maybe to minor instead?) but can't see a way to avoid these going out of date then...

Also, I assume this file is now read during the call to python -m build ?

Yep, the idea here is that build (and pip install) reads these deps, and then installs them in an isolated virtual env, hence giving us more control over the build env. Since we are now building for release via GA, we could of course define the build env via a requirements.txt. However, the pyproject.toml approach gives more consistency between the build environment for PyPI and that used when users install locally.

Copy link
Contributor Author

@ascillitoe ascillitoe Nov 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sidenote: The caveat of this approach is that the above defines the backend. There isn't currently a way to define the frontend deps (i.e. the build version) in the pyproject.toml, hence why I am having to define the version for that in the Makefile and GA yaml.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see what you mean wrt dependencies going out of date if pinned, but for release builds it shouldn't matter (and also note that the frontend build in the GA is now pinned exactly). On the other hand, the minimum version via ~ makes sense to me too, though that could have a small risk of breaking things. TBF I think leaving ~ here is fine.

Btw when you say the pyproject.toml deps are installed in an isolated virtual enviroment, how does that work exactly? Assuming I'm already in a virtualenv/conda env, would in this case, after calling pip install ., setuptools and wheel be installed in a new (not visible to user?) virtualenv, build takes place in that env, and then the package is installed in the original user env?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

setuptools and wheel be installed in a new (not visible to user?) virtualenv, build takes place in that env, and then the package is installed in the original user env?

Yes, from what I understand, exactly this. For packages conforming to PEP 518 (i.e. ones that spec their build-system in a pyproject.toml, running pip install creates a virtual env hidden from the user, installs build deps, does the build, and then installs the built package in the users own environment (i.e. their system python, or conda/virtualenv etc). When running pip install . with this PR we get:

Processing /home/ascillitoe/Software/alibi-detect
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done

Note the extra lines compared to when we run the same with our v0.10.4:

Processing /home/ascillitoe/Software/alibi-detect
  Preparing metadata (setup.py) ... done

Note: You can disable the above and use your already-installed build tools with the pip --no-build-isolation flag. Described as:

Disable isolation when building a modern source distribution. Build dependencies specified by PEP 518 must be already installed if this option is used.

Makefile Outdated
build: ## Build the Python package (for debugging, building for release is done via a GitHub Action)
# (virtualenv installed due to Debian issue https://github.com/pypa/build/issues/224)
pip install --upgrade pip build virtualenv
python -m build --sdist --wheel
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this command be identical to the one in the GA to minimize dev confusion?


# Build
- name: Install pypa/build
run: python -m pip install --user build==0.9.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assume the specific version is just the latest stable one?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep. I toyed with the idea of doing build~=0.9 or something, but wondering if it is safer to pin to a specific version since we will be releasing builds straight off of it?

Copy link
Collaborator

@mauicv mauicv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, will we document the release process somewhere?

password: ${{ secrets.TEST_PYPI_API_TOKEN }}
repository_url: https://test.pypi.org/legacy/

# TODO - Test below by:
Copy link
Contributor Author

@ascillitoe ascillitoe Nov 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What else is needed to do to test the workflows with the real PyPI?

I think these steps should work. The thing we want to avoid is a a non-stable tag triggering a release, so we want to test the if: steps.version.outputs.prerelease == '' line. As long as we try out the below before adding a PYPI_API_TOKEN to the SeldonIO/alibi-detect repo we should be able to test with no risk of accidental release.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

p.s. just realised I am being silly here. We can test the condition bit by just adding a temporary echo...

@ascillitoe
Copy link
Contributor Author

I wonder if we should have a compulsory step to release an "rc" version of test PyPI before releasing the real one? Compulsory here could mean it's just our release process, I don't think we have to programmatically enforce that an "rc" version has to exist on test PyPI before a non-rc version can be pushed (but we could in the future).

Agree, we should deffo do this.

is "rc" the right type of tag for this kind of validation?

I think so? alpha and beta suggest that the code is not yet code complete, and still contains bugs etc. Whereas rc suggests the code is release-ready and no entirely new source code will be added to this release (taking definitions from wikipedia!).

perhaps publishing to real PyPI could/should be done by a manual workflow trigger to add an extra check (e.g. if someone pushes a v- tag and we don't programmatically enforce a pre-requisite to have an "rc" tag)

Agree we might want some way to avoid immediately releasing to PyPI if a v- tag was pushed but the rc step was missed. I feel like it might be nicer to do this programmatically so will look into that first.

maybe for a follow-up, we could have a simple script that automatically bumps the versions (perhaps an option with minor/patch release and rc/regular) and dates in the various files

This would certainly be nice to further streamline the release process. I'm already doing this for the alibi_detect.__version__ in #495 via setuptools_scm (I now intend #495 to be a follow-up), but need to think about how to update CITATION.cff and README.md.

@ascillitoe
Copy link
Contributor Author

ascillitoe commented Nov 17, 2022

perhaps publishing to real PyPI could/should be done by a manual workflow trigger to add an extra check (e.g. if someone pushes a v- tag and we don't programmatically enforce a pre-requisite to have an "rc" tag)

@jklaise based on this comment, I've had another think, and have changed the workflow (subject to agreement of course). The new idea is to publish to Test PyPI whenever v* tags are pushed, and publish to real PyPI when we manually "publish a release" on the GitHub release page (this page).

So, the release workflow would be something like:

  1. Push v0.x.x-rc tag. This will be published to Test PyPI.
  2. pip install the rc release from Test PyPI and check it.
  3. Push the final v0.x.x tag.
  4. Go to the GitHub release page and create the release (inc. release notes), linking it to the v0.x.x tag.
  5. The new release will be published to real PyPI when Publish release is pressed.

In step 1 we could directly push v0.x.x (no rc!) and it would still only be published on Test PyPI. The downside of doing this would be that if it a bug needed fixing, we'd have to then do git push -d upstream v0.x.x before pushing the same tag again...

In my eyes, this process gives a bit more redundancy, in addition to ensuring one does not forget to publish a new release on the github side too?

Edit: I tested this on my fork here:
image

The v.10.5alpha2 action was triggered by clicking through the release workflow, and would publish to real PyPI. The Tmp change version action was triggered by pushing a tag, and would publish to Test PyPI.

run: |
python -m pip install .
echo "Dummy release of alibi-detect..."
python -c "from alibi_detect.version import __version__; print(__version__)"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doing this temporarily to check that the correct commit is checked out (and thus published). If triggered by a tag being pushed that is checked out. If triggered by a release, the tag linked to that release is checked out (see here). The test here confirms this, with 0.10.5alpha2 being printed.

@ascillitoe
Copy link
Contributor Author

LGTM, will we document the release process somewhere?

Good point, I'll update the release process wiki (to inc. our latest thinking on release/ branches etc too) once this PR is merged.

@jklaise
Copy link
Contributor

jklaise commented Nov 17, 2022

@ascillitoe the proposed process makes sense to me and sounds quite nice! One thing to note is that because of version uniqueness, deleting the tag and pushing the same one would trigger the GA but fail uploading to test PyPI, so every push would have to have a unique version tag - though I don't think it's a big deal, just something to keep in mind. I like the extra precaution before publishing on real PyPI and also tying it together with the Github release.

Edit1: perhaps we can also later automate the "manual" install and check step? I.e. have the GA action pushing to test PyPI also have a job that installs it, launches the Python interpreter and imports alibi, printing both stdout and stderr.

Edit2: separate to this PR, but wonder if we should also consider making Github release artifacts be the same as on PyPI. In reality this doesn't matter that much as pretty much everyone is using PyPI, but perhaps we need to publish a wheel and an sdist in the same way as we do on PyPI.

@ascillitoe
Copy link
Contributor Author

@ascillitoe the proposed process makes sense to me and sounds quite nice! One thing to note is that because of version uniqueness, deleting the tag and pushing the same one would trigger the GA but fail uploading to test PyPI, so every push would have to have a unique version tag - though I don't think it's a big deal, just something to keep in mind. I like the extra precaution before publishing on real PyPI and also tying it together with the Github release.

Yes true. I guess we should have a clear instruction in the release process to use unique version tags.

Edit1: perhaps we can also later automate the "manual" install and check step? I.e. have the GA action pushing to test PyPI also have a job that installs it, launches the Python interpreter and imports alibi, printing both stdout and stderr.

I've changed a few things up yet again to achieve this. Main change is running pip install and a test import after publishing. Examples of it passing and failing. It failed in the latter example because PyPI was slow in updating after the new release (I've added a sleep step to prevent this). I've also separated the GA into two smaller GA's for simplicity.

@ascillitoe
Copy link
Contributor Author

Possible follow-up's after this would be:

  1. Explore what @jklaise said: Edit 2: separate to this PR, but wonder if we should also consider making Github release artifacts be the same as on PyPI.
  2. Use setuptools_scm to set version to avoid need to hardcode (see Publish releases to PyPI via a GitHub Action (and replace setup.py with pyproject.toml) #495).
  3. Possibly transition setup.py to pyproject.toml (see Publish releases to PyPI via a GitHub Action (and replace setup.py with pyproject.toml) #495).
  4. Script to update version number etc in CITATION.cff and README.md.

@@ -3,4 +3,4 @@
# 2) we can import it in setup.py for the same reason
# 3) we can import it into your module module

__version__ = "0.11.0dev"
__version__ = "0.10.5-alpha5"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: revert before merging

run: |
make clean
python -m pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple alibi-detect==${{ env.PACKAGE_VERS }}
python -c "from alibi_detect.cd import KSDrift"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would change this to have imports like import alibi_detect, from alibi_detect.cd import * etc. Also, does it capture any stdout/stderr and print it out? Asking as I didn't see it in the example CI run here: https://github.com/ascillitoe/alibi-detect/actions/runs/3490747918/jobs/5842565978

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re stdout I think this line was stdout:

None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.

In b3aa94e I've changed to the imports you suggested and added print()'s so there is more obvious stdout. CI result here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re stderr I'm not sure a python error would actually fail the GA step. Will look into this. Think we're pretty much there after that!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines 39 to 42
# Wait for a while to give PyPI time to update (otherwise the pip install might fail)
- name: Sleep for 10 minutes
shell: bash
run: sleep 10m
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

10 minutes sounds long, usually I can install almost straight away. Not sure though if worth trying programmatically installing (e.g. in a while loop) until it succeeds.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would a while loop not be essentially the same thing though? i.e. wouldn't we have to set some sort of break condition to prevent an infinite loop, so we'd still have to set a time or number of attempts limit?

I've reduced it to 5 mins now. Perhaps we reduce it even more to say 1 or 2 mins, and then increase it if we regularly get failures? An occasional failure would not be the end of the world since the publish still would still have been done, and the user could check manually.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Annoyingly 1 minute wasn't enough!
https://github.com/ascillitoe/alibi-detect/actions/runs/3496674516/jobs/5854858932

I've settled on 3 for now... we can add a fetch a cup of tea 🍵 stage to the instructions?!

@ascillitoe ascillitoe added this to the v0.11.0 milestone Nov 22, 2022
@ascillitoe
Copy link
Contributor Author

Moving back to WIP until final two TODO's are done.

@ascillitoe ascillitoe added the WIP PR is a Work in Progress label Nov 30, 2022
@ascillitoe ascillitoe modified the milestones: v0.11.0, v0.11.1 Jan 12, 2023
@ascillitoe ascillitoe closed this by deleting the head repository Apr 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
WIP PR is a Work in Progress
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants