-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Failing Test]: beam_PreCommit_Python_Coverage suite fails with ModuleNotFoundError: No module named 'pydantic._hypothesis_plugin' and pip check
failures
#30852
Comments
Not sure how pydantic comes in the picture here yet, but for 'pydantic._hypothesis_plugin' to be available, it seems that we need to have 1.0.0<pydantic<2.0.0 |
Beam test infra installs pre-released dependencies to detect possible issues ahead of releases. The comand:
installs pydantic==2.0a4 The command
installs pydantic==1.10.15 The tft requirement comes from: Line 316 in 21129a4
|
from pipdeptree:
this seems to be an incorrect installer behavior. 2.0a4 shouln't be installed under these constraints. not sure how that happens. Likely what triggered the error for us was a recent release in https://pypi.org/project/google-cloud-aiplatform/#history , which added the pydantic dependency. |
Actually I misread this, it still fits the range but the chosen version is strange, there might be more constraints. |
even though pip selects a bizzare version for pydantic, pydantic 2, this pydantic-hypothesis plugin seems broken
|
https://stackoverflow.com/questions/71394400/how-to-block-the-hypothesis-pytest-plugin has some discussion how to disable it |
The hypothesis plugin might be provided by hypothesis itself, and we might need it for tests that use hypothesis. |
that is the factor. The pydantic-2.0a4 distribution has the following: (py38b) :py38b$ cat lib/python3.8/site-packages/pydantic-2.0a4.dist-info/entry_points.txt |
One other problem that causes additional failures in this suite is that in the test environment we first install test environment dependencies (e.g., tensorflow), then install the Beam package. This has implication on dependency resolution, and pip fails to resolve the conflicts. We might be able to prevent that if we install both deps in the same command, asked on tox-dev/tox#2386 (comment) if that is possible. |
pip check
failures in other suites
pip check
failures in other suitespip check
failures
What happened?
The 'coverage' suite runs some Beam unit tests in environments with different versions a particular dependency, for example we test severalversions of pyarrow or tft. The py38-tft-113 suite currently fails, likely due to a incompatible dependencies in tox environment:
Clues are in dependencies that were installed:
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)
Issue Components
The text was updated successfully, but these errors were encountered: