Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test failure: test_running_real_assessment_job_ext_hms #3246

Open
github-actions bot opened this issue Nov 12, 2024 · 10 comments
Open

Test failure: test_running_real_assessment_job_ext_hms #3246

github-actions bot opened this issue Nov 12, 2024 · 10 comments
Labels
bug Something isn't working

Comments

@github-actions
Copy link

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 16 failures: Unknown: assess_CLOUD_ENV_service_principals: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_clusters: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_dashboards: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_global_init_scripts: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_incompatible_submit_runs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_jobs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_pipelines: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_workflows: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_cluster_policies: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_groups: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_mounts: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_tables: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_udfs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: parse_logs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: setup_tacl: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: workspace_listing: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py) (15m23.538s)
databricks.labs.blueprint.parallel.ManyError: Detected 16 failures: Unknown: assess_CLOUD_ENV_service_principals: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_clusters: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_dashboards: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_global_init_scripts: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_incompatible_submit_runs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_jobs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_pipelines: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: assess_workflows: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_cluster_policies: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_groups: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_mounts: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_tables: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: crawl_udfs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: parse_logs: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: setup_tacl: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py), Unknown: workspace_listing: ImportError: cannot import name 'InvalidState' from 'databricks.sdk.errors' (/databricks/python/lib/python3.12/site-packages/databricks/sdk/errors/__init__.py)
[gw6] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DpZPB_ra78a57c67: https://DATABRICKS_HOST/sql/dashboards/bc59541e-74d1-4171-a49f-448cf493030e
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DREHG_ra78a57c67: https://DATABRICKS_HOST/sql/dashboards/b2fe1a28-45f6-4707-a24e-ebb538e8cab8
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.a4X2/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:10 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:10 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1020241112051050
05:10 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:14 INFO [databricks.labs.ucx.install] Creating dashboards...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.a4X2/README for the next steps.
05:14 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/37254605489178
05:14 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/37254605489178/runs/95389356298218
05:14 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of assessment job: https://DATABRICKS_HOST#job/37254605489178/runs/95389356298218
05:21 INFO [databricks.labs.ucx.installer.workflows] ---------- REMOTE LOGS --------------
05:21 WARNING [databricks.labs.ucx.installer.workflows] Cannot fetch logs as folder /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.a4X2/logs/assessment does not exist
05:21 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ----------
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DpZPB_ra78a57c67: https://DATABRICKS_HOST/sql/dashboards/bc59541e-74d1-4171-a49f-448cf493030e
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DREHG_ra78a57c67: https://DATABRICKS_HOST/sql/dashboards/b2fe1a28-45f6-4707-a24e-ebb538e8cab8
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.a4X2/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:10 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:10 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1020241112051050
05:10 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:10 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:11 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:14 INFO [databricks.labs.ucx.install] Creating dashboards...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.a4X2/README for the next steps.
05:14 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/37254605489178
05:14 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/37254605489178/runs/95389356298218
05:14 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of assessment job: https://DATABRICKS_HOST#job/37254605489178/runs/95389356298218
05:21 INFO [databricks.labs.ucx.installer.workflows] ---------- REMOTE LOGS --------------
05:21 WARNING [databricks.labs.ucx.installer.workflows] Cannot fetch logs as folder /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.a4X2/logs/assessment does not exist
05:21 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ----------
05:21 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1020241112051050 from https://DATABRICKS_HOST
05:21 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_s4qja
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=338477512033184, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=635443392938814, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=907236380127905, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=182003173388770, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=757068275348552, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=420352026749638, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=832571309769051, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=42146361002385, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=374961209915942, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=799735299395736, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=37254605489178, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=709544506125265, as it is no longer needed
05:21 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=753654068991649, as it is no longer needed
05:21 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:21 INFO [databricks.labs.ucx.install] Deleting secret scope
05:21 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw6] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #264

@github-actions github-actions bot added the bug Something isn't working label Nov 12, 2024
Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 16 failures: Unknown: assess_CLOUD_ENV_service_principals: run failed with error message (1m57.319s)
databricks.labs.blueprint.parallel.ManyError: Detected 16 failures: Unknown: assess_CLOUD_ENV_service_principals: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_clusters: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_dashboards: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_global_init_scripts: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_incompatible_submit_runs: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_jobs: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_pipelines: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: assess_workflows: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: crawl_cluster_policies: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: crawl_groups: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: crawl_mounts: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: crawl_tables: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: crawl_udfs: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: parse_logs: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: setup_tacl: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65., Unknown: workspace_listing: run failed with error message
 Unexpected user error while preparing the cluster for the job. Cause: RESOURCE_DOES_NOT_EXIST: Can't find a cluster policy with id: 00140323ADB75B65.
[gw3] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DdQH3_ra78a57ccb: https://DATABRICKS_HOST/sql/dashboards/ea85fe69-dfe0-4117-8192-29d8b5a585b8
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DAzMd_ra78a57ccb: https://DATABRICKS_HOST/sql/dashboards/16697a05-66c7-46d8-821f-43a6d0d8ca03
05:07 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.k4sl/config.yml) doesn't exist.
05:07 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:07 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:07 INFO [databricks.labs.ucx.install] Fetching installations...
05:07 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:07 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:07 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1320241113050716
05:07 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:08 INFO [databricks.labs.ucx.install] Creating dashboards...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.k4sl/README for the next steps.
05:08 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/954052104405946
05:08 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/954052104405946/runs/895871531088781
05:08 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of assessment job: https://DATABRICKS_HOST#job/954052104405946/runs/895871531088781
05:08 INFO [databricks.labs.ucx.installer.workflows] ---------- REMOTE LOGS --------------
05:08 WARNING [databricks.labs.ucx.installer.workflows] Cannot fetch logs as folder /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.k4sl/logs/assessment does not exist
05:08 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ----------
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DdQH3_ra78a57ccb: https://DATABRICKS_HOST/sql/dashboards/ea85fe69-dfe0-4117-8192-29d8b5a585b8
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DAzMd_ra78a57ccb: https://DATABRICKS_HOST/sql/dashboards/16697a05-66c7-46d8-821f-43a6d0d8ca03
05:07 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.k4sl/config.yml) doesn't exist.
05:07 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:07 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:07 INFO [databricks.labs.ucx.install] Fetching installations...
05:07 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:07 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:07 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1320241113050716
05:07 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:07 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:08 INFO [databricks.labs.ucx.install] Creating dashboards...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:08 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:08 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:08 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.k4sl/README for the next steps.
05:08 DEBUG [databricks.labs.ucx.installer.workflows] starting assessment job: https://DATABRICKS_HOST#job/954052104405946
05:08 INFO [databricks.labs.ucx.installer.workflows] Started assessment job: https://DATABRICKS_HOST#job/954052104405946/runs/895871531088781
05:08 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of assessment job: https://DATABRICKS_HOST#job/954052104405946/runs/895871531088781
05:08 INFO [databricks.labs.ucx.installer.workflows] ---------- REMOTE LOGS --------------
05:08 WARNING [databricks.labs.ucx.installer.workflows] Cannot fetch logs as folder /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.k4sl/logs/assessment does not exist
05:08 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ----------
05:08 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1320241113050716 from https://DATABRICKS_HOST
05:08 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_sf2ia
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=918414297691294, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=623620151831449, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=763574043154087, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=75233069681440, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=332971472558465, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1051198138656235, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=329414483581813, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=550998314862287, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=729584434720434, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=396536185105164, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1088448686759280, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=591026581394966, as it is no longer needed
05:08 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=954052104405946, as it is no longer needed
05:08 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:08 ERROR [databricks.labs.ucx.install] UCX Policy already deleted
05:08 INFO [databricks.labs.ucx.install] Deleting secret scope
05:08 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw3] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #265

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path' (1m53.93s)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
[gw6] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
15:52 INFO [tests.integration.conftest] Dashboard Created ucx_DfFer_ra78a57d39: https://DATABRICKS_HOST/sql/dashboards/e0eb3f01-5280-434b-bcf5-8f60ff954b62
15:52 INFO [tests.integration.conftest] Dashboard Created ucx_De7Jf_ra78a57d39: https://DATABRICKS_HOST/sql/dashboards/b32f2b22-4001-47d9-b68a-ba5ca0e620b0
15:52 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/config.yml) doesn't exist.
15:52 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
15:52 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
15:52 INFO [databricks.labs.ucx.install] Fetching installations...
15:52 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
15:52 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
15:52 DEBUG [tests.integration.conftest] Waiting for clusters to start...
15:52 DEBUG [tests.integration.conftest] Waiting for clusters to start...
15:52 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241114155231
15:52 INFO [databricks.labs.ucx.install] Creating ucx schemas...
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
15:53 INFO [databricks.labs.ucx.install] Creating dashboards...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
15:53 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:01:11.758433
15:52 INFO [tests.integration.conftest] Dashboard Created ucx_DfFer_ra78a57d39: https://DATABRICKS_HOST/sql/dashboards/e0eb3f01-5280-434b-bcf5-8f60ff954b62
15:52 INFO [tests.integration.conftest] Dashboard Created ucx_De7Jf_ra78a57d39: https://DATABRICKS_HOST/sql/dashboards/b32f2b22-4001-47d9-b68a-ba5ca0e620b0
15:52 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/config.yml) doesn't exist.
15:52 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
15:52 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
15:52 INFO [databricks.labs.ucx.install] Fetching installations...
15:52 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
15:52 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
15:52 DEBUG [tests.integration.conftest] Waiting for clusters to start...
15:52 DEBUG [tests.integration.conftest] Waiting for clusters to start...
15:52 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241114155231
15:52 INFO [databricks.labs.ucx.install] Creating ucx schemas...
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
15:52 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
15:53 INFO [databricks.labs.ucx.install] Creating dashboards...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
15:53 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
15:53 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
15:53 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.RERJ/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
15:53 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
15:53 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:01:11.758433
15:53 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1420241114155231 from https://DATABRICKS_HOST
15:53 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_srx3b
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=544939141999906, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=227882613916020, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=608045537824853, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=366767314897189, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1071298574656839, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=929253787765645, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=382984155621239, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=982932705280342, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=412632390057609, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=791899961272090, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=499781101147960, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=167522932054843, as it is no longer needed
15:53 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=110607399073413, as it is no longer needed
15:53 INFO [databricks.labs.ucx.install] Deleting cluster policy
15:53 INFO [databricks.labs.ucx.install] Deleting secret scope
15:53 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw6] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #268

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path' (9m45.09s)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DvbBk_ra78a57d93: https://DATABRICKS_HOST/sql/dashboards/4ee2a942-d6a6-45bd-8018-8a0e67e1c8fa
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DYsae_ra78a57d93: https://DATABRICKS_HOST/sql/dashboards/7a720ba7-8b48-45b8-8b63-34c70c7341af
05:07 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/config.yml) doesn't exist.
05:07 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:07 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:07 INFO [databricks.labs.ucx.install] Fetching installations...
05:07 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:07 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:08 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:08 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241115050848
05:08 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:16 INFO [databricks.labs.ucx.install] Creating dashboards...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:16 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:07:32.395311
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DvbBk_ra78a57d93: https://DATABRICKS_HOST/sql/dashboards/4ee2a942-d6a6-45bd-8018-8a0e67e1c8fa
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DYsae_ra78a57d93: https://DATABRICKS_HOST/sql/dashboards/7a720ba7-8b48-45b8-8b63-34c70c7341af
05:07 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/config.yml) doesn't exist.
05:07 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:07 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:07 INFO [databricks.labs.ucx.install] Fetching installations...
05:07 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:07 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:08 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:08 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241115050848
05:08 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:08 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:16 INFO [databricks.labs.ucx.install] Creating dashboards...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:16 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:16 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:16 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CNWE/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:16 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:16 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:07:32.395311
05:16 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1420241115050848 from https://DATABRICKS_HOST
05:16 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_sysru
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=216201780920778, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=147965178663226, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=83586394510429, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=548704490827662, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=745790775077077, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=896602870077349, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=487894134067735, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=221933905905146, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=278058099434376, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=975469619461389, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=722664828489659, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1088910635332112, as it is no longer needed
05:16 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1097035720740344, as it is no longer needed
05:16 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:16 INFO [databricks.labs.ucx.install] Deleting secret scope
05:16 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #269

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path' (11m59.52s)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
[gw7] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DEDzC_ra78a57df7: https://DATABRICKS_HOST/sql/dashboards/131a4363-12f1-4ef1-ad66-60a74b3e849f
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_Dnwi0_ra78a57df7: https://DATABRICKS_HOST/sql/dashboards/73db689f-74c6-4772-88c8-0d4dcb35be9e
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241116051210
05:12 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:17 INFO [databricks.labs.ucx.install] Creating dashboards...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:17 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:05:37.232896
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DEDzC_ra78a57df7: https://DATABRICKS_HOST/sql/dashboards/131a4363-12f1-4ef1-ad66-60a74b3e849f
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_Dnwi0_ra78a57df7: https://DATABRICKS_HOST/sql/dashboards/73db689f-74c6-4772-88c8-0d4dcb35be9e
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241116051210
05:12 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:17 INFO [databricks.labs.ucx.install] Creating dashboards...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:17 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:17 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:17 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qClm/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:17 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:17 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:05:37.232896
05:17 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1420241116051210 from https://DATABRICKS_HOST
05:17 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_shbi1
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=180433269774221, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=815341704457384, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=965451655954595, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=979598713100224, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=38110855367097, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=629479250998314, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=570884378124032, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=991682312869308, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=272783920165508, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=315895659101100, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1064629563042611, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=638868004404122, as it is no longer needed
05:17 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=344556713363472, as it is no longer needed
05:18 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:18 INFO [databricks.labs.ucx.install] Deleting secret scope
05:18 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw7] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #270

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path' (11m46.932s)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DZHUN_ra78a57e5b: https://DATABRICKS_HOST/sql/dashboards/5c9ed686-92c4-460e-b4f3-3a337cfcd931
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_D4GUc_ra78a57e5b: https://DATABRICKS_HOST/sql/dashboards/1855f45c-a79a-4db8-b9dd-b65723621d94
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241117051233
05:12 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:18 INFO [databricks.labs.ucx.install] Creating dashboards...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:18 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:05:29.004821
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DZHUN_ra78a57e5b: https://DATABRICKS_HOST/sql/dashboards/5c9ed686-92c4-460e-b4f3-3a337cfcd931
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_D4GUc_ra78a57e5b: https://DATABRICKS_HOST/sql/dashboards/1855f45c-a79a-4db8-b9dd-b65723621d94
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241117051233
05:12 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:18 INFO [databricks.labs.ucx.install] Creating dashboards...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:18 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:18 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:18 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.plWK/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:18 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:18 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:05:29.004821
05:18 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1420241117051233 from https://DATABRICKS_HOST
05:18 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_sjvj7
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=289644693214686, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=379193483552809, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=821912787312295, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=487419092463209, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=788813564283706, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=48265887839815, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=89860362502518, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1082872223854271, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1009233685145901, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=423122979785642, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=983559488133407, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=104511583977069, as it is no longer needed
05:18 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=986178405803317, as it is no longer needed
05:18 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:18 INFO [databricks.labs.ucx.install] Deleting secret scope
05:18 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #271

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path' (7m46.755s)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DJfdz_ra78a57ebf: https://DATABRICKS_HOST/sql/dashboards/9d75b015-d171-4e5e-8e01-846f2c54286a
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DL5bu_ra78a57ebf: https://DATABRICKS_HOST/sql/dashboards/8968d9fd-678f-4414-8bf2-c390123dabff
05:07 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/config.yml) doesn't exist.
05:07 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:07 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:07 INFO [databricks.labs.ucx.install] Fetching installations...
05:07 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:07 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:13 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241118051300
05:13 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:14 INFO [databricks.labs.ucx.install] Creating dashboards...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:14 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:01:31.581854
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DJfdz_ra78a57ebf: https://DATABRICKS_HOST/sql/dashboards/9d75b015-d171-4e5e-8e01-846f2c54286a
05:07 INFO [tests.integration.conftest] Dashboard Created ucx_DL5bu_ra78a57ebf: https://DATABRICKS_HOST/sql/dashboards/8968d9fd-678f-4414-8bf2-c390123dabff
05:07 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/config.yml) doesn't exist.
05:07 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:07 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:07 INFO [databricks.labs.ucx.install] Fetching installations...
05:07 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:07 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:07 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:13 INFO [databricks.labs.ucx.install] Installing UCX v0.49.1+1420241118051300
05:13 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
05:13 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
05:14 INFO [databricks.labs.ucx.install] Creating dashboards...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:14 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:14 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:14 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Ji6G/dashboards') task failed: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1136, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(
TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
05:14 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: TypeError: LakeviewAPI.create() got an unexpected keyword argument 'parent_path'
05:14 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:01:31.581854
05:14 INFO [databricks.labs.ucx.install] Deleting UCX v0.49.1+1420241118051300 from https://DATABRICKS_HOST
05:14 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_s4pxz
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=248361500770857, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=489868287868228, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=544046103063899, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=480536979125949, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=261350678755682, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=829407335632688, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=311491987834530, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=815415570746357, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=386293320373494, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=924716775713027, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=889137932269257, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=934234519011589, as it is no longer needed
05:14 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=751305824560841, as it is no longer needed
05:14 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:14 INFO [databricks.labs.ucx.install] Deleting secret scope
05:14 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #272

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: AttributeError: 'dict' object has no attribute 'as_dict' (1m31.771s)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: AttributeError: 'dict' object has no attribute 'as_dict'
[gw3] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
09:51 INFO [tests.integration.conftest] Dashboard Created ucx_DizYt_ra78a57f27: https://DATABRICKS_HOST/sql/dashboards/f21effc2-0214-4e3c-975f-9ca25c9a159c
09:51 INFO [tests.integration.conftest] Dashboard Created ucx_DrM5v_ra78a57f27: https://DATABRICKS_HOST/sql/dashboards/19959f23-d271-4b94-9a9a-93f75cdc1eae
09:51 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/config.yml) doesn't exist.
09:51 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
09:51 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
09:51 INFO [databricks.labs.ucx.install] Fetching installations...
09:51 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
09:51 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
09:51 DEBUG [tests.integration.conftest] Waiting for clusters to start...
09:51 DEBUG [tests.integration.conftest] Waiting for clusters to start...
09:51 INFO [databricks.labs.ucx.install] Installing UCX v0.50.1+320241119095113
09:51 INFO [databricks.labs.ucx.install] Creating ucx schemas...
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
09:51 INFO [databricks.labs.ucx.install] Creating dashboards...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
09:51 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:51 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:51 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:51 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:51 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:51 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
09:52 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: AttributeError: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: AttributeError: 'dict' object has no attribute 'as_dict'
09:52 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:00:47.344407
09:51 INFO [tests.integration.conftest] Dashboard Created ucx_DizYt_ra78a57f27: https://DATABRICKS_HOST/sql/dashboards/f21effc2-0214-4e3c-975f-9ca25c9a159c
09:51 INFO [tests.integration.conftest] Dashboard Created ucx_DrM5v_ra78a57f27: https://DATABRICKS_HOST/sql/dashboards/19959f23-d271-4b94-9a9a-93f75cdc1eae
09:51 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/config.yml) doesn't exist.
09:51 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
09:51 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
09:51 INFO [databricks.labs.ucx.install] Fetching installations...
09:51 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
09:51 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
09:51 DEBUG [tests.integration.conftest] Waiting for clusters to start...
09:51 DEBUG [tests.integration.conftest] Waiting for clusters to start...
09:51 INFO [databricks.labs.ucx.install] Installing UCX v0.50.1+320241119095113
09:51 INFO [databricks.labs.ucx.install] Creating ucx schemas...
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-legacy
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
09:51 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
09:51 INFO [databricks.labs.ucx.install] Creating dashboards...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
09:51 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
09:51 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
09:51 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:51 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:51 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:51 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:51 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:51 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
09:52 ERROR [databricks.labs.blueprint.parallel] installing dashboards(PosixPath('/home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main'), parent_path='/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.2oAy/dashboards') task failed: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 653, in _create_dashboard
    dashboard = Dashboards(self._ws).create_dashboard(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/dashboards.py", line 1138, in create_dashboard
    sdk_dashboard = self._ws.lakeview.create(dashboard=dashboard_to_create.as_dict())  # type: ignore
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/dashboards.py", line 1147, in create
    body = dashboard.as_dict()
AttributeError: 'dict' object has no attribute 'as_dict'
09:52 CRITICAL [databricks.labs.blueprint.parallel] All 'installing dashboards' tasks failed!!!
09:52 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Detected 7 failures: AttributeError: 'dict' object has no attribute 'as_dict'
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/install.py", line 540, in _create_database_and_dashboards
    Threads.strict("installing dashboards", list(self._get_create_dashboard_tasks()))
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 63, in strict
    raise ManyError(errs)
databricks.labs.blueprint.parallel.ManyError: Detected 7 failures: AttributeError: 'dict' object has no attribute 'as_dict'
09:52 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:00:47.344407
09:52 INFO [databricks.labs.ucx.install] Deleting UCX v0.50.1+320241119095113 from https://DATABRICKS_HOST
09:52 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_snofc
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=953645958524961, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=234959655430387, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=57047917607080, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=340000683331110, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=128639934004722, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=520831883759980, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=986161998730091, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=479805507876714, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=490021059530510, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=801905888307786, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=375736354199303, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=82197583787384, as it is no longer needed
09:52 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1091297107029717, as it is no longer needed
09:52 INFO [databricks.labs.ucx.install] Deleting cluster policy
09:52 INFO [databricks.labs.ucx.install] Deleting secret scope
09:52 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw3] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #276

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.sdk.errors.platform.InternalError: Failed to list databases due to Py4JSecurityException. Update or reinstall UCX to resolve this issue. (21m28.191s)
... (skipped 324448 bytes)
peline-ns9g-ra78a57f32",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_szdk6"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/6c862dbb-1c81-43e7-839a-f2d6ad30d08e
< 200 OK
< {
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "last_modified": 1732048325010,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-19T20:32:05.235Z",
<       "state": "FAILED",
<       "update_id": "81eebaf0-f269-4b5f-8263-774d72d5b482"
<     }
<   ],
<   "name": "pipeline-vm77-ra78a57f32",
<   "pipeline_id": "6c862dbb-1c81-43e7-839a-f2d6ad30d08e",
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "spec": {
<     "catalog": "dummy_cyamc",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024111922",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/45b44955-522e-4560-92ef-1c7b2ecc5354",
<       "pipelines.migration.hmsTarget": "dummy_s3cjb",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "6c862dbb-1c81-43e7-839a-f2d6ad30d08e",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-xKJM-ra78a57f32/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-vm77-ra78a57f32",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_s3cjb"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/92b17352-1dbc-4f93-bfac-3a1bca58290d
< 200 OK
< {
<   "creator_user_name": "[email protected]",
<   "last_modified": 1732045355553,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-19T19:42:35.836Z",
<       "state": "FAILED",
<       "update_id": "a8b0d2c4-6e98-46cd-849b-ecd29538fdf0"
<     }
<   ],
<   "name": "pipeline-zsrq-ra78a57f31",
<   "pipeline_id": "92b17352-1dbc-4f93-bfac-3a1bca58290d",
<   "run_as_user_name": "[email protected]",
<   "spec": {
<     "catalog": "dummy_cmmhb",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024111921",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/ad060e86-f3b1-4e72-8cd6-5cc1a6275a7e",
<       "pipelines.migration.hmsTarget": "dummy_sr7vd",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "92b17352-1dbc-4f93-bfac-3a1bca58290d",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/[email protected]/dummy-L40I-ra78a57f31/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-zsrq-ra78a57f31",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_sr7vd"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/94d35685-384b-470a-848b-be5747643123
< 200 OK
< {
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "last_modified": 1732046264259,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-19T19:57:44.470Z",
<       "state": "FAILED",
<       "update_id": "6c0d44c8-8f3b-443e-87ad-8e3f7fd26195"
<     }
<   ],
<   "name": "pipeline-jo0i-ra78a57f31",
<   "pipeline_id": "94d35685-384b-470a-848b-be5747643123",
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "spec": {
<     "catalog": "dummy_cs3sr",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024111921",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/e7f4d68d-9eb7-470e-8fa5-0bdd1faad470",
<       "pipelines.migration.hmsTarget": "dummy_su3dd",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "94d35685-384b-470a-848b-be5747643123",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Yb68-ra78a57f31/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-jo0i-ra78a57f31",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_su3dd"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/9608489c-4bdc-4bec-8223-f0ef77919390
< 200 OK
< {
<   "creator_user_name": "[email protected]",
<   "last_modified": 1732046326269,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-19T19:58:46.485Z",
<       "state": "FAILED",
<       "update_id": "67622001-b56f-4cf8-80a4-65fc4749e70f"
<     }
<   ],
<   "name": "pipeline-b7qg-ra78a57f31",
<   "pipeline_id": "9608489c-4bdc-4bec-8223-f0ef77919390",
<   "run_as_user_name": "[email protected]",
<   "spec": {
<     "catalog": "dummy_cdbvx",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024111921",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/a14f0583-08e0-4655-96d5-e6a09bade1bf",
<       "pipelines.migration.hmsTarget": "dummy_subz7",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "9608489c-4bdc-4bec-8223-f0ef77919390",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/[email protected]/dummy-PQEb-ra78a57f31/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-b7qg-ra78a57f31",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_subz7"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/a78ec1f9-eb26-4bcb-9373-4b81eeba38cf
< 200 OK
< {
<   "creator_user_name": "[email protected]",
<   "last_modified": 1732119772160,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-20T16:22:52.465Z",
<       "state": "FAILED",
<       "update_id": "9a93e22d-2fb9-4c9e-80fe-69f5c5637ce0"
<     }
<   ],
<   "name": "pipeline-twi1-ra78a57f92",
<   "pipeline_id": "a78ec1f9-eb26-4bcb-9373-4b81eeba38cf",
<   "run_as_user_name": "[email protected]",
<   "spec": {
<     "catalog": "dummy_c0io3",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024112018",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/3b3d0159-18ef-4bf9-a43a-690101618f4c",
<       "pipelines.migration.hmsTarget": "dummy_sgw9k",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "a78ec1f9-eb26-4bcb-9373-4b81eeba38cf",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/[email protected]/dummy-MAox-ra78a57f92/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-twi1-ra78a57f92",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_sgw9k"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/c4a57e44-7bc7-44d1-9e60-5d6487fb7252
< 200 OK
< {
<   "creator_user_name": "[email protected]",
<   "last_modified": 1732118670607,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-20T16:04:30.841Z",
<       "state": "FAILED",
<       "update_id": "03e9d5ed-7a50-432d-a5e3-3537dbdc92da"
<     }
<   ],
<   "name": "pipeline-wg9g-ra78a57f92",
<   "pipeline_id": "c4a57e44-7bc7-44d1-9e60-5d6487fb7252",
<   "run_as_user_name": "[email protected]",
<   "spec": {
<     "catalog": "dummy_c7qag",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024112018",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/90d0f7a6-9962-4892-be8f-210772ee8759",
<       "pipelines.migration.hmsTarget": "dummy_shrb7",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "c4a57e44-7bc7-44d1-9e60-5d6487fb7252",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/[email protected]/dummy-DEBi-ra78a57f92/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-wg9g-ra78a57f92",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_shrb7"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/c94b3ab7-418b-4b92-966f-49e862dcbeed
< 200 OK
< {
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "last_modified": 1732046671091,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-19T20:04:31.294Z",
<       "state": "FAILED",
<       "update_id": "8cdbf0ba-03a4-4bbf-80c3-617894e26baa"
<     }
<   ],
<   "name": "pipeline-htoo-ra78a57f32",
<   "pipeline_id": "c94b3ab7-418b-4b92-966f-49e862dcbeed",
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "spec": {
<     "catalog": "dummy_c1i1h",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024111922",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/5a475b50-bc02-438a-9b00-bc67d9c1872e",
<       "pipelines.migration.hmsTarget": "dummy_svcqz",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "c94b3ab7-418b-4b92-966f-49e862dcbeed",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-xx7s-ra78a57f32/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-htoo-ra78a57f32",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_svcqz"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.sdk:assess_pipelines] GET /api/2.0/pipelines/cf327397-c654-4d14-a3e9-8101a4480105
< 200 OK
< {
<   "creator_user_name": "[email protected]",
<   "last_modified": 1732118848262,
<   "latest_updates": [
<     {
<       "creation_time": "2024-11-20T16:07:28.537Z",
<       "state": "FAILED",
<       "update_id": "09e80195-c607-4138-a8f1-23f6dbef9252"
<     }
<   ],
<   "name": "pipeline-u752-ra78a57f92",
<   "pipeline_id": "cf327397-c654-4d14-a3e9-8101a4480105",
<   "run_as_user_name": "[email protected]",
<   "spec": {
<     "catalog": "dummy_cjexi",
<     "clusters": [
<       {
<         "custom_tags": {
<           "RemoveAfter": "2024112018",
<           "cluster_type": "TEST_SCHEMA"
<         },
<         "label": "TEST_SCHEMA",
<         "node_type_id": "Standard_D4pds_v6",
<         "num_workers": 1
<       }
<     ],
<     "configuration": {
<       "pipelines.migration.hmsStorage": "dbfs:/pipelines/8de5ed83-a9a4-48a7-b589-86b4682d06b1",
<       "pipelines.migration.hmsTarget": "dummy_sn48b",
<       "pipelines.migration.ignoreExplicitPath": "true"
<     },
<     "id": "cf327397-c654-4d14-a3e9-8101a4480105",
<     "libraries": [
<       {
<         "notebook": {
<           "path": "/Users/[email protected]/dummy-0bo6-ra78a57f92/dlt_notebook.py"
<         }
<       }
<     ],
<     "name": "pipeline-u752-ra78a57f92",
<     "pipeline_type": "WORKSPACE",
<     "target": "dummy_sn48b"
<   },
<   "state": "IDLE"
< }
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:assess_pipelines] [hive_metastore.dummy_saq7k.pipelines] found 17 new records for pipelines
05:27 INFO [databricks.labs.ucx:crawl_mounts] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/crawl_mounts.log
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_mounts] [hive_metastore.dummy_saq7k.mounts] fetching mounts inventory
05:27 DEBUG [databricks.labs.lsql.backends:crawl_mounts] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`mounts`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_mounts] [hive_metastore.dummy_saq7k.mounts] crawling new set of snapshot data for mounts
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_mounts] [hive_metastore.dummy_saq7k.mounts] found 8 new records for mounts
05:27 INFO [databricks.labs.ucx:assess_workflows] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/assess_workflows.log
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list
< 200 OK
< {
<   "has_more": true,
<   "jobs": [
<     {
<       "created_time": 1732166541916,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 69833295128531,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "[OCPA] migrate-tables",
<         "tags": {
<           "RemoveAfter": "2024112109",
<           "version": "v0.50.1+720241121052214"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (19 additional elements)"
<   ],
<   "next_page_token": "CAEorZyt6bQyMInFwenM9zE="
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=69833295128531
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=829099560200595
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=154610563329649
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=360528009852719
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=878658693773912
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=922732365922699
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=188707081133857
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1101224239376047
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=650343664119188
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=696156555742785
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1006710622635165
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1064365143046786
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=755683372170208
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=605441179638801
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=533237508805192
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=850905022783821
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=906229698663301
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=840988217236253
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=547569964699010
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=219613710279305
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list?page_token=CAEorZyt6bQyMInFwenM9zE=
< 200 OK
< {
<   "has_more": true,
<   "jobs": [
<     {
<       "created_time": 1732166503610,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 824191750181297,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "[KMYS] migrate-groups",
<         "tags": {
<           "RemoveAfter": "2024112109",
<           "version": "v0.50.1+720241121052138"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (19 additional elements)"
<   ],
<   "next_page_token": "CAEood-n6bQyMPXc5qa712U=",
<   "prev_page_token": "CAAoupmt6bQyMLGj_MORs7sB"
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=824191750181297
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=828169549869394
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=510125214893722
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=897571195813915
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=663205049239907
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=665929046228890
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=946190212077690
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1112199966487772
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=776579258624269
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=628527340757
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1047110507954570
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=463032459793911
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=289758399718803
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=325406761702440
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=618918064434139
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=403138757339967
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=591557043254101
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=923798566571511
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=630407480598789
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=447207913926261
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list?page_token=CAEood-n6bQyMPXc5qa712U=
< 200 OK
< {
<   "has_more": false,
<   "jobs": [
<     {
<       "created_time": 1732165978703,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 548098714908813,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "[PFHC] migrate-external-hiveserde-tables-in-place-experimental",
<         "tags": {
<           "RemoveAfter": "2024112109",
<           "version": "v0.50.1+720241121051244"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (16 additional elements)"
<   ],
<   "prev_page_token": "CAAoz5SN6bQyMI3x0Oniz3w="
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=548098714908813
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1064290974752324
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=605129136231567
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=753437250657406
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=251198955203898
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1086250968066529
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=329035031235111
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=426873636246364
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=580680049988311
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=964372515693499
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=106355500733655
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=787913020673530
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=488006148415052
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=1125113270283935: dummy-jHHtG
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=841001801377893: dummy-jY4XL
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=152846511905881: dummy-jAXqm
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=13862932966175: dummy-j3qYZ
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Running 4 linting tasks in parallel...
05:27 DEBUG [databricks.labs.blueprint.parallel:assess_workflows] Starting 4 tasks in 16 threads
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=841001801377893
< 200 OK
< {
<   "created_time": 1732165629985,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 841001801377893,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-jY4XL",
<     "tags": {
<       "RemoveAfter": "2024112107"
<     },
<     "tasks": [
<       {
<         "description": "B61s",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "run_if": "ALL_SUCCESS",
<         "spark_python_task": {
<           "python_file": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py"
<         },
<         "task_key": "2zdn",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering 2zdn entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=1125113270283935
< 200 OK
< {
<   "created_time": 1732165631145,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 1125113270283935,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-jHHtG",
<     "tags": {
<       "RemoveAfter": "2024112107"
<     },
<     "tasks": [
<       {
<         "description": "Ytyn",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "run_if": "ALL_SUCCESS",
<         "spark_python_task": {
<           "python_file": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py"
<         },
<         "task_key": "bsGn",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering bsGn entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=152846511905881
< 200 OK
< {
<   "created_time": 1732165628778,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 152846511905881,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-jAXqm",
<     "tags": {
<       "RemoveAfter": "2024112107"
<     },
<     "tasks": [
<       {
<         "description": "mIKY",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "notebook_task": {
<           "notebook_path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb",
<           "source": "WORKSPACE"
<         },
<         "run_if": "ALL_SUCCESS",
<         "task_key": "47Nj",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering 47Nj entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=13862932966175
< 200 OK
< {
<   "created_time": 1732165627784,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 13862932966175,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-j3qYZ",
<     "tags": {
<       "RemoveAfter": "2024112107"
<     },
<     "tasks": [
<       {
<         "description": "36iD",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "notebook_task": {
<           "notebook_path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb",
<           "source": "WORKSPACE"
<         },
<         "run_if": "ALL_SUCCESS",
<         "task_key": "BfDE",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering BfDE entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py
< 200 OK
< {
<   "created_at": 1732165629592,
<   "modified_at": 1732165629592,
<   "object_id": 2626665947003134,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py",
<   "resource_id": "2626665947003134"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb
< 200 OK
< {
<   "created_at": 1732165628468,
<   "language": "PYTHON",
<   "modified_at": 1732165628468,
<   "object_id": 2626665947003132,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb",
<   "resource_id": "2626665947003132"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb
< 200 OK
< {
<   "created_at": 1732165627455,
<   "language": "PYTHON",
<   "modified_at": 1732165627455,
<   "object_id": 2626665947003125,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb",
<   "resource_id": "2626665947003125"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py
< 200 OK
< {
<   "created_at": 1732165629592,
<   "modified_at": 1732165629592,
<   "object_id": 2626665947003134,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py",
<   "resource_id": "2626665947003134"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py
< 200 OK
< {
<   "created_at": 1732165630792,
<   "modified_at": 1732165630792,
<   "object_id": 2626665947003135,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py",
<   "resource_id": "2626665947003135"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb
< 200 OK
< {
<   "created_at": 1732165627455,
<   "language": "PYTHON",
<   "modified_at": 1732165627455,
<   "object_id": 2626665947003125,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb",
<   "resource_id": "2626665947003125"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb
< 200 OK
< {
<   "created_at": 1732165628468,
<   "language": "PYTHON",
<   "modified_at": 1732165628468,
<   "object_id": 2626665947003132,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb",
<   "resource_id": "2626665947003132"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py
< 200 OK
< {
<   "created_at": 1732165630792,
<   "modified_at": 1732165630792,
<   "object_id": 2626665947003135,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py",
<   "resource_id": "2626665947003135"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb
< 200 OK
< {
<   "created_at": 1732165627455,
<   "language": "PYTHON",
<   "modified_at": 1732165627455,
<   "object_id": 2626665947003125,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb",
<   "resource_id": "2626665947003125"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb
< 200 OK
< {
<   "created_at": 1732165628468,
<   "language": "PYTHON",
<   "modified_at": 1732165628468,
<   "object_id": 2626665947003132,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb",
<   "resource_id": "2626665947003132"
< }
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:27 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting 2zdn dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py>
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting bsGn dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py>
05:27 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-AXI3-ra78a57feb.py:0 [direct-filesystem-access] The use of direct filesystem references is deprecated: dbfs://mnt/file/
05:27 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-0z1h-ra78a57feb.py:0 [TEST_SCHEMA-format-changed-in-dbr8] The TEST_SCHEMA format changed in Databricks Runtime 8.0, from Parquet to Delta
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting 47Nj dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb>
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting BfDE dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb>
05:27 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-KHVR-ra78a57feb:1 [TEST_SCHEMA-format-changed-in-dbr8] The TEST_SCHEMA format changed in Databricks Runtime 8.0, from Parquet to Delta
05:27 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-Pjqk-ra78a57feb:1 [direct-filesystem-access] The use of direct filesystem references is deprecated: dbfs://mnt/notebook/
05:27 INFO [databricks.labs.blueprint.parallel:assess_workflows] linting workflows 4/4, rps: 1.126/sec
05:27 INFO [databricks.labs.blueprint.parallel:assess_workflows] Finished 'linting workflows' tasks: 100% results available (4/4). Took 0:00:03.559575
05:27 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Saving 4 linting problems...
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:assess_workflows] [hive_metastore.dummy_saq7k.directfs_in_paths] found 2 new records for directfs_in_paths
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:assess_workflows] [hive_metastore.dummy_saq7k.used_tables_in_paths] found 2 new records for used_tables_in_paths
05:27 INFO [databricks.labs.ucx:assess_dashboards] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/assess_dashboards.log
05:27 DEBUG [databricks.sdk:assess_dashboards] GET /api/2.0/preview/sql/dashboards/2aa774db-d540-49b5-9e37-b4c7d6f8e21d
< 200 OK
< {
<   "can_edit": true,
<   "color_palette": null,
<   "created_at": "2024-11-21T05:07:12Z",
<   "dashboard_filters_enabled": false,
<   "data_source_id": null,
<   "id": "2aa774db-d540-49b5-9e37-b4c7d6f8e21d",
<   "is_archived": false,
<   "is_draft": false,
<   "is_favorite": false,
<   "name": "ucx_DcSOS_ra78a57feb",
<   "options": {
<     "folder_node_internal_name": "tree/2626665947003141",
<     "folder_node_status": "ACTIVE",
<     "parent": "folders/3865756826903956",
<     "run_as_role": "owner"
<   },
<   "parent": "folders/3865756826903956",
<   "permission_tier": "CAN_MANAGE",
<   "run_as_role": "owner",
<   "run_as_service_principal_id": null,
<   "slug": "ucx_dcsos_ra78a57feb",
<   "tags": [
<     "original_dashboard_tag"
<   ],
<   "updated_at": "2024-11-21T05:07:13Z",
<   "user": {
<     "email": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<     "id": 481119220561874,
<     "name": "labs-account-admin-identity"
<   },
<   "user_id": 481119220561874,
<   "version": 2,
<   "warehouse_id": null,
<   "widgets": [
<     {
<       "created_at": "2024-11-21T05:07:13Z",
<       "dashboard_id": "2aa774db-d540-49b5-9e37-b4c7d6f8e21d",
<       "id": "ab0216b2-f0c5-4a89-acb1-48560279f665",
<       "options": {
<         "position": {
<           "autoHeight": null,
<           "col": 0,
<           "maxSizeX": null,
<           "maxSizeY": null,
<           "minSizeX": null,
<           "minSizeY": null,
<           "row": 0,
<           "sizeX": 3,
<           "sizeY": 3
<         },
<         "title": ""
<       },
<       "text": "",
<       "updated_at": "2024-11-21T05:07:13Z",
<       "visualization": {
<         "created_at": "2024-11-21T05:07:12Z",
<         "description": "",
<         "id": "45b50b1e-1888-4296-a85a-cd7ac1d8cc9f",
<         "name": "",
<         "options": {
<           "columns": [
<             {
<               "allowSearch": true,
<               "name": "id",
<               "title": "id"
<             }
<           ],
<           "condensed": true,
<           "itemsPerPage": 1,
<           "version": 2,
<           "withRowNumber": false
<         },
<         "query": {
<           "created_at": "2024-11-21T05:07:12Z",
<           "data_source_id": null,
<           "description": "Test query",
<           "id": "ebbc2cc4-6212-41a7-9c02-c9aaa52e172a",
<           "is_draft": false,
<           "is_safe": true,
<           "name": "dummy_query_QjZ65",
<           "options": {
<             "catalog": null,
<             "folder_node_internal_name": "tree/2626665947003138",
<             "folder_node_status": "ACTIVE",
<             "parameters": [],
<             "parent": "folders/3865756826903956",
<             "run_as_role": "owner",
<             "schema": null,
<             "visualization_control_order": []
<           },
<           "query": "SELECT * from parquet.`dbfs://mnt/foo2/bar2`",
<           "run_as_role": "owner",
<           "run_as_service_principal_id": null,
<           "tags": [
<             "{\"key\": \"RemoveAfter\", \"value\": \"2024112107\"}"
<           ],
<           "updated_at": "2024-11-21T05:07:12Z",
<           "user_id": 481119220561874,
<           "version": 1
<         },
<         "query_plan": null,
<         "type": "TABLE",
<         "updated_at": "2024-11-21T05:07:12Z"
<       },
<       "width": 1
<     }
<   ]
< }
05:27 INFO [databricks.labs.ucx.source_code.queries:assess_dashboards] Linting dashboard_id=2aa774db-d540-49b5-9e37-b4c7d6f8e21d: ucx_DcSOS_ra78a57feb
05:27 DEBUG [databricks.sdk:assess_dashboards] GET /api/2.0/preview/sql/dashboards/bccf44e3-374d-4775-aac8-24398e68d193
< 200 OK
< {
<   "can_edit": true,
<   "color_palette": null,
<   "created_at": "2024-11-21T05:07:14Z",
<   "dashboard_filters_enabled": false,
<   "data_source_id": null,
<   "id": "bccf44e3-374d-4775-aac8-24398e68d193",
<   "is_archived": false,
<   "is_draft": false,
<   "is_favorite": false,
<   "name": "ucx_DefCp_ra78a57feb",
<   "options": {
<     "folder_node_internal_name": "tree/2626665947003143",
<     "folder_node_status": "ACTIVE",
<     "parent": "folders/3865756826903956",
<     "run_as_role": "owner"
<   },
<   "parent": "folders/3865756826903956",
<   "permission_tier": "CAN_MANAGE",
<   "run_as_role": "owner",
<   "run_as_service_principal_id": null,
<   "slug": "ucx_defcp_ra78a57feb",
<   "tags": [
<     "original_dashboard_tag"
<   ],
<   "updated_at": "2024-11-21T05:07:14Z",
<   "user": {
<     "email": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<     "id": 481119220561874,
<     "name": "labs-account-admin-identity"
<   },
<   "user_id": 481119220561874,
<   "version": 2,
<   "warehouse_id": null,
<   "widgets": [
<     {
<       "created_at": "2024-11-21T05:07:14Z",
<       "dashboard_id": "bccf44e3-374d-4775-aac8-24398e68d193",
<       "id": "d3b1f62e-2c31-4247-a776-68d295bb2cfd",
<       "options": {
<         "position": {
<           "autoHeight": null,
<           "col": 0,
<           "maxSizeX": null,
<           "maxSizeY": null,
<           "minSizeX": null,
<           "minSizeY": null,
<           "row": 0,
<           "sizeX": 3,
<           "sizeY": 3
<         },
<         "title": ""
<       },
<       "text": "",
<       "updated_at": "2024-11-21T05:07:14Z",
<       "visualization": {
<         "created_at": "2024-11-21T05:07:13Z",
<         "description": "",
<         "id": "9a5725f7-4912-46cd-ba3f-bd022aae277c",
<         "name": "",
<         "options": {
<           "columns": [
<             {
<               "allowSearch": true,
<               "name": "id",
<               "title": "id"
<             }
<           ],
<           "condensed": true,
<           "itemsPerPage": 1,
<           "version": 2,
<           "withRowNumber": false
<         },
<         "query": {
<           "created_at": "2024-11-21T05:07:13Z",
<           "data_source_id": null,
<           "description": "Test query",
<           "id": "8509d40c-c5f8-4e99-b666-d07cc967879e",
<           "is_draft": false,
<           "is_safe": true,
<           "name": "dummy_query_QnMaW",
<           "options": {
<             "catalog": null,
<             "folder_node_internal_name": "tree/2626665947003142",
<             "folder_node_status": "ACTIVE",
<             "parameters": [],
<             "parent": "folders/3865756826903956",
<             "run_as_role": "owner",
<             "schema": null,
<             "visualization_control_order": []
<           },
<           "query": "SELECT * from my_schema.my_table",
<           "run_as_role": "owner",
<           "run_as_service_principal_id": null,
<           "tags": [
<             "{\"key\": \"RemoveAfter\", \"value\": \"2024112107\"}"
<           ],
<           "updated_at": "2024-11-21T05:07:13Z",
<           "user_id": 481119220561874,
<           "version": 1
<         },
<         "query_plan": null,
<         "type": "TABLE",
<         "updated_at": "2024-11-21T05:07:13Z"
<       },
<       "width": 1
<     }
<   ]
< }
05:27 INFO [databricks.labs.ucx.source_code.queries:assess_dashboards] Linting dashboard_id=bccf44e3-374d-4775-aac8-24398e68d193: ucx_DefCp_ra78a57feb
05:27 INFO [databricks.labs.ucx.source_code.queries:assess_dashboards] Saving 1 linting problems...
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:assess_dashboards] [hive_metastore.dummy_saq7k.directfs_in_queries] found 1 new records for directfs_in_queries
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:assess_dashboards] [hive_metastore.dummy_saq7k.used_tables_in_queries] found 1 new records for used_tables_in_queries
05:27 INFO [databricks.labs.ucx:estimate_table_size_for_migration] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/estimate_table_size_for_migration.log
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_saq7k.table_size] fetching table_size inventory
05:27 DEBUG [databricks.labs.lsql.backends:estimate_table_size_for_migration] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`table_size`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_saq7k.table_size] crawling new set of snapshot data for table_size
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_saq7k.tables] fetching tables inventory
05:27 DEBUG [databricks.labs.lsql.backends:estimate_table_size_for_migration] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`tables`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_saq7k.tables] crawling new set of snapshot data for tables
05:27 INFO [databricks.labs.ucx.hive_metastore.tables:estimate_table_size_for_migration] Scanning dummy_sep3k
05:27 DEBUG [databricks.labs.blueprint.parallel:estimate_table_size_for_migration] Starting 1 tasks in 16 threads
05:27 WARNING [databricks.labs.ucx.hive_metastore.tables:estimate_table_size_for_migration] failed-table-crawl: listing tables from database -> dummy_sep3k : [SCHEMA_NOT_FOUND] The schema `dummy_sep3k` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 559, in _list_tables
    return list(self._iterator(self._external_catalog.listTables(database)))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 269, in deco
    raise converted from None
pyspark.errors.exceptions.captured.AnalysisException: [SCHEMA_NOT_FOUND] The schema `dummy_sep3k` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
05:27 INFO [databricks.labs.blueprint.parallel:estimate_table_size_for_migration] listing tables 1/1, rps: 12.413/sec
05:27 INFO [databricks.labs.blueprint.parallel:estimate_table_size_for_migration] Finished 'listing tables' tasks: 100% results available (1/1). Took 0:00:00.081725
05:27 INFO [databricks.labs.ucx.hive_metastore.tables:estimate_table_size_for_migration] Finished scanning 0 tables
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_saq7k.tables] found 0 new records for tables
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_saq7k.table_size] found 0 new records for table_size
05:27 INFO [databricks.labs.ucx:guess_external_locations] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/guess_external_locations.log
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_saq7k.external_locations] fetching external_locations inventory
05:27 DEBUG [databricks.labs.lsql.backends:guess_external_locations] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`external_locations`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_saq7k.external_locations] crawling new set of snapshot data for external_locations
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_saq7k.tables] fetching tables inventory
05:27 DEBUG [databricks.labs.lsql.backends:guess_external_locations] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`tables`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_saq7k.tables] crawling new set of snapshot data for tables
05:27 INFO [databricks.labs.ucx.hive_metastore.tables:guess_external_locations] Scanning dummy_sep3k
05:27 DEBUG [databricks.labs.blueprint.parallel:guess_external_locations] Starting 1 tasks in 16 threads
05:27 WARNING [databricks.labs.ucx.hive_metastore.tables:guess_external_locations] failed-table-crawl: listing tables from database -> dummy_sep3k : [SCHEMA_NOT_FOUND] The schema `dummy_sep3k` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 559, in _list_tables
    return list(self._iterator(self._external_catalog.listTables(database)))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 269, in deco
    raise converted from None
pyspark.errors.exceptions.captured.AnalysisException: [SCHEMA_NOT_FOUND] The schema `dummy_sep3k` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
05:27 INFO [databricks.labs.blueprint.parallel:guess_external_locations] listing tables 1/1, rps: 11.820/sec
05:27 INFO [databricks.labs.blueprint.parallel:guess_external_locations] Finished 'listing tables' tasks: 100% results available (1/1). Took 0:00:00.086021
05:27 INFO [databricks.labs.ucx.hive_metastore.tables:guess_external_locations] Finished scanning 0 tables
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_saq7k.tables] found 0 new records for tables
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_saq7k.external_locations] found 0 new records for external_locations
05:27 INFO [databricks.labs.ucx:crawl_grants] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/crawl_grants.log
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.grants] fetching grants inventory
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`grants`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.grants] crawling new set of snapshot data for grants
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.tables] fetching tables inventory
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`tables`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.tables] crawling new set of snapshot data for tables
05:27 INFO [databricks.labs.ucx.hive_metastore.tables:crawl_grants] Scanning dummy_sep3k
05:27 DEBUG [databricks.labs.blueprint.parallel:crawl_grants] Starting 1 tasks in 16 threads
05:27 ERROR [databricks.labs.ucx.hive_metastore.tables:crawl_grants] Failed to list databases due to Py4JSecurityException. Update or reinstall UCX to resolve this issue.
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 559, in _list_tables
    return list(self._iterator(self._external_catalog.listTables(database)))
                               ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/functools.py", line 995, in __get__
    val = self.func(instance)
          ^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 530, in _external_catalog
    return self._spark._jsparkSession.sharedState().externalCatalog()  # pylint: disable=protected-access
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 263, in deco
    return f(*a, **kw)
           ^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value
    raise Py4JError(
py4j.protocol.Py4JError: An error occurred while calling o406.sharedState. Trace:
py4j.security.Py4JSecurityException: Method public org.apache.spark.sql.internal.SharedState org.apache.spark.sql.SparkSession.sharedState() is not whitelisted on class class org.apache.spark.sql.SparkSession
	at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473)
	at py4j.Gateway.invoke(Gateway.java:305)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.base/java.lang.Thread.run(Thread.java:840)


05:27 INFO [databricks.labs.blueprint.parallel:crawl_grants] listing tables 1/1, rps: 131.337/sec
05:27 INFO [databricks.labs.blueprint.parallel:crawl_grants] Finished 'listing tables' tasks: 100% results available (1/1). Took 0:00:00.008822
05:27 INFO [databricks.labs.ucx.hive_metastore.tables:crawl_grants] Finished scanning 0 tables
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.tables] found 0 new records for tables
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.udfs] fetching udfs inventory
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_saq7k`.`udfs`
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.udfs] crawling new set of snapshot data for udfs
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][execute] USE CATALOG `hive_metastore`;
05:27 DEBUG [databricks.labs.ucx.hive_metastore.udfs:crawl_grants] [hive_metastore.dummy_sep3k] listing udfs
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW USER FUNCTIONS FROM `hive_metastore`.`dummy_sep3k`;
05:27 WARNING [databricks.labs.ucx.hive_metastore.udfs:crawl_grants] Schema hive_metastore.dummy_sep3k no longer existed
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.udfs] found 0 new records for udfs
05:27 DEBUG [databricks.labs.blueprint.parallel:crawl_grants] Starting 4 tasks in 16 threads
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON CATALOG `hive_metastore`
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON ANY FILE 
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 
05:27 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON DATABASE `hive_metastore`.`dummy_sep3k`
05:27 ERROR [databricks.labs.ucx.hive_metastore.grants:crawl_grants] Couldn't fetch grants for object DATABASE hive_metastore.dummy_sep3k: An error occurred while calling o406.sql.
: org.apache.spark.SparkSecurityException: Database(dummy_sep3k,Some(hive_metastore)) does not exist.
	at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79)
	at scala.Option.getOrElse(Option.scala:189)
	at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79)
	at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75)
	at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226)
	at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)
	at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:181)
	at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:192)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:387)
	at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:387)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:193)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:387)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:453)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:738)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:334)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1273)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:205)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:675)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:383)
	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1125)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:379)
	at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:329)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:377)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:431)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:426)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:379)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:375)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:481)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:426)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:436)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:426)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:288)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:285)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:383)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:132)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1273)
	at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1280)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1280)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:123)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:969)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1273)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:933)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:992)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)
	at py4j.Gateway.invoke(Gateway.java:306)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.base/java.lang.Thread.run(Thread.java:840)

05:27 INFO [databricks.labs.blueprint.parallel:crawl_grants] listing grants for hive_metastore 4/4, rps: 11.426/sec
05:27 INFO [databricks.labs.blueprint.parallel:crawl_grants] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:00.351583
05:27 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_saq7k.grants] found 1 new records for grants
05:27 INFO [databricks.labs.ucx:crawl_permissions] UCX v0.50.1+720241121051244 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.pFhc/logs/assessment/run-1036948703931009-0/crawl_permissions.log
05:27 INFO [databricks.labs.ucx.assessment.workflows:crawl_permissions] Skipping permission crawling as legacy permission migration is disabled.
05:27 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ----------
05:27 INFO [databricks.labs.ucx.install] Deleting UCX v0.50.1+720241121051244 from https://DATABRICKS_HOST
05:27 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_saq7k
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=488006148415052, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=787913020673530, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=106355500733655, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=964372515693499, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=580680049988311, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=426873636246364, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=329035031235111, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1086250968066529, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=251198955203898, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=753437250657406, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=605129136231567, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1064290974752324, as it is no longer needed
05:27 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=548098714908813, as it is no longer needed
05:27 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:27 INFO [databricks.labs.ucx.install] Deleting secret scope
05:27 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw2] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #280

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.labs.blueprint.parallel.ManyError: Detected 2 failures: InternalError: Failed to list databases due to Py4JSecurityException. Update or reinstall UCX to resolve this issue., Unknown: assess_pipelines: OSError: [Errno 5] Input/output error: '/Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/README.md.lock' (20m14.536s)
... (skipped 400124 bytes)
 \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.sdk:assess_jobs] GET /api/2.0/policies/clusters/get?policy_id=000F2E704E8230F0
< 200 OK
< {
<   "created_at_timestamp": 1732252032000,
<   "definition": "{\"spark_version\": {\"type\": \"fixed\", \"value\": \"16.0.x-scala2.12\"}, \"spark_conf.spark.hadoop.javax... (803 more bytes)",
<   "description": "Custom cluster policy for Unity Catalog Migration (UCX)",
<   "is_TEST_SCHEMA": false,
<   "name": "Unity Catalog Migration (dummy_sdtha) (0a330eb5-dd51-4d97-b6e4-c474356b1d5d)",
<   "policy_id": "000F2E704E8230F0"
< }
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:assess_jobs] [hive_metastore.dummy_sdtha.jobs] found 59 new records for jobs
05:26 INFO [databricks.labs.ucx:assess_workflows] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/assess_workflows.log
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list
< 200 OK
< {
<   "has_more": true,
<   "jobs": [
<     {
<       "created_time": 1732252855371,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 375782975821524,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "dummy-jK3Hi",
<         "tags": {
<           "RemoveAfter": "2024112207"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (19 additional elements)"
<   ],
<   "next_page_token": "CAEow7i7krUyMJLcjLeHw20="
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=375782975821524
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=652606796772608
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=699729189967897
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=617772007871753
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=603920134411537
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=27118679064911
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=349814724028095
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1052732832957508
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=990228115660926
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=216600176050141
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=836423790870481
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=718000796121160
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=969913570639479
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1085796820173824
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=639364042566608
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1089601177143695
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=455795842377788
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=858989082094947
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=964191129701919
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=481691166780946
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list?page_token=CAEow7i7krUyMJLcjLeHw20=
< 200 OK
< {
<   "has_more": true,
<   "jobs": [
<     {
<       "created_time": 1732252719604,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 1026698072293330,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "[OVOL] migrate-tables",
<         "tags": {
<           "RemoveAfter": "2024112209",
<           "version": "v0.50.1+720241122051833"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (19 additional elements)"
<   ],
<   "next_page_token": "CAEonMO5krUyMOX_soTtpNQB",
<   "prev_page_token": "CAAo9LO7krUyMNL_j6_suOkB"
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1026698072293330
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=986533835741072
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=461732327793193
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=32628696302936
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=638027033703006
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=971056330135422
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=683074699740278
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=652235873715035
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=927756565876427
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=97978819832749
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=870144919540250
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=254063148394033
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=266746182391033
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1122948191426082
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=825067008664161
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1001802015886096
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=71299472669391
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=587164698388225
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=802684617383906
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=933652079624165
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list?page_token=CAEonMO5krUyMOX_soTtpNQB
< 200 OK
< {
<   "has_more": true,
<   "jobs": [
<     {
<       "created_time": 1732252688348,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 3584022694909,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "[IOQM] migrate-data-reconciliation",
<         "tags": {
<           "RemoveAfter": "2024112209",
<           "version": "v0.50.1+720241122051805"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (19 additional elements)"
<   ],
<   "next_page_token": "CAEoktSmkrUyMMu7wdWCxRo=",
<   "prev_page_token": "CAAo3L-5krUyMP2XmcOnaA=="
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=3584022694909
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=273606788111002
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=136407440762647
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1006072252815533
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=573251780842546
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=19909822359120
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1017599333528951
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1049473743549185
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=96309291471861
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=95590317771301
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=908914918229672
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=158637781452735
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=57351708208425
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=495878641078113
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=445581436899177
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=576347590665439
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=487967856149373
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=158890630391480
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=61137511780157
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=116720747437515
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/list?page_token=CAEoktSmkrUyMMu7wdWCxRo=
< 200 OK
< {
<   "has_more": false,
<   "jobs": [
<     {
<       "created_time": 1732252379113,
<       "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "job_id": 118405263079872,
<       "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<       "settings": {
<         "email_notifications": {},
<         "format": "MULTI_TASK",
<         "max_concurrent_runs": 1,
<         "name": "[QCEP] validate-groups-permissions",
<         "tags": {
<           "RemoveAfter": "2024112209",
<           "version": "v0.50.1+720241122051248"
<         },
<         "timeout_seconds": 0
<       }
<     },
<     "... (14 additional elements)"
<   ],
<   "prev_page_token": "CAAo6c-mkrUyMMDzlP2F9ho="
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=118405263079872
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=675259360849584
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=863323240184881
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=235355368297948
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=823577848945095
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=580602565800703
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1089187274465298
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1024858039202568
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=936975718698707
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=1056936420508344
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Skipping job_id=824558880345542
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=489279085504437: dummy-j1Lqb
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=246531855140129: dummy-jhoUY
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=730186695541121: dummy-jEWft
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job_id=84706922026799: dummy-jFLUZ
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Running 4 linting tasks in parallel...
05:26 DEBUG [databricks.labs.blueprint.parallel:assess_workflows] Starting 4 tasks in 16 threads
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=246531855140129
< 200 OK
< {
<   "created_time": 1732252022041,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 246531855140129,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-jhoUY",
<     "tags": {
<       "RemoveAfter": "2024112207"
<     },
<     "tasks": [
<       {
<         "description": "IRav",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "run_if": "ALL_SUCCESS",
<         "spark_python_task": {
<           "python_file": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py"
<         },
<         "task_key": "GbUK",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering GbUK entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=489279085504437
< 200 OK
< {
<   "created_time": 1732252023266,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 489279085504437,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-j1Lqb",
<     "tags": {
<       "RemoveAfter": "2024112207"
<     },
<     "tasks": [
<       {
<         "description": "jgbR",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "run_if": "ALL_SUCCESS",
<         "spark_python_task": {
<           "python_file": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py"
<         },
<         "task_key": "Tpl5",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering Tpl5 entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=730186695541121
< 200 OK
< {
<   "created_time": 1732252020464,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 730186695541121,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-jEWft",
<     "tags": {
<       "RemoveAfter": "2024112207"
<     },
<     "tasks": [
<       {
<         "description": "sZMG",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "notebook_task": {
<           "notebook_path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f",
<           "source": "WORKSPACE"
<         },
<         "run_if": "ALL_SUCCESS",
<         "task_key": "Jf5p",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering Jf5p entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py
< 200 OK
< {
<   "created_at": 1732252022864,
<   "modified_at": 1732252022864,
<   "object_id": 3399139300163634,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py",
<   "resource_id": "3399139300163634"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f
< 200 OK
< {
<   "created_at": 1732252020003,
<   "language": "PYTHON",
<   "modified_at": 1732252020003,
<   "object_id": 3399139300163626,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f",
<   "resource_id": "3399139300163626"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py
< 200 OK
< {
<   "created_at": 1732252021405,
<   "modified_at": 1732252021405,
<   "object_id": 3399139300163628,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py",
<   "resource_id": "3399139300163628"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.1/jobs/get?job_id=84706922026799
< 200 OK
< {
<   "created_time": 1732252019286,
<   "creator_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "job_id": 84706922026799,
<   "run_as_owner": true,
<   "run_as_user_name": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<   "settings": {
<     "email_notifications": {},
<     "format": "MULTI_TASK",
<     "max_concurrent_runs": 1,
<     "name": "dummy-jFLUZ",
<     "tags": {
<       "RemoveAfter": "2024112207"
<     },
<     "tasks": [
<       {
<         "description": "B7me",
<         "email_notifications": {},
<         "new_cluster": {
<           "CLOUD_ENV_attributes": {
<             "availability": "ON_DEMAND_AZURE"
<           },
<           "enable_elastic_disk": true,
<           "node_type_id": "Standard_D4pds_v6",
<           "num_workers": 1,
<           "spark_version": "16.0.x-scala2.12"
<         },
<         "notebook_task": {
<           "notebook_path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f",
<           "source": "WORKSPACE"
<         },
<         "run_if": "ALL_SUCCESS",
<         "task_key": "aGy3",
<         "timeout_seconds": 0
<       }
<     ],
<     "timeout_seconds": 0,
<     "webhook_notifications": {}
<   }
< }
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Discovering aGy3 entrypoint: /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py
< 200 OK
< {
<   "created_at": 1732252022864,
<   "modified_at": 1732252022864,
<   "object_id": 3399139300163634,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py",
<   "resource_id": "3399139300163634"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f
< 200 OK
< {
<   "created_at": 1732252020003,
<   "language": "PYTHON",
<   "modified_at": 1732252020003,
<   "object_id": 3399139300163626,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f",
<   "resource_id": "3399139300163626"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py
< 200 OK
< {
<   "created_at": 1732252021405,
<   "modified_at": 1732252021405,
<   "object_id": 3399139300163628,
<   "object_type": "FILE",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py",
<   "resource_id": "3399139300163628"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f
< 200 OK
< {
<   "created_at": 1732252020003,
<   "language": "PYTHON",
<   "modified_at": 1732252020003,
<   "object_id": 3399139300163626,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f",
<   "resource_id": "3399139300163626"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f
< 200 OK
< {
<   "created_at": 1732252018952,
<   "language": "PYTHON",
<   "modified_at": 1732252018952,
<   "object_id": 3399139300163624,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f",
<   "resource_id": "3399139300163624"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f
< 200 OK
< {
<   "created_at": 1732252018952,
<   "language": "PYTHON",
<   "modified_at": 1732252018952,
<   "object_id": 3399139300163624,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f",
<   "resource_id": "3399139300163624"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/get-status?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f
< 200 OK
< {
<   "created_at": 1732252018952,
<   "language": "PYTHON",
<   "modified_at": 1732252018952,
<   "object_id": 3399139300163624,
<   "object_type": "NOTEBOOK",
<   "path": "/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f",
<   "resource_id": "3399139300163624"
< }
05:26 DEBUG [databricks.sdk:assess_workflows] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f&direct_download=true&format=AUTO
< 200 OK
< [raw stream]
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting Tpl5 dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py>
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting Jf5p dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f>
05:26 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-ZYpN-ra78a5804f.py:0 [TEST_SCHEMA-format-changed-in-dbr8] The TEST_SCHEMA format changed in Databricks Runtime 8.0, from Parquet to Delta
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting GbUK dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py>
05:26 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-3eEp-ra78a5804f:1 [TEST_SCHEMA-format-changed-in-dbr8] The TEST_SCHEMA format changed in Databricks Runtime 8.0, from Parquet to Delta
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Linting aGy3 dependency: Dependency</Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f>
05:26 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-7Psd-ra78a5804f:1 [direct-filesystem-access] The use of direct filesystem references is deprecated: dbfs://mnt/notebook/
05:26 WARNING [databricks.labs.ucx.source_code.jobs:assess_workflows] Found job problems:
/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/dummy-nXV4-ra78a5804f.py:0 [direct-filesystem-access] The use of direct filesystem references is deprecated: dbfs://mnt/file/
05:26 INFO [databricks.labs.blueprint.parallel:assess_workflows] linting workflows 4/4, rps: 1.456/sec
05:26 INFO [databricks.labs.blueprint.parallel:assess_workflows] Finished 'linting workflows' tasks: 100% results available (4/4). Took 0:00:02.748081
05:26 INFO [databricks.labs.ucx.source_code.jobs:assess_workflows] Saving 4 linting problems...
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:assess_workflows] [hive_metastore.dummy_sdtha.directfs_in_paths] found 2 new records for directfs_in_paths
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:assess_workflows] [hive_metastore.dummy_sdtha.used_tables_in_paths] found 2 new records for used_tables_in_paths
05:26 INFO [databricks.labs.ucx:assess_dashboards] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/assess_dashboards.log
05:26 DEBUG [databricks.sdk:assess_dashboards] GET /api/2.0/preview/sql/dashboards/55571b3b-63b6-4894-bff5-bae401c12653
< 200 OK
< {
<   "can_edit": true,
<   "color_palette": null,
<   "created_at": "2024-11-22T05:07:04Z",
<   "dashboard_filters_enabled": false,
<   "data_source_id": null,
<   "id": "55571b3b-63b6-4894-bff5-bae401c12653",
<   "is_archived": false,
<   "is_draft": false,
<   "is_favorite": false,
<   "name": "ucx_Ddg1u_ra78a5804f",
<   "options": {
<     "folder_node_internal_name": "tree/3399139300163640",
<     "folder_node_status": "ACTIVE",
<     "parent": "folders/3865756826903956",
<     "run_as_role": "owner"
<   },
<   "parent": "folders/3865756826903956",
<   "permission_tier": "CAN_MANAGE",
<   "run_as_role": "owner",
<   "run_as_service_principal_id": null,
<   "slug": "ucx_ddg1u_ra78a5804f",
<   "tags": [
<     "original_dashboard_tag"
<   ],
<   "updated_at": "2024-11-22T05:07:05Z",
<   "user": {
<     "email": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<     "id": 481119220561874,
<     "name": "labs-account-admin-identity"
<   },
<   "user_id": 481119220561874,
<   "version": 2,
<   "warehouse_id": null,
<   "widgets": [
<     {
<       "created_at": "2024-11-22T05:07:05Z",
<       "dashboard_id": "55571b3b-63b6-4894-bff5-bae401c12653",
<       "id": "68f13ff6-ec25-4a22-9c07-b6eab37c85ea",
<       "options": {
<         "position": {
<           "autoHeight": null,
<           "col": 0,
<           "maxSizeX": null,
<           "maxSizeY": null,
<           "minSizeX": null,
<           "minSizeY": null,
<           "row": 0,
<           "sizeX": 3,
<           "sizeY": 3
<         },
<         "title": ""
<       },
<       "text": "",
<       "updated_at": "2024-11-22T05:07:05Z",
<       "visualization": {
<         "created_at": "2024-11-22T05:07:04Z",
<         "description": "",
<         "id": "55871dab-60be-4777-b778-c7e10b5b6b9b",
<         "name": "",
<         "options": {
<           "columns": [
<             {
<               "allowSearch": true,
<               "name": "id",
<               "title": "id"
<             }
<           ],
<           "condensed": true,
<           "itemsPerPage": 1,
<           "version": 2,
<           "withRowNumber": false
<         },
<         "query": {
<           "created_at": "2024-11-22T05:07:04Z",
<           "data_source_id": null,
<           "description": "Test query",
<           "id": "9219ad4e-a7e7-4fa3-b067-5dff090dc5d4",
<           "is_draft": false,
<           "is_safe": true,
<           "name": "dummy_query_Q1cXt",
<           "options": {
<             "catalog": null,
<             "folder_node_internal_name": "tree/3399139300163637",
<             "folder_node_status": "ACTIVE",
<             "parameters": [],
<             "parent": "folders/3865756826903956",
<             "run_as_role": "owner",
<             "schema": null,
<             "visualization_control_order": []
<           },
<           "query": "SELECT * from parquet.`dbfs://mnt/foo2/bar2`",
<           "run_as_role": "owner",
<           "run_as_service_principal_id": null,
<           "tags": [
<             "{\"key\": \"RemoveAfter\", \"value\": \"2024112207\"}"
<           ],
<           "updated_at": "2024-11-22T05:07:04Z",
<           "user_id": 481119220561874,
<           "version": 1
<         },
<         "query_plan": null,
<         "type": "TABLE",
<         "updated_at": "2024-11-22T05:07:04Z"
<       },
<       "width": 1
<     }
<   ]
< }
05:26 INFO [databricks.labs.ucx.source_code.queries:assess_dashboards] Linting dashboard_id=55571b3b-63b6-4894-bff5-bae401c12653: ucx_Ddg1u_ra78a5804f
05:26 DEBUG [databricks.sdk:assess_dashboards] GET /api/2.0/preview/sql/dashboards/8bad5389-15a7-4ddd-8bba-8d397dd4a0c0
< 200 OK
< {
<   "can_edit": true,
<   "color_palette": null,
<   "created_at": "2024-11-22T05:07:05Z",
<   "dashboard_filters_enabled": false,
<   "data_source_id": null,
<   "id": "8bad5389-15a7-4ddd-8bba-8d397dd4a0c0",
<   "is_archived": false,
<   "is_draft": false,
<   "is_favorite": false,
<   "name": "ucx_DAfDy_ra78a5804f",
<   "options": {
<     "folder_node_internal_name": "tree/3399139300163644",
<     "folder_node_status": "ACTIVE",
<     "parent": "folders/3865756826903956",
<     "run_as_role": "owner"
<   },
<   "parent": "folders/3865756826903956",
<   "permission_tier": "CAN_MANAGE",
<   "run_as_role": "owner",
<   "run_as_service_principal_id": null,
<   "slug": "ucx_dafdy_ra78a5804f",
<   "tags": [
<     "original_dashboard_tag"
<   ],
<   "updated_at": "2024-11-22T05:07:06Z",
<   "user": {
<     "email": "0a330eb5-dd51-4d97-b6e4-c474356b1d5d",
<     "id": 481119220561874,
<     "name": "labs-account-admin-identity"
<   },
<   "user_id": 481119220561874,
<   "version": 2,
<   "warehouse_id": null,
<   "widgets": [
<     {
<       "created_at": "2024-11-22T05:07:06Z",
<       "dashboard_id": "8bad5389-15a7-4ddd-8bba-8d397dd4a0c0",
<       "id": "bb652e99-4876-4aac-a5f7-ea19e2caf9ef",
<       "options": {
<         "position": {
<           "autoHeight": null,
<           "col": 0,
<           "maxSizeX": null,
<           "maxSizeY": null,
<           "minSizeX": null,
<           "minSizeY": null,
<           "row": 0,
<           "sizeX": 3,
<           "sizeY": 3
<         },
<         "title": ""
<       },
<       "text": "",
<       "updated_at": "2024-11-22T05:07:06Z",
<       "visualization": {
<         "created_at": "2024-11-22T05:07:05Z",
<         "description": "",
<         "id": "d19b44e1-e118-4ede-95a2-0aa23b3a438f",
<         "name": "",
<         "options": {
<           "columns": [
<             {
<               "allowSearch": true,
<               "name": "id",
<               "title": "id"
<             }
<           ],
<           "condensed": true,
<           "itemsPerPage": 1,
<           "version": 2,
<           "withRowNumber": false
<         },
<         "query": {
<           "created_at": "2024-11-22T05:07:05Z",
<           "data_source_id": null,
<           "description": "Test query",
<           "id": "b50fcaee-1fde-47cf-8b7e-c7a7c4841652",
<           "is_draft": false,
<           "is_safe": true,
<           "name": "dummy_query_QQI34",
<           "options": {
<             "catalog": null,
<             "folder_node_internal_name": "tree/3399139300163641",
<             "folder_node_status": "ACTIVE",
<             "parameters": [],
<             "parent": "folders/3865756826903956",
<             "run_as_role": "owner",
<             "schema": null,
<             "visualization_control_order": []
<           },
<           "query": "SELECT * from my_schema.my_table",
<           "run_as_role": "owner",
<           "run_as_service_principal_id": null,
<           "tags": [
<             "{\"key\": \"RemoveAfter\", \"value\": \"2024112207\"}"
<           ],
<           "updated_at": "2024-11-22T05:07:05Z",
<           "user_id": 481119220561874,
<           "version": 1
<         },
<         "query_plan": null,
<         "type": "TABLE",
<         "updated_at": "2024-11-22T05:07:05Z"
<       },
<       "width": 1
<     }
<   ]
< }
05:26 INFO [databricks.labs.ucx.source_code.queries:assess_dashboards] Linting dashboard_id=8bad5389-15a7-4ddd-8bba-8d397dd4a0c0: ucx_DAfDy_ra78a5804f
05:26 INFO [databricks.labs.ucx.source_code.queries:assess_dashboards] Saving 1 linting problems...
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:assess_dashboards] [hive_metastore.dummy_sdtha.directfs_in_queries] found 1 new records for directfs_in_queries
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:assess_dashboards] [hive_metastore.dummy_sdtha.used_tables_in_queries] found 1 new records for used_tables_in_queries
05:26 INFO [databricks.labs.ucx:estimate_table_size_for_migration] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/estimate_table_size_for_migration.log
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_sdtha.table_size] fetching table_size inventory
05:26 DEBUG [databricks.labs.lsql.backends:estimate_table_size_for_migration] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`table_size`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_sdtha.table_size] crawling new set of snapshot data for table_size
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_sdtha.tables] fetching tables inventory
05:26 DEBUG [databricks.labs.lsql.backends:estimate_table_size_for_migration] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`tables`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_sdtha.tables] crawling new set of snapshot data for tables
05:26 INFO [databricks.labs.ucx.hive_metastore.tables:estimate_table_size_for_migration] Scanning dummy_skooy
05:26 DEBUG [databricks.labs.blueprint.parallel:estimate_table_size_for_migration] Starting 1 tasks in 16 threads
05:26 WARNING [databricks.labs.ucx.hive_metastore.tables:estimate_table_size_for_migration] failed-table-crawl: listing tables from database -> dummy_skooy : [SCHEMA_NOT_FOUND] The schema `dummy_skooy` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 559, in _list_tables
    return list(self._iterator(self._external_catalog.listTables(database)))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 269, in deco
    raise converted from None
pyspark.errors.exceptions.captured.AnalysisException: [SCHEMA_NOT_FOUND] The schema `dummy_skooy` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
05:26 INFO [databricks.labs.blueprint.parallel:estimate_table_size_for_migration] listing tables 1/1, rps: 15.419/sec
05:26 INFO [databricks.labs.blueprint.parallel:estimate_table_size_for_migration] Finished 'listing tables' tasks: 100% results available (1/1). Took 0:00:00.066184
05:26 INFO [databricks.labs.ucx.hive_metastore.tables:estimate_table_size_for_migration] Finished scanning 0 tables
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_sdtha.tables] found 0 new records for tables
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:estimate_table_size_for_migration] [hive_metastore.dummy_sdtha.table_size] found 0 new records for table_size
05:26 INFO [databricks.labs.ucx:guess_external_locations] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/guess_external_locations.log
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_sdtha.external_locations] fetching external_locations inventory
05:26 DEBUG [databricks.labs.lsql.backends:guess_external_locations] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`external_locations`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_sdtha.external_locations] crawling new set of snapshot data for external_locations
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_sdtha.tables] fetching tables inventory
05:26 DEBUG [databricks.labs.lsql.backends:guess_external_locations] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`tables`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_sdtha.tables] crawling new set of snapshot data for tables
05:26 INFO [databricks.labs.ucx.hive_metastore.tables:guess_external_locations] Scanning dummy_skooy
05:26 DEBUG [databricks.labs.blueprint.parallel:guess_external_locations] Starting 1 tasks in 16 threads
05:26 WARNING [databricks.labs.ucx.hive_metastore.tables:guess_external_locations] failed-table-crawl: listing tables from database -> dummy_skooy : [SCHEMA_NOT_FOUND] The schema `dummy_skooy` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 559, in _list_tables
    return list(self._iterator(self._external_catalog.listTables(database)))
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 269, in deco
    raise converted from None
pyspark.errors.exceptions.captured.AnalysisException: [SCHEMA_NOT_FOUND] The schema `dummy_skooy` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a catalog, verify the current_schema() output, or qualify the name with the correct catalog.
To tolerate the error on drop use DROP SCHEMA IF EXISTS. SQLSTATE: 42704
05:26 INFO [databricks.labs.blueprint.parallel:guess_external_locations] listing tables 1/1, rps: 8.975/sec
05:26 INFO [databricks.labs.blueprint.parallel:guess_external_locations] Finished 'listing tables' tasks: 100% results available (1/1). Took 0:00:00.113742
05:26 INFO [databricks.labs.ucx.hive_metastore.tables:guess_external_locations] Finished scanning 0 tables
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_sdtha.tables] found 0 new records for tables
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:guess_external_locations] [hive_metastore.dummy_sdtha.external_locations] found 0 new records for external_locations
05:26 INFO [databricks.labs.ucx:setup_tacl] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/setup_tacl.log
05:26 INFO [databricks.labs.ucx:crawl_grants] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/crawl_grants.log
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.grants] fetching grants inventory
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`grants`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.grants] crawling new set of snapshot data for grants
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.tables] fetching tables inventory
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`tables`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.tables] crawling new set of snapshot data for tables
05:26 INFO [databricks.labs.ucx.hive_metastore.tables:crawl_grants] Scanning dummy_skooy
05:26 DEBUG [databricks.labs.blueprint.parallel:crawl_grants] Starting 1 tasks in 16 threads
05:26 ERROR [databricks.labs.ucx.hive_metastore.tables:crawl_grants] Failed to list databases due to Py4JSecurityException. Update or reinstall UCX to resolve this issue.
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 559, in _list_tables
    return list(self._iterator(self._external_catalog.listTables(database)))
                               ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/functools.py", line 995, in __get__
    val = self.func(instance)
          ^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.12/site-packages/databricks/labs/ucx/hive_metastore/tables.py", line 530, in _external_catalog
    return self._spark._jsparkSession.sharedState().externalCatalog()  # pylint: disable=protected-access
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 263, in deco
    return f(*a, **kw)
           ^^^^^^^^^^^
  File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value
    raise Py4JError(
py4j.protocol.Py4JError: An error occurred while calling o406.sharedState. Trace:
py4j.security.Py4JSecurityException: Method public org.apache.spark.sql.internal.SharedState org.apache.spark.sql.SparkSession.sharedState() is not whitelisted on class class org.apache.spark.sql.SparkSession
	at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473)
	at py4j.Gateway.invoke(Gateway.java:305)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.base/java.lang.Thread.run(Thread.java:840)


05:26 INFO [databricks.labs.blueprint.parallel:crawl_grants] listing tables 1/1, rps: 146.092/sec
05:26 INFO [databricks.labs.blueprint.parallel:crawl_grants] Finished 'listing tables' tasks: 100% results available (1/1). Took 0:00:00.007989
05:26 INFO [databricks.labs.ucx.hive_metastore.tables:crawl_grants] Finished scanning 0 tables
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.tables] found 0 new records for tables
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.udfs] fetching udfs inventory
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SELECT * FROM `hive_metastore`.`dummy_sdtha`.`udfs`
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.udfs] crawling new set of snapshot data for udfs
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][execute] USE CATALOG `hive_metastore`;
05:26 DEBUG [databricks.labs.ucx.hive_metastore.udfs:crawl_grants] [hive_metastore.dummy_skooy] listing udfs
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW USER FUNCTIONS FROM `hive_metastore`.`dummy_skooy`;
05:26 WARNING [databricks.labs.ucx.hive_metastore.udfs:crawl_grants] Schema hive_metastore.dummy_skooy no longer existed
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.udfs] found 0 new records for udfs
05:26 DEBUG [databricks.labs.blueprint.parallel:crawl_grants] Starting 4 tasks in 16 threads
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON CATALOG `hive_metastore`
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON ANY FILE 
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 
05:26 DEBUG [databricks.labs.lsql.backends:crawl_grants] [spark][fetch] SHOW GRANTS ON DATABASE `hive_metastore`.`dummy_skooy`
05:26 ERROR [databricks.labs.ucx.hive_metastore.grants:crawl_grants] Couldn't fetch grants for object DATABASE hive_metastore.dummy_skooy: An error occurred while calling o406.sql.
: org.apache.spark.SparkSecurityException: Database(dummy_skooy,Some(hive_metastore)) does not exist.
	at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79)
	at scala.Option.getOrElse(Option.scala:189)
	at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79)
	at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75)
	at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226)
	at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84)
	at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:181)
	at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:192)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$5(QueryExecution.scala:387)
	at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$4(QueryExecution.scala:387)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:193)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$3(QueryExecution.scala:387)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:453)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:738)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:334)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1273)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:205)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:675)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$2(QueryExecution.scala:383)
	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1125)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:379)
	at org.apache.spark.sql.execution.QueryExecution.withMVTagsIfNecessary(QueryExecution.scala:329)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$eagerlyExecute$1(QueryExecution.scala:377)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:431)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$8$1.applyOrElse(QueryExecution.scala:426)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:379)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:375)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:481)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$8(QueryExecution.scala:426)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:436)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:426)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:288)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:285)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:383)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:132)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1273)
	at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1280)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1280)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:123)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:969)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1273)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:933)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:992)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)
	at py4j.Gateway.invoke(Gateway.java:306)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:119)
	at java.base/java.lang.Thread.run(Thread.java:840)

05:26 INFO [databricks.labs.blueprint.parallel:crawl_grants] listing grants for hive_metastore 4/4, rps: 16.118/sec
05:26 INFO [databricks.labs.blueprint.parallel:crawl_grants] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:00.249819
05:26 DEBUG [databricks.labs.ucx.framework.crawlers:crawl_grants] [hive_metastore.dummy_sdtha.grants] found 0 new records for grants
05:26 INFO [databricks.labs.ucx:crawl_permissions] UCX v0.50.1+720241122051248 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.qceP/logs/assessment/run-847776580427997-0/crawl_permissions.log
05:26 INFO [databricks.labs.ucx.assessment.workflows:crawl_permissions] Skipping permission crawling as legacy permission migration is disabled.
05:26 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ----------
05:26 INFO [databricks.labs.ucx.install] Deleting UCX v0.50.1+720241122051248 from https://DATABRICKS_HOST
05:26 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_sdtha
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=824558880345542, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1056936420508344, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=936975718698707, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1024858039202568, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1089187274465298, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=580602565800703, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=823577848945095, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=235355368297948, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=863323240184881, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=675259360849584, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=118405263079872, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=116720747437515, as it is no longer needed
05:26 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=61137511780157, as it is no longer needed
05:26 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:26 INFO [databricks.labs.ucx.install] Deleting secret scope
05:26 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw4] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #281

Copy link
Author

❌ test_running_real_assessment_job_ext_hms: databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 0004CE93075B14E9. (8m8.422s)
databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 0004CE93075B14E9.
[gw4] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_D12ty_ra78a580b3: https://DATABRICKS_HOST/sql/dashboards/1e8b7ef0-95bd-4618-b481-9b8ac9b356b7
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DCM9R_ra78a580b3: https://DATABRICKS_HOST/sql/dashboards/008e66dc-ea36-489e-8635-16854434bca9
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.vak3/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 INFO [databricks.labs.ucx.install] Installing UCX v0.50.1+920241123051216
05:12 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:12 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Can't find a cluster policy with id: 0004CE93075B14E9.
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/installer/workflows.py", line 617, in create_jobs
    self._deploy_workflow(workflow_name, settings)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/installer/workflows.py", line 761, in _deploy_workflow
    new_job = self._ws.jobs.create(**settings)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/jobs.py", line 5809, in create
    res = self._api.do('POST', '/api/2.1/jobs/create', body=body, headers=headers)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/core.py", line 77, in do
    return self._api_client.do(method=method,
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/_base_client.py", line 172, in do
    response = call(method,
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/_base_client.py", line 278, in _perform
    raise error from None
databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 0004CE93075B14E9.
05:13 INFO [databricks.labs.ucx.install] Creating dashboards...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:01:34.253713
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_D12ty_ra78a580b3: https://DATABRICKS_HOST/sql/dashboards/1e8b7ef0-95bd-4618-b481-9b8ac9b356b7
05:06 INFO [tests.integration.conftest] Dashboard Created ucx_DCM9R_ra78a580b3: https://DATABRICKS_HOST/sql/dashboards/008e66dc-ea36-489e-8635-16854434bca9
05:06 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.vak3/config.yml) doesn't exist.
05:06 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
05:06 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
05:06 INFO [databricks.labs.ucx.install] Fetching installations...
05:06 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
05:06 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
05:06 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 DEBUG [tests.integration.conftest] Waiting for clusters to start...
05:12 INFO [databricks.labs.ucx.install] Installing UCX v0.50.1+920241123051216
05:12 INFO [databricks.labs.ucx.install] Creating ucx schemas...
05:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
05:12 ERROR [databricks.labs.blueprint.parallel] installing components task failed: Can't find a cluster policy with id: 0004CE93075B14E9.
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/parallel.py", line 158, in inner
    return func(*args, **kwargs), None
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/installer/workflows.py", line 617, in create_jobs
    self._deploy_workflow(workflow_name, settings)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/installer/workflows.py", line 761, in _deploy_workflow
    new_job = self._ws.jobs.create(**settings)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/service/jobs.py", line 5809, in create
    res = self._api.do('POST', '/api/2.1/jobs/create', body=body, headers=headers)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/core.py", line 77, in do
    return self._api_client.do(method=method,
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/_base_client.py", line 172, in do
    response = call(method,
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 54, in wrapper
    raise err
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/retries.py", line 33, in wrapper
    return func(*args, **kwargs)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/sdk/_base_client.py", line 278, in _perform
    raise error from None
databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 0004CE93075B14E9.
05:13 INFO [databricks.labs.ucx.install] Creating dashboards...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
05:13 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
05:13 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
05:13 ERROR [databricks.labs.blueprint.parallel] More than half 'installing components' tasks failed: 0% results available (0/2). Took 0:01:34.253713
05:13 INFO [databricks.labs.ucx.install] Deleting UCX v0.50.1+920241123051216 from https://DATABRICKS_HOST
05:13 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_skek3
05:13 INFO [databricks.labs.ucx.install] Deleting cluster policy
05:13 ERROR [databricks.labs.ucx.install] UCX Policy already deleted
05:13 INFO [databricks.labs.ucx.install] Deleting secret scope
05:13 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw4] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Running from nightly #282

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

0 participants