You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What happened:
From a python application running on a Databricks cluster, I want to write to an append-only delta table.
The function is called as follows:
OSError: Generic LocalFileSystem error: Unable to copy file from /Volumes/catalog/schema/volume_path/table_path/_delta_log/_commit_e964ab56-f56c-403a-b06d-fe2b6bcabf9d.json.tmp to /Volumes/catalog/schema/volume_path/table_path/_delta_log/00000000000000000000.json: Function not implemented (os error 38)
What you expected to happen:
As Databricks supports copy/rename/delete operations, I would expect it to work.
As far as I know Databricks use a Local File System API, which emulates a filesystem on top of a cloud storage.
How to reproduce it:
I made the below notebook to reproduce the error. It needs to be run from a Databricks Runtime.
Afaik, databricks volumes are fuse mounted, so this is not an bug. If you want to write to mounted storage that doesn't support CopyIfNotExists, you can pass this to the writer:
Environment
Delta-rs version: 0.17.4
Binding: python (pyarrow engine)
Environment:
Bug
What happened:
From a python application running on a Databricks cluster, I want to write to an append-only delta table.
The function is called as follows:
However, I am getting the below error:
What you expected to happen:
As Databricks supports copy/rename/delete operations, I would expect it to work.
As far as I know Databricks use a Local File System API, which emulates a filesystem on top of a cloud storage.
How to reproduce it:
I made the below notebook to reproduce the error. It needs to be run from a Databricks Runtime.
More details:
The text was updated successfully, but these errors were encountered: