You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using arrows2 to create a simple record batch which contains one Dictionary array. when writing to parquet with Snappy or gzip, the output cannot be read back with pyarrows. If I remove either the compression or the dictionary, there is no problem.
pyarrows errors:
Corrupt snappy compressed data
GZipCodec failed: incorrect header check
The text was updated successfully, but these errors were encountered:
btw, I keep track of performance benchmarks here and here. If you find something that does not fit the general numbers, please let us know: always looking for edge cases and the like ^_^
Hi,
I am using arrows2 to create a simple record batch which contains one Dictionary array. when writing to parquet with Snappy or gzip, the output cannot be read back with pyarrows. If I remove either the compression or the dictionary, there is no problem.
pyarrows errors:
The text was updated successfully, but these errors were encountered: