Skip to content

Commit

Permalink
fix: Inserting multiple small batches now works, even if the second b…
Browse files Browse the repository at this point in the history
…atch triggers rebinding the buffer due to element size. Previously in this scenario not all values already inserted were correctly copied into the new buffer. This caused strings to be replaced with `null` bytes.
  • Loading branch information
pacman82 committed Dec 9, 2024
1 parent 471bd54 commit e9f807b
Show file tree
Hide file tree
Showing 2 changed files with 65 additions and 67 deletions.
129 changes: 65 additions & 64 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 0 additions & 3 deletions tests/test_arrow_odbc.py
Original file line number Diff line number Diff line change
Expand Up @@ -769,8 +769,6 @@ def iter_record_batches():
assert "a\n1\n2\n3\n1\n2\n3\n" == actual.decode("utf8")


@pytest.mark.xfail(reason="We do not know why this fails yet, maybe the second batch overwrites "
"values for the first one. We'll try to reproduce it upstream in arrow-odbc")
def test_insert_multiple_small_batches():
"""
Insert multiple batches into the database, using one roundtrip.
Expand Down Expand Up @@ -934,7 +932,6 @@ def test_into_pyarrow_record_batch_reader_transfers_ownership():
next(iter(arrow_reader))


@pytest.mark.xfail(reason="This likely fails for the same reason as test_multiple_small_batches")
def test_chunked_arrays_of_variable_length_strings():
"""
See issue: <https://github.com/pacman82/arrow-odbc-py/issues/115>
Expand Down

0 comments on commit e9f807b

Please sign in to comment.