Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

copying from s3 cache buffers entire download in memory, causes segfault #12403

Open
2 tasks
andrewhamon opened this issue Feb 3, 2025 · 0 comments
Open
2 tasks
Labels

Comments

@andrewhamon
Copy link
Contributor

Describe the bug

When copying a large path from an s3-based cache, the nix binary appears to buffer the entire download into memory.

When the path is large enough (above approximately 3.5GB) this also reliably causes nix to segfault.

I can only reproduce for an s3-based nix cache. If I use an HTTP cache, memory usage stays low can constant.

Steps To Reproduce

cd $(mktemp -d)
dd if=/dev/urandom of=./random_4g.bin bs=1M count=4096
path=$(nix store add-path . --name large-random)

# copy path to s3 store

nix store delete $path

# attempt to re-import
nix copy --to local --from <the s3 cache> $path

# experience segfault
74861 segmentation fault  nix copy --to local /nix/store/rv559vmhs7751xizmfnxk5bwyjhfizpa-large-random

Expected behavior

Nix does uses fixed amount of memory and does not segfault

Metadata

nix-env (Nix) 2.22.0

but I have also experienced this with nix 2.25.0

Additional context

Checklist


Add 👍 to issues you find important.

@andrewhamon andrewhamon added the bug label Feb 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant