Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory allocation issue while reading many large files #103

Open
enp opened this issue Feb 2, 2024 · 2 comments
Open

Memory allocation issue while reading many large files #103

enp opened this issue Feb 2, 2024 · 2 comments

Comments

@enp
Copy link

enp commented Feb 2, 2024

Hi,

Reading https://github.com/yandex-cloud/geesefs?tab=readme-ov-file#memory-limit about ENOMEM error for cases where read-ahead-large * processes count > memory-limit

Looks as not good limitation for production cases where we can't limit processes count

Maybe it is possible to check if read-ahead-large can be really allocated and skip allocation to allow at least slow reading without read ahead?

@vitalif
Copy link
Collaborator

vitalif commented Feb 6, 2024

Hi, I'm not sure it's easy to implement it this way, but we can think about other ways to solve this problem
For example we can dynamically increase memory limit when there are too many readers, or maybe just allow readers to wait for buffers to be unlocked
I.e. now readers may get ENOMEM when, at the moment of reading, memory cache is full and all data in memory cache is marked as required for some read or write requests. It's probably possible to just make it to wait for this data to be unmarked...

@enp
Copy link
Author

enp commented Feb 6, 2024

Any solution above looks better than ENOMEM really :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants