-
Notifications
You must be signed in to change notification settings - Fork 705
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak in readMIMEHeader #1650
Comments
Hey @nuttert, thanks for finding this, Ill let you know once we have a fix. |
BTW here is how I create jetstream and subscription: js, err := nc.JetStream()
............
if config.SyncHandler {
jetSub, err = js.Subscribe(channel, func(msg *nats.Msg) {
c.HandleMessage(ctx, msg, sub)
}, opt...)
} else {
jetSub, err = js.Subscribe(channel, func(msg *nats.Msg) {
go c.HandleMessage(ctx, msg, sub)
}, opt...)
} |
Hey @piotrpio, any news? We have to restart our client once a week to flush out the memory allocated by the natsMsg....fortunately this is still a small service and we can afford it without affectation |
Hey @nuttert, sorry for taking so long on this. I have a hard time reproducing it - I can get to a similar profile that you provided, but everything gets cleaned up nicely by GC. What sizes of payloads/headers are you usually consuming? Do you discard the messages after handling them (i.e. aren't they stored in memory)? And finally, which go version are you using? |
Observed behavior
I saw that my client used 67GB memory, first of all I checked pprof and got the result:
Also in client logs sometimes I see this:
context deadline exceeded
Expected behavior
I expect that memory after processing a message will be released
Server and client version
Client libs:
Server:
Host environment
I pinned my client to 2 cores and did not set any memory limits(the host has 400+GB memory) .
Server pinned to 8 cores and does not consumed a lot(
Virt 13.5G RSS 3929M
).Steps to reproduce
I created a lot of subscriptions from the side of clients within 1 connection.
And after 2 days the memory increased despite that the consumers were passive(the did not send a lot of messages).
The text was updated successfully, but these errors were encountered: