-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pubsub: pull message slow after update #2540
Comments
Hi, thanks for taking the time to file an issue. What's your publish rate, and how long does it take for you to handle an individual message? Also, there's been a change recently where Pub/Sub version is no longer tied to the general |
I am really sorry for the confusion. Yes, I am using pubsub 1.4. My The publishing rate is around 150k-200k per second. With 3 processes each with 4 CPU but non of the processes are using full CPU or memory. Handling is good during the start but after a while, it processes around half of the messages |
Can you try increasing |
@hongalex We've recently had a similar situation, where an update to the latest pubsub version (v1.4.0, although we saw the same in v1.3.1) caused our subscribers to stop pulling messages. We've experienced this in services which have a reasonably high number of subscribers (> 70). In these cases, We then tried manually pulling messages (via a standalone SubscriptionClient), making the grpc calls ourselves. In this case, we were always receiving a fraction of the messages we were asking for. Eventually, we have found out that by increasing the number of connections available to the SubscriberClient (bigger connection pool), the issue goes away (probably meaning the connection was saturated). Which brings us to this. Why is the library leaving the The way we've worked around this is by instantiating:
Would you consider this a typical use case? Or you would expect most of the people to stick with just the |
@jesushernandez Yes we have around 200+ subscriptions and I can see that it is not pulling from some of the subscriptions at all. |
@jesushernandez can you please help me whether you needed to update this config to higher or not after that numConnection ?change
|
@smit-aterlo We are not using the It used to be much simpler with the streaming pull via subscription.Receive but we were having the issues we discussed above with it. |
OK thank you very much for the help @jesushernandez |
No problem. Use that code as inspiration as it is still under active development and it still lacks proper testing. In any case, this is an interim solution. I'm still waiting to know if there's anything else we could have tried or we're doing wrong. |
Yes, that is true. Meanwhile @hongalex if you have any solution with |
Related #2593 |
Hiya, apologies for the delay. Can y'all try updating to |
I can confirm our subscriptions are now pulling messages much faster and no messages are stuck. Thank you, @hongalex! |
Yes, it resolved the issue we were having. Thank you for the help |
Client
PubSub v1.4.0
Environment
GKE
Go Environment
$ go version
go version go1.14.4 linux/amd64
Code
I used following config file when I was using v1.2.0 and it was reading all the messages with high throughput, but with the same config and Pubsub v1.4.0 (Yes I upgraded recently) read throughput is really slow.
Expected behavior
Messages read rate is very high which leads to high throughput
Actual behavior
Messages read rate is very slow and some of the subscription not pulling any messages
The text was updated successfully, but these errors were encountered: