-
Notifications
You must be signed in to change notification settings - Fork 805
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make event cache size based #3294
Conversation
* Introduce a random processing queue split policy for testing purpose * Refactor existing split policy implementation * Add lookAheadFunc to stuckTaskSplitPolicy so that timer queue can control the new queue's maxLevel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM overall. Some minor comments.
@@ -131,14 +136,28 @@ func New(maxSize int, opts *Options) Cache { | |||
opts = &Options{} | |||
} | |||
|
|||
return &lru{ | |||
cache := &lru{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To me, size and count is either/or, meaning that if it's size based, we don't care about the count, as long as it stays within size limit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree, if maxSize is provided then it is size based only and count based otherwise. And if size based, lets still check for an upper limit for count to prevent the infinite growing, for safety sake.
* Make event cache size based
What changed?
Make event cache size based
Why?
events part of #2350
How did you test it?
performance tests
Potential risks
history events related API latency