We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I've a need to get/set values from an LFU cache directly, rather than as a function decorator. The need is as such:
def slow_function(*args, **kwargs) cache = choose_cache_out_of_many(*args) found = cache.get(*args, **kwags) if found: return found result = slow_code() cache.set(result, *args, **kwargs)
This pattern of having multiple caches and only knowing which one to leverage inside the function that is to be cached means I cannot use a decorator.
How can I access memoization caches directly?
memoization
The text was updated successfully, but these errors were encountered:
I have 16 different caches and I need to keep them separate so that overflowing cache 3 does not pop out values from cache 1 and so on
Sorry, something went wrong.
I was able to come up with a hacky way of handling this, but a natural implementation would still be welcome. pseudo code:
def __init__(self): self.caches = [] for _ in range(16): new_cache = mem.cached(custom_key_maker=self._custom_keys)(self.get_next_state) self.caches.append(new_cache) def get_next_state(self, arg1, arg2, arg3, use_cache=True): if use_cache: cache_idx = get_cache_id(arg1) cache = self.caches[cache_idx] return cache(arg1, arg2, arg3)
This creates a once-recursive call that utilizes a cache from a list of caches, had to also leverage custom_key_maker to ignore the use_cache bit
custom_key_maker
use_cache
Yes, upvote for sure. Any sophisticated caching usage quickly escape the boundaries imposed by decorators.
No branches or pull requests
I've a need to get/set values from an LFU cache directly, rather than as a function decorator. The need is as such:
This pattern of having multiple caches and only knowing which one to leverage inside the function that is to be cached means I cannot use a decorator.
How can I access
memoization
caches directly?The text was updated successfully, but these errors were encountered: